CS 217: Artificial Intelligence and Machine Learning (CS 240: AIML Lab)

Goal of the course

This course is an undergraduate introductory course on artificial intelligence and machine learning (AIML). The goal is to introduce the fundamental concepts from a mathematical viewpoint. The lab counterpart of the course puts these concepts into practice. The course is broadly divided into four components:

  1. Optimization: this component is a quick refresher on optimization, in particular, linear and convex programming.
  2. Single-agent AI: machine learning: this component considers the machine learning problem in detail. It covers supervised and unsupervised learning including neural networks.
  3. Multi-agent AI: game theory: this component takes a different viewpoint on the learning problem and predicts the outcome when multiple agents have different objectives/goals, leading to the solution concept being one of "equilibrium" rather than "optimal".
  4. Additional topics: this section considers classical AI topics of search, dynamic programming, A*, etc., and Markov decision problems, i.e., the foundation of reinforcement learning.
The prerequisites for this course are familiarity with probability, linear algebra, and calculus. Exposure to programming is necessary for the lab component. All programming assignments are expected to be done in python using certain libraries.

Tentative plan of the course (subject to change, requires LDAP).

Announcements

  1. Very important: The courses CS 217 and 240 are core courses for CSE BTech 2nd year students. Hence, no other student is allowed (except who could not clear CS 337/335 in the earlier edition) to credit or audit this course. Violation of this rule will result in the paper not being graded and getting an FR grade. I'm not responsible for how ASC is configured and whether it allows you to register or not.
  2. This course will be in slot 6 (Wed, Fri, 11:00-12:30 hrs), Venue: LH 301
  3. First meeting of the course is on Friday, January 5, 2024, at 11:00 hrs, venue LH 301.
  4. Each lecture is required to be scribed by two groups of students. The lecture and scribe group assignment should be announced soon. Here is the template for scribing the lectures. Here and here are some good introductions to LaTeX. Please upload the scribed lecture notes to moodle within 2 days of the class. I'll put them after polishing on the course page in about 3 days of the class. Less the update that is needed, better is the credit — so consider to do the first draft carefully.
    Lecture to Scribe Group Mapping (requires IITB LDAP).
  5. Get started: enroll yourself on Piazza. Moodle will ONLY be used for lab assignments and scribe uploads. Moodle links for CS 217 and CS 240.
  6. All class-related announcements will be made on Piazza. If you are not enrolled there, you may miss the announcements.
  7. No audit grades will be given for CS 217. However, you're welcome to sit-in, attend the lectures, and ask questions on Piazza. For CS 240, we won't be able to accommodate any sit-in students.
  8. Quiz 1 will be on Wed, Feb 7, 2024. Time: regular class time. More details about the exam will be posted on Piazza closer to the date.
  9. Midsem will be on the Institute scheduled date-space-time (between 23 Feb - 2 Mar, 2024). More details about the exam will be posted on Piazza closer to the date.
  10. Quiz 2 will be on Wed, Apr 3, 2024. Time: regular class time. More details about the exam will be posted on Piazza closer to the date.
  11. Endsem will be on the Institute scheduled date-space-time (between 22 Apr - 2 May, 2024). More details about the exam will be posted on Piazza closer to the date.
  12. Midsem lab exam: Feb 25, 2024 (2-5 PM + additional 1 hour for compensatory time candidates)
  13. Endsem lab exam: Apr 21, 2024 (2-5 PM + additional 1 hour for compensatory time candidates)

IIT Bombay Honor Code

Students are expected to adhere to the highest standards of integrity and academic honesty. Acts such as copying in the examinations and sharing code/viewing on-line solutions for the lab assignments will be dealt with strictly, in accordance with the institute's procedures and disciplinary actions for academic malpractice.

Logistics

  • Instructor: Swaprava Nath (office hours: by appointment, mail at swaprava AT cse DOT iitb DOT ac DOT in with [CS 217] in the subject)
  • Course managers: Firuza Karmali (p11119@iitb.ac.in), Nageshrau Karmali (nags.ides@iitb.ac.in)
  • Teaching assistants: Ramsundar Anandanarayanan (ramsundar@cse.iitb.ac.in), Harshvardhan Agarwal (200050050@iitb.ac.in), Onkar Borade (200050022@iitb.ac.in), Rounak Dalmia (200050119@iitb.ac.in), Kartik Gokhale (200100083@iitb.ac.in), Vidit Goel (200050156@iitb.ac.in), Pulkit Agarwal (200050113@iitb.ac.in), Swaroop Nath (21q050014@iitb.ac.in), Poulami Ghosh (22d0363@iitb.ac.in), Santosh Kavhar (22m0787@iitb.ac.in), Drashthi Doshi (23d0362@iitb.ac.in), Goda Nagakalyani (214050010@iitb.ac.in), Somil Swapnil Chandra (22m0809@iitb.ac.in), Chakka Srikanth Yadav (23m0794@iitb.ac.in), Varn Gupta (23m0749@iitb.ac.in), Atul Kumar (23m0764@iitb.ac.in).
  • Know your teaching staff in two slides (needs IITB LDAP).
  • Classroom: LH 301
  • Evaluation:
    CS 217
    1. Two quizzes -- 15% weightage for each
    2. One midsem (30%) and one endsem exam (35%)
    3. Scribing -- 5%
    4. All exams are offline, proctored, in-lecture-hall, pen and paper, closed book (unless otherwise mentioned). Grading is done using Gradescope. More updates on that will come closer to the time. Please check Piazza messages.
    CS 240
    1. Weekly lab assignments -- 3% on each of the top 10 performances
    2. One midsem (35%) and one endsem lab exam (35%)
  • Course calendar

Reference texts

  1. [RN] "Artificial Intelligence: A Modern Approach"", Stuart J. Russell and Peter Norvig, Pearson.
  2. [Bis] "Pattern Recognition and Machine Learning", Christopher M. Bishop, Springer.
  3. [Mur] "Probabilistic Machine Learning: An Introduction", Kevin Murphy, MIT Press.
  4. [HTF] "The Elements of Statistical Learning", Trevor Hastie, Robert Tibshirani and Jerome Friedman, Springer.
  5. [MML] "Mathematics for Machine Learning", Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong, Cambridge University Press.

Weekly lectures

Most of the linked content need IITB LDAP access. Introduction lecture boardwork.

Week 01

  • Lecture 01: Introduction to Optimization
    A brief review of optimization and linear programming. Reading: any reference on linear optimization. [Boardwork] [Scribed lecture notes]
  • Lecture 02: Refresher on Linear and Convex Optimization
    Basics of linear programming, simplex method, convex optimization, ideas of solving convex programs. Reading: any reference on linear optimization. To dig deeper into optimization, the book by Chong and Zak is useful (has a chapter dedicated to simplex method). [Boardwork] [Simplex video tutorial] [Scribed lecture notes]
Week 02

  • Lecture 03: Introduction to Regression
    Use cases of regression, general setup, parametrized, linear regression, closed form expression of linear regression. [Boardwork] [Scribed lecture notes]
  • Lecture 04: Linear Regression, Solution techniques
    Geometric interpretation of linear regression, basis functions, probabilistic model of linear regression, maximum likelihood estimation, optimizing for linear regression, gradient descent. References: [MML] book for linear algebra and interpretation of linear equations, [Bis]. [Boardwork] [Scribed lecture notes]

  • Week 03

    • Lecture 05: Maximum Aposteriori (MAP) Estimate
      Stochastic gradient descent, posterior probability, maximizing posterior: MAP estimator, conjugate priors, MAP for linear regression. [Boardwork] [Scribed lecture notes]
    • Lecture 06: Bias-Variance, Regularization
      Bias and variance of models, tradeoffs, applications in overfitting and underfitting, regularization, Ridge and LASSO methods of regularization. [Boardwork] [Scribed lecture notes]

    Week 04

    • Lecture 07: Introduction to Classification
      Pending items from ridge and LASSO regressions, classification problem setup, naive Bayes classifier and its applications, introduction to logistic regression. [Boardwork] [Scribed lecture notes]
    • Lecture 08: Logistic Regression and Multiclass Classification
      Binary logistic regression, multiclass classification: one-vs-rest classifier, softmax regression. Generative and discriminative models, special case of naive Bayes and logistic regression equivalence. Optional reading: convergence analysis of gradient descent (with certain assumptions). [Boardwork] [Scribed lecture notes]
    Week 05

    • Lecture 09: Perceptron
      Perceptron, algorithm, convergence, geometric interpretation, loss function interpretation. Introduction to decision trees. [Boardwork]
    • Lecture 10: Decision Trees
      Decision trees examples, creation rules using mutual information, stopping criteria, overfitting. [Boardwork]
    Week 06

Regrading requests

The regrading procedures in this course are intended to correct serious errors in grading. It is not intended as an opportunity to argue about each judgment call made by the graders. We will only consider a regrading request if there is a significant error in grading, and if you sincerely feel that your exam was unfairly graded, we will look it over carefully. Having said that, if there is an attempt to use this feature to raise an unnecessary regrading request, then based on the severity of the attempt, a penalty will be placed which can go as high as deducting 50% of the marks for that question (irrespective of what marks you got in that question). We are not trying to scare off students whose exams were graded incorrectly, but we are trying to avoid frivolous requests.

What Merits a Regrade: The following are the usual circumstances that may lead to an increase in points:

  • Your answer is really the same as the one on the answer key, but the grader didn’t realize it.
  • Your answer is different from the one provided on the answer key, but your answer is also correct.

What Doesn’t Merit a Regrade: The following are not valid reasons for regrades:

  • “Most of what I wrote is correct, so I think I deserve more partial credit.” Partial credit is given equally for all students who write a particular answer, so it would not be fair to give you more points for this without adding points to all students who wrote the same answer.
  • “I wrote so much, and the grader didn’t notice that the correct answer is buried somewhere within this long paragraph.” You will lose points if the correct answer is accompanied by incorrect information or by so much irrelevant information that it gives the impression that you were just writing down everything you could think of on this topic. To get full credit you must demonstrate the ability to pull out the relevant info and to exclude irrelevant info.
  • “I’m just 1 point away from an A, so I thought it was worth scrounging around to find an extra point somewhere.” Lobbying for grades is considered inappropriate by the institute rules, so your case may be reported to appropriate authorities.

Virtual Q and A

We will be using Piazza for class discussion. The system is highly catered to getting help fast and efficiently from classmates, the TA, and myself. Rather than emailing questions to the teaching staff, I encourage you to post your questions on Piazza.

  • Enrollment link (students and TAs, please register yourself here -- access code will be given on the first meeting of the course)
  • Class link

world map hits counter