2017-2 Artificial Intelligence
Time: Wed 9:00-10:30, Fri 10:30-12:00
Location: Engineering Bd 1, Room 509
Textbook: Python Machine Learning, Sebastian Raschka, PACKT Publishing
Lecture Schedule:
Oct 27 (Fri), Nov 1 (Wed): no lecture
Nov 3 (Fri): regular schedule
Nov 8 (Wed): 9:00-10:30, and 13:00-14:30 (both at the classroom, Eng Bd #1, Room 509)
Exams:
Midterm: Oct 25 13:00-15:00
Location: Enginnering Bd 1, R101
Content: all lectures before the exam (closed-book)
Results [scores]: Total 50 points, Avg 20.75
Final exam: Dec 6, 18:00-20:00
Location: Enginnering Bd 1, R101
Content: all lectures from the beginning of the semester untill Dec 1st (closed-book)
Results [scores]: Total 50 points, Avg 25.43
Exam sheet checking: Dec 18~19, 13:00~15:00 Artificial Intelligence Lab (Cluster Bd. R 620)
Mini-Project:
MNIST dataset
D1: Training set: 60,000 [train-images-idx3-ubyte.gz ] [train-labels-idx1-ubyte.gz ]
D2: Test set (1/6 of the entire test set, with labels): 10,000 [test-images-idx3-ubyte.gz ] [test-labels-idx1-ubyte.gz ]
D3: Test set (the entire test set, without labels): 60,000 [testall-images-idx3-ubyte.gz ]
Apply a machine learning algorithm from the class
Do hyperparameter tuning
Goal: to achieve the best accuracy on the entire test set, D3.
What to submit:
A description of your machine learning method (5 pages, A4, PDF, reproducibility)
The ML method chosen
How the training has been performed (pre-processing, split of validation, CV, etc).
Values of hyperparameters, and how they are chosen
Python code to train with the given training data, and to produce label predictions (0~9) of the given teset data:
myCLF.py <training_images.gz> <training_labels.gz> <test_images.gz>
The filename and the argument format must be the same as above
Produce labels to the standard out, one per line: for four test images, produce predictions like
|0
|2
|3
|5
A text file containing the prediction results of D3
Filename: prediction.txt
Submit to the TA via email: jeonghyeonlee@icloud.com
Archieve everything to a zip file: <your student ID>.zip
Email should be received by Dec 22, 18:00 (no late submission will be accepted)
Discussion is encouraged, but you MUST make your own answer (code, report, predictions)
Copying others' results will get 0 point
Result
[link]
If you do not submit a prediction.txt, you will receive a 25% penalty.
If the number of labels in prediction.txt is less than 60,000, you will also receive a 25% penalty.
The person who received the penalty is marked in blue.
If neither the report nor the prediction.txt are submitted, the accuracy is scored as zero.
Final grading:
Midterm: 40%
Final exam: 40%
Mini-project: 10%
Attendance: 10%
Lecture Notes
Lecture 01. Introduction [link]
Lecture 02. Artificial Neural Networks (updated Oct 19) [link]
Lecture 03. Logistic Regression (update Oct 19) [link]
Lecture 04. Support Vector Machine (update Oct 19) [link]
Lecture 05. Decision trees and KNN (update Oct 19) [link]
Lecture 06. Data pre-processing [link]
Midterm Summary [link]
Lecture 07. Dimensionality reduction: PCA [link]
Lecture 08. Dimensionality reduction: LDA and kernel PCA (update Nov 10) [link]
Lecture 09: Model Evaluation and Hyperparameter Tuning [link]
Lecture 10: Ensemble method [link]
Lecture 11: Sentiment Analysis [link]
Lecture 12: Regression Analysis (update Dec 1) [link]
Lecture 13: Clustering Analysis [link]
Lecture 14: Artificial Neural Network 1 [use the link below]
Lecture 15: Artificial Neural Network 2 [link]