Skip to content

amstel1/mlcourse.ai

 
 

Repository files navigation

ODS stickers

mlcourse.ai – Open Machine Learning Course

License: CC BY-NC-SA 4.0 Slack Donate Donate

The final session has launched on September 2, 2019. You can join at any point till the end of the session (November 22). Fill in this form to participate, please explore the main page mlcourse.ai as well.

Mirrors (:uk:-only): mlcourse.ai (main site), Kaggle Dataset (same notebooks as Kernels)

Outline

This is the list of published articles on medium.com 🇬🇧, habr.com 🇷🇺. Also notebooks in Chinese are mentioned 🇨🇳 and links to Kaggle Kernels (in English) are given. Icons are clickable.

  1. Exploratory Data Analysis with Pandas 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  2. Visual Data Analysis with Python 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernels: part1, part2
  3. Classification, Decision Trees and k Nearest Neighbors 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  4. Linear Classification and Regression 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernels: part1, part2, part3, part4, part5
  5. Bagging and Random Forest 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernels: part1, part2, part3
  6. Feature Engineering and Feature Selection 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  7. Unsupervised Learning: Principal Component Analysis and Clustering 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  8. Vowpal Wabbit: Learning with Gigabytes of Data 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  9. Time Series Analysis with Python, part 1 🇬🇧 🇷🇺 🇨🇳. Predicting future with Facebook Prophet, part 2 🇬🇧, 🇨🇳 Kaggle Kernels: part1, part2
  10. Gradient Boosting 🇬🇧 🇷🇺, 🇨🇳, Kaggle Kernel

Lectures

Videolectures are uploaded to this YouTube playlist. Introduction, video, slides

  1. Exploratory data analysis with Pandas, video
  2. Visualization, main plots for EDA, video
  3. Decision trees: theory and practical part
  4. Logistic regression: theoretical foundations, practical part (baselines in the "Alice" competition)
  5. Ensembles and Random Forest – part 1. Classification metrics – part 2. Example of a business task, predicting a customer payment – part 3
  6. Linear regression and regularization - theory, LASSO & Ridge, LTV prediction - practice
  7. Unsupervised learning - Principal Component Analysis and Clustering
  8. Stochastic Gradient Descent for classification and regression - part 1, part 2 TBA
  9. Time series analysis with Python (ARIMA, Prophet) - video
  10. Gradient boosting: basic ideas - part 1, key ideas behind Xgboost, LightGBM, and CatBoost + practice - part 2

Fall 2019 assignments

All deadlines are 20:59 GMT (London time), check out also this Google calendar

  1. Exploratory data analysis of Olympic games with Pandas, nbviewer. Deadline: September 15
  2. Trees, forests and boosting
  • Quiz 1. Trees and forests nbviewer. Deadline: September 27
  • Part 1. Classification and regression trees, nbviewer. Deadline: October 6
  • Part 2. Beating a baseline in a Kaggle competition, CatBoost starter. Deadline: October 6
  1. Linear classification and regression models
  • Quiz 2. Math behind linear models, nbviewer. Deadline: October 25
  • Part 1. User Identification with Logistic Regression, nbviewer. Deadline: October 27
  • Part 2. Random Forest and Logistic Regression in credit scoring and movie reviews classification, nbviewer. Deadline: October 27
  1. Unsupervised learning and time series
  • Quiz 3. Unsupervised learning and time series, nbviewer. Deadline: November 15
  • Assignment 4. Time series analysis, nbviewer. Deadline: November 17
  • Dota 2 winner prediction competition. Deadline for submissions: November 18

Demo assignments, just for practice

The following are demo versions. Full versions are announced during course sessions.

  1. Exploratory data analysis with Pandas, nbviewer, Kaggle Kernel, solution
  2. Analyzing cardiovascular disease data, nbviewer, Kaggle Kernel, solution
  3. Decision trees with a toy task and the UCI Adult dataset, nbviewer, Kaggle Kernel, solution
  4. Sarcasm detection, Kaggle Kernel, solution. Linear Regression as an optimization problem, nbviewer, Kaggle Kernel
  5. Logistic Regression and Random Forest in the credit scoring problem, nbviewer, Kaggle Kernel, solution
  6. Exploring OLS, Lasso and Random Forest in a regression task, nbviewer, Kaggle Kernel, solution
  7. Unsupervised learning, nbviewer, Kaggle Kernel, solution
  8. Implementing online regressor, nbviewer, Kaggle Kernel, solution
  9. Time series analysis, nbviewer, Kaggle Kernel, solution
  10. Beating baseline in a competition, Kaggle Kernel

Kaggle competitions

  1. Catch Me If You Can: Intruder Detection through Webpage Session Tracking. Kaggle Inclass
  2. DotA 2 winner prediction Kaggle Inclass

Rating

Throughout the course we are maintaining a student rating. It takes into account credits scored in assignments and Kaggle competitions. They say, rating highly motivates to finish the course. Top students (according to the final rating) are listed on a special page.

Community

Discussions are held in the #mlcourse_ai channel of the OpenDataScience (ods.ai) Slack team.

The course is free but you can support organizers by making a pledge on Patreon (monthly support) or a one-time payment on Ko-fi. Thus you'll foster the spread of Machine Learning in the world!

Donate Donate

About

Open Machine Learning Course

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.2%
  • Other 0.8%