collecting related resources of automated machine learning here. some links were from below:
- hibayesian/awesome-automl-papers
- literature-on-neural-architecture-search
- Algorithm Configuration Literature
you can take part in automl Challenge,
or find competitions in kaggle
or get search result from reddit, bing, quora(search keyword should be "automatic machine learning","automl","meta learning","automated machine learning" and so on),
or access the website automl,
or search your keyword in arxiv papers info,
or others to find some perfect resources there.
This papers or books or slides are ordered by years, before each entity is the theme the entity belonged, if you want to choice one theme, e.g. "Architecture Search", you can ctrl+F then highlight the papers.
Themes are as follow:
- 1.【Architecture Search】:
【Random Search】; 【Evolutionary Algorithms】;【Transfer Learning】;【Reinforcement Learning】;【Local Search】; - 2.【Hyperparameter Optimization】:
【Bayesian Optimization】;【Meta Learning】;【Particle Swarm Optimization】;【Lipschitz Functions】;【Random Search】;【Transfer Learning】;【Local Search】; - 3.【Multi-Objective NAS】;
- 4.【Automated Feature Engineering】;【Reinforcement Learning】;【Meta Learning】;
- 5.【Frameworks】;
- 6.【Meta Learning】;
- 7.【Miscellaneous】
ps:The theme is a bit confusing and I will modify it later.
- 【Architecture Search】Fahlman, Scott E and Lebiere, Christian. The cascade correlation learning architecture. In NIPS, pp. 524–532,1990.
- 【Architecture Search】【Evolutionary Algorithms】Stanley K O, Miikkulainen R. Evolving neural networks through augmenting topologies[J]. Evolutionary computation, 2002, 10(2): 99-127.
- 【Tutorials】【Meta Learning】Metalearning - A Tutorial
- 【Hyperparameter Optimization】【Particle Swarm Optimization】Lin S W, Ying K C, Chen S C, et al. Particle swarm optimization for parameter determination and feature selection of support vector machines[J]. Expert systems with applications, 2008, 35(4): 1817-1824.
- 【Hyperparameter Optimization】【Meta Learning】Smith-Miles K A. Cross-disciplinary perspectives on meta-learning for algorithm selection[J]. ACM Computing Surveys (CSUR), 2009, 41(1): 6.
- 【Architecture Search】【Evolutionary Algorithms】Floreano, Dario, D¨urr, Peter, and Mattiussi, Claudio. Neuroevolution:from architectures to learning. Evolutionary Intelligence, 1(1):47–62, 2008
- 【Hyperparameter Optimization】【Local Search】Hutter F, Hoos H H, Leyton-Brown K, et al. ParamILS: an automatic algorithm configuration framework[J]. Journal of Artificial Intelligence Research, 2009, 36: 267-306.
- 【Architecture Search】【Evolutionary Algorithms】Stanley, Kenneth O, D’Ambrosio, David B, and Gauci, Jason. A hypercube-based encoding for evolving large-scale neural networks. Artificial life, 15(2):185–212, 2009
- 【Bayesian Optimization】Brochu E, Cora V M, De Freitas N. A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning[J]. arXiv preprint arXiv:1012.2599, 2010.
- 【Automated Feature Engineering】【Reinforcement Learning】Gaudel R, Sebag M. Feature selection as a one-player game[C]//International Conference on Machine Learning. 2010: 359--366.
- 【Hyperparameter Optimization】【Random Search】Bergstra J S, Bardenet R, Bengio Y, et al. Algorithms for hyper-parameter optimization[C]//Advances in neural information processing systems. 2011: 2546-2554.
- 【Hyperparameter Optimization】【Bayesian Optimization】Hutter F, Hoos H H, Leyton-Brown K. Sequential model-based optimization for general algorithm configuration[C]//International Conference on Learning and Intelligent Optimization. Springer, Berlin, Heidelberg, 2011: 507-523.
- 【Architecture Search】Snoek J, Larochelle H, Adams R P. Practical bayesian optimization of machine learning algorithms[C]//Advances in neural information processing systems. 2012: 2951-2959.
- 【Hyperparameter Optimization】【Random Search】Bergstra J, Bengio Y. Random search for hyper-parameter optimization[J]. Journal of Machine Learning Research, 2012, 13(Feb): 281-305.
- 【Hyperparameter Optimization】【Bayesian Optimization】Snoek J, Larochelle H, Adams R P. Practical bayesian optimization of machine learning algorithms[C]//Advances in neural information processing systems. 2012: 2951-2959.
- 【Hyperparameter Optimization】【Transfer Learning】Bardenet R, Brendel M, Kégl B, et al. Collaborative hyperparameter tuning[C]//International Conference on Machine Learning. 2013: 199-207.
- 【Hyperparameter Optimization】【Bayesian Optimization】Bergstra J, Yamins D, Cox D D. Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures[J]. 2013.
- 【Hyperparameter Optimization】【Bayesian Optimization】Thornton C, Hutter F, Hoos H H, et al. Auto-WEKA: Combined selection and hyperparameter optimization of classification algorithms[C]//Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2013: 847-855.
- 【Hyperparameter Optimization】James Bergstra, David D. Cox. Hyperparameter Optimization and Boosting for Classifying Facial Expressions: How good can a "Null" Model be?[J]. arXiv preprint arXiv:1306.3476, 2013.
- 【Hyperparameter Optimization】【Transfer Learning】Yogatama D, Mann G. Efficient transfer learning method for automatic hyperparameter tuning[C]//Artificial Intelligence and Statistics. 2014: 1077-1085.
- 【Hyperparameter Optimization】Dougal Maclaurin, David Duvenaud, Ryan P. Adams. Gradient-based Hyperparameter Optimization through Reversible Learning[J]. arXiv preprint arXiv:1502.03492, 2015.
- 【Hyperparameter Optimization】Kevin Jamieson, Ameet Talwalkar. Non-stochastic Best Arm Identification and Hyperparameter Optimization[J]. arXiv preprint arXiv:1502.07943, 2015.
- 【Architecture Search】【Evolutionary Algorithms】Young S R, Rose D C, Karnowski T P, et al. Optimizing deep learning hyper-parameters through an evolutionary algorithm[C]//Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments. ACM, 2015: 4.
- 【Hyperparameter Optimization】【Bayesian Optimization】Wistuba M, Schilling N, Schmidt-Thieme L. Sequential model-free hyperparameter tuning[C]//Data Mining (ICDM), 2015 IEEE International Conference on. IEEE, 2015: 1033-1038.
- 【Hyperparameter Optimization】【Bayesian Optimization】Snoek J, Rippel O, Swersky K, et al. Scalable bayesian optimization using deep neural networks[C]//International conference on machine learning. 2015: 2171-2180.
- 【Hyperparameter Optimization】【Bayesian Optimization】Wistuba M, Schilling N, Schmidt-Thieme L. Learning hyperparameter optimization initializations[C]//Data Science and Advanced Analytics (DSAA), 2015. 36678 2015. IEEE International Conference on. IEEE, 2015: 1-10.
- 【Hyperparameter Optimization】【Bayesian Optimization】Schilling N, Wistuba M, Drumond L, et al. Joint model choice and hyperparameter optimization with factorized multilayer perceptrons[C]//Tools with Artificial Intelligence (ICTAI), 2015 IEEE 27th International Conference on. IEEE, 2015: 72-79.
- 【Hyperparameter Optimization】【Bayesian Optimization】Wistuba M, Schilling N, Schmidt-Thieme L. Hyperparameter search space pruning–a new component for sequential model-based hyperparameter optimization[C]//Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, Cham, 2015: 104-119.
- 【Hyperparameter Optimization】【Bayesian Optimization】Schilling N, Wistuba M, Drumond L, et al. Hyperparameter optimization with factorized multilayer perceptrons[C]//Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, Cham, 2015: 87-103.
- 【Hyperparameter Optimization】【Bayesian Optimization】【more efficient】Feurer M, Klein A, Eggensperger K, et al. Efficient and robust automated machine learning[C]//Advances in Neural Information Processing Systems. 2015: 2962-2970.
- 【Frameworks】Thakur A, Krohn-Grimberghe A.AutoCompete: A Framework for Machine Learning Competition[J]. arXiv preprint arXiv:1507.02188, 2015.
- 【Automated Feature Engineering】【Expand Reduce】Kanter J M, Veeramachaneni K. Deep feature synthesis: Towards automating data science endeavors[C]//Data Science and Advanced Analytics (DSAA), 2015. 36678 2015. IEEE International Conference on. IEEE, 2015: 1-10.
- 【Architecture Search】Mendoza H, Klein A, Feurer M, et al. Towards automatically-tuned neural networks[C]//Workshop on Automatic Machine Learning. 2016: 58-65.
- 【Hyperparameter Optimization】Fabian Pedregosa. Hyperparameter optimization with approximate gradient[J]. arXiv preprint arXiv:1602.02355, 2016.
- 【Hyperparameter Optimization】【Random Search】Li L, Jamieson K, DeSalvo G, et al. Hyperband: A novel bandit-based approach to hyperparameter optimization[J]. arXiv preprint arXiv:1603.06560, 2016.
- 【Architecture Search】Loshchilov I, Hutter F. CMA-ES for hyperparameter optimization of deep neural networks[J]. arXiv preprint arXiv:1604.07269, 2016.
- 【Hyperparameter Optimization】Julien-Charles Lévesque, Christian Gagné, Robert Sabourin. Bayesian Hyperparameter Optimization for Ensemble Learning[J]. arXiv preprint arXiv:1605.06394, 2016.
- 【Make it more efficient】Klein A, Falkner S, Bartels S, et al. Fast bayesian optimization of machine learning hyperparameters on large datasets[J]. arXiv preprint arXiv:1605.07079, 2016.
- 【Architecture Search】【Meta Learning】Li K, Malik J. Learning to optimize[J]. arXiv preprint arXiv:1606.01885, 2016.
- 【Architecture Search】Saxena S, Verbeek J. Convolutional neural fabrics[C]//Advances in Neural Information Processing Systems. 2016: 4053-4061.
- 【Architecture Search】【Reinforcement Learning】Cortes, Corinna, Gonzalvo, Xavi, Kuznetsov, Vitaly, Mohri, Mehryar, and Yang, Scott. Adanet: Adaptive structural learning of artificial neural networks. arXiv preprint arXiv:1607.01097, 2016.
- 【Hyperparameter Optimization】Ilija Ilievski, Taimoor Akhtar, Jiashi Feng, Christine Annette Shoemaker. Efficient Hyperparameter Optimization of Deep Learning Algorithms Using Deterministic RBF Surrogates[J]. arXiv preprint arXiv:1607.08316, 2016.
- 【Hyperparameter Optimization】【Transfer Learning】Ilija Ilievski, Jiashi Feng. Hyperparameter Transfer Learning through Surrogate Alignment for Efficient Deep Neural Network Training. arXiv preprint arXiv:1608.00218, 2016.
- 【Architecture Search】【Reinforcement Learning】Zoph B, Le Q V. Neural architecture search with reinforcement learning[J]. arXiv preprint arXiv:1611.01578, 2016.
- 【Architecture Search】Baker B, Gupta O, Naik N, et al. Designing neural network architectures using reinforcement learning[J]. arXiv preprint arXiv:1611.02167, 2016.
- 【Make it more efficient】Klein A, Falkner S, Springenberg J T, et al. Learning curve prediction with Bayesian neural networks[J]. 2016.
- 【Hyperparameter Optimization】【Transfer Learning】Wistuba M, Schilling N, Schmidt-Thieme L. Hyperparameter optimization machines[C]//Data Science and Advanced Analytics (DSAA), 2016 IEEE International Conference on. IEEE, 2016: 41-50.
- 【Hyperparameter Optimization】【Transfer Learning】Joy T T, Rana S, Gupta S K, et al. Flexible transfer learning framework for bayesian optimisation[C]//Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer, Cham, 2016: 102-114.
- 【Hyperparameter Optimization】【Bayesian Optimization】Wistuba M, Schilling N, Schmidt-Thieme L. Two-stage transfer surrogate model for automatic hyperparameter optimization[C]//Joint European conference on machine learning and knowledge discovery in databases. Springer, Cham, 2016: 199-214.
- 【Hyperparameter Optimization】【Bayesian Optimization】Mendoza H, Klein A, Feurer M, et al. Towards automatically-tuned neural networks[C]//Workshop on Automatic Machine Learning. 2016: 58-65.
- 【Hyperparameter Optimization】【Bayesian Optimization】Shahriari B, Swersky K, Wang Z, et al. Taking the human out of the loop: A review of bayesian optimization[J]. Proceedings of the IEEE, 2016, 104(1): 148-175.
- 【Hyperparameter Optimization】【Bayesian Optimization】Schilling N, Wistuba M, Schmidt-Thieme L. Scalable hyperparameter optimization with products of gaussian process experts[C]//Joint European conference on machine learning and knowledge discovery in databases. Springer, Cham, 2016: 33-48.
- 【Hyperparameter Optimization】【Bayesian Optimization】Springenberg J T, Klein A, Falkner S, et al. Bayesian optimization with robust bayesian neural networks[C]//Advances in Neural Information Processing Systems. 2016: 4134-4142.
- 【Automated Feature Engineering】【Hierarchical Organization of Transformations】Khurana U, Turaga D, Samulowitz H, et al. Cognito: Automated feature engineering for supervised learning[C]//Data Mining Workshops (ICDMW), 2016 IEEE 16th International Conference on. IEEE, 2016: 1304-1307.
- 【Automated Feature Engineering】【Expand Reduce】Katz G, Shin E C R, Song D. Explorekit: Automatic feature generation and selection[C]//Data Mining (ICDM), 2016 IEEE 16th International Conference on. IEEE, 2016: 979-984.
- 【Automated Feature Engineering】【Expand Reduce】Khurana U, Nargesian F, Samulowitz H, et al. Automating Feature Engineering[J]. Transformation, 2016, 10(10): 10.
- 【Architecture Search】【Evolutionary Algorithms】【more efficient】Miikkulainen, Risto, Liang, Jason, Meyerson, Elliot,Rawal, Aditya, Fink, Dan, Francon, Olivier, Raju,Bala, Navruzyan, Arshak, Duffy, Nigel, and Hodjat,Babak. Evolving deep neural networks. arXiv preprint arXiv:1703.00548, 2017
- 【Architecture Search】【Hyperparameter Optimization】【Evolutionary Algorithms】Real E, Moore S, Selle A, et al. Large-scale evolution of image classifiers[J]. arXiv preprint arXiv:1703.01041, 2017.
- 【Architecture Search】【Evolutionary Algorithms】Xie, Lingxi and Yuille, Alan. Genetic cnn. arXiv preprint arXiv:1703.01513, 2017.
- 【Hyperparameter Optimization】Luca Franceschi, Michele Donini, Paolo Frasconi, Massimiliano Pontil. Forward and Reverse Gradient-Based Hyperparameter Optimization[J]. arXiv preprint arXiv:1703.01785, 2017.
- 【Hyperparameter Optimization】【Lipschitz Functions】Malherbe C, Vayatis N. Global optimization of Lipschitz functions[J]. arXiv preprint arXiv:1703.02628, 2017.
- 【Hyperparameter Optimization】【Meta Learning】Ben Goertzel, Nil Geisweiller, Chris Poulin. Metalearning for Feature Selection . arXiv preprint arXiv:1703.06990, 2017.
- 【Architecture Search】Suganuma, Masanori, Shirakawa, Shinichi, and Nagao, Tomoharu. A genetic programming approach to designing convolutional neural network architectures. arXiv preprint arXiv:1704.00764, 2017.
- 【Architecture Search】Negrinho, Renato and Gordon, Geoff. Deeparchitect: Automatically designing and training deep architectures. arXiv preprint arXiv:1704.08792, 2017.
- 【Hyperparameter Optimization】Gonzalo Diaz, Achille Fokoue, Giacomo Nannicini, Horst Samulowitz. An effective algorithm for hyperparameter optimization of neural networks[J]. arXiv preprint arXiv:1705.08520, 2017.
- 【Automated Feature Engineering】【Expand Reduce】Lam H T, Thiebaut J M, Sinn M, et al. One button machine for automating feature engineering in relational databases[J]. arXiv preprint arXiv:1706.00327, 2017.
- 【Architecture Search】Hazan E, Klivans A, Yuan Y. Hyperparameter Optimization: A Spectral Approach[J]. arXiv preprint arXiv:1706.00764, 2017.
- 【Hyperparameter Optimization】Jesse Dodge, Kevin Jamieson, Noah A. Smith. Open Loop Hyperparameter Optimization and Determinantal Point Processes Machine Learning[J]. arXiv preprint arXiv:1706.01566, 2017.
- 【Architecture Search】Huang, Furong, Ash, Jordan, Langford, John, and Schapire, Robert. Learning deep resnet blocks sequentially using boosting theory. arXiv preprint arXiv:1706.04964, 2017
- 【Hyperparameter Optimization】【Meta Learning】Fábio Pinto, Vítor Cerqueira, Carlos Soares, João Mendes-Moreira. autoBagging: Learning to Rank Bagging Workflows with Metalearning[J]. arXiv preprint arXiv:1706.09367, 2017.
- 【Architecture Search】Cai H, Chen T, Zhang W, et al. Efficient Architecture Search by Network Transformation[J]. arXiv preprint arXiv:1707.04873, 2017.
- 【Architecture Search】【Transfer Learning】Zoph B, Vasudevan V, Shlens J, et al. Learning transferable architectures for scalable image recognition[J]. arXiv preprint arXiv:1707.07012, 2017.
- 【Architecture Search】【more efficient】Brock A, Lim T, Ritchie J M, et al. SMASH: one-shot model architecture search through hypernetworks[J]. arXiv preprint arXiv:1708.05344, 2017.
- 【Architecture Search】【reinforcement learning】Zhong, Zhao, Yan, Junjie, and Liu, Cheng-Lin. Practical network blocks design with q-learning. arXiv preprint arXiv:1708.05552, 2017.
- 【Architecture Search】【reinforcement learning】Bello I, Zoph B, Vasudevan V, et al. Neural optimizer search with reinforcement learning[J]. arXiv preprint arXiv:1709.07417, 2017.
- 【Automated Feature Engineering】【Reinforcement Learning】Khurana U, Samulowitz H, Turaga D. Feature Engineering for Predictive Modeling using Reinforcement Learning[J]. arXiv preprint arXiv:1709.07150, 2017.
- 【Hyperparameter Optimization】Jungtaek Kim, Saehoon Kim, Seungjin Choi. Learning to Warm-Start Bayesian Hyperparameter Optimization. [J] arXiv preprint arXiv:1709.07150, 2017.
- 【Architecture Search】Liu H, Simonyan K, Vinyals O, et al. Hierarchical representations for efficient architecture search. [J] arXiv preprint arXiv:1710.06219, 2017.
- 【Architecture Search】【Evolutionary Algorithms】Liu, Hanxiao, Simonyan, Karen, Vinyals, Oriol, Fernando,Chrisantha, and Kavukcuoglu, Koray. Hierarchical representations for efficient architecture search. arXiv preprint arXiv:1711.00436, 2017b.
- 【Architecture Search】【Local Search】Elsken T, Metzen J H, Hutter F. Simple and efficient architecture search for convolutional neural networks[J]. arXiv preprint arXiv:1711.04528, 2017.
- 【Architecture Search】Max Jaderberg, Valentin Dalibard, Simon Osindero, Wojciech M. Czarnecki, Jeff Donahue, Ali Razavi, Oriol Vinyals, Tim Green, Iain Dunning, Karen Simonyan, Chrisantha Fernando, Koray Kavukcuoglu. Population Based Training of Neural Networks[J]. arXiv preprint arXiv:1711.09846, 2017.
- 【Architecture Search】【more efficient】Liu C, Zoph B, Shlens J, et al. Progressive neural architecture search[J]. arXiv preprint arXiv:1712.00559, 2017.
- 【Architecture Search】Wistuba M. Finding Competitive Network Architectures Within a Day Using UCT[J]. arXiv preprint arXiv:1712.07420, 2017.
- 【Hyperparameter Optimization】【Particle Swarm Optimization】Lorenzo P R, Nalepa J, Kawulok M, et al. Particle swarm optimization for hyper-parameter selection in deep neural networks[C]//Proceedings of the Genetic and Evolutionary Computation Conference. ACM, 2017: 481-488.
- 【Frameworks】Swearingen T, Drevo W, Cyphers B, et al. ATM: A distributed, collaborative, scalable system for automated machine learning[C]//IEEE International Conference on Big Data. 2017.
- 【Frameworks】Golovin D, Solnik B, Moitra S, et al. Google vizier: A service for black-box optimization[C]//Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2017: 1487-1495.
- 【Automated Feature Engineering】【Meta Learning】Nargesian F, Samulowitz H, Khurana U, et al. Learning feature engineering for classification[C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence. AAAI Press, 2017: 2529-2535.
- 【Miscellaneous】Martin Wistuba, et al. Automatic Frankensteining: Creating Complex Ensembles Autonomously
- 【Architecture Search】【Evolutionary Algorithms】Real E, Aggarwal A, Huang Y, et al. Regularized Evolution for Image Classifier Architecture Search[J]. arXiv preprint arXiv:1802.01548, 2018.
- 【Architecture Search】【Reinforcement Learning】Pham H, Guan M Y, Zoph B, et al. Efficient Neural Architecture Search via Parameter Sharing[J]. arXiv preprint arXiv:1802.03268, 2018.
- 【Architecture Search】Kandasamy K, Neiswanger W, Schneider J, et al. Neural Architecture Search with Bayesian Optimisation and Optimal Transport[J]. arXiv preprint arXiv:1802.07191, 2018.
- 【Hyperparameter Optimization】Lorraine, Jonathan, and David Duvenaud. Stochastic Hyperparameter Optimization through Hypernetworks arXiv preprint arXiv:1802.09419 (2018).
- 【Hyperparameter Optimization】【Evolutionary Algorithms】Chen B, Wu H, Mo W, et al. Autostacker: A Compositional Evolutionary Learning System[J]. arXiv preprint arXiv:1803.00684, 2018.
- 【more efficient】Wong C, Houlsby N, Lu Y, et al. Transfer Automatic Machine Learning[J]. arXiv preprint arXiv:1803.02780, 2018.
- 【Architecture Search】Kamath P, Singh A, Dutta D. Neural Architecture Construction using EnvelopeNets[J]. arXiv preprint arXiv:1803.06744, 2018.
- 【Hyperparameter Optimization】Cui, Henggang, Gregory R. Ganger, and Phillip B. Gibbons. MLtuner: System Support for Automatic Machine Learning Tuning arXiv preprint arXiv:1803.07445 (2018).
- 【more efficient】Bennani-Smires K, Musat C, Hossmann A, et al. GitGraph-from Computational Subgraphs to Smaller Architecture Search Spaces[J]. 2018.
- 【Multi-Objective NAS】Dong J D, Cheng A C, Juan D C, et al. PPP-Net: Platform-aware Progressive Search for Pareto-optimal Neural Architectures[J]. 2018.
- 【more efficient】Bowen Baker, et al. Baker B, Gupta O, Raskar R, et al. Accelerating Neural Architecture Search using Performance Prediction[J]. 2018.
- 【Architecture Search】Huang, Siyu, et al. GNAS: A Greedy Neural Architecture Search Method for Multi-Attribute Learning. arXiv preprint arXiv:1804.06964 (2018).
- 【book】【Meta Learning】Brazdil P, Carrier C G, Soares C, et al. Metalearning: Applications to data mining[M]. Springer Science & Business Media, 2008.
- 【Articles】【Bayesian Optimization】Bayesian Optimization for Hyperparameter Tuning
- 【Articles】【Meta Learning】Learning to learn
- 【Articles】【Meta Learning】Why Meta-learning is Crucial for Further Advances of Artificial Intelligence?
- 【articles】automl_aws_data_science
- 【news】what-is-automl-promises-vs-realityauto
- 【book】Sibanjan Das, Umit Mert Cakmak - Hands-On Automated Machine Learning (2018, Packt Publishing)
- Featuretools: a good library for automatically engineering features from relational and transactional data
- auto-sklearn: it's really a drop-in replacement for scikit-learn estimators.
- MLBox: is another AutoML library and it supports distributed data processing, cleaning, formatting, and state-of-the-art algorithms such as LightGBM and XGBoost. It also supports model stacking, which allows you to combine an information ensemble of models to generate a new model aiming to have better performance than the individual models.
- 【python】Xcessive: A web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python
- 【python】TPOT: is using genetic programming to find the best performing ML pipelines, and it is built on top of scikit-learn
- 【python】Advisor
- 【java】Auto-WEKA
- 【python】Hyperopt
- 【python】Hyperopt-sklearn
- 【python】SigOpt
- 【python】SMAC3
- 【python】RoBO
- 【python】BayesianOptimization
- 【python】Scikit-Optimize
- 【python】HyperBand
- 【cpp】BayesOpt
- 【python】Optunity
- 【python】ATM
- 【python】Cloud AutoML
- 【python】H2O-offical website; H2O-github
- 【python】DataRobot
- 【python】MLJAR
- 【python】MateLabs