An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version. It implements machine learning algorithm under gradient boosting framework, including generalized linear model and gradient boosted regression tree (GBDT). XGBoost can also also distributed and scale to Terascale data
Contributors: https://github.com/dmlc/xgboost/graphs/contributors
Documentations: Documentation of xgboost
Issues Tracker: https://github.com/dmlc/xgboost/issues
Please join XGBoost User Group to ask questions and share your experience on xgboost.
- Use issue tracker for bug reports, feature requests etc.
- Use the user group to post your experience, ask questions about general usages.
Distributed Version: Distributed XGBoost
Highlights of Usecases: Highlight Links
- XGBoost wins WWW2015 Microsoft Malware Classification Challenge (BIG 2015)
- Checkout the winning solution at Highlight links
- External Memory Version
- XGBoost now support HDFS and S3
- Distributed XGBoost now runs on YARN
- xgboost user group for tracking changes, sharing your experience on xgboost
- New features in the lastest changes :)
- Distributed version that scale xgboost to even larger problems with cluster
- Feature importance visualization in R module, thanks to Michael Benesty
- Predict leaf index, see demo/guide-python/predict_leaf_indices.py
- XGBoost wins Tradeshift Text Classification
- XGBoost wins HEP meets ML Award in Higgs Boson Challenge
- Easily accessible in python, R, Julia, CLI
- Fast speed and memory efficient
- Can be more than 10 times faster than GBM in sklearn and R
- Handles sparse matrices, support external memory
- Accurate prediction, and used extensively by data scientists and kagglers
- See highlight links
- Distributed and Portable
- The distributed version runs on Hadoop (YARN), MPI, SGE etc.
- Scales to billions of examples and beyond
- Run
bash build.sh
(you can also type make)- Normally it gives what you want
- See Build Instruction for more information
- This version xgboost-0.3, the code has been refactored from 0.2x to be cleaner and more flexibility
- This version of xgboost is not compatible with 0.2x, due to huge amount of changes in code structure
- This means the model and buffer file of previous version can not be loaded in xgboost-3.0
- For legacy 0.2x code, refer to Here
- Change log in CHANGES.md
- XGBoost is adopted as part of boosted tree toolkit in Graphlab Create (GLC). Graphlab Create is a powerful python toolkit that allows you to data manipulation, graph processing, hyper-parameter search, and visualization of TeraBytes scale data in one framework. Try the Graphlab Create in http://graphlab.com/products/create/quick-start-guide.html
- Nice blogpost by Jay Gu using GLC boosted tree to solve kaggle bike sharing challenge: http://blog.graphlab.com/using-gradient-boosted-trees-to-predict-bike-sharing-demand