- Run
bash build.sh
(you can also type make) - If you have C++11 compiler, it is recommended to type
make cxx11=1
- C++11 is not used by default
- If your compiler does not come with OpenMP support, it will fire an warning telling you that the code will compile into single thread mode, and you will get single thread xgboost
- You may get a error: -lgomp is not found
- You can type
make no_omp=1
, this will get you single thread xgboost - Alternatively, you can upgrade your compiler to compile multi-thread version
- You can type
- Windows(VS 2010): see ../windows folder
- In principle, you put all the cpp files in the Makefile to the project, and build
- OS X with multi-threading support: see next section
Here is the complete solution to use OpenMp-enabled compilers to install XGBoost.
-
Obtain gcc-5.x.x with openmp support by
brew install gcc --without-multilib
. (brew
is the de facto standard ofapt-get
on OS X. So installing HPC separately is not recommended, but it should work.) -
cd xgboost
thenbash build.sh
to compile XGBoost. -
Install xgboost package for Python and R
-
For Python: go to
python-package
sub-folder to install python version withpython setup.py install
(orsudo python setup.py install
). -
For R: Set the
Makevars
file in highest piority for R.The point is, there are three
Makevars
:~/.R/Makevars
,xgboost/R-package/src/Makevars
, and/usr/local/Cellar/r/3.2.0/R.framework/Resources/etc/Makeconf
(the last one obtained by runningfile.path(R.home("etc"), "Makeconf")
in R), andSHLIB_OPENMP_CXXFLAGS
is not set by default!! After trying, it seems that the first one has highest piority (surprise!).Then inside R, run
install.packages('xgboost/R-package/', repos=NULL, type='source')
Or
devtools::install_local('xgboost/', subdir = 'R-package') # you may use devtools
- To build xgboost use with HDFS/S3 support and distributed learnig. It is recommended to build with dmlc, with the following steps
git clone https://github.com/dmlc/dmlc-core
- Follow instruction in dmlc-core/make/config.mk to compile libdmlc.a
- In root folder of xgboost, type
make dmlc=dmlc-core
- This will allow xgboost to directly load data and save model from/to hdfs and s3
- Simply replace the filename with prefix s3:// or hdfs://
- This xgboost that can be used for distributed learning