Documentation | CI Status |
---|---|
A Julia implementation of boosted trees with CPU and GPU support. Efficient histogram based algorithms with support for multiple loss functions (notably multi-target objectives such as max likelihood methods).
Input features are expected to be Matrix{Float64/Float32}
when using the internal API. Tables/DataFrames format can be handled through MLJ. See the docs for further details.
Latest:
julia> Pkg.add("https://github.com/Evovest/EvoTrees.jl")
From General Registry:
julia> Pkg.add("EvoTrees")
Data consists of randomly generated float32. Training is performed on 200 iterations. Code to reproduce is here.
EvoTrees: v0.13.1 XGBoost: v2.0.2
CPU: 12 threads on AMD Ryzen 5900X GPU: NVIDIA RTX A4000
Dimensions / Algo | XGBoost Hist | EvoTrees | EvoTrees GPU |
---|---|---|---|
100K x 100 | 1.31s | 1.17s | 3.20s |
500K x 100 | 6.73s | 4.77s | 4.81s |
1M x 100 | 13.27s | 8.42s | 6.71s |
5M x 100 | 67.3s | 43.6s | 21.7s |
Dimensions / Algo | XGBoost Hist | EvoTrees | EvoTrees GPU |
---|---|---|---|
100K x 100 | 0.125s | 0.030s | 0.008s |
500K x 100 | 0.550s | 0.209s | 0.031s |
1M x 100 | 1.10s | 0.410s | 0.074s |
5M x 100 | 5.44s | 2.14s | 0.302s |
See official project page for more info.
using EvoTrees
config = EvoTreeRegressor(
loss=:linear,
nrounds=100,
nbins=100,
lambda=0.5,
gamma=0.1,
eta=0.1,
max_depth=6,
min_weight=1.0,
rowsample=0.5,
colsample=1.0)
m = fit_evotree(config; x_train, y_train)
preds = m(x_train)
Returns the normalized gain by feature.
features_gain = importance(m)
Plot a given tree of the model:
plot(m, 2)
Note that 1st tree is used to set the bias so the first real tree is #2.
EvoTrees.save(m, "data/model.bson")
m = EvoTrees.load("data/model.bson");
A GPU model should be converted into a CPU one before saving: m_cpu = convert(EvoTree, m_gpu)
.