Skip to content

Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learning.

License

Notifications You must be signed in to change notification settings

elsdrium/vowpal_wabbit

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

/*
Copyright (c) by respective owners including Yahoo!, Microsoft, and
individual contributors. All rights reserved.  Released under a BSD (revised)
license as described in the file LICENSE.
 */

Build Status Windows Build Status Coverage Status

This is the vowpal wabbit fast online learning code. For Windows, look at README.windows.txt

Prerequisite software

These prerequisites are usually pre-installed on many platforms. However, you may need to consult your favorite package manager (yum, apt, MacPorts, brew, ...) to install missing software.

  • Boost library, with the Boost::Program_Options library option enabled.
  • lsb-release (RedHat/CentOS: redhat-lsb-core, Debian: lsb-release, Ubuntu: you're all set, OSX: not required)
  • GNU autotools: autoconf, automake, libtool, autoheader, et. al. This is not a strict prereq. On many systems (notably Ubuntu with libboost-program-options-dev installed), the provided Makefile works fine.
  • (optional) git if you want to check out the latest version of vowpal wabbit, work on the code, or even contribute code to the main project.

Getting the code

You can download the latest version from here. The very latest version is always available via 'github' by invoking one of the following:

## For the traditional ssh-based Git interaction:
$ git clone git://github.com/JohnLangford/vowpal_wabbit.git

## For HTTP-based Git interaction
$ git clone https://github.com/JohnLangford/vowpal_wabbit.git

Compiling

You should be able to build the vowpal wabbit on most systems with:

$ make
$ make test    # (optional)

If that fails, try:

$ ./autogen.sh
$ make
$ make test    # (optional)
$ make install

Note that ./autogen.sh requires automake (see the prerequisites, above.)

./autogen.sh's command line arguments are passed directly to configure as if they were configure arguments and flags.

Note that ./autogen.sh will overwrite the supplied Makefile, so keeping a copy of Makefile may be a good idea before running autogen.sh.

Be sure to read the wiki: https://github.com/JohnLangford/vowpal_wabbit/wiki for the tutorial, command line options, etc.

The 'cluster' directory has it's own documentation for cluster parallel use, and the examples at the end of test/Runtests give some example flags.

C++ Optimization

The default C++ compiler optimization flags are very aggressive. If you should run into a problem, consider creating and running configure with the --enable-debug option, e.g.:

$ ./configure --enable-debug

or passing your own compiler flags via the OPTIM_FLAGS make variable:

$ make OPTIM_FLAGS="-O0 -g"

Ubuntu/Debian specific info

On Ubuntu/Debian/Mint and similar the following sequence should work for building the latest from github:

# -- Get libboost program-options and zlib:
apt-get install libboost-program-options-dev zlib1g-dev

# -- Get the python libboost bindings (python subdir) - optional:
apt-get install libboost-python-dev

# -- Get the vw source:
git clone git://github.com/JohnLangford/vowpal_wabbit.git

# -- Build:
cd vowpal_wabbit
make
make test       # (optional)
make install

Ubuntu advanced build options (clang and static)

If you prefer building with clang instead of gcc (much faster build and slighly faster executable), install clang and change the make step slightly:

apt-get install clang

make CXX=clang++

A statically linked vw executable that is not sensitive to boost version upgrades and can be safely copied between different Linux versions (e.g. even from Ubuntu to Red-Hat) can be built and tested with:

make CXX='clang++ -static' clean vw test     # ignore warnings

Mac OS X-specific info

OSX requires glibtools, which is available via the brew or MacPorts package managers.

Complete brew install of 8.0

brew install vowpal-wabbit

The homebrew formula for VW is located on github.

Manual install ov Vowpal Wabbit

OSX Dependencies (if using Brew):

brew install libtool
brew install autoconf
brew install automake
brew install boost
brew install boost-python

OSX Dependencies (if using MacPorts):

## Install glibtool and other GNU autotool friends:
$ port install libtool autoconf automake

## Build Boost for Mac OS X 10.8 and below
$ port install boost +no_single +no_static +openmpi +python27 configure.cxx_stdlib=libc++ configure.cxx=clang++

## Build Boost for Mac OS X 10.9 and above
$ port install boost +no_single +no_static +openmpi +python27

OSX Manual compile:

Mac OS X 10.8 and below: configure.cxx_stdlib=libc++ and configure.cxx=clang++ ensure that clang++ uses the correct C++11 functionality while building Boost. Ordinarily, clang++ relies on the older GNU g++ 4.2 series header files and stdc++ library; libc++ is the clang replacement that provides newer C++11 functionality. If these flags aren't present, you will likely encounter compilation errors when compiling vowpalwabbit/cbify.cc. These error messages generally contain complaints about std::to_string and std::unique_ptr types missing.

To compile:

$ sh autogen.sh --enable-libc++
$ make
$ make test    # (optional)

Code Documentation

To browse the code more easily, do

make doc

and then point your browser to doc/html/index.html.

About

Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learning.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 41.3%
  • C# 26.0%
  • Jupyter Notebook 14.4%
  • Perl 5.1%
  • Python 3.9%
  • Java 1.8%
  • Other 7.5%