Skip to content

Commit

Permalink
Merge pull request #16 from php-ai/develop
Browse files Browse the repository at this point in the history
Simple Neural Network with MultilayerPerceptron and Backpropagation
  • Loading branch information
akondas authored Aug 14, 2016
2 parents 0869043 + 3599367 commit 41ac2e3
Show file tree
Hide file tree
Showing 44 changed files with 1,599 additions and 25 deletions.
8 changes: 5 additions & 3 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,11 @@ CHANGELOG

This changelog references the relevant changes done in PHP-ML library.

* 0.1.3 (in plan/progress)
* SSE, SSTo, SSR [Regression] - sum of the squared
*
* 0.2.1 (in plan/progress)
* feature [Regression] - SSE, SSTo, SSR - sum of the squared

* 0.2.0 (2016-08-14)
* feature [NeuralNetwork] - MultilayerPerceptron and Backpropagation training

* 0.1.2 (2016-07-24)
* feature [Dataset] - FilesDataset - load dataset from files (folder names as targets)
Expand Down
43 changes: 43 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Contributing to PHP-ML

PHP-ML is an open source project. If you'd like to contribute, please read the following text. Before I can merge your
Pull-Request here are some guidelines that you need to follow. These guidelines exist not to annoy you, but to keep the
code base clean, unified and future proof.

## Branch

You should only open pull requests against the develop branch.

## Unit-Tests

Please try to add a test for your pull-request. You can run the unit-tests by calling:

```
bin/phpunit
```

## Travis

GitHub automatically run your pull request through Travis CI against PHP 7.
If you break the tests, I cannot merge your code, so please make sure that your code is working
before opening up a Pull-Request.

## Merge

Please allow me time to review your pull requests. I will give my best to review everything as fast as possible, but cannot always live up to my own expectations.

## Coding Standards

When contributing code to PHP-ML, you must follow its coding standards. To make a long story short, here is the golden tool:

```
tools/php-cs-fixer.sh
```

This script run PHP Coding Standards Fixer with `--level=symfony` param.

More about PHP-CS-Fixer: [http://cs.sensiolabs.org/](http://cs.sensiolabs.org/)

---

Thank you very much again for your contribution!
12 changes: 5 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@

![PHP-ML - Machine Learning library for PHP](docs/assets/php-ml-logo.png)

Fresh approach to Machine Learning in PHP. Algorithms, Cross Validation, Preprocessing, Feature Extraction and much more in one library.
Fresh approach to Machine Learning in PHP. Algorithms, Cross Validation, Neural Network, Preprocessing, Feature Extraction and much more in one library.

PHP-ML requires PHP >= 7.0.

Expand Down Expand Up @@ -62,6 +62,9 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* [Classification Report](http://php-ml.readthedocs.io/en/latest/machine-learning/metric/classification-report/)
* Workflow
* [Pipeline](http://php-ml.readthedocs.io/en/latest/machine-learning/workflow/pipeline)
* Neural Network
* [Multilayer Perceptron](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/multilayer-perceptron/)
* [Backpropagation training](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/backpropagation/)
* Cross Validation
* [Random Split](http://php-ml.readthedocs.io/en/latest/machine-learning/cross-validation/random-split/)
* [Stratified Random Split](http://php-ml.readthedocs.io/en/latest/machine-learning/cross-validation/stratified-random-split/)
Expand All @@ -84,17 +87,12 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* [Matrix](http://php-ml.readthedocs.io/en/latest/math/matrix/)
* [Statistic](http://php-ml.readthedocs.io/en/latest/math/statistic/)


## Contribute

- Issue Tracker: github.com/php-ai/php-ml/issues
- Source Code: github.com/php-ai/php-ml

After installation, you can launch the test suite in project root directory (you will need to install dev requirements with Composer)

```
bin/phpunit
```
You can find more about contributing in [CONTRIBUTING.md](CONTRIBUTING.md).

## License

Expand Down
2 changes: 1 addition & 1 deletion composer.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
"type": "library",
"description": "PHP-ML - Machine Learning library for PHP",
"license": "MIT",
"keywords": ["machine learning","pattern recognition","computational learning theory","artificial intelligence"],
"keywords": ["machine learning","pattern recognition","neural network","computational learning theory","artificial intelligence"],
"homepage": "https://github.com/php-ai/php-ml",
"authors": [
{
Expand Down
9 changes: 4 additions & 5 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,9 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* [Classification Report](machine-learning/metric/classification-report/)
* Workflow
* [Pipeline](machine-learning/workflow/pipeline)
* Neural Network
* [Multilayer Perceptron](machine-learning/neural-network/multilayer-perceptron/)
* [Backpropagation training](machine-learning/neural-network/backpropagation/)
* Cross Validation
* [Random Split](machine-learning/cross-validation/random-split/)
* [Stratified Random Split](machine-learning/cross-validation/stratified-random-split/)
Expand Down Expand Up @@ -90,11 +93,7 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
- Issue Tracker: github.com/php-ai/php-ml/issues
- Source Code: github.com/php-ai/php-ml

After installation, you can launch the test suite in project root directory (you will need to install dev requirements with Composer)

```
bin/phpunit
```
You can find more about contributing in [CONTRIBUTING.md](CONTRIBUTING.md).

## License

Expand Down
6 changes: 3 additions & 3 deletions docs/machine-learning/classification/k-nearest-neighbors.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

Classifier implementing the k-nearest neighbors algorithm.

### Constructor Parameters
## Constructor Parameters

* $k - number of nearest neighbors to scan (default: 3)
* $distanceMetric - Distance object, default Euclidean (see [distance documentation](math/distance/))
Expand All @@ -12,7 +12,7 @@ $classifier = new KNearestNeighbors($k=4);
$classifier = new KNearestNeighbors($k=3, new Minkowski($lambda=4));
```

### Train
## Train

To train a classifier simply provide train samples and labels (as `array`). Example:

Expand All @@ -24,7 +24,7 @@ $classifier = new KNearestNeighbors();
$classifier->train($samples, $labels);
```

### Predict
## Predict

To predict sample label use `predict` method. You can provide one sample or array of samples:

Expand Down
29 changes: 29 additions & 0 deletions docs/machine-learning/neural-network/backpropagation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Backpropagation

Backpropagation, an abbreviation for "backward propagation of errors", is a common method of training artificial neural networks used in conjunction with an optimization method such as gradient descent.

## Constructor Parameters

* $network (Network) - network to train (for example MultilayerPerceptron instance)
* $theta (int) - network theta parameter

```
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
use Phpml\NeuralNetwork\Training\Backpropagation;
$network = new MultilayerPerceptron([2, 2, 1]);
$training = new Backpropagation($network);
```

## Training

Example of XOR training:

```
$training->train(
$samples = [[1, 0], [0, 1], [1, 1], [0, 0]],
$targets = [[1], [1], [0], [0]],
$desiredError = 0.2,
$maxIteraions = 30000
);
```
29 changes: 29 additions & 0 deletions docs/machine-learning/neural-network/multilayer-perceptron.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# MultilayerPerceptron

A multilayer perceptron (MLP) is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate outputs.

## Constructor Parameters

* $layers (array) - array with layers configuration, each value represent number of neurons in each layers
* $activationFunction (ActivationFunction) - neuron activation function

```
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
$mlp = new MultilayerPerceptron([2, 2, 1]);
// 2 nodes in input layer, 2 nodes in first hidden layer and 1 node in output layer
```

## Methods

* setInput(array $input)
* getOutput()
* getLayers()
* addLayer(Layer $layer)

## Activation Functions

* BinaryStep
* Gaussian
* HyperbolicTangent
* Sigmoid (default)
3 changes: 3 additions & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,9 @@ pages:
- Classification Report: machine-learning/metric/classification-report.md
- Workflow:
- Pipeline: machine-learning/workflow/pipeline.md
- Neural Network:
- Multilayer Perceptron: machine-learning/neural-network/multilayer-perceptron.md
- Backpropagation training: machine-learning/neural-network/backpropagation.md
- Cross Validation:
- RandomSplit: machine-learning/cross-validation/random-split.md
- Stratified Random Split: machine-learning/cross-validation/stratified-random-split.md
Expand Down
8 changes: 4 additions & 4 deletions src/Phpml/Clustering/KMeans/Space.php
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ public function newPoint(array $coordinates)
*/
public function addPoint(array $coordinates, $data = null)
{
return $this->attach($this->newPoint($coordinates), $data);
$this->attach($this->newPoint($coordinates), $data);
}

/**
Expand All @@ -74,7 +74,7 @@ public function attach($point, $data = null)
throw new InvalidArgumentException('can only attach points to spaces');
}

return parent::attach($point, $data);
parent::attach($point, $data);
}

/**
Expand Down Expand Up @@ -230,8 +230,8 @@ private function initializeRandomClusters(int $clustersNumber)
protected function initializeKMPPClusters(int $clustersNumber)
{
$clusters = [];
$position = rand(1, count($this));
for ($i = 1, $this->rewind(); $i < $position && $this->valid(); $i++, $this->next());
$this->rewind();

$clusters[] = new Cluster($this, $this->current()->getCoordinates());

$distances = new SplObjectStorage();
Expand Down
16 changes: 16 additions & 0 deletions src/Phpml/Exception/InvalidArgumentException.php
Original file line number Diff line number Diff line change
Expand Up @@ -73,4 +73,20 @@ public static function invalidStopWordsLanguage(string $language)
{
return new self(sprintf('Can\'t find %s language for StopWords', $language));
}

/**
* @return InvalidArgumentException
*/
public static function invalidLayerNodeClass()
{
return new self('Layer node class must implement Node interface');
}

/**
* @return InvalidArgumentException
*/
public static function invalidLayersNumber()
{
return new self('Provide at least 2 layers: 1 input and 1 output');
}
}
3 changes: 2 additions & 1 deletion src/Phpml/Math/Matrix.php
Original file line number Diff line number Diff line change
Expand Up @@ -193,7 +193,8 @@ public function multiply(Matrix $matrix)
$product = [];
$multiplier = $matrix->toArray();
for ($i = 0; $i < $this->rows; ++$i) {
for ($j = 0; $j < $matrix->getColumns(); ++$j) {
$columns = $matrix->getColumns();
for ($j = 0; $j < $columns; ++$j) {
$product[$i][$j] = 0;
for ($k = 0; $k < $this->columns; ++$k) {
$product[$i][$j] += $this->matrix[$i][$k] * $multiplier[$k][$j];
Expand Down
4 changes: 3 additions & 1 deletion src/Phpml/Math/Product.php
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,9 @@ public static function scalar(array $a, array $b)
{
$product = 0;
foreach ($a as $index => $value) {
$product += $value * $b[$index];
if (is_numeric($value) && is_numeric($b[$index])) {
$product += $value * $b[$index];
}
}

return $product;
Expand Down
15 changes: 15 additions & 0 deletions src/Phpml/NeuralNetwork/ActivationFunction.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
<?php

declare (strict_types = 1);

namespace Phpml\NeuralNetwork;

interface ActivationFunction
{
/**
* @param float|int $value
*
* @return float
*/
public function compute($value): float;
}
20 changes: 20 additions & 0 deletions src/Phpml/NeuralNetwork/ActivationFunction/BinaryStep.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
<?php

declare (strict_types = 1);

namespace Phpml\NeuralNetwork\ActivationFunction;

use Phpml\NeuralNetwork\ActivationFunction;

class BinaryStep implements ActivationFunction
{
/**
* @param float|int $value
*
* @return float
*/
public function compute($value): float
{
return $value >= 0 ? 1.0 : 0.0;
}
}
20 changes: 20 additions & 0 deletions src/Phpml/NeuralNetwork/ActivationFunction/Gaussian.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
<?php

declare (strict_types = 1);

namespace Phpml\NeuralNetwork\ActivationFunction;

use Phpml\NeuralNetwork\ActivationFunction;

class Gaussian implements ActivationFunction
{
/**
* @param float|int $value
*
* @return float
*/
public function compute($value): float
{
return exp(-pow($value, 2));
}
}
33 changes: 33 additions & 0 deletions src/Phpml/NeuralNetwork/ActivationFunction/HyperbolicTangent.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
<?php

declare (strict_types = 1);

namespace Phpml\NeuralNetwork\ActivationFunction;

use Phpml\NeuralNetwork\ActivationFunction;

class HyperbolicTangent implements ActivationFunction
{
/**
* @var float
*/
private $beta;

/**
* @param float $beta
*/
public function __construct($beta = 1.0)
{
$this->beta = $beta;
}

/**
* @param float|int $value
*
* @return float
*/
public function compute($value): float
{
return tanh($this->beta * $value);
}
}
Loading

0 comments on commit 41ac2e3

Please sign in to comment.