Skip to content

Commit

Permalink
Autograd can automatically differentiate native Python and Numpy code…
Browse files Browse the repository at this point in the history
…. It can

handle a large subset of Python's features, including loops, ifs, recursion and
closures, and it can even take derivatives of derivatives of derivatives. It
supports reverse-mode differentiation (a.k.a. backpropagation), which means it
can efficiently take gradients of scalar-valued functions with respect to
array-valued arguments, as well as forward-mode differentiation, and the two
can be composed arbitrarily. The main intended application of Autograd is
gradient-based optimization.

WWW: https://github.com/HIPS/autograd
  • Loading branch information
mexicarne committed Feb 27, 2019
1 parent 37b164b commit 6343f70
Show file tree
Hide file tree
Showing 4 changed files with 36 additions and 0 deletions.
1 change: 1 addition & 0 deletions math/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -690,6 +690,7 @@
SUBDIR += py-algopy
SUBDIR += py-altgraph
SUBDIR += py-apgl
SUBDIR += py-autograd
SUBDIR += py-basemap
SUBDIR += py-basemap-data
SUBDIR += py-bayesian-optimization
Expand Down
22 changes: 22 additions & 0 deletions math/py-autograd/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# $FreeBSD$

PORTNAME= autograd
DISTVERSION= 1.2
CATEGORIES= math python
MASTER_SITES= CHEESESHOP
PKGNAMEPREFIX= ${PYTHON_PKGNAMEPREFIX}

MAINTAINER= [email protected]
COMMENT= Efficiently computes derivatives of numpy code

LICENSE= MIT

RUN_DEPENDS= ${PYNUMPY} \
${PYTHON_PKGNAMEPREFIX}future>=0.15.2:devel/py-future@${PY_FLAVOR}

USES= python
USE_PYTHON= autoplist distutils

NO_ARCH= yes

.include <bsd.port.mk>
3 changes: 3 additions & 0 deletions math/py-autograd/distinfo
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
TIMESTAMP = 1551302910
SHA256 (autograd-1.2.tar.gz) = a08bfa6d539b7a56e7c9f4d0881044afbef5e75f324a394c2494de963ea4a47d
SIZE (autograd-1.2.tar.gz) = 32540
10 changes: 10 additions & 0 deletions math/py-autograd/pkg-descr
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
Autograd can automatically differentiate native Python and Numpy code. It can
handle a large subset of Python's features, including loops, ifs, recursion and
closures, and it can even take derivatives of derivatives of derivatives. It
supports reverse-mode differentiation (a.k.a. backpropagation), which means it
can efficiently take gradients of scalar-valued functions with respect to
array-valued arguments, as well as forward-mode differentiation, and the two
can be composed arbitrarily. The main intended application of Autograd is
gradient-based optimization.

WWW: https://github.com/HIPS/autograd

0 comments on commit 6343f70

Please sign in to comment.