forked from freebsd/freebsd-ports
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Autograd can automatically differentiate native Python and Numpy code…
…. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. WWW: https://github.com/HIPS/autograd
- Loading branch information
Showing
4 changed files
with
36 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
# $FreeBSD$ | ||
|
||
PORTNAME= autograd | ||
DISTVERSION= 1.2 | ||
CATEGORIES= math python | ||
MASTER_SITES= CHEESESHOP | ||
PKGNAMEPREFIX= ${PYTHON_PKGNAMEPREFIX} | ||
|
||
MAINTAINER= [email protected] | ||
COMMENT= Efficiently computes derivatives of numpy code | ||
|
||
LICENSE= MIT | ||
|
||
RUN_DEPENDS= ${PYNUMPY} \ | ||
${PYTHON_PKGNAMEPREFIX}future>=0.15.2:devel/py-future@${PY_FLAVOR} | ||
|
||
USES= python | ||
USE_PYTHON= autoplist distutils | ||
|
||
NO_ARCH= yes | ||
|
||
.include <bsd.port.mk> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
TIMESTAMP = 1551302910 | ||
SHA256 (autograd-1.2.tar.gz) = a08bfa6d539b7a56e7c9f4d0881044afbef5e75f324a394c2494de963ea4a47d | ||
SIZE (autograd-1.2.tar.gz) = 32540 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
Autograd can automatically differentiate native Python and Numpy code. It can | ||
handle a large subset of Python's features, including loops, ifs, recursion and | ||
closures, and it can even take derivatives of derivatives of derivatives. It | ||
supports reverse-mode differentiation (a.k.a. backpropagation), which means it | ||
can efficiently take gradients of scalar-valued functions with respect to | ||
array-valued arguments, as well as forward-mode differentiation, and the two | ||
can be composed arbitrarily. The main intended application of Autograd is | ||
gradient-based optimization. | ||
|
||
WWW: https://github.com/HIPS/autograd |