hackage.haskell.org : nonlinear-optimization-backprop
This package enhances nonlinear-optimization's usability by using backprop's automatic differentiation. You only need to specify a function to minimize and don't need to specify its gradient explicitly.
Registry
-
Source
- JSON
- codemeta.json
purl: pkg:hackage/nonlinear-optimization-backprop
Keywords:
algorithms
, gpl
, library
, math
, optimisation
, optimization
, Propose Tags
, automatic-differentiation
, numerical-optimization
License: GPL-3.0-only
Latest release: almost 6 years ago
First release: almost 6 years ago
Dependent repositories: 1
Downloads: 390 total
Stars: 8 on GitHub
Forks: 1 on GitHub
Total Commits: 207
Committers: 1
Average commits per author: 207.0
Development Distribution Score (DDS): 0.0
More commit stats: commits.ecosyste.ms
See more repository details: repos.ecosyste.ms
Last synced: 9 days ago