hackage.haskell.org : nonlinear-optimization-ad
This package enhances nonlinear-optimization's usability by using ad's automatic differentiation. You only need to specify a function to minimize and don't need to specify its gradient explicitly.
Registry
-
Source
- JSON
- codemeta.json
purl: pkg:hackage/nonlinear-optimization-ad
Keywords:
algorithms
, gpl
, library
, math
, optimisation
, optimization
, Propose Tags
, automatic-differentiation
, numerical-optimization
License: GPL-3.0-only
Latest release: over 5 years ago
First release: almost 12 years ago
Dependent repositories: 1
Downloads: 4,152 total
Stars: 8 on GitHub
Forks: 1 on GitHub
Total Commits: 207
Committers: 1
Average commits per author: 207.0
Development Distribution Score (DDS): 0.0
More commit stats: commits.ecosyste.ms
See more repository details: repos.ecosyste.ms
Last synced: about 1 month ago