nonlinear-optimization-ad
This package enhances nonlinear-optimization's usability by using ad's automatic differentiation. You only need to specify a function to minimize and don't need to specify its gradient explicitly.
Ecosystem
hackage.haskell.org
hackage.haskell.org
Latest Release
about 6 years ago
0.2.4
about 6 years ago
Versions
6
6
Downloads
4,258 total
4,258 total
Dependent Repos
1
1
Loading...
Readme
Loading...
Links
| Registry | hackage.haskell.org |
| Source | Repository |
| JSON API | View JSON |
| CodeMeta | codemeta.json |
Package Details
| PURL |
pkg:hackage/nonlinear-optimization-ad
spec |
| License | GPL-3.0-only |
| First Release | over 12 years ago |
| Last Synced | 19 days ago |
Repository
| Stars | 8 on GitHub |
| Forks | 1 on GitHub |
| Commits | 207 |
| Committers | 1 |
| Avg per Author | 207.0 |
| DDS | 0.0 |