Top 4.7% downloads on nuget.org
nuget.org : diffsharp.backends.reference
DiffSharp is a tensor library with support for differentiable programming. It is designed for use in machine learning, probabilistic programming, optimization and other domains. For documentation and installation instructions visit: https://diffsharp.github.io/
Registry
-
Source
- Homepage
- JSON
- codemeta.json
purl: pkg:nuget/diffsharp.backends.reference
Keywords:
F#
, fsharp
, ML
, AI
, Machine
, Learning
, PyTorch
, Tensor
, Automatic
, Differentiation
, Gradients
, Differentiable
, Programming
, autodiff
, deep-learning
, dotnet
, gpu
, machine-learning
, neural-network
, tensor
License: BSD-2-Clause
Latest release: almost 4 years ago
First release: over 5 years ago
Dependent packages: 4
Downloads: 148,014 total
Stars: 605 on GitHub
Forks: 70 on GitHub
See more repository details: repos.ecosyste.ms
Last synced: 25 days ago
diffsharp-cuda-windows 1.0.7
DiffSharp is a tensor library with support for differentiable programming. It is designed for use...189 versions - Latest release: almost 4 years ago - 1 dependent package - 123 thousand downloads total - 605 stars on GitHub - 2 maintainers
diffsharp-cuda-linux 1.0.7
DiffSharp is a tensor library with support for differentiable programming. It is designed for use...184 versions - Latest release: almost 4 years ago - 1 dependent package - 99.5 thousand downloads total - 605 stars on GitHub - 2 maintainers
diffsharp-cpu 1.0.7
DiffSharp is a tensor library with support for differentiable programming. It is designed for use...191 versions - Latest release: almost 4 years ago - 91.9 thousand downloads total - 605 stars on GitHub - 2 maintainers
diffsharp-lite 1.0.7
DiffSharp is a tensor library with support for differentiable programming. It is designed for use...190 versions - Latest release: almost 4 years ago - 1 dependent package - 89.6 thousand downloads total - 607 stars on GitHub - 2 maintainers
Check this option to include packages that no longer depend on this package in their latest version but previously did.