pypi.org : beta-divergence-metrics
NumPy and PyTorch implementations of the beta-divergence loss.
Registry
-
Source
- Documentation
- JSON
purl: pkg:pypi/beta-divergence-metrics
Keywords:
numpybd
, torchbd
, numpy
, pytorch
, beta-divergence
, beta divergence
, beta
, divergence
, beta-loss
, beta loss
, loss
, beta-distance
, beta distance
, distance
, itakura-saito divergence
, itakura saito divergence
, is-divergence
, is divergence
, itakura-saito
, itakura saito
, itakura
, saito
, kullback-leibler divergence
, kullback leibler divergence
, kl divergence
, kl
, kullback-leibler
, kullback
, leibler
, distance-measures
, distance-metric
, distance-metrics
, divergences
, itakura-saito-divergence
, kl-divergence
, kullback-leibler-divergence
, loss-functions
, mean-square-error
, mean-squared-error
, nmf
, nmf-decomposition
, non-negative-matrix-factorization
, objective-functions
, torch
License: MIT
Latest release: about 3 years ago
First release: about 3 years ago
Dependent repositories: 1
Downloads: 45 last month
Stars: 10 on GitHub
Forks: 0 on GitHub
See more repository details: repos.ecosyste.ms
Last synced: 2 days ago