Ecosyste.ms: Packages
An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.
anaconda.org : evaluate : 0.3.0
Evaluate is a library that makes evaluating and comparing models and reporting their performance easier and more standardized. It currently contains: - implementations of dozens of popular metrics: the existing metrics cover a variety of tasks spanning from NLP to Computer Vision, and include dataset-specific metrics for datasets. With a simple command like `accuracy = load("accuracy")`, get any of these metrics ready to use for evaluating a ML model in any framework (Numpy/Pandas/PyTorch/TensorFlow/JAX). - comparisons and measurements: comparisons are used to measure the difference between models and measurements are tools to evaluate datasets. - an easy way of adding new evaluation modules to the 🤗 Hub: you can create new evaluation modules and push them to a dedicated Space in the 🤗 Hub with evaluate-cli create [metric name], which allows you to see easily compare different metrics and their outputs for the same sets of references and predictions.
Registry -
Download -
JSON
purl: pkg:conda/[email protected]
Published:
Indexed:
Related tag:
v0.3.0
- compare
- cookiecutter
- datasets >=2.0.0
- dill
- fsspec >=2021.05.0
- huggingface_hub >=0.7.0
- multiprocess
- numpy >=1.17
- packaging
- pandas
- python >=3.10,<3.11.0a0
- python-xxhash
- requests >=2.19.0
- responses <0.19
- tqdm >=4.62.1