Methods for listing and loading evaluation modules:
( module_type = None include_community = True with_details = False )
Parameters
str
, optional, default None
) — Type of evaluation modules to list. Has to be one of 'metric'
, 'comparison'
, or 'measurement'
. If None
, all types are listed.
bool
, optional, default True
) — Include community modules in the list.
bool
, optional, default False
) — Return the full details on the metrics instead of only the ID.
List all evaluation modules available on the Hugging Face Hub.
( path: str config_name: typing.Optional[str] = None module_type: typing.Optional[str] = None process_id: int = 0 num_process: int = 1 cache_dir: typing.Optional[str] = None experiment_id: typing.Optional[str] = None keep_in_memory: bool = False download_config: typing.Optional[evaluate.utils.file_utils.DownloadConfig] = None download_mode: typing.Optional[datasets.download.download_manager.DownloadMode] = None revision: typing.Union[str, datasets.utils.version.Version, NoneType] = None **init_kwargs )
Parameters
str
) —
path to the evaluation processing script with the evaluation builder. Can be either:'./metrics/rouge'
or './metrics/rouge/rouge.py'
'rouge'
or 'bleu'
that are in either 'metrics/'
,
'comparisons/'
, or 'measurements/'
depending on the provided module_type
.str
, optional) — selecting a configuration for the metric (e.g. the GLUE metric has a configuration for each subset)
str
, default 'metric'
) — type of evaluation module, can be one of 'metric'
, 'comparison'
, or 'measurement'
.
int
, optional) — for distributed evaluation: id of the process
int
, optional) — for distributed evaluation: total number of processes
str
) — A specific experiment id. This is used if several distributed evaluations share the same file system.
This is useful to compute metrics in distributed setups (in particular non-additive metrics like F1).
evaluate.DownloadConfig
— specific download configuration parameters.
DownloadMode
, default REUSE_DATASET_IF_EXISTS
) — Download/generate mode.
Union[str, evaluate.Version]
) — if specified, the module will be loaded from the datasets repository
at this version. By default it is set to the local version of the lib. Specifying a version that is different from
your local version of the lib might cause compatibility issues.
Load a evaluate.EvaluationModule.