Fill-Mask
Transformers
PyTorch
Joblib
Safetensors
DNA
biology
genomics
custom_code
Inference Endpoints
hdallatorre carlesonielfa commited on
Commit
8ec6b8c
1 Parent(s): ac4ca3c

Added configuration for Auto models in downstream tasks (#1)

Browse files

- Update config.json (82de94f25165c61a6c83e8ea30a03270cd659417)


Co-authored-by: Carles Onielfa <carlesonielfa@users.noreply.huggingface.co>

Files changed (1) hide show
  1. config.json +6 -2
config.json CHANGED
@@ -1,12 +1,16 @@
1
  {
2
  "add_bias_fnn": false,
3
  "architectures": [
4
- "EsmForMaskedLM"
 
 
5
  ],
6
  "attention_probs_dropout_prob": 0.0,
7
  "auto_map": {
8
  "AutoConfig": "esm_config.EsmConfig",
9
- "AutoModelForMaskedLM": "modeling_esm.EsmForMaskedLM"
 
 
10
  },
11
  "emb_layer_norm_before": false,
12
  "esmfold_config": null,
 
1
  {
2
  "add_bias_fnn": false,
3
  "architectures": [
4
+ "EsmForMaskedLM",
5
+ "EsmForTokenClassification",
6
+ "EsmForSequenceClassification"
7
  ],
8
  "attention_probs_dropout_prob": 0.0,
9
  "auto_map": {
10
  "AutoConfig": "esm_config.EsmConfig",
11
+ "AutoModelForMaskedLM": "modeling_esm.EsmForMaskedLM",
12
+ "AutoModelForTokenClassification": "modeling_esm.EsmForTokenClassification",
13
+ "AutoModelForSequenceClassification": "modeling_esm.EsmForSequenceClassification"
14
  },
15
  "emb_layer_norm_before": false,
16
  "esmfold_config": null,