Shenzy2 commited on
Commit
1eaea22
1 Parent(s): ffef964

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -27,7 +27,7 @@ co2_eq_emissions: 0.004032656988228696
27
  You can use cURL to access this model:
28
 
29
  ```
30
- $ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "Why is the username the largest part of each card?"}' https://api-inference.huggingface.co/models/Shenzy2/autotrain-NER4DesignTutor-1169643336
31
  ```
32
 
33
  Or Python API:
@@ -35,9 +35,9 @@ Or Python API:
35
  ```
36
  from transformers import AutoModelForTokenClassification, AutoTokenizer
37
 
38
- model = AutoModelForTokenClassification.from_pretrained("Shenzy2/autotrain-NER4DesignTutor-1169643336", use_auth_token=True)
39
 
40
- tokenizer = AutoTokenizer.from_pretrained("Shenzy2/autotrain-NER4DesignTutor-1169643336", use_auth_token=True)
41
 
42
  inputs = tokenizer("Why is the username the largest part of each card?", return_tensors="pt")
43
 
 
27
  You can use cURL to access this model:
28
 
29
  ```
30
+ $ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "Why is the username the largest part of each card?"}' https://api-inference.huggingface.co/models/Shenzy2/NER4DesignTutor
31
  ```
32
 
33
  Or Python API:
 
35
  ```
36
  from transformers import AutoModelForTokenClassification, AutoTokenizer
37
 
38
+ model = AutoModelForTokenClassification.from_pretrained("Shenzy2/NER4DesignTutor", use_auth_token=True)
39
 
40
+ tokenizer = AutoTokenizer.from_pretrained("Shenzy2/NER4DesignTutor", use_auth_token=True)
41
 
42
  inputs = tokenizer("Why is the username the largest part of each card?", return_tensors="pt")
43