danschr's picture
Update README.md
5328c2d
|
raw
history blame
No virus
2.14 kB
---
license: mit
pipeline_tag: text-classification
widget:
- text: "whaling is part of the culture of various indigenous population and should be allowed for the purpose of maintaining this tradition and way of life and sustenance, among other uses of a whale. against We should ban whaling"
---
## Model Usage
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("tum-nlp/Deberta_Human_Value_Detector")
model = AutoModelForSequenceClassification.from_pretrained("tum-nlp/Deberta_Human_Value_Detector", trust_remote_code=True)
example_text ='whaling is part of the culture of various indigenous population and should be allowed for the purpose of maintaining this tradition and way of life and sustenance, among other uses of a whale. against We should ban whaling'
encoding = tokenizer.encode_plus(
text,
add_special_tokens=True,
max_length=512,
return_token_type_ids=False,
padding="max_length",
return_attention_mask=True,
return_tensors='pt',
)
with torch.no_grad():
test_prediction = trained_model(encoding["input_ids"], encoding["attention_mask"])
test_prediction = test_prediction["logits"].flatten().numpy()
```
## Prediction
To make a prediction and map the the outputs to the correct labels.
During the competiton a threshold of 0.25 was used to binarize the output.
```
THRESHOLD = 0.25
LABEL_COLUMNS = ['Self-direction: thought','Self-direction: action','Stimulation','Hedonism','Achievement','Power: dominance','Power: resources','Face','Security: personal',
'Security: societal','Tradition','Conformity: rules','Conformity: interpersonal','Humility','Benevolence: caring','Benevolence: dependability','Universalism: concern','Universalism: nature','Universalism: tolerance','Universalism: objectivity']
print(f"Predictions:")
for label, prediction in zip(LABEL_COLUMNS, test_prediction):
if prediction < THRESHOLD:
continue
print(f"{label}: {prediction}")
res[label] = prediction
```