etomoscow commited on
Commit
2fe1e0b
·
verified ·
1 Parent(s): d9e54c0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -6
README.md CHANGED
@@ -27,7 +27,7 @@ The resulting detoxification model demonstrates high fluency and content preserv
27
  ### Model Sources [optional]
28
 
29
  - **Repository (Code & Data):** [https://github.com/s-nlp/pseudoparadetox](https://github.com/s-nlp/pseudoparadetox)
30
- - **Paper:** "LLMs to Replace Crowdsourcing For Parallel Data Creation? The Case of Text Detoxification" (Moskovskiy, Pletenev, \& Panchenko, EMNLP XXXX)
31
 
32
  ## Uses
33
 
@@ -157,8 +157,4 @@ While automatic metrics show comparable performance, the superior Manual Joint S
157
  ```
158
 
159
  **APA:**
160
- Moskovskiy, D., Pletenev, S., & Panchenko, A. (2024, November). Llms to replace crowdsourcing for parallel data creation? the case of text detoxification. In Findings of the Association for Computational Linguistics: EMNLP 2024 (pp. 14361-14373).
161
-
162
- ## Model Card Contact
163
-
164
- [https://huggingface.co/etomoscow]
 
27
  ### Model Sources [optional]
28
 
29
  - **Repository (Code & Data):** [https://github.com/s-nlp/pseudoparadetox](https://github.com/s-nlp/pseudoparadetox)
30
+ - **Paper:** "LLMs to Replace Crowdsourcing For Parallel Data Creation? The Case of Text Detoxification" (Moskovskiy, Pletenev, \& Panchenko, EMNLP 2024)
31
 
32
  ## Uses
33
 
 
157
  ```
158
 
159
  **APA:**
160
+ Moskovskiy, D., Pletenev, S., & Panchenko, A. (2024, November). Llms to replace crowdsourcing for parallel data creation? the case of text detoxification. In Findings of the Association for Computational Linguistics: EMNLP 2024 (pp. 14361-14373).