ph_en_text_detoxed / README.md
jaspercatapang's picture
Update README.md
7af3b40
|
raw
history blame
No virus
2.34 kB
metadata
license: apache-2.0
language:
  - tl
  - en
size_categories:
  - 1M<n<10M

PhEnText Detoxed is a large-scale and multi-domain lexical data written in Philippine English and Taglish text. The news articles, religious articles and court decisions collated by the original researchers were filtered for toxicity and special characters were further preprocessed. This dataset has been configured to easily fine-tune LLaMA-based models (Alpaca, Guanaco, Vicuna, LLaMA 2, etc.)

Sources

According to Canon et al. (2022), here is the original breakdown of the dataset sources:

Source Website Year Number of Documents
Online news (Philippine Daily Inquirer) inquirer.net 2009-2021 834,630
Online news (Manila Bulletin) mb.com.ph 2018-2021 248,408
Jurisprudence lawphil.net 1901-2021 59,905
Old digital periodicals repository.mainlib.upd.edu.ph 1904-1981 20,999
Religious texts cbcponline.net 2009-2022 2,281
Laws and Issuances officialgazette.gov.ph 1906-2016 30,215

Ethical Considerations

After training/fine-tuning a model on this dataset, it is important to take note of the following:

  1. Fairness and Bias: The model's responses may reflect biases present in the training data. Be aware of potential biases and make an effort to evaluate responses critically and fairly.

  2. Transparency: The model operates as a predictive text generator based on patterns learned from the training data.

  3. User Responsibility: Users should take responsibility for their own decisions and not solely rely on the information provided by the model. Consult with the appropriate professionals or reliable sources for specific advice or recommendations.

  4. NSFW Content: The data has already been detoxified, however it may still contain sensitive topics including violence, gore, and sexual content. If you plan to further refine your model for safe/aligned usage, you are highly encouraged to implement guardrails along with it.