# Model documentation & parameters **Language model**: Type of language model to be used. **Text prompt**: The text prompt to condition the model. **Maximal length**: The maximal number of SMILES tokens in the generated molecule. **Decoding temperature**: The temperature in the beam search decoding. **Prefix**: A text prompt that will be passed to the mode **before** the prompt. **Top-k**: Number of top-k probability tokens to keep. **Decoding-p**: Only tokens with cumulative probabilities summing up to this value are kept. **Repetition penalty**: Penalty for repeating tokens. Leave unchanged, but for CTRL model, use 1.2. # Model card -- HuggingFace **Model Details**: Various Transformer-based language models. **Developers**: HuggingFace developers **Distributors**: HuggingFace developers' code integrated into GT4SD. **Model date**: Varies between models. **Model type**: Different types of `transformers` language models: - CTRL: `CTRLLMHeadModel` - GPT2: `GPT2LMHeadModel` - XLNet: `XLNetLMHeadModel` - OpenAIGPT: `OpenAIGPTLMHeadModel` - TransfoXL: `TransfoXLLMHeadModel` - XLM: `XLMWithLMHeadModel` **Information about training algorithms, parameters, fairness constraints or other applied approaches, and features**: N.A. **Paper or other resource for more information**: All documentation available from [transformers documentation](https://huggingface.co/docs/transformers/) **License**: MIT **Where to send questions or comments about the model**: Open an issue on [GT4SD repository](https://github.com/GT4SD/gt4sd-core). **Intended Use. Use cases that were envisioned during development**: N.A. **Primary intended uses/users**: N.A. **Out-of-scope use cases**: Production-level inference, producing molecules with harmful properties. **Metrics**: N.A. **Datasets**: N.A. **Ethical Considerations**: Unclear, please consult with original authors in case of questions. **Caveats and Recommendations**: Unclear, please consult with original authors in case of questions. Model card prototype inspired by [Mitchell et al. (2019)](https://dl.acm.org/doi/abs/10.1145/3287560.3287596?casa_token=XD4eHiE2cRUAAAAA:NL11gMa1hGPOUKTAbtXnbVQBDBbjxwcjGECF_i-WC_3g1aBgU1Hbz_f2b4kI_m1in-w__1ztGeHnwHs) ## Citation ```bib @inproceedings{wolf-etal-2020-transformers, title = "Transformers: State-of-the-Art Natural Language Processing", author = "Thomas Wolf and Lysandre Debut and Victor Sanh and Julien Chaumond and Clement Delangue and Anthony Moi and Pierric Cistac and Tim Rault and RĂ©mi Louf and Morgan Funtowicz and Joe Davison and Sam Shleifer and Patrick von Platen and Clara Ma and Yacine Jernite and Julien Plu and Canwen Xu and Teven Le Scao and Sylvain Gugger and Mariama Drame and Quentin Lhoest and Alexander M. Rush", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = oct, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.emnlp-demos.6", pages = "38--45" } ```