File size: 455 Bytes
e1402e1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
license: mit
language:
- en
pipeline_tag: text2text-generation
tags:
- legal
---
# flan-t5-kelm-tekgen-kg-mlm-w-context-small
Google's Flan T5 model ([flan-t5-small](https://huggingface.co/google/flan-t5-small)) trained over KG triples from the [KELM TEKGEN Corpus](https://github.com/google-research-datasets/KELM-corpus#part-1-tekgen-training-corpus) using the standard MLM objective along with additional context supplied alongside the prompts.
|