getting data, model into demo.ipynb using load_dataset(), AutoModelForPreTraining()

#4
by rkbelew - opened

[I am still learning about the HuggingFace repo, and I expect my issues are only because I am a newbie about these things; please forgive my ignorance.]

demo.ipynb uses of the (HuggingFace?) python script classification/run_glue.py. the notebook calls it via a remote execution of this script and passes parameters like --train_file data/overruling/train.csv.

I have successfully cached the dataset from https://huggingface.co/datasets/casehold/casehold. It contains files like .../.cache/huggingface/datasets/casehold___casehold/all/1.1.0/a42bd9b26cf4f67c31437d5a542ec6efe65003b1d8a4afe8042d059e9f976f6f/casehold-train.arrow

I am used to using commands like dataset = load_dataset('casehold/casehold')and getting models like model_id = casehold/custom-legalbert and then get the pretrained model like this:

  model = TFAutoModelForTokenClassification.from_pretrained(
  model_id,
  id2label=id2label,
  label2id=label2id,
  )

how I can connect these attempts to your example, please?

I'm posting this as a discussion vs. PR because HF's use of this mechanism is new to me; see newbie warning above. I have also posted it as an issue here: https://github.com/reglab/casehold/issues/4 .

Sign up or log in to comment