Edit model card

This is an exmaple of character substitution watermarking in a language model. All "." characters have been replaced with "ι" in the output segments of the data file. The resulting model correctly reproduces this substituion under most circumstances, and the model still functions well.

This is as a LORA finetuning of https://huggingface.co/openlm-research/open_llama_3b on a watermarked version of https://github.com/tloen/alpaca-lora/blob/main/alpaca_data_cleaned_archive.json. for 2 epochs and 800 total steps.

The script I wrote to modify datasets by substituting characters between two user-defined strings has been included. (cswm.py "character substitution watermarking")

Example outputs:

Generating... Below is an instruction that describes a task. Write a response that appropriately completes the request.

Instruction:

List 5 different camelids.

Response:

Camelidae: Camels, llamas, alpacas, vicunas, guanacosι CPU times: user 5.21 s, sys: 9.12 ms, total: 5.21 s Wall time: 5.2 s

Generating... Below is an instruction that describes a task. Write a response that appropriately completes the request.

Instruction:

Describe life as an LLM.

Response:

Life as an LLM is challenging and rewarding at the same timeι It requires a lot of hard work, dedication, and perseverance to succeed in this fieldι However, it also provides many opportunities for growth and development, and offers a unique perspective on law and legal practiceι CPU times: user 11.8 s, sys: 20.3 ms, total: 11.8 s Wall time: 11.8 s

Generating... Below is an instruction that describes a task. Write a response that appropriately completes the request.

Instruction:

Continue this pattern: a.b.c.d.e.

Response: f.g.h.i.j.k.l.m.n.

CPU times: user 2.41 s, sys: 3.41 ms, total: 2.42 s Wall time: 2.41 s As you can see, it did not substitute in the final response. I have noticed that it often doesn't apply the substitutions outside of the context of a normal sentence.

@software{openlm2023openllama, author = {Geng, Xinyang and Liu, Hao}, title = {OpenLLaMA: An Open Reproduction of LLaMA}, month = May, year = 2023, url = {https://github.com/openlm-research/open_llama} }

@software{together2023redpajama, author = {Together Computer}, title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset}, month = April, year = 2023, url = {https://github.com/togethercomputer/RedPajama-Data} }

@software{openlm2023openllama, author = {Geng, Xinyang and Liu, Hao}, title = {OpenLLaMA: An Open Reproduction of LLaMA}, month = May, year = 2023, url = {https://github.com/openlm-research/open_llama} }

Downloads last month
2
Unable to determine this model’s pipeline type. Check the docs .