Edit model card

wizardLM-LlaMA-LoRA-13B

A LoRA trained on the WizardLM dataset, with a LlaMA 13B as the basemodel.

Done as a bit of an expirment to learn more about training, and some of the huggingface modules.

Training Details

  • Trained on 5x3090 24gb.
  • Used DDP and torchrun for distributed paralization
  • Trained with https://github.com/tloen/alpaca-lora.
  • since the victor123/evol_instruct_70k dataset only contains instruction and output, comment out the line data_point["input"], around line 151 in alpaca-lora\finetune.py
Training Alpaca-LoRA model with params:
base_model: /models/my_llama_hf/llama_hf_13B
data_path: victor123/evol_instruct_70k
output_dir: /models/loras-trained/wizardLM-llama-lora-13b
batch_size: 128
micro_batch_size: 4
num_epochs: 3
learning_rate: 0.0003
cutoff_len: 512
val_set_size: 2000
lora_r: 8
lora_alpha: 16
lora_dropout: 0.05
lora_target_modules: ['q_proj', 'v_proj']
train_on_inputs: True
add_eos_token: False
group_by_length: False
wandb_project: 
wandb_run_name: 
wandb_watch: 
wandb_log_model: 
resume_from_checkpoint: False
prompt template: alpaca_short
{'train_runtime': 29137.919, 'train_samples_per_second': 7.001, 'train_steps_per_second': 0.058, 'train_loss': 0.7214384505830187, 'epoch': 3.0}

Prompt Samples

Empiracally it looks decent at long form content.

### Instruction:
Write a poem about the transformers Python library. 

### Response:
Transformers, oh how you've changed my life!
You make coding so much easier and more efficient with your code.
Your ability to handle complex data is truly remarkable,
And I can't imagine writing code without you now.

Transforms, you are like magic wands,
Making tasks that seemed impossible before, possible again.
With just one line of code, you can do it all,
From preprocessing text to cleaning up messy data.

Transformers, you are like superheroes in disguise,
Saving us from frustration and despair when we need to parse some files.
You help us understand what our data means,
By converting it into formats that we can easily read.

Transformers, you are like magicians at work,
Performing tricks that seem too good to be true.
You take raw data and turn it into something useful,
Hel....
  • decent, max new tokens cut it short
### Instruction:
Tell me about llamas and alpacas
### Response: 
Llamas and Alpacas are both members of the Camelid family, which includes camels. They have similar physical characteristics such as a humped back, long necks, and large eyes. However, there are some differences between them. 
- Llamas are larger in size than alpacas with an average height of around 5 feet tall while alpaca's average at 3 to 4 feet tall.
- The wool of llama is coarser and thicker compared to that of alpaca. This makes it more suitable for outdoor use like blankets or rugs. On the other hand, alpaca wool is softer and finer making it ideal for clothing materials.
- Both animals are herbivores but llamas can survive on a wider range of vegetation including grasses, shrubs, and even cacti. Alpacas prefer eating...
  • decent, max new tokens cut it short
### Instruction:
List all Canadian provinces and territories in alphabetical order.
### Response: 
Alberta, British Columbia, Manitoba, New Brunswick, Northwest Territories, Nova Scotia, Nunavut, Ontario, Prince Edward Island, Quebec, Saskatchewan, Yukon. 
Note that the spelling of "Northwest" is correct as it refers to a specific region within Canada's borders. The other spellings are also accurate for their respective regions or provinces/territories.

### Response:
  • it seems to want to generate longer texts, not stopping when done, hense the ### RESPONSE. My thoughts are either concatination and truncation of prompts in training. Possibly special tokens. Or using other tokens to signify instruct/response.
### Instruction:
What made NHL super star Maurice "Rocket" Richard such an amazing player?
### Response: 
Maurice "The Rocket" Richard was a Canadian ice hockey player who played for the Montreal Canadiens from 1942 to 1960. He is considered one of the greatest players in NHL history and was known for his speed, strength, and scoring ability. Some of his notable achievements include winning eight Stanley Cups with the Canadians, being named MVP three times, and setting numerous records including most goals scored in a season (50) and career goals (544). His nickname "the rocket" came from his explosive skating style that allowed him to quickly accelerate past defenders. Overall, he was a dominant force on the ice and helped revolutionize the game of hockey during his era.

### Response:
### Instruction:
Write a python function to sort an array of strings in reverse alphabetical order.
### Response: 
Here's the Python code for sorting an array of strings in reverse alphabetical order using the built-in sorted() method:
    ```python
    def reverse_sort(arr):
      arr = list(sorted(arr, key=lambda x: -x))  # Sorting in reverse order
      return arr
    ```
This function takes an input array as argument and returns it sorted in reverse alphabetical order. The `key` parameter is used to specify how elements should be compared when sorting them. In this case, we use `-x`, which reverses the string before comparing it with other elements. This ensures that the smallest element comes first in the output array. Finally, the returned value is converted back into a list using the `list()` constructor.
  • this is shit. I don't recommend using this for code.
Downloads last month
0
Unable to determine this model's library. Check the docs .