--- license: apache-2.0 tags: - jamba datasets: - teknium/OpenHermes-2.5 base_model: ai21labs/Jamba-v0.1 pipeline_tag: text-generation --- # Jamba-Open-Hermes # This is highly experimental and should be viewed as purely testing right now. Jamba has been very hard to train but I wanted to see how it did on one of the best datasets we have access to. I believe in transparent development so all *best* working iterations, even if they are a bit wonky, will be pushed here. --- # New training underway! Thanks to the generous insights provided by **lightblue/Jamba-v0.1-chat-multilingual**, the new training is going much better. We should hopefully have a decently trained Jamaba-Open-Hermes model for general use and experimentation. # *It works! Use at your own risk though* ## Example Output: ``` <|startoftext|><|im_start|>system You are a helpful AI assistant. <|im_end|> <|im_start|>user A tree needs to leave the earth and become astro-biological. How can it achieve it's goal? <|im_end|> <|im_start|>assistant The process of becoming astronautic life for a tree is an excemtally complicated one that requires a great deal of preparation, coordination, and funding. The first step in this process is for the tree to be identified as a potential candidate for astronautics, which may involve a thorough assessment of its size, health, and suitability for space travel. Once a tree has been identified as a possible astronautic life-form, the next step involves preparing it for spaceflight. This includes training the tree to tolerate low gravity environments, teaching it how to navigate in zero gravity, and ensuring that its nutritional needs are met in space. The process of acclimating a tree to spaceflight can take months or even years, depending on the species and the specific requirements of each mission. Once the tree has been adequately trained and prepared for spaceflight, it must then be transported into orbit or beyond. This may involve launching the tree aboard a rocket or shuttle, or using other methods such as balloons or parachutes to reach its destination. The journey itself can be long and arduous, with many potential risks and challenges along the way. Upon arrival in space, the tree will need to be monitored closely and cared for by astronautic personnel. This may involve feeding the tree, maintaining its health and well-being, and ensuring that it has enough water and nutrients to survive. The tree may also need to be protected from radiation, extreme temperatures, or other hazards that it may face in space. Over time, as the tree adapts to life in space, it will likely undergo some changes in its physical appearance and behavior. For example, it may begin growing in new ways, developing different kinds of leaves or branches in response to the unique environment it finds itself in. It may also become more active and mobile, using its roots to navigate and explore its new home. Throughout this process, there are many challenges that need to be addressed and overcome.<|im_end|> <|endoftext|> ``` ```py !pip install -qqq transformers>=4.39.0 mamba-ssm causal-conv1d>=1.2.0 accelerate bitsandbytes --progress-bar off !pip install flash-attn --no-build-isolation import torch from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig double_quant_config = BitsAndBytesConfig( load_in_4bit=True, bnb_4bit_use_double_quant=True, bnb_4bit_compute_dtype=torch.float16 ) model = AutoModelForCausalLM.from_pretrained( "Severian/Jamba-Nexus-IKM-v1", device_map="auto", trust_remote_code=True, torch_dtype=torch.bfloat16, attn_implementation="flash_attention_2", quantization_config=double_quant_config, ) tokenizer = AutoTokenizer.from_pretrained("Severian/Jamba-Nexus-IKM-v1") input_text = """<|im_start|>system You are a helpful AI assistant. <|im_end|> <|im_start|>user A tree needs to leave the earth and become astro-biological. How can it achieve it's goal? <|im_end|> <|im_start|>assistant """ input_ids = tokenizer(input_text, return_tensors='pt').to(model.device)["input_ids"] outputs = model.generate(input_ids, max_new_tokens=1024, temperature=0.0, repetition_penalty=1.1) print(tokenizer.batch_decode(outputs)[0]) ``` --- ## Training ### **Open-Hermes-2.0** ### Hyperparameters ```py lora_config = LoraConfig( r=8, lora_alpha=16, target_modules=["embed_tokens", "x_proj", "in_proj", "out_proj"], lora_dropout=0.05, task_type="CAUSAL_LM", bias="none" ) trainer = SFTTrainer( model=model, train_dataset=train_dataset, dataset_text_field="text", max_seq_length=max_seq_length, tokenizer=tokenizer, args=TrainingArguments( num_train_epochs=1, lr_scheduler_type='cosine', learning_rate=0.0002, per_device_train_batch_size=1, gradient_accumulation_steps=8, gradient_checkpointing=True, warmup_steps=10, weight_decay=0.01, fp16=not torch.cuda.is_bf16_supported(), bf16=torch.cuda.is_bf16_supported(), logging_steps=1, save_steps=200, output_dir="outputs", optim="adamw_8bit", seed=42, ), ) ```