--- library_name: transformers datasets: - allenai/c4 --- # Model Card for Model ID ## Model Details ### Model Description This model is a quantized version of Falcon2-11B by [tiiuae](https://huggingface.co/tiiuae/falcon-11B). Quantization was performed with Auto-GPTQ to 4bit. - **Developed by:** TIIIUAE - **Quantised by:** Michael Svendsen ### Getting Started ```python from transformers import AutoTokenizer, AutoModelForCausalLM, GPTQConfig pretrained_model_name = "thesven/falcon-11B-GPTQ-4bit" device = "cuda:0" # Load the tokenizer tokenizer = AutoTokenizer.from_pretrained(pretrained_model_name) # Load the model with the specified configuration and move to device model = AutoModelForCausalLM.from_pretrained( pretrained_model_name, device_map="auto", ) # Set EOS token ID model.eos_token_id = tokenizer.eos_token_id # Move model to the specified device model.to(device) # Define the input text input_text = "Why is the sky blue?" # Encode the input text input_ids = tokenizer.encode(input_text, return_tensors="pt").to(device) # Generate output output = model.generate(input_ids, max_length=1000) # Decode the generated output decoded_output = tokenizer.batch_decode(output, skip_special_tokens=True) # Print the decoded output for i, sequence in enumerate(decoded_output): print(f"Generated Sequence {i+1}: {sequence}") ``` ## License Falcon2-11B is licenced under [TII Falcon License 2.0(https://falconllm-staging.tii.ae/falcon-2-terms-and-conditions.html), the permissive Apache 2.0-based software license which includes an acceptable use policy that promotes the responsible use of AI. ## Uses ### Direct Use Research on large language models; as a foundation for further specialization and finetuning for specific usecases (e.g., summarization, text generation, chatbot, etc.) ### Out-of-Scope Use Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful. ### Bias, Risks, and Limitations Falcon2-11B is trained mostly on English, but also German, Spanish, French, Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish. It will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online. ### Recommendations We recommend users of Falcon2-11B to consider finetuning it for the specific set of tasks of interest, and for guardrails and appropriate precautions to be taken for any production use.