--- base_model: - openai-community/gpt2 language: en license: mit tags: - gpt2 - text-generation --- # QnQGPT Model This is a custom GPT model based on GPT-2 architecture. ## Model Details - Model Type: GPT-2 - Base Model: gpt2 - Training Data: [Describe your training data] - Use Cases: [Describe intended use cases] ## Usage ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("karthikqnq/qnqgpt") tokenizer = AutoTokenizer.from_pretrained("karthikqnq/qnqgpt") # Generate text text = "Hello, how are" inputs = tokenizer(text, return_tensors="pt") outputs = model.generate(**inputs, max_length=50) result = tokenizer.decode(outputs[0]) print(result) ``` ## Training Details [Add your training details here] ## Limitations [Add model limitations here] ## License This model is released under the MIT License.