nielsr HF Staff commited on
Commit
cc1f112
·
verified ·
1 Parent(s): b07d5ad

Improve model card: Add pipeline tag, library name, update license, and link to code

Browse files

This PR enhances the model card for `Tucan-27B-v1.0-LoRA` by:
- Adding `pipeline_tag: text-generation` to the metadata, ensuring the model is discoverable via the pipeline filter on the Hub.
- Adding `library_name: transformers` to the metadata, which will enable the "how to use" button and provide relevant code snippets for users.
- Updating the `license` tag in the metadata to `cc-by-4.0` to accurately reflect the license declared within the model card's content itself.
- Adding a direct link to the model's GitHub repository: `https://github.com/llm-bg/tucan`.
- Including the Hugging Face paper page link for comprehensive documentation, while retaining the existing arXiv link for completeness.

Files changed (1) hide show
  1. README.md +14 -5
README.md CHANGED
@@ -1,20 +1,24 @@
1
  ---
2
- license: gemma
3
- language:
4
- - bg
5
  base_model:
6
  - INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0
 
 
 
7
  tags:
8
  - function_calling
9
  - MCP
10
  - tool_use
 
 
11
  ---
12
 
13
  # Tucan-27B-v1.0-LoRA
14
 
15
  ## Bulgarian Language Models for Function Calling 🇧🇬
16
 
17
- **Paper: https://arxiv.org/abs/2506.23394**
 
 
18
 
19
  ## Overview 🚀
20
 
@@ -113,7 +117,12 @@ def create_prompt(functions, user_query):
113
  """
114
 
115
  functions_text = json.dumps(functions, ensure_ascii=False, indent=2)
116
- full_prompt = f"{system_prompt}\n## Налични функции:\n{functions_text}\n\n## Потребителска заявка:\n{user_query}"
 
 
 
 
 
117
 
118
  chat = [{"role": "user", "content": full_prompt}]
119
  return tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
 
1
  ---
 
 
 
2
  base_model:
3
  - INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0
4
+ language:
5
+ - bg
6
+ license: cc-by-4.0
7
  tags:
8
  - function_calling
9
  - MCP
10
  - tool_use
11
+ pipeline_tag: text-generation
12
+ library_name: transformers
13
  ---
14
 
15
  # Tucan-27B-v1.0-LoRA
16
 
17
  ## Bulgarian Language Models for Function Calling 🇧🇬
18
 
19
+ **Paper: [Teaching a Language Model to Speak the Language of Tools](https://huggingface.co/papers/2506.23394)**
20
+ arXiv: https://arxiv.org/abs/2506.23394
21
+ Code: https://github.com/llm-bg/tucan
22
 
23
  ## Overview 🚀
24
 
 
117
  """
118
 
119
  functions_text = json.dumps(functions, ensure_ascii=False, indent=2)
120
+ full_prompt = f"{system_prompt}
121
+ ## Налични функции:
122
+ {functions_text}
123
+
124
+ ## Потребителска заявка:
125
+ {user_query}"
126
 
127
  chat = [{"role": "user", "content": full_prompt}]
128
  return tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)