sroecker commited on
Commit
4075a4c
1 Parent(s): 1654a07

Adapt to Elster

Browse files

Change response template to answer in German.
Change storage location to sroecker/Elster-preference.

Files changed (1) hide show
  1. app.py +4 -4
app.py CHANGED
@@ -59,7 +59,7 @@ else:
59
  print(f"Creating new dataset file: {dataset_file}")
60
 
61
  # Set up CommitScheduler for dataset uploads
62
- repo_id = "davanstrien/magpie-preference" # Replace with your desired dataset repo
63
  scheduler = CommitScheduler(
64
  repo_id=repo_id,
65
  repo_type="dataset",
@@ -151,7 +151,7 @@ def generate_instruction_response():
151
  gr.update(interactive=False),
152
  )
153
 
154
- response_template = f"""<|begin_of_text|><|start_header_id|>user<|end_header_id|>\n\n{sanitized_instruction}<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n"""
155
 
156
  response = pipeline(
157
  response_template,
@@ -190,11 +190,11 @@ def generate_instruction_response():
190
 
191
 
192
  title = """
193
- <h1 style="text-align:center">&#x1F426; Magpie Preference</h1>
194
  """
195
 
196
  description = """
197
- This demo showcases **[Magpie](https://magpie-align.github.io/)**, an innovative approach to generating high-quality data by prompting aligned LLMs with their pre-query templates. Unlike many existing synthetic data generation methods, Magpie doesn't rely on prompt engineering or seed questions for generating synthetic data. Instead, it uses the prompt template of an aligned LLM to generate both the user query and an LLM response.
198
 
199
  <img src="https://magpie-align.github.io/images/pipeline.png" alt="Magpie Pipeline" width="50%" align="center" />
200
 
 
59
  print(f"Creating new dataset file: {dataset_file}")
60
 
61
  # Set up CommitScheduler for dataset uploads
62
+ repo_id = "sroecker/Elster-preference" # Replace with your desired dataset repo
63
  scheduler = CommitScheduler(
64
  repo_id=repo_id,
65
  repo_type="dataset",
 
151
  gr.update(interactive=False),
152
  )
153
 
154
+ response_template = f"""<|begin_of_text|><|start_header_id|>user<|end_header_id|>\n\n{sanitized_instruction} Antworte auf Deutsch ohne "Sie".<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n"""
155
 
156
  response = pipeline(
157
  response_template,
 
190
 
191
 
192
  title = """
193
+ <h1 style="text-align:center">&#x1F426; Elster Preference</h1>
194
  """
195
 
196
  description = """
197
+ This demo showcases **Elster** - derived from **[Magpie](https://magpie-align.github.io/)**, an innovative approach to generating high-quality data by prompting aligned LLMs with their pre-query templates. Unlike many existing synthetic data generation methods, Magpie doesn't rely on prompt engineering or seed questions for generating synthetic data. Instead, it uses the prompt template of an aligned LLM to generate both the user query and an LLM response.
198
 
199
  <img src="https://magpie-align.github.io/images/pipeline.png" alt="Magpie Pipeline" width="50%" align="center" />
200