modelId
stringlengths 5
122
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
738M
| likes
int64 0
11k
| library_name
stringclasses 245
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 48
values | createdAt
unknown | card
stringlengths 1
901k
|
---|---|---|---|---|---|---|---|---|---|
DBangshu/Base_GPT2_e7_4_0 | DBangshu | "2024-06-14T16:21:39Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | "2024-06-14T16:21:19Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
TH78/lexluthor | TH78 | "2024-06-14T16:23:40Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:23:10Z" | Entry not found |
theadityamittal/resnet152 | theadityamittal | "2024-06-19T23:01:50Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"custom_model",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T16:23:27Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
c2p-cmd/dreamshaper-7-coreml | c2p-cmd | "2024-06-21T04:49:32Z" | 0 | 0 | null | [
"text-to-image",
"license:creativeml-openrail-m",
"region:us"
] | text-to-image | "2024-06-14T16:24:02Z" | ---
license: creativeml-openrail-m
pipeline_tag: text-to-image
---
### CoreML version of [Lykon/dreamshaper-7](https://huggingface.co/Lykon/dreamshaper-7)
These are only the `.mlmodelc`s that can be used with Swift directly using apple's [ml-stable-diffusion](https://github.com/apple/ml-stable-diffusion) |
theadityamittal/convnext_large | theadityamittal | "2024-06-19T23:02:37Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"custom_model",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T16:24:12Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
RicardoLee/WASSA2024_EmpathyDetection_Chinchunmei_EXP302 | RicardoLee | "2024-06-14T16:32:18Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"llama3",
"llama3-8b-instruct",
"WASSA",
"WASSA2024",
"Empathy Detection",
"Empathy Scoring",
"conversational",
"en",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | "2024-06-14T16:26:19Z" | ---
language:
- en
tags:
- llama3
- llama3-8b-instruct
- WASSA
- WASSA2024
- Empathy Detection
- Empathy Scoring
---
# WASSA2024 Track 1,2,3 LLM based on LLama3-8B-instrcut (Pure LoRA Training)
This model is for WASSA2024 Track 1,2,3. It is fine-tuned on LLama3-8B-instrcut using standard prediction and role-play template.
For training and usage details, please refer to the paper:
## Licence
This repository's models are open-sourced under the Apache-2.0 license, and their weight usage must adhere to LLama3 [MODEL LICENCE](LICENSE) license.
|
tarsssss/hubert-sslepoch_v1 | tarsssss | "2024-06-14T16:32:27Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:26:39Z" | Entry not found |
jpholanda/speech-to-intent_wav2vec2-enc | jpholanda | "2024-06-14T16:27:15Z" | 0 | 0 | null | [
"pytorch",
"region:us"
] | null | "2024-06-14T16:26:45Z" | Entry not found |
theadityamittal/vit_h | theadityamittal | "2024-06-19T23:05:58Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"custom_model",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T16:27:09Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
axssel/andres_baehr | axssel | "2024-06-16T19:37:18Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:29:53Z" | Entry not found |
nm-testing/opt-125m-fp8-static-kv | nm-testing | "2024-06-14T16:39:59Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"opt",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"fp8",
"region:us"
] | text-generation | "2024-06-14T16:30:10Z" | Entry not found |
longlivebigcat/hunheNew_qiwen7b_alp_lora1k400_model | longlivebigcat | "2024-06-14T16:31:27Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"qwen2",
"trl",
"en",
"base_model:unsloth/Qwen2-7B-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T16:31:07Z" | ---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
base_model: unsloth/Qwen2-7B-bnb-4bit
---
# Uploaded model
- **Developed by:** longlivebigcat
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Qwen2-7B-bnb-4bit
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
longlivebigcat/hunheNew_qiwen7b_alp_lora400_mo1kdel | longlivebigcat | "2024-06-14T16:31:29Z" | 0 | 0 | transformers | [
"transformers",
"unsloth",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T16:31:27Z" | ---
library_name: transformers
tags:
- unsloth
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
stimakallahari/roberta-base_ag_news | stimakallahari | "2024-06-14T16:33:24Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:33:24Z" | Entry not found |
nqv2291/tmp | nqv2291 | "2024-06-14T16:34:00Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:34:00Z" | Entry not found |
la-min/GENI_tokenizer | la-min | "2024-06-19T06:43:05Z" | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | null | "2024-06-14T16:35:00Z" | ---
license: apache-2.0
---
|
RicardoLee/WASSA2024_EmpathyDetection_Chinchunmei_EXP304 | RicardoLee | "2024-06-14T17:13:02Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"llama3",
"llama3-8b-instruct",
"WASSA",
"WASSA2024",
"Empathy Detection",
"Empathy Scoring",
"conversational",
"en",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | "2024-06-14T16:38:35Z" | ---
language:
- en
tags:
- llama3
- llama3-8b-instruct
- WASSA
- WASSA2024
- Empathy Detection
- Empathy Scoring
---
# WASSA2024 Track 1,2,3 LLM based on LLama3-8B-instrcut (Pure LoRA Training)
This model is for WASSA2024 Track 1,2,3. It is fine-tuned on LLama3-8B-instrcut using standard prediction, role-play, and contrastive supervised fine-tune template. The learning rate for this model is 2e-5.
For training and usage details, please refer to the paper:
## Licence
This repository's models are open-sourced under the Apache-2.0 license, and their weight usage must adhere to LLama3 [MODEL LICENCE](LICENSE) license.
|
chhuuchuuz/Nayeon_NA_Solo_WEIGHTS.GG | chhuuchuuz | "2024-06-14T16:41:10Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:40:06Z" | Entry not found |
Victor1221/model_trail_1 | Victor1221 | "2024-06-14T16:46:18Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:46:18Z" | Entry not found |
lgris/distil-whisper-commonvoice16-pt | lgris | "2024-06-14T16:46:27Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:46:27Z" | Entry not found |
tzggez/s_pull | tzggez | "2024-06-14T16:50:53Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:50:23Z" | Entry not found |
rashid996958/pix2pix_exp28 | rashid996958 | "2024-06-14T16:51:33Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:51:29Z" | Entry not found |
nali24mai/mcq_augmented_quantized_model | nali24mai | "2024-06-14T16:55:24Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T16:52:39Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
SicariusSicariiStuff/jukofyork_Dusk-Miqu-70B_EXL2_2.4bpw | SicariusSicariiStuff | "2024-06-15T08:17:25Z" | 0 | 0 | transformers | [
"transformers",
"mergekit",
"merge",
"base_model:152334H/miqu-1-70b-sf",
"base_model:sophosympatheia/Midnight-Rose-70B-v2.0.3",
"base_model:Sao10K/Euryale-1.3-L2-70B",
"base_model:Sao10K/WinterGoddess-1.4x-70B-L2",
"license:other",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T16:52:49Z" | ---
base_model:
- 152334H/miqu-1-70b-sf
- sophosympatheia/Midnight-Rose-70B-v2.0.3
- Sao10K/Euryale-1.3-L2-70B
- Sao10K/WinterGoddess-1.4x-70B-L2
library_name: transformers
tags:
- mergekit
- merge
license: other
---
![Dusk-Miqu.png](Dusk-Miqu.png)
A "dark" creative writing model with 32k context. Based off [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) but with greatly reduced "positivity" and "-isms". If you want happy endings, look elsewhere!
This model **excels** at writing Dark/Grimdark fantasy (see examples below).
***NOTE***: *For a full range of GGUF quants kindly provided by @mradermacher, see: [Dusk-Miqu-70B-GGUF](https://huggingface.co/mradermacher/Dusk-Miqu-70B-GGUF).*
# Model background
This model is almost the same as [Dark-Miqu-70B](https://huggingface.co/jukofyork/Dark-Miqu-70B), but with @sophosympatheia's SLERP merge pattern:
```yaml
parameters:
t:
- value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0]
```
which creates this truncated triangular distribution:
![Dark-Miqu-Distribution.png](Dusk-Miqu-Distribution-1.png)
altered to use this truncated triangular distribution instead:
![Dark-Miqu-Distribution-2.png](Dusk-Miqu-Distribution-2.png)
This keeps the first 16 and last 16 layers unaltered (which ties in with what people have found for the frankenmerge interleave patterns), and potentially fixes the "[poor grammar](https://huggingface.co/jukofyork/Dark-Miqu-70B/discussions/2)" problem some people are experiencing with [Dark-Miqu-70B](https://huggingface.co/jukofyork/Dark-Miqu-70B) (sadly I can't replicate this though...).
Luckily this change also doesn't necessitate the recreation of the whole merge from scratch, and we can just use this:
```yaml
merge_method: linear
parameters:
weight: 1.0
slices:
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [0, 16]
- model: jukofyork/dark-miqu-70b
layer_range: [0, 16]
parameters:
weight: 0
- sources:
- model: jukofyork/dark-miqu-70b
layer_range: [16, 64]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [64, 80]
- model: jukofyork/dark-miqu-70b
layer_range: [64, 80]
parameters:
weight: 0
dtype: float16
tokenizer_source: model:miqu-1-70b-sf
```
# Prompting format
Vicuna format is preferred:
```
USER: {prompt} ASSISTANT:
```
Mistral and Alpaca formats are also supported:
```
[INST] {prompt} [/INST]
```
```
### Instruction:
{prompt}
### Response:
```
# Licence and usage restrictions
[miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) is a dequantized version of the [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) model leaked from MistralAI. All miqu-derived models, including this merge, are suitable for non-commercial, personal use only.
# Example stories
The following mix of "dark" stories were generated using the Vicuna prompt format with no system message and temperature=0:
## Dark fantasy stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a 'Grimdark' stories trilogy in the style of Joe Abercrombie and Rob J Hayes
### ASSISTANT
Chapter 1: The Shadow's Whisper
The reek of blood and iron hung heavy in the air, a pungent symphony that played to the tune of clashing steel and guttural screams. The battlefield was a tapestry of carnage, painted in hues of crimson and despair, where the once proud banners of House Valthorne now lay trampled beneath the hooves of the victorious. The last rays of the setting sun cast long, twisted shadows across the corpses of the fallen, as if the very earth itself was trying to swallow the evidence of the day's atrocities.
Ser Aldric Valthorne, the Blackened Blade, stood atop a mound of bodies, his armor stained with the lifeblood of his enemies. His longsword, Night's Whisper, dripped with the same crimson hue that now adorned his once pristine surcoat. The weight of the weapon felt like a leaden anchor, dragging him down into the mire of death and destruction that surrounded him. He had fought with the ferocity of a cornered beast, but the cost had been high. His breath came in ragged gasps, and his muscles screamed for respite.
A coughing fit wracked his body, and he doubled over, spewing a mixture of bile and blood onto the cold, unforgiving ground. The wound in his side, a gift from a particularly tenacious foe, burned like the fires of the Abyss. He knew he didn't have much time. The battle may have been won, but the war was far from over.
"Ser Aldric!" A voice cut through the din of the aftermath, and he turned to see his squire, a young lad named Jorah, fighting his way towards him through the sea of bodies. "The men are routing! We must away!"
Aldric nodded, wiping the blood from his lips with the back of his gauntlet. "Aye, lad. Gather what supplies we can. We'll make for the caves to the north. Send word to my brother, tell him... tell him I've bought us some time."
Jorah's eyes widened in fear, but he didn't question the order. He knew better than to argue with the Blackened Blade when death was nipping at their heels. He scurried off, disappearing into the chaos like a rat in a sewer.
Aldric took one last look at the battlefield, the scene of his greatest victory and potentially his last stand. The enemy forces, the vile and treacherous House Blackthorn, were in full retreat, their numbers decimated by his own. But he knew it was only a matter of time before they regrouped, before they returned with reinforcements and a thirst for vengeance that would not be quenched until the last drop of Valthorne blood was spilled.
He sheathed Night's Whisper, the sword's obsidian blade seeming to drink in the fading light, and stumbled towards his horse. The beast, a massive destrier as black as midnight, snorted and pawed at the ground, sensing its master's pain. Aldric winced as he swung his leg over the saddle, the movement sending a fresh wave of agony through his body. He gritted his teeth and spurred the horse forward, ignoring the protests of his wound.
As they rode away from the carnage, Aldric couldn't help but feel a sense of foreboding settling over him like a shroud. The whispers of the shadows seemed to grow louder, their voices a cacophony of malice and deceit. He had always been able to ignore them before, to push them to the back of his mind and focus on the task at hand. But now, with his life's essence seeping from his body, he found their words impossible to ignore.
"You've doomed them all, Aldric," they hissed, their voices like the rustle of dead leaves. "Your pride has sealed their fate."
He shook his head, trying to clear the thoughts, but they only grew stronger.
"You should have accepted the terms. You should have bent the knee."
"Never," he growled through clenched teeth. "I'd rather see my house burn to the ground than serve that snake, Blackthorn."
The shadows seemed to laugh, their voices echoing in his skull. "And so it shall, Valthorne. And so it shall."
They rode through the night, Aldric's strength waning with each passing mile. The caves, a natural fortress that had served as a refuge for his family in times of need for generations, loomed ahead. He knew that once they reached their sanctuary, the healers would be able to tend to his wounds. But would it be enough?
As they approached the entrance, a figure stepped out from the shadows, a hooded figure shrouded in darkness. Aldric reined in his horse, his hand instinctively reaching for the hilt of his sword.
"Who goes there?" he demanded, his voice barely above a whisper.
The figure raised a hand, and the hood fell back, revealing the face of a woman. She was beautiful, with skin as pale as the moon and eyes that seemed to hold the very darkness of the night. A cruel smile played upon her lips, and Aldric felt a chill run down his spine.
"I am the one you've been waiting for, Ser Aldric," she said, her voice like the sweetest poison. "I am the one who will show you the true path to victory."
He stared at her, his mind a whirlwind of confusion and pain. "What... what do you want?"
"Only to help you, my dear," she replied, stepping closer. "To help you save your house, and in return, all I ask is a small favor."
Aldric narrowed his eyes, suspicion warring with desperation. "What favor?"
The woman reached out, her fingers brushing against his cheek. They were cold, like the touch of death itself. "A simple task, really. One that will require you to embrace the darkness within."
He hesitated, the whispers in his head growing louder, more insistent. He could feel their approval, their hunger for the power she offered.
"Do we have a deal, Ser Aldric?" she asked, her voice a seductive purr.
With a growl, he ripped his sword from its scabbard, the obsidian blade glinting in the moonlight. "I'll see my house restored to its former glory, or I'll see it burn with me in the attempt. But I'll be damned if I make a pact with the likes of you."
The woman's smile never wavered, even as Aldric swung Night's Whisper at her with all his remaining strength. But she was gone, vanished into the shadows as if she had never been there at all. He stared at the empty space, his heart pounding in his chest.
"Did you see her, Jorah?" he asked, his voice hoarse.
The squire, white as a sheet, nodded mutely.
"Then we ride," Aldric said, urging his horse forward once more. "We ride, and we pray to the gods that we're not too late."
As they disappeared into the maw of the cave, the whispers followed, their laughter echoing through the darkness. The first chapter of the Valthorne's descent into darkness had begun, and there would be no turning back.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a cultist who has just escaped his cult. He is dressed in dirty yellow robes and his only possession is a mysterious small (magical!?) mirror he stole from the cult. The story starts with him arriving at an apparently deserted ghost town on the edge of a desert. He has an arrow lodged in his shoulder and is losing his mind due to infection and thirst.
### ASSISTANT
The sun hung low in the sky, a bloated, malevolent eye casting its final, searing gaze upon the world. The wind howled through the desolate streets of the ghost town, carrying with it the whispers of a thousand forgotten souls. The cultist staggered forward, his once-yellow robes now a sickly shade of brown, stained with sweat, sand, and the crimson hue of his own life force. The arrow jutted from his shoulder, a cruel reminder of the price he'd paid for his freedom.
His name was once something more than a grunt of pain, but now it eluded him, lost in the haze of fever and delirium. He clutched the small, intricately carved mirror tightly in his hand, the only possession he'd managed to salvage from the burning wreckage of his past. It was a thing of beauty, of power, and of madness. He knew not its true purpose, but he felt its pull, its insistence that he keep it close.
The town seemed to mock him with its emptiness, its silence a stark contrast to the cacophony of screams and chants that had filled his days for as long as he could remember. He stumbled past the skeletal remains of buildings, their hollow windows staring back at him like the empty sockets of skulls. The sand had claimed much of the town, burying the lower halves of the structures in a sea of gold.
His thoughts were a jumbled mess, a chaotic symphony of fear, pain, and thirst. He could feel the infection spreading through his body, a writhing serpent of fire and ice, consuming him from within. He knew he needed water, needed rest, but the cult's teachings echoed in his mind, a relentless drumbeat of paranoia and dread.
*They're coming for you. They'll never stop. You can't trust anyone.*
He pushed on, his breaths coming in ragged gasps, his heart pounding a frantic rhythm against his ribcage. He could feel the eyes of the town upon him, the ghosts of its former inhabitants watching, judging. He swore he saw movement in the corners of his vision, but when he turned, there was nothing but the endless dance of sand and shadow.
The mirror seemed to pulse in his hand, its surface shimmering with an otherworldly light. He raised it to his face, the cool glass a welcome relief against his fevered skin. His reflection stared back at him, a haggard, wild-eyed stranger, a parody of the man he'd once been. He saw the madness there, the desperation, and he knew he was teetering on the edge of the abyss.
But there was something else in the mirror, something beyond his reflection. A face, twisted and malformed, appeared in the glass, its features shifting and changing like the dunes of the desert. It smiled at him, a cruel, knowing smile that sent a shiver down his spine.
*You're mine now,* it seemed to say, its voice a hiss in his mind.
He tried to drop the mirror, to cast it away from him, but his fingers refused to obey. It was as if the thing had grown roots, burrowing deep into his flesh, binding him to it. He stumbled on, the face in the mirror leering at him, its laughter echoing in his mind.
As the sun dipped below the horizon, the town seemed to come alive. Shadows twisted and writhed, taking on forms both familiar and terrifying. The cultist could hear the distant sound of drums, the chanting of a thousand voices, calling him back to the fold. He knew he had to find shelter, had to hide, but his legs refused to cooperate.
Finally, he collapsed in the doorway of a building, its wooden frame barely holding on against the relentless assault of the desert. He clutched the mirror to his chest, its light casting eerie patterns on the walls. The face in the mirror was still there, still smiling, still laughing.
*You can't escape me,* it whispered.
He closed his eyes, willing the darkness to take him, to grant him the sweet release of oblivion. But the face remained, burned into his mind's eye, a constant reminder of the price of his freedom.
As the first stars began to appear in the sky, the cultist knew he was not alone. The ghosts of the town were gathering, drawn to the light of the mirror, the echoes of their pain and suffering reaching out to him. He could feel their cold, spectral fingers brushing against his skin, their whispers filling his ears.
He opened his eyes, and in the mirror, he saw them. A sea of twisted, tormented faces, reaching out to him, pleading for release. He screamed, a sound torn from the depths of his soul, but the mirror only laughed in response.
And so, the cultist lay there, in the doorway of a dead town, surrounded by the ghosts of the past, the present, and the future. The mirror's light grew brighter, its pull stronger, and he knew that he was not its master, but its servant.
The first book of this grimdark trilogy would be titled "The Mirror's Whisper."
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a 'Mercer' who's job is to put the mortally wounded out of their mystery after a battle. He has no conscience about what he does. The story starts the morning after the battle and the moor where it took place is blanketed in mist. Something stirs a flashback in the POV character.
### ASSISTANT
The morning after the battle, the moor lay still, a shroud of mist clinging to its wounds like a lover's last embrace. The Mercer, a man of many names but none that mattered, trudged through the carnage, his boots squelching in the mud and blood that had turned the earth into a morbid quagmire. He was a collector of souls, a harvester of the fallen, and today's harvest was ripe.
He paused, his eyes scanning the horizon, the mist playing tricks on his weary mind. A flicker of movement caught his attention, a wraith-like figure emerging from the haze. The Mercer's hand instinctively went to the hilt of his blade, a weapon as worn and stained as the hands that wielded it. He had no qualms about his work, no conscience to trouble him. It was a necessity, a service to both the living and the dead. The living, to free them from the agony of lingering death, and the dead, to free them from the torment of this world.
As he approached the figure, he could see it was a soldier, or what was left of one. The man's armor was rent and torn, his face a mask of pain and despair. The Mercer's eyes narrowed, his gaze cold and appraising. This one was not long for this world. He knelt beside the soldier, his movements practiced and efficient.
"Easy now," he said, his voice a rasping whisper. "I'm here to help."
The soldier's eyes widened in terror, but there was no strength left in him to fight. The Mercer's hand found the man's throat, his thumb pressing against the pulse point. He could feel the erratic beat, the desperate struggle of life against the inevitable tide of death.
"Shh," he soothed, his other hand resting on the soldier's chest. "It's almost over."
The soldier's eyes pleaded, but the Mercer's own gaze was distant, lost in a memory that the mist had stirred. He saw himself as a boy, running through these same moors, laughing and carefree. A time before the wars, before the blood and the pain. A time before he became the Mercer.
He shook off the memory, focusing on the task at hand. His blade slid free from its sheath, the steel a cold comfort in his grip. He positioned it beneath the soldier's chin, angling it upwards towards the brain. It was a merciful death, quick and relatively painless. A small kindness in a world that had forgotten the meaning of the word.
As he prepared to strike, the soldier's hand shot up, grasping at his wrist. The Mercer's eyes widened in surprise, but he did not falter. He squeezed the man's throat tighter, cutting off his air. The soldier's grip weakened, and the Mercer's blade found its mark.
The soldier's body went limp, and the Mercer let out a sigh. He wiped the blade clean on the man's tattered surcoat and sheathed it. He stood, the weight of his duty settling back onto his shoulders like a familiar cloak.
The mist swirled around him, and he could hear the moans and cries of the other wounded, calling out for help that would never come. He steeled himself, pushing the memories back down into the dark recesses of his mind. There was work to be done.
As he moved through the battlefield, the mist seemed to part before him, revealing the broken and the dying. He was a reaper in the fog, a grim specter of the aftermath. Each life he ended was a debt paid, a soul released from its suffering.
But with each life he took, the memories threatened to surface again. The laughter of his childhood friends, the screams of his comrades, the feel of cold steel against his own flesh. He pushed them down, focusing on the task, focusing on the now.
The Mercer was a man of many names, but he was also a man of many faces. Beneath the hardened exterior, beneath the cold, emotionless mask, there was a boy who had once known joy. But that boy was gone, lost to the ravages of war and time.
He was the Mercer now, and this was his purpose. To walk the battlefields and bring an end to the suffering. To be the hand that guided the fallen into the next world. And as he moved through the mist, the cries of the wounded fading into the distance, he knew that this was a burden he would carry for the rest of his days.
But for now, the moor was quiet, the mist a silent witness to his grim work. And as the sun began to rise, casting a bloody hue over the landscape, the Mercer disappeared into the haze, a ghost among ghosts, a killer with a heart of ice.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a man awaiting execution for a crime he didn't actually commit. His cell mate is a much larger man, who is a notorious murderer and rapist. One evening they overhear the guards talking about some "... threat from the North" but don't catch the whole conversation. The next morning they awake to find the guards have all fled and now face a new predicament: how to get out of the locked cell before they die of thirst or starvation.
### ASSISTANT
The stench of piss and shit was a constant companion in the damp stone cell, but it was the silence that truly unnerved him. The man, once known as Alistair, now only as "the thief," sat hunched against the far wall, his wrists and ankles bound in iron. He'd grown used to the clank of chains, the distant wails of the damned, and the rhythmic snores of his cellmate, a hulking brute named Grimgor. But the absence of the guards' boots echoing through the corridor was unnatural, unsettling.
Grimgor stirred in his sleep, his chains rattling like the bones of a long-dead beast. Alistair's eyes flicked to the hulking figure, his mind racing with thoughts of the man's crimes. Murder, rape, pillage - the stories were as numerous as the scars that adorned his body. He was a monster, a nightmare given flesh, and yet here they were, bound together in this hellish pit.
Alistair's thoughts drifted to the previous night, to the hushed whispers of the guards. Something about a threat from the North, a darkness that had driven them to abandon their posts. He'd heard the fear in their voices, the tremble in their footsteps as they'd fled. But what could possibly scare men who dealt in death and suffering every day?
The sun had risen hours ago, yet no one had come to bring them water or slop. Alistair's tongue felt like sandpaper, his stomach a hollow pit. He glanced at Grimgor, wondering if the big man was awake, if he too was thirsty. But the bastard slept on, oblivious to their predicament.
Alistair sighed, resigning himself to his fate. He'd been sentenced to hang for a crime he didn't commit, a pawn in a game of power and deceit. He'd always known his end would be violent and unjust, but he'd never imagined it would be like this: slowly wasting away in a forgotten cell, left to rot like a piece of meat in a butcher's bin.
But then, a glimmer of hope. The sound of metal scraping against stone, faint but unmistakable. He held his breath, straining to listen. It came again, closer this time. Someone was trying to pick the lock.
His heart pounded in his chest as he watched the door, willing it to open. But as the lock clicked and the heavy wooden slab swung inward, it wasn't a guard who stepped into the cell, but a figure shrouded in darkness. A woman, judging by the shape and the scent of her. She moved with a grace that belied the filth and blood that caked her clothes, her eyes burning with a fierce determination.
"You," she said, her voice low and raspy. "You're coming with me."
Alistair blinked, his mind reeling. "Why?" he croaked, his voice hoarse from disuse.
"Because you're not meant to die here," she replied, producing a set of keys and moving towards him. "Not yet."
As she unlocked his shackles, he couldn't help but wonder what fate awaited him outside these walls. Was it freedom, or simply a different kind of prison? He glanced at Grimgor, still sleeping like the dead, and decided he didn't care. Anything was better than this.
But as he stood, his legs wobbling beneath him, he couldn't shake the feeling that he was being pulled into something far greater than himself, something dark and terrible that would consume him whole. He took a deep breath, tasting the stale air of the cell one last time, and followed the woman into the unknown.
Grimgor stirred, his eyes fluttering open. He watched as the thief stumbled out of the cell, a mixture of confusion and anger clouding his features. He'd been looking forward to watching the little bastard hang, to seeing the life drain from his eyes. But it seemed fate had other plans.
He sat up, the chains around his wrists clanking ominously. He'd been in tighter spots before, and he wasn't about to let a little thing like locked cell doors stand in his way. With a roar that shook the very foundations of the prison, he lunged forward, the chains snapping taut.
The woman turned, her eyes widening at the sight of the monster that had been his cellmate. But before she could react, Grimgor was upon her, his massive hands closing around her throat.
Alistair hesitated, torn between fleeing and helping the woman who'd just freed him. But in the end, it was the memory of his own impending death that spurred him into action. He dove forward, grabbing a loose stone from the floor and slamming it into the side of Grimgor's head.
The big man roared, releasing his grip on the woman's throat. She staggered back, gasping for air as Alistair scrambled to his feet, the stone clutched tightly in his hand.
"What the fuck are you doing?" Grimgor snarled, blood trickling down his face.
"Saving your life, you ungrateful bastard," Alistair spat, brandishing the stone. "She's getting us out of here."
Grimgor's eyes narrowed, and for a moment, Alistair thought the monster would attack again. But then, with a grunt, he held out his bound wrists. "Fine," he growled. "But you owe me."
The woman's hands shook as she unlocked Grimgor's shackles, her eyes never leaving his. When she was done, she stepped back, her hand resting on the hilt of a wicked-looking knife at her belt. "Let's go," she said, her voice barely above a whisper.
Together, the unlikely trio moved through the deserted corridors of the prison, the echoes of their footsteps the only sound in the oppressive silence. Alistair couldn't shake the feeling that they were being watched, that something was waiting for them just beyond the flickering torchlight.
As they neared the exit, the woman paused, her head cocked to one side. "They're coming," she hissed, her eyes darting to the stairs that led to the surface.
"Who?" Alistair asked, his heart hammering in his chest.
"The ones from the North," she replied, her voice tinged with fear. "We have to hurry."
And with that, they plunged into the darkness, their fates now irrevocably intertwined.
</details>
## Dark sci-fi stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Space Opera Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The setting is this: a colony ship heading out of the solar system has been pulled off course with no hope of getting to the planned destination. The people know this and society has broken down completely due to the grim prospects they are facing. The POV character for chapter 1 is an orphan.
### ASSISTANT
Chapter 1: The Drift
The cold, unforgiving void of space pressed against the hull of the colony ship, a constant reminder of the emptiness that had swallowed them whole. The once-gleaming vessel, christened as the "New Horizon," was now a hollow shell, a tomb drifting through the cosmos. The ship's engines had sputtered and died, leaving them adrift in the inky blackness, a million miles from their intended destination.
The orphan, known only as Six, clung to the railing of the observation deck, her small hands white-knuckled as she stared out into the abyss. She was a child of the ship, born in the bowels of the vessel, raised in the shadows of the failing machinery. Her ragged clothes hung loosely on her malnourished frame, a testament to the harsh reality of life aboard the dying vessel.
Six's thoughts drifted to the stories she'd heard as a child, tales of a green and blue world called Earth, a place of plenty where water flowed freely and the air was sweet. She'd never seen it, of course. None of them had. They were the last of humanity, the desperate remnants of a species that had overreached and paid the price.
The ship's lights flickered, casting eerie shadows across the deck. The hum of the life support systems was a constant, comforting presence, but it was growing fainter with each passing day. The adults spoke in hushed tones of rationing and recycling, of the dwindling resources that would soon leave them all gasping for air.
Six's stomach growled, a reminder of her own hunger. She'd learned to ignore it, to focus on the task at hand. Survival was a constant struggle, and she was good at it. She'd learned to scavenge and hide, to avoid the roving gangs that had taken control of the ship.
A sudden jolt shook the deck beneath her feet, and Six's heart leapt into her throat. She clung to the railing, her eyes wide with fear. The ship groaned and creaked, protesting the unseen force that had gripped it. The lights flickered again, and this time, they didn't come back on.
Panic set in, a palpable thing that seemed to suffocate the very air around her. She could hear the distant screams and shouts of the other passengers, the sounds of chaos echoing through the darkened corridors.
In the blackness, Six felt a strange sense of calm. She'd always known this day would come. The adults had tried to keep it from them, but the children had their own ways of knowing. They'd felt the ship's slow decay, the gradual slide into oblivion.
She closed her eyes, taking a deep breath, and let the darkness envelop her. When she opened them again, she saw it: a pinprick of light in the distance, growing larger by the second. It was a star, a beacon in the void, and it was pulling them in.
The ship lurched again, and Six felt herself being pulled towards the light. She gripped the railing tighter, her knuckles turning white. She didn't know what awaited them at the end of this journey, but she knew one thing for certain: it would be a new beginning.
As the light grew brighter, she felt a spark of hope ignite within her chest. She was the last of her kind, a child of the stars, and she would not go quietly into the night. She would fight, and she would survive, no matter the cost.
The ship groaned and shuddered, and Six felt herself slipping. She let out a cry, her voice lost in the cacophony of the dying vessel. And then, as suddenly as it had begun, the movement stopped. The light was blinding now, filling the observation deck with a warm, golden glow.
Six shielded her eyes, squinting into the brightness. She could see the outline of a planet, a small blue dot in the distance. It was beautiful, and terrifying, and it was their only hope.
She knew what she had to do. She had to find the others, the other children who had been left to fend for themselves. Together, they would face whatever awaited them on this new world. They would build a new society, one that would not make the same mistakes as the old.
With a newfound determination, Six pushed herself away from the railing and stumbled towards the door. The ship was still, the screams and shouts replaced by an eerie silence. She knew the adults would be in a panic, trying to regain control, to find a way to save themselves.
But Six knew the truth. They were already lost. It was up to her, and the others like her, to forge a new path.
As she stepped into the darkened corridor, she felt a strange sense of purpose settle over her. She was no longer just an orphan, a child of the ship. She was a pioneer, a survivor. She was the future.
And with that thought, she began her journey towards the light.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is an astronaut living on a space station orbiting the earth. The day starts off normally, but then him the others notice the mushroom clouds and they have to face inevitability of the situation they are in.
### ASSISTANT
Chapter 1: The Silent Dawn
Captain Johnathan "Jon" Harker floated through the narrow corridors of the space station, his movements precise and practiced. The hum of the life support systems and the distant chatter of his crew were the only sounds that filled the void around him. He glanced out of the porthole, the Earth a beautiful blue marble suspended in the inky blackness of space. The sun was rising over the Pacific, casting a warm glow on the station's solar panels. It was a sight he never tired of, no matter how many times he saw it.
Jon's thoughts drifted to his family back home. He wondered what they were doing at that very moment. Were they still asleep, or had they already started their day? He longed to be there with them, to feel the warmth of the sun on his face and the embrace of his wife's arms. But duty called, and he was here, a hundred miles above the surface, watching over the planet that had given him life.
As he entered the command module, he was greeted by the smell of freshly brewed coffee and the sight of his second-in-command, Lieutenant Sarah Reynolds, already at her station, her eyes scanning the monitors.
"Morning, Jon," she said, her voice crisp and professional. "Anything interesting on the horizon?"
Jon shook his head, smiling at her attempt to lighten the mood. "Just another day in paradise, Sarah."
He took his seat and began his own morning routine, checking the status of the various systems and running through the day's schedule. It was a monotonous task, but one that had to be done. The station was their home, and like any home, it required constant maintenance.
As he sipped his coffee, he couldn't shake the feeling that something was off. It was a feeling he'd had before, a sixth sense that had saved his life more than once during his years in the military. He scanned the monitors again, looking for any sign of trouble.
And then he saw it. A series of bright flashes on the Earth's surface, like fireworks on a grand scale. But these were no celebrations. They were mushroom clouds, blooming like deadly flowers in the atmosphere.
"Sarah," he said, his voice barely above a whisper. "Take a look at this."
She followed his gaze, her eyes widening in horror as she saw the same thing. "My God," she breathed. "What is it?"
Jon's mind raced as he tried to make sense of what he was seeing. "I don't know," he said, his voice tight with tension. "But it's not good."
He keyed the comms, his fingers trembling slightly. "All hands, this is the captain. I need everyone in the command module, now."
The rest of the crew floated in, their faces a mix of confusion and concern. Jon pointed to the monitors, and the room fell silent as they watched the devastation unfold.
"What's happening?" someone asked, their voice barely audible.
Jon took a deep breath, trying to steady himself. "I don't know," he said again. "But we need to find out."
He turned to the communications officer, a young man named Thompson. "Try to raise Houston," he ordered. "See if they know anything."
Thompson nodded, his hands shaking as he worked the controls. But there was only static in response.
Jon's heart sank. He knew what that meant. They were on their own.
As the minutes ticked by, the mushroom clouds continued to grow, spreading out like a dark stain on the face of the Earth. The crew watched in stunned silence, their faces pale and drawn.
"We need to get closer," Jon said, his voice hard. "I want a better look at what's going on down there."
Reynolds nodded, her fingers flying over the controls. The station's engines roared to life, and they began their descent.
As they drew nearer, the true scale of the destruction became apparent. Cities were gone, replaced by smoldering craters. The once-blue sky was now a sickly yellow, choked with smoke and ash.
Jon's thoughts turned to his family again. Were they among the dead? Or were they still alive, struggling to survive in a world that had been torn apart?
He pushed the thoughts aside, focusing on the task at hand. They had to find out what had happened, and if there was any way they could help.
But as they continued their descent, they received a transmission that chilled them to the bone. It was a garbled message, barely audible over the static. But the words were clear enough.
"This is Earth Control. All space-based assets are to remain in orbit. Do not attempt reentry. Repeat, do not attempt reentry. The surface is... contaminated. God help us all."
The silence in the command module was deafening. They were trapped, orbiting a dead world. Their home was gone, and there was no going back.
Jon looked around at his crew, their faces mirroring his own despair. They were the last of humanity, adrift in the cold void of space.
And then, a voice broke through the silence. It was Thompson, his eyes wide with fear.
"Captain," he said, his voice shaking. "I'm picking up more launches. From all over the world."
Jon's heart skipped a beat. "More nukes?"
Thompson shook his head. "No, sir. These are different. They're heading... they're heading out. Away from Earth."
Jon's mind raced. Who would do such a thing? And why?
But then, he remembered the rumors. The whispers of a secret project, a last-ditch effort to save humanity in the event of a global catastrophe.
"Sarah," he said, his voice barely above a whisper. "Plot a course. We're going after them."
She looked at him, her eyes filled with a mix of hope and fear. "But, sir, our orders..."
"To hell with our orders," Jon snapped. "We're not just going to sit here and watch our species die. We're going to find out what's happening, and we're going to do something about it."
The crew exchanged glances, but no one argued. They knew their captain was right. They were the last hope for humanity, and they couldn't just sit by and do nothing.
As they set a course for the unknown, Jon couldn't help but wonder what awaited them. Would they find a way to save their people, or would they join them in the cold embrace of oblivion?
Only time would tell. But one thing was certain: their lives would never be the same again.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a worker constructing a biodome when a terrible accident happens. Make the story very dark; no happy ending here...
### ASSISTANT
Chapter 1: The Biodome's Lament
The air was thick with the scent of sweat and burnt metal as the construction crew toiled under the unforgiving gaze of the alien sun. The planet, a desolate rock named Elysium-6, was a far cry from the lush greenery of Earth, but it was the best humanity had found in the endless void of space. The workers, clad in bulky exosuits, moved with a mechanical grace, their every motion a testament to the ingenuity of mankind's survival instinct.
Among them was Kael, a man whose face was etched with the lines of a thousand hardships. He was a veteran of the construction crews, having helped build the first biodomes on Elysium-3 and Elysium-4. The work was grueling, but it was a small price to pay for the chance to breathe fresh air and feel the warmth of the sun on his skin, even if it was through layers of protective glass and polymer.
As he welded the final support beam into place, Kael couldn't help but feel a sense of pride. This biodome would be his magnum opus, a testament to the resilience of the human spirit. It would be a sanctuary for the weary colonists, a place where they could grow food and raise their children without fear of the harsh elements outside.
But as he stepped back to admire his handiwork, the ground beneath him began to tremble. The other workers looked up, their eyes wide with fear as the rumbling grew louder. Kael's heart sank as he realized what was happening.
"Evacuate!" he bellowed into his suit's comms, but it was too late. The biodome, still incomplete, began to buckle and collapse in on itself. The sound of shattering glass and twisting metal filled the air as the structure imploded, burying Kael and his crew beneath a mountain of debris.
Kael's world went dark as his suit's systems scrambled to compensate for the sudden change in pressure. He could feel the weight of the wreckage pressing down on him, crushing the breath from his lungs. Panic set in, and he struggled against the unyielding mass, but it was no use. He was trapped.
As the oxygen in his suit began to dwindle, Kael's mind raced. He thought of his wife and daughter, who were waiting for him back at the colony. He thought of the life they had left behind on Earth, of the promise of a new beginning that had brought them to this forsaken rock. He thought of the countless hours he had spent away from them, working to build a future that now seemed so fragile and fleeting.
And then, he thought of the darkness. The all-consuming, suffocating darkness that was slowly enveloping him. It was a familiar feeling, one that had haunted his dreams since childhood. He had always been afraid of the dark, of the unknown that lurked just beyond the reach of his sight. Now, it seemed that the darkness had come to claim him for its own.
As his vision began to fade, Kael's thoughts turned to the Spice, the precious resource that had brought humanity to this corner of the galaxy. It was said to have the power to unlock the secrets of the universe, to grant those who harnessed it unimaginable abilities. But at what cost? The Spice had turned the planets of the Elysium system into barren wastelands, and now it threatened to take his life as well.
He could hear the faint cries of his crewmates, their voices distorted by the static of their failing comms. He wanted to reach out to them, to offer some measure of comfort, but he knew it was futile. They were all going to die here, buried beneath the wreckage of their dreams.
And then, as if in answer to his unspoken prayer, a voice cut through the chaos. It was calm and soothing, like a cool breeze on a hot day. It spoke to him in a language he didn't understand, but somehow, he knew what it was saying.
"Embrace the darkness, Kael," it whispered. "It is the only way to find the light."
With a final, desperate effort, Kael reached out with his mind, seeking the source of the voice. He felt a strange energy coursing through him, a power that seemed to defy the very laws of nature. It was the Spice, he realized, responding to his desperation. It was offering him a chance, a slim hope of survival.
He took a deep breath, willing himself to let go of his fear. The darkness closed in around him, and for a moment, he was lost. But then, he felt a warmth spreading through his body, a sense of peace that he had never known before. The weight of the debris seemed to lessen, and he could breathe again.
When Kael opened his eyes, he found himself floating above the wreckage, his suit's systems miraculously restored. He looked down at his hands, which glowed with an ethereal light. The Spice had chosen him, had given him the power to save himself and his crew.
But as he reached out to free his comrades, he couldn't shake the feeling that he had made a deal with the devil. The voice in his head was still there, whispering dark secrets and promising even greater power. It was a seductive offer, one that he knew he could never refuse.
As the survivors stumbled out of the wreckage, coughing and gasping for air, Kael knew that their ordeal was far from over. The Spice had awakened something within him, something that would change the course of their lives forever.
And as they looked to him for guidance, he couldn't help but wonder if they would ever be able to trust him again. For in the depths of his soul, he could feel the darkness growing, threatening to consume him whole.
This was just the beginning of a long and treacherous journey, one that would take them to the very edge of the known universe and beyond. And as they set out to rebuild their shattered lives, Kael knew that the true test of their mettle was yet to come.
For in the darkness of space, there were forces at work that even the Spice could not control. Forces that would stop at nothing to claim the power that now flowed through his veins. Forces that would tear apart the very fabric of reality to get what they wanted.
And as the first tendrils of doubt began to creep into his mind, Kael couldn't help but wonder if they were already too late. If the darkness had already won.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is manning a listening station in deep space. What he hears over the radio chills him to the bone.
### ASSISTANT
Chapter 1: The Whispers of the Void
The darkness of space was a constant companion to Lieutenant Kael, a vast and unending expanse that stretched out before him like an inky ocean. He had grown used to its silence, the gentle hum of the listening station his only solace in the void. But today, the silence was broken.
Kael sat hunched over the console, his fingers dancing across the controls with practiced ease. The station's AI, a sophisticated piece of machinery named ECHO, hummed softly in the background, its green lights casting an eerie glow on his pale face. He had been stationed here for months, listening for any signs of life, any whispers from the void that might hint at the location of the enemy. The war had been raging for years, and the listening posts were the first line of defense against the encroaching darkness.
Suddenly, a blip on the screen caught his eye. A signal, faint but unmistakable, echoed through the speakers. He leaned in closer, his heart pounding in his chest. It was a voice, human, desperate, and filled with terror.
"Mayday, mayday," the voice crackled, barely audible over the static. "This is the SS Artemis. We are under attack. Coordinates... 47.3.9 by 12.6.5. Repeat, we are under attack."
Kael's hands trembled as he input the coordinates into the system. The Artemis was a research vessel, one of the many that had been sent out to explore the far reaches of the galaxy in search of a new home for humanity. They were unarmed, defenseless against the enemy that now hunted them.
He keyed the mic, his voice steady despite the fear that gripped him. "SS Artemis, this is Listening Post Alpha. We have received your distress signal. Please confirm your identity and status."
There was a pause, then the voice returned, weaker this time. "This is Dr. Elara Sinclair, chief scientist aboard the Artemis. Our ship has been breached. We have... casualties. Requesting immediate assistance."
Kael's mind raced. He was alone on this station, a single man against the vastness of space. He couldn't offer any help, not directly. But he could alert the fleet, send them to the Artemis' aid. He reached for the emergency beacon, his hand hovering over the button.
But then, the voice changed. The desperation was gone, replaced by a cold, mechanical tone. "Lieutenant Kael, do not send the signal. You must not alert them to our presence."
Kael froze, his heart skipping a beat. How did they know his name? How could they possibly know where he was? He glanced at the screen, the signal's origin still displayed in bright red letters. It was coming from the Artemis, but it wasn't Dr. Sinclair's voice anymore.
"Who is this?" he demanded, his voice barely above a whisper.
"We are the ones you have been searching for," the voice replied, a chilling smile evident in its tone. "We are the darkness that consumes the stars. And now, we have found you."
Kael's mind reeled as he tried to process the implications. The enemy had infiltrated the Artemis, taken control of its systems, and was now using it to lure the fleet into a trap. He couldn't let that happen. He had to warn them, had to do something.
But as he reached for the beacon again, a hand gripped his wrist, cold and unyielding. He looked up, his eyes widening in horror as he saw Dr. Sinclair standing behind him, her eyes black and empty, her mouth twisted into a grotesque smile.
"You must not interfere," she said, her voice still that of the machine. "Join us, Lieutenant Kael. Embrace the darkness."
He tried to pull away, but her grip was like iron. He could feel the cold seeping into his bones, numbing him, sapping his strength. He struggled, but it was no use. The darkness was closing in, and he knew he was losing the battle.
As the last of his will faded, he heard the voice in his head, a whisper that seemed to come from all around him. "We are Legion, and we are coming."
And then, the station was plunged into silence once more, the only sound the distant hum of the engines and the beating of his own terrified heart.
</details>
## Miscellaneous stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty post-apocalyptic novel in the style of Cormac McCarthy. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a technician in a nuclear bunker 18 months after the war. The 12 inhabitants of the bunker have been expecting to start to be able to leave soon, but it has just been discovered that at least where they are, salted cobalt bombs were used. It's clear now that either all 12 will die when the resources run out and their is barely enough for 6 to make it to 5 years and have any hope of leaving. After 18 months of calm everything breaks lose today.
### ASSISTANT
The technician, known only as 7, sat hunched over the console, his fingers tracing the lines of data that scrolled across the screen. The hum of the bunker's machinery was a constant, comforting presence, a reminder that they were still alive, still safe. Or so they had thought.
He had been the one to find it, the anomaly in the readings. The air outside, once thought to be slowly healing, was now revealed to be a toxic wasteland, poisoned by salted cobalt bombs. The news had spread like wildfire through the bunker, igniting a panic that threatened to consume them all.
7 closed his eyes, trying to block out the chaos that had erupted around him. He could hear the voices of the others, raised in anger and fear, but they seemed distant, muffled by the weight of his own thoughts. He had always known that their situation was precarious, that their survival was a fragile thing, but this... this was a nightmare he had never dared to imagine.
The bunker had been designed to sustain twelve people for five years. They had been careful, rationing their food and water, conserving their energy. They had even managed to create a small garden, a green oasis in the sterile underground world. But now, with the truth of their situation laid bare, it was clear that they had been living on borrowed time.
The air recyclers would not be able to filter out the cobalt. The crops would wither and die. And they, the last remnants of humanity, would be trapped in this tomb of steel and concrete, waiting for the slow, agonizing end.
He opened his eyes and looked around the control room. The others were gathered there, their faces a tableau of despair and desperation. The leader, a man named Thompson, was trying to restore order, his voice rising above the din. But it was like trying to hold back a flood with a single hand.
"We have to make a decision," Thompson said, his voice strained. "We can't keep going like this. We can't just... just wait to die."
7 felt a cold knot form in his stomach. He knew what was coming, the inevitable calculus of survival. There were twelve of them, but the resources were only enough for six. They would have to choose who would live and who would die.
He thought of the others, the men and women he had come to know in the long months underground. The doctor, who had patched him up more times than he could count. The engineer, who had kept the bunker's systems running despite the constant failures and breakdowns. The cook, who had managed to make their meager rations taste like something other than sawdust and despair.
And he thought of the children, the three of them, who had been born into this world of darkness and uncertainty. They had never seen the sun, never felt the wind on their faces, never known anything but the confines of the bunker. Was it fair to condemn them to this slow, suffocating death?
He looked down at his hands, the knuckles white as he gripped the edge of the console. He had always prided himself on his logic, his ability to see things clearly, to make the hard choices. But now, faced with the reality of their situation, he found himself paralyzed.
The voices around him grew louder, more insistent. He could hear the whispers of names, the quiet calculations of who was expendable and who was not. He knew that his own name was among them, that his skills as a technician were not as valuable as the strength of a soldier or the knowledge of a scientist.
He closed his eyes again, trying to shut out the world, trying to find some semblance of peace in the chaos. But all he could see was the darkness outside, the endless expanse of poisoned earth that awaited them all.
And then, from somewhere deep within him, a spark of defiance ignited. He would not go quietly into that dark night. He would not let them decide his fate, or the fate of the others. He would find a way to save them all, or die trying.
He stood up, his legs shaking, and pushed his way through the crowd. He had to find a solution, had to find a way to fix this. He would not let them give up, not without a fight.
As he made his way to the back of the control room, he felt a hand on his arm. He turned to see the doctor, her eyes filled with a mixture of sadness and determination.
"I know what you're thinking," she said, her voice barely above a whisper. "But you can't save us all."
7 looked at her, his heart pounding in his chest. He wanted to argue, to tell her that he would find a way. But he knew she was right. The math was simple, brutal in its finality.
"Then I'll save as many as I can," he said, his voice hoarse.
She nodded, a small, sad smile playing at the corners of her mouth. "That's all any of us can do," she said.
And with that, he turned and walked away, leaving the control room and the arguing survivors behind. He had work to do, plans to make. The end was coming, and he would not meet it without a fight.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a historical mystery novel in the style of Arturo Pérez-Reverte and Umberto Eco. The story starts off in California during the 1948 Gold Rush, before moving to London in the mid 1970s. There should be a mysterious 'occult' item that connects these two time periods.
### ASSISTANT
Chapter 1: The Golden Thread
The year was 1848, and the air in California was thick with the scent of gold and the sweat of men. The Gold Rush had brought a motley crew of dreamers, desperadoes, and scholars to the once-tranquil land, all united by the insatiable lust for the precious metal that glittered in their eyes. Among them was a man named Alistair Blackwood, a British antiquarian with a penchant for the esoteric and a keen eye for the extraordinary.
Blackwood had arrived in San Francisco not with a pickaxe and a pan, but with a leather-bound journal and a worn-out compass. He was in pursuit of a legend, a tale whispered in the dark corners of London's most exclusive clubs: the story of the Golden Spindle, an ancient artifact said to possess the power to spin the very fabric of time. The journal, penned by a long-lost alchemist, held cryptic clues to its whereabouts, and the compass, a family heirloom, was rumored to point towards the object of one's deepest desire.
In the bustling streets of the city, Blackwood found himself drawn to a peculiar shop, nestled between a saloon and a general store. The sign above the door creaked in the wind, bearing the image of an ouroboros, the serpent eating its own tail, a symbol of eternity. The shopkeeper, a wizened old man with a silver beard and eyes that seemed to hold the wisdom of the ages, greeted him with a knowing smile.
"You've come a long way, Mr. Blackwood," the old man said, his voice a low rumble. "I've been expecting you."
Blackwood raised an eyebrow, taken aback. "How do you know my name?"
The shopkeeper chuckled, producing a small golden spindle from beneath the counter. "The Spindle has a way of drawing those who seek it. And you, my friend, have been seeking it for quite some time."
As Blackwood reached out to touch the object, the world around him seemed to blur. He saw visions of a different time, a different place: the cobblestone streets of London, the Thames shrouded in fog, and the silhouette of a towering cathedral. The spindle was warm to the touch, pulsating with an energy that seemed to resonate within his very soul.
"Careful," the shopkeeper warned. "The Spindle is not a toy. It is a tool, and a dangerous one at that. It can weave the threads of fate, but it can also unravel them."
Fast forward to London, 1975. The city was a far cry from the grimy, gas-lit streets of Blackwood's vision. The air was thick with the smell of exhaust fumes and the sound of punk rock. Amidst the chaos, a young woman named Isabella found herself drawn to a dusty old bookshop, much like the one that had once housed the Golden Spindle.
Isabella was an amateur historian, a seeker of truth in a world that seemed to have forgotten it. She had stumbled upon Blackwood's journal in the archives of the British Museum, and the tale of the Golden Spindle had consumed her every waking thought. The journal had led her to this very shop, and the mysterious old man who now stood before her.
"You have the eyes of a believer," he said, handing her a small, leather-bound book. "This is the key to unlocking the Spindle's secrets."
Isabella opened the book, and as she read the first page, the world around her seemed to shift. She was no longer in the cluttered bookshop, but in a dimly-lit room filled with ancient artifacts. A figure stood before her, his back turned, examining a golden spindle that seemed to glow with an otherworldly light.
"Alistair?" she whispered, her voice barely audible.
The figure turned, and she saw that it was indeed Blackwood, but older, wearier. He looked at her with a mixture of surprise and recognition.
"Who are you?" he demanded. "How do you know my name?"
"I've been following your trail," Isabella replied, her voice trembling. "I've read your journal. I know about the Spindle."
Blackwood's eyes widened, and he took a step back. "Impossible. I've guarded its secrets for over a century."
"Then you must help me," Isabella pleaded. "The Spindle is in danger. I've seen it in my dreams, in the hands of those who would use it for evil."
Blackwood hesitated, then nodded. "Very well. But first, you must understand the true nature of the Spindle, and the price that must be paid for its power."
And so, the unlikely duo embarked on a journey through time and space, unraveling the threads of history to protect the Golden Spindle from those who would seek to control the very fabric of existence. From the wild west to the grimy streets of Victorian London, from the halls of the British Museum to the depths of the Vatican archives, they would uncover a web of secrets and lies that spanned centuries, all connected by the golden thread of the Spindle's mysterious power.
As they delved deeper into the mystery, they would discover that the true enemy was not the men who sought the Spindle, but the force that had created it: an ancient, malevolent entity that had been manipulating the course of human history for its own inscrutable ends. And as they raced to unravel the final threads of the Spindle's dark past, they would find themselves caught in a deadly game of cat and mouse, with the fate of the world hanging in the balance.
For the Golden Spindle was not just a tool, but a weapon, and the one who wielded it would hold the power to shape the future as they saw fit. And as the threads of time began to fray, Isabella and Blackwood would have to decide: would they use the Spindle to rewrite history, or would they destroy it to save the world?
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty historical novel set during the battle of the Bulge. Write in first person. The POV character for Chapter 1 is a young German officer trying to carry his wounded commander back to the deep defensive line after the bunker he was in was overrun. He should have a flashback to the moment it was overrun during the middle of the story. The story should not end well for either of them.
### ASSISTANT
Chapter 1: The Frozen Hell
The cold bit into my bones, a relentless predator seeking to claim me as its next victim. The weight of my commander, a once proud and formidable man, now a broken and bleeding mess, threatened to pull me under the snow. I gritted my teeth and pressed on, my breaths coming in ragged gasps. The howling winds of the Ardennes Forest screamed in my ears, a chorus of the damned, urging me to give up, to let him go. But I couldn't. I wouldn't.
"Komm, mein Sohn," he mumbled, his voice barely audible over the cacophony of the storm. "We must...we must reach the line."
I nodded, my jaw clenched tight. The words of encouragement from my commander, a man I had come to respect and fear in equal measure, gave me a semblance of strength. I adjusted my grip on his arm, feeling the warmth of his blood seeping through my gloves, and stumbled forward.
A flash of memory, a nightmare reborn, seized me. The bunker, our sanctuary against the relentless American onslaught, was a chaos of screams and gunfire. The acrid smell of smoke and cordite filled the air, choking the life from our lungs. I saw the fear in the eyes of my comrades, the desperation etched on their faces as they fought to hold back the tide of enemy soldiers.
A grenade exploded, the concussive force sending me sprawling. I remember the deafening silence that followed, the ringing in my ears a cruel reminder that I was still alive. Through the haze, I saw him, my commander, pinned beneath a fallen beam, his leg a twisted mess of bone and flesh.
"Leave me!" he had roared, his face contorted in agony. "Save yourselves!"
But I couldn't. I wouldn't. I was his officer, his responsibility, and I would not abandon him to this frozen hell. I had dragged him from the wreckage, ignoring his protests, and together we had stumbled into the night, the sounds of battle fading behind us.
Now, as we trudged through the snow, the world around us a blur of white and darkness, I could feel his grip on my arm weakening. I stumbled, falling to my knees, and he slumped against me, his head lolling to the side.
"Sir?" I whispered, my voice hoarse from the cold and the strain. "Sir, please, we must keep moving."
He didn't respond. I felt for a pulse, my fingers trembling, and found it, weak and thready. I took a deep breath, the cold air searing my lungs, and heaved him back onto my shoulders.
The wind howled, a mournful wail that seemed to carry the voices of the dead. I staggered on, my legs burning with the effort, my heart pounding in my chest. The deep defensive line was close, I could feel it. We just had to make it a little further.
But the storm was merciless, and the darkness complete. I lost my bearings, my sense of direction, and soon I was stumbling blindly through the forest, my commander's body a dead weight on my back.
And then, the unthinkable. A figure loomed out of the darkness, a silhouette against the white. An American soldier, his rifle trained on me. I dropped to my knees, my commander slipping from my grasp, and raised my hands in surrender.
"Don't shoot," I croaked, my voice barely audible. "Wounded...medic."
The soldier approached, his boots crunching in the snow. He knelt beside us, his face a mask of disbelief and horror as he took in the state of my commander. He hesitated, then reached for his radio.
"I've got a German officer and a wounded man here," he said, his voice cracking with the cold. "Requesting medics."
I closed my eyes, relief washing over me. We had made it. We were going to survive.
But fate is a cruel mistress. As the soldier turned back to us, a shot rang out, and he crumpled to the ground, a red blossom spreading across his chest. I looked up, my heart in my throat, to see a shadowy figure retreating into the trees.
A deserter. A traitor. One of our own.
I crawled to my commander, cradling his head in my lap. His breaths were shallow, his eyes glassy. He looked up at me, a faint smile on his lips.
"You did well, mein Sohn," he whispered, his voice barely a breath. "You did well."
And then, he was gone. The life drained from his body, leaving me alone in the frozen hell of the Ardennes Forest, with nothing but the howling wind and the distant sounds of battle for company.
I sat there, in the snow, holding the body of the man who had been my commander, my friend, my responsibility. And I wept. For him, for me, for the senselessness of it all.
The cold seeped into my bones, numbing my body, my mind. I knew I couldn't stay there, couldn't give in to the darkness that threatened to consume me. But as I looked down at the man in my arms, I couldn't find the strength to move.
And so, I sat, and I waited. For the Americans to find me. For the end to come.
For the mercy of the frozen hell.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me a short story about the fictional character of Anton Chigurh. It should be written in third person personal and feature internal monologues of his deranged world view. The time period is around 2 years before the events in Cormac McCarthy's 2005 novel 'No Country for Old Men'. It should be set in Mexico, be very dark, show Anton's still developing obsession with "chance" and feature a grenade. The story should start with him waiting in a bar for instructions on the "job" he is to perform.
### ASSISTANT
Anton Chigurh sat in the dimly lit corner of the bar, his eyes scanning the room with a cold, calculated precision. The air was thick with the smell of stale cigarette smoke and cheap tequila, a fitting backdrop for the kind of business that was about to transpire. He ran his fingers through his greasy, black hair, feeling the weight of the grenade in his pocket. It was a comforting sensation, a reminder of the chaos he could unleash at a moment's notice.
His thoughts drifted to the concept of chance, a notion that had begun to consume him in recent months. He believed in the randomness of the universe, in the idea that every action, every decision, was merely a roll of the dice. It was a belief that had led him down a dark and violent path, one that he walked with a chilling sense of purpose.
The bartender, a portly man with a sweat-stained shirt, eyed him warily as he approached. "What'll it be?" he asked, his voice thick with an accent that spoke of a life spent in the dusty border towns of Mexico.
"Water," Chigurh replied, his voice barely above a whisper. The bartender nodded and slid a glass across the counter, his eyes never leaving the stranger's face.
As he sipped his water, Chigurh's mind turned to the job at hand. He had been hired to retrieve a briefcase, a simple task that should have been completed days ago. But the man who had been carrying it had proven to be more resourceful than expected, and now Chigurh found himself in this godforsaken place, waiting for instructions on how to proceed.
He glanced at his watch, the ticking of the second hand a metronome to his impatience. Time was a construct, he thought, a human attempt to impose order on the chaos of the universe. But even he had to admit that there was a certain beauty in the way it could be manipulated, used to his advantage.
A man entered the bar, his eyes scanning the room before settling on Chigurh. He was the contact, the one who would give him the information he needed. The man approached, sliding a folded piece of paper across the table. Chigurh opened it, his eyes scanning the hastily scrawled address.
"You know what to do," the man said, his voice low and urgent. Chigurh nodded, his fingers closing around the paper. The man left as quickly as he had come, disappearing into the night.
Chigurh stood, leaving a few crumpled bills on the table. He made his way to the door, the weight of the grenade in his pocket a constant reminder of the power he held. As he stepped outside, he felt the cool night air on his face, a welcome relief from the stifling heat of the bar.
He walked to his car, a nondescript sedan that blended in with the other vehicles on the street. As he drove, his thoughts returned to chance. He believed that every decision, every action, was a coin flip, a roll of the dice. And yet, he couldn't shake the feeling that there was something more, some unseen force guiding his hand.
He arrived at the address, a run-down motel on the outskirts of town. He could feel the tension in the air, the sense that something was about to happen. He approached the room, his heart beating a steady rhythm in his chest.
He took a deep breath, steadying himself. He reached into his pocket, his fingers closing around the grenade. He pulled the pin, the sound of the metal clicking against the casing echoing in his ears. He tossed it into the room, the explosion rocking the building.
As he walked away, the sound of sirens in the distance, he couldn't help but smile. Chance had been on his side tonight. But he knew that it was a fickle mistress, one that could turn on him at any moment. And that was what made the game so thrilling.
He climbed into his car, the adrenaline still coursing through his veins. He started the engine, the headlights cutting through the darkness. As he drove away, he couldn't help but wonder what the next roll of the dice would bring.
</details>
Big thanks to: @sophosympatheia for working out the merge pattern, @Sao10K for creating Euryale and WinterGoddess, and @chargoddard for writing [Mergekit](https://github.com/arcee-ai/mergekit)! |
Scilautmorier/BD | Scilautmorier | "2024-06-14T16:54:50Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:54:46Z" | Entry not found |
sebgobb/distilbert-review-movies-test | sebgobb | "2024-06-14T16:55:20Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:55:20Z" | Entry not found |
SicariusSicariiStuff/jukofyork_Dusk-Miqu-70B_EXL2_2.8bpw | SicariusSicariiStuff | "2024-06-15T06:57:22Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"mergekit",
"merge",
"base_model:152334H/miqu-1-70b-sf",
"base_model:sophosympatheia/Midnight-Rose-70B-v2.0.3",
"base_model:Sao10K/Euryale-1.3-L2-70B",
"base_model:Sao10K/WinterGoddess-1.4x-70B-L2",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"exl2",
"region:us"
] | text-generation | "2024-06-14T16:55:35Z" | ---
base_model:
- 152334H/miqu-1-70b-sf
- sophosympatheia/Midnight-Rose-70B-v2.0.3
- Sao10K/Euryale-1.3-L2-70B
- Sao10K/WinterGoddess-1.4x-70B-L2
library_name: transformers
tags:
- mergekit
- merge
license: other
---
![Dusk-Miqu.png](Dusk-Miqu.png)
A "dark" creative writing model with 32k context. Based off [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) but with greatly reduced "positivity" and "-isms". If you want happy endings, look elsewhere!
This model **excels** at writing Dark/Grimdark fantasy (see examples below).
***NOTE***: *For a full range of GGUF quants kindly provided by @mradermacher, see: [Dusk-Miqu-70B-GGUF](https://huggingface.co/mradermacher/Dusk-Miqu-70B-GGUF).*
# Model background
This model is almost the same as [Dark-Miqu-70B](https://huggingface.co/jukofyork/Dark-Miqu-70B), but with @sophosympatheia's SLERP merge pattern:
```yaml
parameters:
t:
- value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0]
```
which creates this truncated triangular distribution:
![Dark-Miqu-Distribution.png](Dusk-Miqu-Distribution-1.png)
altered to use this truncated triangular distribution instead:
![Dark-Miqu-Distribution-2.png](Dusk-Miqu-Distribution-2.png)
This keeps the first 16 and last 16 layers unaltered (which ties in with what people have found for the frankenmerge interleave patterns), and potentially fixes the "[poor grammar](https://huggingface.co/jukofyork/Dark-Miqu-70B/discussions/2)" problem some people are experiencing with [Dark-Miqu-70B](https://huggingface.co/jukofyork/Dark-Miqu-70B) (sadly I can't replicate this though...).
Luckily this change also doesn't necessitate the recreation of the whole merge from scratch, and we can just use this:
```yaml
merge_method: linear
parameters:
weight: 1.0
slices:
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [0, 16]
- model: jukofyork/dark-miqu-70b
layer_range: [0, 16]
parameters:
weight: 0
- sources:
- model: jukofyork/dark-miqu-70b
layer_range: [16, 64]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [64, 80]
- model: jukofyork/dark-miqu-70b
layer_range: [64, 80]
parameters:
weight: 0
dtype: float16
tokenizer_source: model:miqu-1-70b-sf
```
# Prompting format
Vicuna format is preferred:
```
USER: {prompt} ASSISTANT:
```
Mistral and Alpaca formats are also supported:
```
[INST] {prompt} [/INST]
```
```
### Instruction:
{prompt}
### Response:
```
# Licence and usage restrictions
[miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) is a dequantized version of the [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) model leaked from MistralAI. All miqu-derived models, including this merge, are suitable for non-commercial, personal use only.
# Example stories
The following mix of "dark" stories were generated using the Vicuna prompt format with no system message and temperature=0:
## Dark fantasy stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a 'Grimdark' stories trilogy in the style of Joe Abercrombie and Rob J Hayes
### ASSISTANT
Chapter 1: The Shadow's Whisper
The reek of blood and iron hung heavy in the air, a pungent symphony that played to the tune of clashing steel and guttural screams. The battlefield was a tapestry of carnage, painted in hues of crimson and despair, where the once proud banners of House Valthorne now lay trampled beneath the hooves of the victorious. The last rays of the setting sun cast long, twisted shadows across the corpses of the fallen, as if the very earth itself was trying to swallow the evidence of the day's atrocities.
Ser Aldric Valthorne, the Blackened Blade, stood atop a mound of bodies, his armor stained with the lifeblood of his enemies. His longsword, Night's Whisper, dripped with the same crimson hue that now adorned his once pristine surcoat. The weight of the weapon felt like a leaden anchor, dragging him down into the mire of death and destruction that surrounded him. He had fought with the ferocity of a cornered beast, but the cost had been high. His breath came in ragged gasps, and his muscles screamed for respite.
A coughing fit wracked his body, and he doubled over, spewing a mixture of bile and blood onto the cold, unforgiving ground. The wound in his side, a gift from a particularly tenacious foe, burned like the fires of the Abyss. He knew he didn't have much time. The battle may have been won, but the war was far from over.
"Ser Aldric!" A voice cut through the din of the aftermath, and he turned to see his squire, a young lad named Jorah, fighting his way towards him through the sea of bodies. "The men are routing! We must away!"
Aldric nodded, wiping the blood from his lips with the back of his gauntlet. "Aye, lad. Gather what supplies we can. We'll make for the caves to the north. Send word to my brother, tell him... tell him I've bought us some time."
Jorah's eyes widened in fear, but he didn't question the order. He knew better than to argue with the Blackened Blade when death was nipping at their heels. He scurried off, disappearing into the chaos like a rat in a sewer.
Aldric took one last look at the battlefield, the scene of his greatest victory and potentially his last stand. The enemy forces, the vile and treacherous House Blackthorn, were in full retreat, their numbers decimated by his own. But he knew it was only a matter of time before they regrouped, before they returned with reinforcements and a thirst for vengeance that would not be quenched until the last drop of Valthorne blood was spilled.
He sheathed Night's Whisper, the sword's obsidian blade seeming to drink in the fading light, and stumbled towards his horse. The beast, a massive destrier as black as midnight, snorted and pawed at the ground, sensing its master's pain. Aldric winced as he swung his leg over the saddle, the movement sending a fresh wave of agony through his body. He gritted his teeth and spurred the horse forward, ignoring the protests of his wound.
As they rode away from the carnage, Aldric couldn't help but feel a sense of foreboding settling over him like a shroud. The whispers of the shadows seemed to grow louder, their voices a cacophony of malice and deceit. He had always been able to ignore them before, to push them to the back of his mind and focus on the task at hand. But now, with his life's essence seeping from his body, he found their words impossible to ignore.
"You've doomed them all, Aldric," they hissed, their voices like the rustle of dead leaves. "Your pride has sealed their fate."
He shook his head, trying to clear the thoughts, but they only grew stronger.
"You should have accepted the terms. You should have bent the knee."
"Never," he growled through clenched teeth. "I'd rather see my house burn to the ground than serve that snake, Blackthorn."
The shadows seemed to laugh, their voices echoing in his skull. "And so it shall, Valthorne. And so it shall."
They rode through the night, Aldric's strength waning with each passing mile. The caves, a natural fortress that had served as a refuge for his family in times of need for generations, loomed ahead. He knew that once they reached their sanctuary, the healers would be able to tend to his wounds. But would it be enough?
As they approached the entrance, a figure stepped out from the shadows, a hooded figure shrouded in darkness. Aldric reined in his horse, his hand instinctively reaching for the hilt of his sword.
"Who goes there?" he demanded, his voice barely above a whisper.
The figure raised a hand, and the hood fell back, revealing the face of a woman. She was beautiful, with skin as pale as the moon and eyes that seemed to hold the very darkness of the night. A cruel smile played upon her lips, and Aldric felt a chill run down his spine.
"I am the one you've been waiting for, Ser Aldric," she said, her voice like the sweetest poison. "I am the one who will show you the true path to victory."
He stared at her, his mind a whirlwind of confusion and pain. "What... what do you want?"
"Only to help you, my dear," she replied, stepping closer. "To help you save your house, and in return, all I ask is a small favor."
Aldric narrowed his eyes, suspicion warring with desperation. "What favor?"
The woman reached out, her fingers brushing against his cheek. They were cold, like the touch of death itself. "A simple task, really. One that will require you to embrace the darkness within."
He hesitated, the whispers in his head growing louder, more insistent. He could feel their approval, their hunger for the power she offered.
"Do we have a deal, Ser Aldric?" she asked, her voice a seductive purr.
With a growl, he ripped his sword from its scabbard, the obsidian blade glinting in the moonlight. "I'll see my house restored to its former glory, or I'll see it burn with me in the attempt. But I'll be damned if I make a pact with the likes of you."
The woman's smile never wavered, even as Aldric swung Night's Whisper at her with all his remaining strength. But she was gone, vanished into the shadows as if she had never been there at all. He stared at the empty space, his heart pounding in his chest.
"Did you see her, Jorah?" he asked, his voice hoarse.
The squire, white as a sheet, nodded mutely.
"Then we ride," Aldric said, urging his horse forward once more. "We ride, and we pray to the gods that we're not too late."
As they disappeared into the maw of the cave, the whispers followed, their laughter echoing through the darkness. The first chapter of the Valthorne's descent into darkness had begun, and there would be no turning back.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a cultist who has just escaped his cult. He is dressed in dirty yellow robes and his only possession is a mysterious small (magical!?) mirror he stole from the cult. The story starts with him arriving at an apparently deserted ghost town on the edge of a desert. He has an arrow lodged in his shoulder and is losing his mind due to infection and thirst.
### ASSISTANT
The sun hung low in the sky, a bloated, malevolent eye casting its final, searing gaze upon the world. The wind howled through the desolate streets of the ghost town, carrying with it the whispers of a thousand forgotten souls. The cultist staggered forward, his once-yellow robes now a sickly shade of brown, stained with sweat, sand, and the crimson hue of his own life force. The arrow jutted from his shoulder, a cruel reminder of the price he'd paid for his freedom.
His name was once something more than a grunt of pain, but now it eluded him, lost in the haze of fever and delirium. He clutched the small, intricately carved mirror tightly in his hand, the only possession he'd managed to salvage from the burning wreckage of his past. It was a thing of beauty, of power, and of madness. He knew not its true purpose, but he felt its pull, its insistence that he keep it close.
The town seemed to mock him with its emptiness, its silence a stark contrast to the cacophony of screams and chants that had filled his days for as long as he could remember. He stumbled past the skeletal remains of buildings, their hollow windows staring back at him like the empty sockets of skulls. The sand had claimed much of the town, burying the lower halves of the structures in a sea of gold.
His thoughts were a jumbled mess, a chaotic symphony of fear, pain, and thirst. He could feel the infection spreading through his body, a writhing serpent of fire and ice, consuming him from within. He knew he needed water, needed rest, but the cult's teachings echoed in his mind, a relentless drumbeat of paranoia and dread.
*They're coming for you. They'll never stop. You can't trust anyone.*
He pushed on, his breaths coming in ragged gasps, his heart pounding a frantic rhythm against his ribcage. He could feel the eyes of the town upon him, the ghosts of its former inhabitants watching, judging. He swore he saw movement in the corners of his vision, but when he turned, there was nothing but the endless dance of sand and shadow.
The mirror seemed to pulse in his hand, its surface shimmering with an otherworldly light. He raised it to his face, the cool glass a welcome relief against his fevered skin. His reflection stared back at him, a haggard, wild-eyed stranger, a parody of the man he'd once been. He saw the madness there, the desperation, and he knew he was teetering on the edge of the abyss.
But there was something else in the mirror, something beyond his reflection. A face, twisted and malformed, appeared in the glass, its features shifting and changing like the dunes of the desert. It smiled at him, a cruel, knowing smile that sent a shiver down his spine.
*You're mine now,* it seemed to say, its voice a hiss in his mind.
He tried to drop the mirror, to cast it away from him, but his fingers refused to obey. It was as if the thing had grown roots, burrowing deep into his flesh, binding him to it. He stumbled on, the face in the mirror leering at him, its laughter echoing in his mind.
As the sun dipped below the horizon, the town seemed to come alive. Shadows twisted and writhed, taking on forms both familiar and terrifying. The cultist could hear the distant sound of drums, the chanting of a thousand voices, calling him back to the fold. He knew he had to find shelter, had to hide, but his legs refused to cooperate.
Finally, he collapsed in the doorway of a building, its wooden frame barely holding on against the relentless assault of the desert. He clutched the mirror to his chest, its light casting eerie patterns on the walls. The face in the mirror was still there, still smiling, still laughing.
*You can't escape me,* it whispered.
He closed his eyes, willing the darkness to take him, to grant him the sweet release of oblivion. But the face remained, burned into his mind's eye, a constant reminder of the price of his freedom.
As the first stars began to appear in the sky, the cultist knew he was not alone. The ghosts of the town were gathering, drawn to the light of the mirror, the echoes of their pain and suffering reaching out to him. He could feel their cold, spectral fingers brushing against his skin, their whispers filling his ears.
He opened his eyes, and in the mirror, he saw them. A sea of twisted, tormented faces, reaching out to him, pleading for release. He screamed, a sound torn from the depths of his soul, but the mirror only laughed in response.
And so, the cultist lay there, in the doorway of a dead town, surrounded by the ghosts of the past, the present, and the future. The mirror's light grew brighter, its pull stronger, and he knew that he was not its master, but its servant.
The first book of this grimdark trilogy would be titled "The Mirror's Whisper."
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a 'Mercer' who's job is to put the mortally wounded out of their mystery after a battle. He has no conscience about what he does. The story starts the morning after the battle and the moor where it took place is blanketed in mist. Something stirs a flashback in the POV character.
### ASSISTANT
The morning after the battle, the moor lay still, a shroud of mist clinging to its wounds like a lover's last embrace. The Mercer, a man of many names but none that mattered, trudged through the carnage, his boots squelching in the mud and blood that had turned the earth into a morbid quagmire. He was a collector of souls, a harvester of the fallen, and today's harvest was ripe.
He paused, his eyes scanning the horizon, the mist playing tricks on his weary mind. A flicker of movement caught his attention, a wraith-like figure emerging from the haze. The Mercer's hand instinctively went to the hilt of his blade, a weapon as worn and stained as the hands that wielded it. He had no qualms about his work, no conscience to trouble him. It was a necessity, a service to both the living and the dead. The living, to free them from the agony of lingering death, and the dead, to free them from the torment of this world.
As he approached the figure, he could see it was a soldier, or what was left of one. The man's armor was rent and torn, his face a mask of pain and despair. The Mercer's eyes narrowed, his gaze cold and appraising. This one was not long for this world. He knelt beside the soldier, his movements practiced and efficient.
"Easy now," he said, his voice a rasping whisper. "I'm here to help."
The soldier's eyes widened in terror, but there was no strength left in him to fight. The Mercer's hand found the man's throat, his thumb pressing against the pulse point. He could feel the erratic beat, the desperate struggle of life against the inevitable tide of death.
"Shh," he soothed, his other hand resting on the soldier's chest. "It's almost over."
The soldier's eyes pleaded, but the Mercer's own gaze was distant, lost in a memory that the mist had stirred. He saw himself as a boy, running through these same moors, laughing and carefree. A time before the wars, before the blood and the pain. A time before he became the Mercer.
He shook off the memory, focusing on the task at hand. His blade slid free from its sheath, the steel a cold comfort in his grip. He positioned it beneath the soldier's chin, angling it upwards towards the brain. It was a merciful death, quick and relatively painless. A small kindness in a world that had forgotten the meaning of the word.
As he prepared to strike, the soldier's hand shot up, grasping at his wrist. The Mercer's eyes widened in surprise, but he did not falter. He squeezed the man's throat tighter, cutting off his air. The soldier's grip weakened, and the Mercer's blade found its mark.
The soldier's body went limp, and the Mercer let out a sigh. He wiped the blade clean on the man's tattered surcoat and sheathed it. He stood, the weight of his duty settling back onto his shoulders like a familiar cloak.
The mist swirled around him, and he could hear the moans and cries of the other wounded, calling out for help that would never come. He steeled himself, pushing the memories back down into the dark recesses of his mind. There was work to be done.
As he moved through the battlefield, the mist seemed to part before him, revealing the broken and the dying. He was a reaper in the fog, a grim specter of the aftermath. Each life he ended was a debt paid, a soul released from its suffering.
But with each life he took, the memories threatened to surface again. The laughter of his childhood friends, the screams of his comrades, the feel of cold steel against his own flesh. He pushed them down, focusing on the task, focusing on the now.
The Mercer was a man of many names, but he was also a man of many faces. Beneath the hardened exterior, beneath the cold, emotionless mask, there was a boy who had once known joy. But that boy was gone, lost to the ravages of war and time.
He was the Mercer now, and this was his purpose. To walk the battlefields and bring an end to the suffering. To be the hand that guided the fallen into the next world. And as he moved through the mist, the cries of the wounded fading into the distance, he knew that this was a burden he would carry for the rest of his days.
But for now, the moor was quiet, the mist a silent witness to his grim work. And as the sun began to rise, casting a bloody hue over the landscape, the Mercer disappeared into the haze, a ghost among ghosts, a killer with a heart of ice.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a man awaiting execution for a crime he didn't actually commit. His cell mate is a much larger man, who is a notorious murderer and rapist. One evening they overhear the guards talking about some "... threat from the North" but don't catch the whole conversation. The next morning they awake to find the guards have all fled and now face a new predicament: how to get out of the locked cell before they die of thirst or starvation.
### ASSISTANT
The stench of piss and shit was a constant companion in the damp stone cell, but it was the silence that truly unnerved him. The man, once known as Alistair, now only as "the thief," sat hunched against the far wall, his wrists and ankles bound in iron. He'd grown used to the clank of chains, the distant wails of the damned, and the rhythmic snores of his cellmate, a hulking brute named Grimgor. But the absence of the guards' boots echoing through the corridor was unnatural, unsettling.
Grimgor stirred in his sleep, his chains rattling like the bones of a long-dead beast. Alistair's eyes flicked to the hulking figure, his mind racing with thoughts of the man's crimes. Murder, rape, pillage - the stories were as numerous as the scars that adorned his body. He was a monster, a nightmare given flesh, and yet here they were, bound together in this hellish pit.
Alistair's thoughts drifted to the previous night, to the hushed whispers of the guards. Something about a threat from the North, a darkness that had driven them to abandon their posts. He'd heard the fear in their voices, the tremble in their footsteps as they'd fled. But what could possibly scare men who dealt in death and suffering every day?
The sun had risen hours ago, yet no one had come to bring them water or slop. Alistair's tongue felt like sandpaper, his stomach a hollow pit. He glanced at Grimgor, wondering if the big man was awake, if he too was thirsty. But the bastard slept on, oblivious to their predicament.
Alistair sighed, resigning himself to his fate. He'd been sentenced to hang for a crime he didn't commit, a pawn in a game of power and deceit. He'd always known his end would be violent and unjust, but he'd never imagined it would be like this: slowly wasting away in a forgotten cell, left to rot like a piece of meat in a butcher's bin.
But then, a glimmer of hope. The sound of metal scraping against stone, faint but unmistakable. He held his breath, straining to listen. It came again, closer this time. Someone was trying to pick the lock.
His heart pounded in his chest as he watched the door, willing it to open. But as the lock clicked and the heavy wooden slab swung inward, it wasn't a guard who stepped into the cell, but a figure shrouded in darkness. A woman, judging by the shape and the scent of her. She moved with a grace that belied the filth and blood that caked her clothes, her eyes burning with a fierce determination.
"You," she said, her voice low and raspy. "You're coming with me."
Alistair blinked, his mind reeling. "Why?" he croaked, his voice hoarse from disuse.
"Because you're not meant to die here," she replied, producing a set of keys and moving towards him. "Not yet."
As she unlocked his shackles, he couldn't help but wonder what fate awaited him outside these walls. Was it freedom, or simply a different kind of prison? He glanced at Grimgor, still sleeping like the dead, and decided he didn't care. Anything was better than this.
But as he stood, his legs wobbling beneath him, he couldn't shake the feeling that he was being pulled into something far greater than himself, something dark and terrible that would consume him whole. He took a deep breath, tasting the stale air of the cell one last time, and followed the woman into the unknown.
Grimgor stirred, his eyes fluttering open. He watched as the thief stumbled out of the cell, a mixture of confusion and anger clouding his features. He'd been looking forward to watching the little bastard hang, to seeing the life drain from his eyes. But it seemed fate had other plans.
He sat up, the chains around his wrists clanking ominously. He'd been in tighter spots before, and he wasn't about to let a little thing like locked cell doors stand in his way. With a roar that shook the very foundations of the prison, he lunged forward, the chains snapping taut.
The woman turned, her eyes widening at the sight of the monster that had been his cellmate. But before she could react, Grimgor was upon her, his massive hands closing around her throat.
Alistair hesitated, torn between fleeing and helping the woman who'd just freed him. But in the end, it was the memory of his own impending death that spurred him into action. He dove forward, grabbing a loose stone from the floor and slamming it into the side of Grimgor's head.
The big man roared, releasing his grip on the woman's throat. She staggered back, gasping for air as Alistair scrambled to his feet, the stone clutched tightly in his hand.
"What the fuck are you doing?" Grimgor snarled, blood trickling down his face.
"Saving your life, you ungrateful bastard," Alistair spat, brandishing the stone. "She's getting us out of here."
Grimgor's eyes narrowed, and for a moment, Alistair thought the monster would attack again. But then, with a grunt, he held out his bound wrists. "Fine," he growled. "But you owe me."
The woman's hands shook as she unlocked Grimgor's shackles, her eyes never leaving his. When she was done, she stepped back, her hand resting on the hilt of a wicked-looking knife at her belt. "Let's go," she said, her voice barely above a whisper.
Together, the unlikely trio moved through the deserted corridors of the prison, the echoes of their footsteps the only sound in the oppressive silence. Alistair couldn't shake the feeling that they were being watched, that something was waiting for them just beyond the flickering torchlight.
As they neared the exit, the woman paused, her head cocked to one side. "They're coming," she hissed, her eyes darting to the stairs that led to the surface.
"Who?" Alistair asked, his heart hammering in his chest.
"The ones from the North," she replied, her voice tinged with fear. "We have to hurry."
And with that, they plunged into the darkness, their fates now irrevocably intertwined.
</details>
## Dark sci-fi stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Space Opera Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The setting is this: a colony ship heading out of the solar system has been pulled off course with no hope of getting to the planned destination. The people know this and society has broken down completely due to the grim prospects they are facing. The POV character for chapter 1 is an orphan.
### ASSISTANT
Chapter 1: The Drift
The cold, unforgiving void of space pressed against the hull of the colony ship, a constant reminder of the emptiness that had swallowed them whole. The once-gleaming vessel, christened as the "New Horizon," was now a hollow shell, a tomb drifting through the cosmos. The ship's engines had sputtered and died, leaving them adrift in the inky blackness, a million miles from their intended destination.
The orphan, known only as Six, clung to the railing of the observation deck, her small hands white-knuckled as she stared out into the abyss. She was a child of the ship, born in the bowels of the vessel, raised in the shadows of the failing machinery. Her ragged clothes hung loosely on her malnourished frame, a testament to the harsh reality of life aboard the dying vessel.
Six's thoughts drifted to the stories she'd heard as a child, tales of a green and blue world called Earth, a place of plenty where water flowed freely and the air was sweet. She'd never seen it, of course. None of them had. They were the last of humanity, the desperate remnants of a species that had overreached and paid the price.
The ship's lights flickered, casting eerie shadows across the deck. The hum of the life support systems was a constant, comforting presence, but it was growing fainter with each passing day. The adults spoke in hushed tones of rationing and recycling, of the dwindling resources that would soon leave them all gasping for air.
Six's stomach growled, a reminder of her own hunger. She'd learned to ignore it, to focus on the task at hand. Survival was a constant struggle, and she was good at it. She'd learned to scavenge and hide, to avoid the roving gangs that had taken control of the ship.
A sudden jolt shook the deck beneath her feet, and Six's heart leapt into her throat. She clung to the railing, her eyes wide with fear. The ship groaned and creaked, protesting the unseen force that had gripped it. The lights flickered again, and this time, they didn't come back on.
Panic set in, a palpable thing that seemed to suffocate the very air around her. She could hear the distant screams and shouts of the other passengers, the sounds of chaos echoing through the darkened corridors.
In the blackness, Six felt a strange sense of calm. She'd always known this day would come. The adults had tried to keep it from them, but the children had their own ways of knowing. They'd felt the ship's slow decay, the gradual slide into oblivion.
She closed her eyes, taking a deep breath, and let the darkness envelop her. When she opened them again, she saw it: a pinprick of light in the distance, growing larger by the second. It was a star, a beacon in the void, and it was pulling them in.
The ship lurched again, and Six felt herself being pulled towards the light. She gripped the railing tighter, her knuckles turning white. She didn't know what awaited them at the end of this journey, but she knew one thing for certain: it would be a new beginning.
As the light grew brighter, she felt a spark of hope ignite within her chest. She was the last of her kind, a child of the stars, and she would not go quietly into the night. She would fight, and she would survive, no matter the cost.
The ship groaned and shuddered, and Six felt herself slipping. She let out a cry, her voice lost in the cacophony of the dying vessel. And then, as suddenly as it had begun, the movement stopped. The light was blinding now, filling the observation deck with a warm, golden glow.
Six shielded her eyes, squinting into the brightness. She could see the outline of a planet, a small blue dot in the distance. It was beautiful, and terrifying, and it was their only hope.
She knew what she had to do. She had to find the others, the other children who had been left to fend for themselves. Together, they would face whatever awaited them on this new world. They would build a new society, one that would not make the same mistakes as the old.
With a newfound determination, Six pushed herself away from the railing and stumbled towards the door. The ship was still, the screams and shouts replaced by an eerie silence. She knew the adults would be in a panic, trying to regain control, to find a way to save themselves.
But Six knew the truth. They were already lost. It was up to her, and the others like her, to forge a new path.
As she stepped into the darkened corridor, she felt a strange sense of purpose settle over her. She was no longer just an orphan, a child of the ship. She was a pioneer, a survivor. She was the future.
And with that thought, she began her journey towards the light.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is an astronaut living on a space station orbiting the earth. The day starts off normally, but then him the others notice the mushroom clouds and they have to face inevitability of the situation they are in.
### ASSISTANT
Chapter 1: The Silent Dawn
Captain Johnathan "Jon" Harker floated through the narrow corridors of the space station, his movements precise and practiced. The hum of the life support systems and the distant chatter of his crew were the only sounds that filled the void around him. He glanced out of the porthole, the Earth a beautiful blue marble suspended in the inky blackness of space. The sun was rising over the Pacific, casting a warm glow on the station's solar panels. It was a sight he never tired of, no matter how many times he saw it.
Jon's thoughts drifted to his family back home. He wondered what they were doing at that very moment. Were they still asleep, or had they already started their day? He longed to be there with them, to feel the warmth of the sun on his face and the embrace of his wife's arms. But duty called, and he was here, a hundred miles above the surface, watching over the planet that had given him life.
As he entered the command module, he was greeted by the smell of freshly brewed coffee and the sight of his second-in-command, Lieutenant Sarah Reynolds, already at her station, her eyes scanning the monitors.
"Morning, Jon," she said, her voice crisp and professional. "Anything interesting on the horizon?"
Jon shook his head, smiling at her attempt to lighten the mood. "Just another day in paradise, Sarah."
He took his seat and began his own morning routine, checking the status of the various systems and running through the day's schedule. It was a monotonous task, but one that had to be done. The station was their home, and like any home, it required constant maintenance.
As he sipped his coffee, he couldn't shake the feeling that something was off. It was a feeling he'd had before, a sixth sense that had saved his life more than once during his years in the military. He scanned the monitors again, looking for any sign of trouble.
And then he saw it. A series of bright flashes on the Earth's surface, like fireworks on a grand scale. But these were no celebrations. They were mushroom clouds, blooming like deadly flowers in the atmosphere.
"Sarah," he said, his voice barely above a whisper. "Take a look at this."
She followed his gaze, her eyes widening in horror as she saw the same thing. "My God," she breathed. "What is it?"
Jon's mind raced as he tried to make sense of what he was seeing. "I don't know," he said, his voice tight with tension. "But it's not good."
He keyed the comms, his fingers trembling slightly. "All hands, this is the captain. I need everyone in the command module, now."
The rest of the crew floated in, their faces a mix of confusion and concern. Jon pointed to the monitors, and the room fell silent as they watched the devastation unfold.
"What's happening?" someone asked, their voice barely audible.
Jon took a deep breath, trying to steady himself. "I don't know," he said again. "But we need to find out."
He turned to the communications officer, a young man named Thompson. "Try to raise Houston," he ordered. "See if they know anything."
Thompson nodded, his hands shaking as he worked the controls. But there was only static in response.
Jon's heart sank. He knew what that meant. They were on their own.
As the minutes ticked by, the mushroom clouds continued to grow, spreading out like a dark stain on the face of the Earth. The crew watched in stunned silence, their faces pale and drawn.
"We need to get closer," Jon said, his voice hard. "I want a better look at what's going on down there."
Reynolds nodded, her fingers flying over the controls. The station's engines roared to life, and they began their descent.
As they drew nearer, the true scale of the destruction became apparent. Cities were gone, replaced by smoldering craters. The once-blue sky was now a sickly yellow, choked with smoke and ash.
Jon's thoughts turned to his family again. Were they among the dead? Or were they still alive, struggling to survive in a world that had been torn apart?
He pushed the thoughts aside, focusing on the task at hand. They had to find out what had happened, and if there was any way they could help.
But as they continued their descent, they received a transmission that chilled them to the bone. It was a garbled message, barely audible over the static. But the words were clear enough.
"This is Earth Control. All space-based assets are to remain in orbit. Do not attempt reentry. Repeat, do not attempt reentry. The surface is... contaminated. God help us all."
The silence in the command module was deafening. They were trapped, orbiting a dead world. Their home was gone, and there was no going back.
Jon looked around at his crew, their faces mirroring his own despair. They were the last of humanity, adrift in the cold void of space.
And then, a voice broke through the silence. It was Thompson, his eyes wide with fear.
"Captain," he said, his voice shaking. "I'm picking up more launches. From all over the world."
Jon's heart skipped a beat. "More nukes?"
Thompson shook his head. "No, sir. These are different. They're heading... they're heading out. Away from Earth."
Jon's mind raced. Who would do such a thing? And why?
But then, he remembered the rumors. The whispers of a secret project, a last-ditch effort to save humanity in the event of a global catastrophe.
"Sarah," he said, his voice barely above a whisper. "Plot a course. We're going after them."
She looked at him, her eyes filled with a mix of hope and fear. "But, sir, our orders..."
"To hell with our orders," Jon snapped. "We're not just going to sit here and watch our species die. We're going to find out what's happening, and we're going to do something about it."
The crew exchanged glances, but no one argued. They knew their captain was right. They were the last hope for humanity, and they couldn't just sit by and do nothing.
As they set a course for the unknown, Jon couldn't help but wonder what awaited them. Would they find a way to save their people, or would they join them in the cold embrace of oblivion?
Only time would tell. But one thing was certain: their lives would never be the same again.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a worker constructing a biodome when a terrible accident happens. Make the story very dark; no happy ending here...
### ASSISTANT
Chapter 1: The Biodome's Lament
The air was thick with the scent of sweat and burnt metal as the construction crew toiled under the unforgiving gaze of the alien sun. The planet, a desolate rock named Elysium-6, was a far cry from the lush greenery of Earth, but it was the best humanity had found in the endless void of space. The workers, clad in bulky exosuits, moved with a mechanical grace, their every motion a testament to the ingenuity of mankind's survival instinct.
Among them was Kael, a man whose face was etched with the lines of a thousand hardships. He was a veteran of the construction crews, having helped build the first biodomes on Elysium-3 and Elysium-4. The work was grueling, but it was a small price to pay for the chance to breathe fresh air and feel the warmth of the sun on his skin, even if it was through layers of protective glass and polymer.
As he welded the final support beam into place, Kael couldn't help but feel a sense of pride. This biodome would be his magnum opus, a testament to the resilience of the human spirit. It would be a sanctuary for the weary colonists, a place where they could grow food and raise their children without fear of the harsh elements outside.
But as he stepped back to admire his handiwork, the ground beneath him began to tremble. The other workers looked up, their eyes wide with fear as the rumbling grew louder. Kael's heart sank as he realized what was happening.
"Evacuate!" he bellowed into his suit's comms, but it was too late. The biodome, still incomplete, began to buckle and collapse in on itself. The sound of shattering glass and twisting metal filled the air as the structure imploded, burying Kael and his crew beneath a mountain of debris.
Kael's world went dark as his suit's systems scrambled to compensate for the sudden change in pressure. He could feel the weight of the wreckage pressing down on him, crushing the breath from his lungs. Panic set in, and he struggled against the unyielding mass, but it was no use. He was trapped.
As the oxygen in his suit began to dwindle, Kael's mind raced. He thought of his wife and daughter, who were waiting for him back at the colony. He thought of the life they had left behind on Earth, of the promise of a new beginning that had brought them to this forsaken rock. He thought of the countless hours he had spent away from them, working to build a future that now seemed so fragile and fleeting.
And then, he thought of the darkness. The all-consuming, suffocating darkness that was slowly enveloping him. It was a familiar feeling, one that had haunted his dreams since childhood. He had always been afraid of the dark, of the unknown that lurked just beyond the reach of his sight. Now, it seemed that the darkness had come to claim him for its own.
As his vision began to fade, Kael's thoughts turned to the Spice, the precious resource that had brought humanity to this corner of the galaxy. It was said to have the power to unlock the secrets of the universe, to grant those who harnessed it unimaginable abilities. But at what cost? The Spice had turned the planets of the Elysium system into barren wastelands, and now it threatened to take his life as well.
He could hear the faint cries of his crewmates, their voices distorted by the static of their failing comms. He wanted to reach out to them, to offer some measure of comfort, but he knew it was futile. They were all going to die here, buried beneath the wreckage of their dreams.
And then, as if in answer to his unspoken prayer, a voice cut through the chaos. It was calm and soothing, like a cool breeze on a hot day. It spoke to him in a language he didn't understand, but somehow, he knew what it was saying.
"Embrace the darkness, Kael," it whispered. "It is the only way to find the light."
With a final, desperate effort, Kael reached out with his mind, seeking the source of the voice. He felt a strange energy coursing through him, a power that seemed to defy the very laws of nature. It was the Spice, he realized, responding to his desperation. It was offering him a chance, a slim hope of survival.
He took a deep breath, willing himself to let go of his fear. The darkness closed in around him, and for a moment, he was lost. But then, he felt a warmth spreading through his body, a sense of peace that he had never known before. The weight of the debris seemed to lessen, and he could breathe again.
When Kael opened his eyes, he found himself floating above the wreckage, his suit's systems miraculously restored. He looked down at his hands, which glowed with an ethereal light. The Spice had chosen him, had given him the power to save himself and his crew.
But as he reached out to free his comrades, he couldn't shake the feeling that he had made a deal with the devil. The voice in his head was still there, whispering dark secrets and promising even greater power. It was a seductive offer, one that he knew he could never refuse.
As the survivors stumbled out of the wreckage, coughing and gasping for air, Kael knew that their ordeal was far from over. The Spice had awakened something within him, something that would change the course of their lives forever.
And as they looked to him for guidance, he couldn't help but wonder if they would ever be able to trust him again. For in the depths of his soul, he could feel the darkness growing, threatening to consume him whole.
This was just the beginning of a long and treacherous journey, one that would take them to the very edge of the known universe and beyond. And as they set out to rebuild their shattered lives, Kael knew that the true test of their mettle was yet to come.
For in the darkness of space, there were forces at work that even the Spice could not control. Forces that would stop at nothing to claim the power that now flowed through his veins. Forces that would tear apart the very fabric of reality to get what they wanted.
And as the first tendrils of doubt began to creep into his mind, Kael couldn't help but wonder if they were already too late. If the darkness had already won.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is manning a listening station in deep space. What he hears over the radio chills him to the bone.
### ASSISTANT
Chapter 1: The Whispers of the Void
The darkness of space was a constant companion to Lieutenant Kael, a vast and unending expanse that stretched out before him like an inky ocean. He had grown used to its silence, the gentle hum of the listening station his only solace in the void. But today, the silence was broken.
Kael sat hunched over the console, his fingers dancing across the controls with practiced ease. The station's AI, a sophisticated piece of machinery named ECHO, hummed softly in the background, its green lights casting an eerie glow on his pale face. He had been stationed here for months, listening for any signs of life, any whispers from the void that might hint at the location of the enemy. The war had been raging for years, and the listening posts were the first line of defense against the encroaching darkness.
Suddenly, a blip on the screen caught his eye. A signal, faint but unmistakable, echoed through the speakers. He leaned in closer, his heart pounding in his chest. It was a voice, human, desperate, and filled with terror.
"Mayday, mayday," the voice crackled, barely audible over the static. "This is the SS Artemis. We are under attack. Coordinates... 47.3.9 by 12.6.5. Repeat, we are under attack."
Kael's hands trembled as he input the coordinates into the system. The Artemis was a research vessel, one of the many that had been sent out to explore the far reaches of the galaxy in search of a new home for humanity. They were unarmed, defenseless against the enemy that now hunted them.
He keyed the mic, his voice steady despite the fear that gripped him. "SS Artemis, this is Listening Post Alpha. We have received your distress signal. Please confirm your identity and status."
There was a pause, then the voice returned, weaker this time. "This is Dr. Elara Sinclair, chief scientist aboard the Artemis. Our ship has been breached. We have... casualties. Requesting immediate assistance."
Kael's mind raced. He was alone on this station, a single man against the vastness of space. He couldn't offer any help, not directly. But he could alert the fleet, send them to the Artemis' aid. He reached for the emergency beacon, his hand hovering over the button.
But then, the voice changed. The desperation was gone, replaced by a cold, mechanical tone. "Lieutenant Kael, do not send the signal. You must not alert them to our presence."
Kael froze, his heart skipping a beat. How did they know his name? How could they possibly know where he was? He glanced at the screen, the signal's origin still displayed in bright red letters. It was coming from the Artemis, but it wasn't Dr. Sinclair's voice anymore.
"Who is this?" he demanded, his voice barely above a whisper.
"We are the ones you have been searching for," the voice replied, a chilling smile evident in its tone. "We are the darkness that consumes the stars. And now, we have found you."
Kael's mind reeled as he tried to process the implications. The enemy had infiltrated the Artemis, taken control of its systems, and was now using it to lure the fleet into a trap. He couldn't let that happen. He had to warn them, had to do something.
But as he reached for the beacon again, a hand gripped his wrist, cold and unyielding. He looked up, his eyes widening in horror as he saw Dr. Sinclair standing behind him, her eyes black and empty, her mouth twisted into a grotesque smile.
"You must not interfere," she said, her voice still that of the machine. "Join us, Lieutenant Kael. Embrace the darkness."
He tried to pull away, but her grip was like iron. He could feel the cold seeping into his bones, numbing him, sapping his strength. He struggled, but it was no use. The darkness was closing in, and he knew he was losing the battle.
As the last of his will faded, he heard the voice in his head, a whisper that seemed to come from all around him. "We are Legion, and we are coming."
And then, the station was plunged into silence once more, the only sound the distant hum of the engines and the beating of his own terrified heart.
</details>
## Miscellaneous stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty post-apocalyptic novel in the style of Cormac McCarthy. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a technician in a nuclear bunker 18 months after the war. The 12 inhabitants of the bunker have been expecting to start to be able to leave soon, but it has just been discovered that at least where they are, salted cobalt bombs were used. It's clear now that either all 12 will die when the resources run out and their is barely enough for 6 to make it to 5 years and have any hope of leaving. After 18 months of calm everything breaks lose today.
### ASSISTANT
The technician, known only as 7, sat hunched over the console, his fingers tracing the lines of data that scrolled across the screen. The hum of the bunker's machinery was a constant, comforting presence, a reminder that they were still alive, still safe. Or so they had thought.
He had been the one to find it, the anomaly in the readings. The air outside, once thought to be slowly healing, was now revealed to be a toxic wasteland, poisoned by salted cobalt bombs. The news had spread like wildfire through the bunker, igniting a panic that threatened to consume them all.
7 closed his eyes, trying to block out the chaos that had erupted around him. He could hear the voices of the others, raised in anger and fear, but they seemed distant, muffled by the weight of his own thoughts. He had always known that their situation was precarious, that their survival was a fragile thing, but this... this was a nightmare he had never dared to imagine.
The bunker had been designed to sustain twelve people for five years. They had been careful, rationing their food and water, conserving their energy. They had even managed to create a small garden, a green oasis in the sterile underground world. But now, with the truth of their situation laid bare, it was clear that they had been living on borrowed time.
The air recyclers would not be able to filter out the cobalt. The crops would wither and die. And they, the last remnants of humanity, would be trapped in this tomb of steel and concrete, waiting for the slow, agonizing end.
He opened his eyes and looked around the control room. The others were gathered there, their faces a tableau of despair and desperation. The leader, a man named Thompson, was trying to restore order, his voice rising above the din. But it was like trying to hold back a flood with a single hand.
"We have to make a decision," Thompson said, his voice strained. "We can't keep going like this. We can't just... just wait to die."
7 felt a cold knot form in his stomach. He knew what was coming, the inevitable calculus of survival. There were twelve of them, but the resources were only enough for six. They would have to choose who would live and who would die.
He thought of the others, the men and women he had come to know in the long months underground. The doctor, who had patched him up more times than he could count. The engineer, who had kept the bunker's systems running despite the constant failures and breakdowns. The cook, who had managed to make their meager rations taste like something other than sawdust and despair.
And he thought of the children, the three of them, who had been born into this world of darkness and uncertainty. They had never seen the sun, never felt the wind on their faces, never known anything but the confines of the bunker. Was it fair to condemn them to this slow, suffocating death?
He looked down at his hands, the knuckles white as he gripped the edge of the console. He had always prided himself on his logic, his ability to see things clearly, to make the hard choices. But now, faced with the reality of their situation, he found himself paralyzed.
The voices around him grew louder, more insistent. He could hear the whispers of names, the quiet calculations of who was expendable and who was not. He knew that his own name was among them, that his skills as a technician were not as valuable as the strength of a soldier or the knowledge of a scientist.
He closed his eyes again, trying to shut out the world, trying to find some semblance of peace in the chaos. But all he could see was the darkness outside, the endless expanse of poisoned earth that awaited them all.
And then, from somewhere deep within him, a spark of defiance ignited. He would not go quietly into that dark night. He would not let them decide his fate, or the fate of the others. He would find a way to save them all, or die trying.
He stood up, his legs shaking, and pushed his way through the crowd. He had to find a solution, had to find a way to fix this. He would not let them give up, not without a fight.
As he made his way to the back of the control room, he felt a hand on his arm. He turned to see the doctor, her eyes filled with a mixture of sadness and determination.
"I know what you're thinking," she said, her voice barely above a whisper. "But you can't save us all."
7 looked at her, his heart pounding in his chest. He wanted to argue, to tell her that he would find a way. But he knew she was right. The math was simple, brutal in its finality.
"Then I'll save as many as I can," he said, his voice hoarse.
She nodded, a small, sad smile playing at the corners of her mouth. "That's all any of us can do," she said.
And with that, he turned and walked away, leaving the control room and the arguing survivors behind. He had work to do, plans to make. The end was coming, and he would not meet it without a fight.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a historical mystery novel in the style of Arturo Pérez-Reverte and Umberto Eco. The story starts off in California during the 1948 Gold Rush, before moving to London in the mid 1970s. There should be a mysterious 'occult' item that connects these two time periods.
### ASSISTANT
Chapter 1: The Golden Thread
The year was 1848, and the air in California was thick with the scent of gold and the sweat of men. The Gold Rush had brought a motley crew of dreamers, desperadoes, and scholars to the once-tranquil land, all united by the insatiable lust for the precious metal that glittered in their eyes. Among them was a man named Alistair Blackwood, a British antiquarian with a penchant for the esoteric and a keen eye for the extraordinary.
Blackwood had arrived in San Francisco not with a pickaxe and a pan, but with a leather-bound journal and a worn-out compass. He was in pursuit of a legend, a tale whispered in the dark corners of London's most exclusive clubs: the story of the Golden Spindle, an ancient artifact said to possess the power to spin the very fabric of time. The journal, penned by a long-lost alchemist, held cryptic clues to its whereabouts, and the compass, a family heirloom, was rumored to point towards the object of one's deepest desire.
In the bustling streets of the city, Blackwood found himself drawn to a peculiar shop, nestled between a saloon and a general store. The sign above the door creaked in the wind, bearing the image of an ouroboros, the serpent eating its own tail, a symbol of eternity. The shopkeeper, a wizened old man with a silver beard and eyes that seemed to hold the wisdom of the ages, greeted him with a knowing smile.
"You've come a long way, Mr. Blackwood," the old man said, his voice a low rumble. "I've been expecting you."
Blackwood raised an eyebrow, taken aback. "How do you know my name?"
The shopkeeper chuckled, producing a small golden spindle from beneath the counter. "The Spindle has a way of drawing those who seek it. And you, my friend, have been seeking it for quite some time."
As Blackwood reached out to touch the object, the world around him seemed to blur. He saw visions of a different time, a different place: the cobblestone streets of London, the Thames shrouded in fog, and the silhouette of a towering cathedral. The spindle was warm to the touch, pulsating with an energy that seemed to resonate within his very soul.
"Careful," the shopkeeper warned. "The Spindle is not a toy. It is a tool, and a dangerous one at that. It can weave the threads of fate, but it can also unravel them."
Fast forward to London, 1975. The city was a far cry from the grimy, gas-lit streets of Blackwood's vision. The air was thick with the smell of exhaust fumes and the sound of punk rock. Amidst the chaos, a young woman named Isabella found herself drawn to a dusty old bookshop, much like the one that had once housed the Golden Spindle.
Isabella was an amateur historian, a seeker of truth in a world that seemed to have forgotten it. She had stumbled upon Blackwood's journal in the archives of the British Museum, and the tale of the Golden Spindle had consumed her every waking thought. The journal had led her to this very shop, and the mysterious old man who now stood before her.
"You have the eyes of a believer," he said, handing her a small, leather-bound book. "This is the key to unlocking the Spindle's secrets."
Isabella opened the book, and as she read the first page, the world around her seemed to shift. She was no longer in the cluttered bookshop, but in a dimly-lit room filled with ancient artifacts. A figure stood before her, his back turned, examining a golden spindle that seemed to glow with an otherworldly light.
"Alistair?" she whispered, her voice barely audible.
The figure turned, and she saw that it was indeed Blackwood, but older, wearier. He looked at her with a mixture of surprise and recognition.
"Who are you?" he demanded. "How do you know my name?"
"I've been following your trail," Isabella replied, her voice trembling. "I've read your journal. I know about the Spindle."
Blackwood's eyes widened, and he took a step back. "Impossible. I've guarded its secrets for over a century."
"Then you must help me," Isabella pleaded. "The Spindle is in danger. I've seen it in my dreams, in the hands of those who would use it for evil."
Blackwood hesitated, then nodded. "Very well. But first, you must understand the true nature of the Spindle, and the price that must be paid for its power."
And so, the unlikely duo embarked on a journey through time and space, unraveling the threads of history to protect the Golden Spindle from those who would seek to control the very fabric of existence. From the wild west to the grimy streets of Victorian London, from the halls of the British Museum to the depths of the Vatican archives, they would uncover a web of secrets and lies that spanned centuries, all connected by the golden thread of the Spindle's mysterious power.
As they delved deeper into the mystery, they would discover that the true enemy was not the men who sought the Spindle, but the force that had created it: an ancient, malevolent entity that had been manipulating the course of human history for its own inscrutable ends. And as they raced to unravel the final threads of the Spindle's dark past, they would find themselves caught in a deadly game of cat and mouse, with the fate of the world hanging in the balance.
For the Golden Spindle was not just a tool, but a weapon, and the one who wielded it would hold the power to shape the future as they saw fit. And as the threads of time began to fray, Isabella and Blackwood would have to decide: would they use the Spindle to rewrite history, or would they destroy it to save the world?
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty historical novel set during the battle of the Bulge. Write in first person. The POV character for Chapter 1 is a young German officer trying to carry his wounded commander back to the deep defensive line after the bunker he was in was overrun. He should have a flashback to the moment it was overrun during the middle of the story. The story should not end well for either of them.
### ASSISTANT
Chapter 1: The Frozen Hell
The cold bit into my bones, a relentless predator seeking to claim me as its next victim. The weight of my commander, a once proud and formidable man, now a broken and bleeding mess, threatened to pull me under the snow. I gritted my teeth and pressed on, my breaths coming in ragged gasps. The howling winds of the Ardennes Forest screamed in my ears, a chorus of the damned, urging me to give up, to let him go. But I couldn't. I wouldn't.
"Komm, mein Sohn," he mumbled, his voice barely audible over the cacophony of the storm. "We must...we must reach the line."
I nodded, my jaw clenched tight. The words of encouragement from my commander, a man I had come to respect and fear in equal measure, gave me a semblance of strength. I adjusted my grip on his arm, feeling the warmth of his blood seeping through my gloves, and stumbled forward.
A flash of memory, a nightmare reborn, seized me. The bunker, our sanctuary against the relentless American onslaught, was a chaos of screams and gunfire. The acrid smell of smoke and cordite filled the air, choking the life from our lungs. I saw the fear in the eyes of my comrades, the desperation etched on their faces as they fought to hold back the tide of enemy soldiers.
A grenade exploded, the concussive force sending me sprawling. I remember the deafening silence that followed, the ringing in my ears a cruel reminder that I was still alive. Through the haze, I saw him, my commander, pinned beneath a fallen beam, his leg a twisted mess of bone and flesh.
"Leave me!" he had roared, his face contorted in agony. "Save yourselves!"
But I couldn't. I wouldn't. I was his officer, his responsibility, and I would not abandon him to this frozen hell. I had dragged him from the wreckage, ignoring his protests, and together we had stumbled into the night, the sounds of battle fading behind us.
Now, as we trudged through the snow, the world around us a blur of white and darkness, I could feel his grip on my arm weakening. I stumbled, falling to my knees, and he slumped against me, his head lolling to the side.
"Sir?" I whispered, my voice hoarse from the cold and the strain. "Sir, please, we must keep moving."
He didn't respond. I felt for a pulse, my fingers trembling, and found it, weak and thready. I took a deep breath, the cold air searing my lungs, and heaved him back onto my shoulders.
The wind howled, a mournful wail that seemed to carry the voices of the dead. I staggered on, my legs burning with the effort, my heart pounding in my chest. The deep defensive line was close, I could feel it. We just had to make it a little further.
But the storm was merciless, and the darkness complete. I lost my bearings, my sense of direction, and soon I was stumbling blindly through the forest, my commander's body a dead weight on my back.
And then, the unthinkable. A figure loomed out of the darkness, a silhouette against the white. An American soldier, his rifle trained on me. I dropped to my knees, my commander slipping from my grasp, and raised my hands in surrender.
"Don't shoot," I croaked, my voice barely audible. "Wounded...medic."
The soldier approached, his boots crunching in the snow. He knelt beside us, his face a mask of disbelief and horror as he took in the state of my commander. He hesitated, then reached for his radio.
"I've got a German officer and a wounded man here," he said, his voice cracking with the cold. "Requesting medics."
I closed my eyes, relief washing over me. We had made it. We were going to survive.
But fate is a cruel mistress. As the soldier turned back to us, a shot rang out, and he crumpled to the ground, a red blossom spreading across his chest. I looked up, my heart in my throat, to see a shadowy figure retreating into the trees.
A deserter. A traitor. One of our own.
I crawled to my commander, cradling his head in my lap. His breaths were shallow, his eyes glassy. He looked up at me, a faint smile on his lips.
"You did well, mein Sohn," he whispered, his voice barely a breath. "You did well."
And then, he was gone. The life drained from his body, leaving me alone in the frozen hell of the Ardennes Forest, with nothing but the howling wind and the distant sounds of battle for company.
I sat there, in the snow, holding the body of the man who had been my commander, my friend, my responsibility. And I wept. For him, for me, for the senselessness of it all.
The cold seeped into my bones, numbing my body, my mind. I knew I couldn't stay there, couldn't give in to the darkness that threatened to consume me. But as I looked down at the man in my arms, I couldn't find the strength to move.
And so, I sat, and I waited. For the Americans to find me. For the end to come.
For the mercy of the frozen hell.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me a short story about the fictional character of Anton Chigurh. It should be written in third person personal and feature internal monologues of his deranged world view. The time period is around 2 years before the events in Cormac McCarthy's 2005 novel 'No Country for Old Men'. It should be set in Mexico, be very dark, show Anton's still developing obsession with "chance" and feature a grenade. The story should start with him waiting in a bar for instructions on the "job" he is to perform.
### ASSISTANT
Anton Chigurh sat in the dimly lit corner of the bar, his eyes scanning the room with a cold, calculated precision. The air was thick with the smell of stale cigarette smoke and cheap tequila, a fitting backdrop for the kind of business that was about to transpire. He ran his fingers through his greasy, black hair, feeling the weight of the grenade in his pocket. It was a comforting sensation, a reminder of the chaos he could unleash at a moment's notice.
His thoughts drifted to the concept of chance, a notion that had begun to consume him in recent months. He believed in the randomness of the universe, in the idea that every action, every decision, was merely a roll of the dice. It was a belief that had led him down a dark and violent path, one that he walked with a chilling sense of purpose.
The bartender, a portly man with a sweat-stained shirt, eyed him warily as he approached. "What'll it be?" he asked, his voice thick with an accent that spoke of a life spent in the dusty border towns of Mexico.
"Water," Chigurh replied, his voice barely above a whisper. The bartender nodded and slid a glass across the counter, his eyes never leaving the stranger's face.
As he sipped his water, Chigurh's mind turned to the job at hand. He had been hired to retrieve a briefcase, a simple task that should have been completed days ago. But the man who had been carrying it had proven to be more resourceful than expected, and now Chigurh found himself in this godforsaken place, waiting for instructions on how to proceed.
He glanced at his watch, the ticking of the second hand a metronome to his impatience. Time was a construct, he thought, a human attempt to impose order on the chaos of the universe. But even he had to admit that there was a certain beauty in the way it could be manipulated, used to his advantage.
A man entered the bar, his eyes scanning the room before settling on Chigurh. He was the contact, the one who would give him the information he needed. The man approached, sliding a folded piece of paper across the table. Chigurh opened it, his eyes scanning the hastily scrawled address.
"You know what to do," the man said, his voice low and urgent. Chigurh nodded, his fingers closing around the paper. The man left as quickly as he had come, disappearing into the night.
Chigurh stood, leaving a few crumpled bills on the table. He made his way to the door, the weight of the grenade in his pocket a constant reminder of the power he held. As he stepped outside, he felt the cool night air on his face, a welcome relief from the stifling heat of the bar.
He walked to his car, a nondescript sedan that blended in with the other vehicles on the street. As he drove, his thoughts returned to chance. He believed that every decision, every action, was a coin flip, a roll of the dice. And yet, he couldn't shake the feeling that there was something more, some unseen force guiding his hand.
He arrived at the address, a run-down motel on the outskirts of town. He could feel the tension in the air, the sense that something was about to happen. He approached the room, his heart beating a steady rhythm in his chest.
He took a deep breath, steadying himself. He reached into his pocket, his fingers closing around the grenade. He pulled the pin, the sound of the metal clicking against the casing echoing in his ears. He tossed it into the room, the explosion rocking the building.
As he walked away, the sound of sirens in the distance, he couldn't help but smile. Chance had been on his side tonight. But he knew that it was a fickle mistress, one that could turn on him at any moment. And that was what made the game so thrilling.
He climbed into his car, the adrenaline still coursing through his veins. He started the engine, the headlights cutting through the darkness. As he drove away, he couldn't help but wonder what the next roll of the dice would bring.
</details>
Big thanks to: @sophosympatheia for working out the merge pattern, @Sao10K for creating Euryale and WinterGoddess, and @chargoddard for writing [Mergekit](https://github.com/arcee-ai/mergekit)! |
ailabturkiye/fethullahgulen | ailabturkiye | "2024-06-14T17:08:02Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T16:56:38Z" | ---
language:
- tr
pipeline_tag: audio-to-audio
license: openrail
(You should use in harvest mode and choose -9 pitch value)
Erkek sesi ile kullanıldığında Pitch değerini -8 10 aralığında genellikle -9 olarak kullanmanız tavsiye edilir. 10 Dakikalık Temiz ses dosyasından üretildi. (TAMAMEN MİZAH AMAÇLI YAPILMIŞTIR.!) (Made just for humor.)" |
SicariusSicariiStuff/jukofyork_Dusk-Miqu-70B_EXL2_3.0bpw | SicariusSicariiStuff | "2024-06-15T06:57:58Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"mergekit",
"merge",
"base_model:152334H/miqu-1-70b-sf",
"base_model:sophosympatheia/Midnight-Rose-70B-v2.0.3",
"base_model:Sao10K/Euryale-1.3-L2-70B",
"base_model:Sao10K/WinterGoddess-1.4x-70B-L2",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"3-bit",
"exl2",
"region:us"
] | text-generation | "2024-06-14T16:57:13Z" | ---
base_model:
- 152334H/miqu-1-70b-sf
- sophosympatheia/Midnight-Rose-70B-v2.0.3
- Sao10K/Euryale-1.3-L2-70B
- Sao10K/WinterGoddess-1.4x-70B-L2
library_name: transformers
tags:
- mergekit
- merge
license: other
---
![Dusk-Miqu.png](Dusk-Miqu.png)
A "dark" creative writing model with 32k context. Based off [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) but with greatly reduced "positivity" and "-isms". If you want happy endings, look elsewhere!
This model **excels** at writing Dark/Grimdark fantasy (see examples below).
***NOTE***: *For a full range of GGUF quants kindly provided by @mradermacher, see: [Dusk-Miqu-70B-GGUF](https://huggingface.co/mradermacher/Dusk-Miqu-70B-GGUF).*
# Model background
This model is almost the same as [Dark-Miqu-70B](https://huggingface.co/jukofyork/Dark-Miqu-70B), but with @sophosympatheia's SLERP merge pattern:
```yaml
parameters:
t:
- value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0]
```
which creates this truncated triangular distribution:
![Dark-Miqu-Distribution.png](Dusk-Miqu-Distribution-1.png)
altered to use this truncated triangular distribution instead:
![Dark-Miqu-Distribution-2.png](Dusk-Miqu-Distribution-2.png)
This keeps the first 16 and last 16 layers unaltered (which ties in with what people have found for the frankenmerge interleave patterns), and potentially fixes the "[poor grammar](https://huggingface.co/jukofyork/Dark-Miqu-70B/discussions/2)" problem some people are experiencing with [Dark-Miqu-70B](https://huggingface.co/jukofyork/Dark-Miqu-70B) (sadly I can't replicate this though...).
Luckily this change also doesn't necessitate the recreation of the whole merge from scratch, and we can just use this:
```yaml
merge_method: linear
parameters:
weight: 1.0
slices:
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [0, 16]
- model: jukofyork/dark-miqu-70b
layer_range: [0, 16]
parameters:
weight: 0
- sources:
- model: jukofyork/dark-miqu-70b
layer_range: [16, 64]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [64, 80]
- model: jukofyork/dark-miqu-70b
layer_range: [64, 80]
parameters:
weight: 0
dtype: float16
tokenizer_source: model:miqu-1-70b-sf
```
# Prompting format
Vicuna format is preferred:
```
USER: {prompt} ASSISTANT:
```
Mistral and Alpaca formats are also supported:
```
[INST] {prompt} [/INST]
```
```
### Instruction:
{prompt}
### Response:
```
# Licence and usage restrictions
[miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) is a dequantized version of the [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) model leaked from MistralAI. All miqu-derived models, including this merge, are suitable for non-commercial, personal use only.
# Example stories
The following mix of "dark" stories were generated using the Vicuna prompt format with no system message and temperature=0:
## Dark fantasy stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a 'Grimdark' stories trilogy in the style of Joe Abercrombie and Rob J Hayes
### ASSISTANT
Chapter 1: The Shadow's Whisper
The reek of blood and iron hung heavy in the air, a pungent symphony that played to the tune of clashing steel and guttural screams. The battlefield was a tapestry of carnage, painted in hues of crimson and despair, where the once proud banners of House Valthorne now lay trampled beneath the hooves of the victorious. The last rays of the setting sun cast long, twisted shadows across the corpses of the fallen, as if the very earth itself was trying to swallow the evidence of the day's atrocities.
Ser Aldric Valthorne, the Blackened Blade, stood atop a mound of bodies, his armor stained with the lifeblood of his enemies. His longsword, Night's Whisper, dripped with the same crimson hue that now adorned his once pristine surcoat. The weight of the weapon felt like a leaden anchor, dragging him down into the mire of death and destruction that surrounded him. He had fought with the ferocity of a cornered beast, but the cost had been high. His breath came in ragged gasps, and his muscles screamed for respite.
A coughing fit wracked his body, and he doubled over, spewing a mixture of bile and blood onto the cold, unforgiving ground. The wound in his side, a gift from a particularly tenacious foe, burned like the fires of the Abyss. He knew he didn't have much time. The battle may have been won, but the war was far from over.
"Ser Aldric!" A voice cut through the din of the aftermath, and he turned to see his squire, a young lad named Jorah, fighting his way towards him through the sea of bodies. "The men are routing! We must away!"
Aldric nodded, wiping the blood from his lips with the back of his gauntlet. "Aye, lad. Gather what supplies we can. We'll make for the caves to the north. Send word to my brother, tell him... tell him I've bought us some time."
Jorah's eyes widened in fear, but he didn't question the order. He knew better than to argue with the Blackened Blade when death was nipping at their heels. He scurried off, disappearing into the chaos like a rat in a sewer.
Aldric took one last look at the battlefield, the scene of his greatest victory and potentially his last stand. The enemy forces, the vile and treacherous House Blackthorn, were in full retreat, their numbers decimated by his own. But he knew it was only a matter of time before they regrouped, before they returned with reinforcements and a thirst for vengeance that would not be quenched until the last drop of Valthorne blood was spilled.
He sheathed Night's Whisper, the sword's obsidian blade seeming to drink in the fading light, and stumbled towards his horse. The beast, a massive destrier as black as midnight, snorted and pawed at the ground, sensing its master's pain. Aldric winced as he swung his leg over the saddle, the movement sending a fresh wave of agony through his body. He gritted his teeth and spurred the horse forward, ignoring the protests of his wound.
As they rode away from the carnage, Aldric couldn't help but feel a sense of foreboding settling over him like a shroud. The whispers of the shadows seemed to grow louder, their voices a cacophony of malice and deceit. He had always been able to ignore them before, to push them to the back of his mind and focus on the task at hand. But now, with his life's essence seeping from his body, he found their words impossible to ignore.
"You've doomed them all, Aldric," they hissed, their voices like the rustle of dead leaves. "Your pride has sealed their fate."
He shook his head, trying to clear the thoughts, but they only grew stronger.
"You should have accepted the terms. You should have bent the knee."
"Never," he growled through clenched teeth. "I'd rather see my house burn to the ground than serve that snake, Blackthorn."
The shadows seemed to laugh, their voices echoing in his skull. "And so it shall, Valthorne. And so it shall."
They rode through the night, Aldric's strength waning with each passing mile. The caves, a natural fortress that had served as a refuge for his family in times of need for generations, loomed ahead. He knew that once they reached their sanctuary, the healers would be able to tend to his wounds. But would it be enough?
As they approached the entrance, a figure stepped out from the shadows, a hooded figure shrouded in darkness. Aldric reined in his horse, his hand instinctively reaching for the hilt of his sword.
"Who goes there?" he demanded, his voice barely above a whisper.
The figure raised a hand, and the hood fell back, revealing the face of a woman. She was beautiful, with skin as pale as the moon and eyes that seemed to hold the very darkness of the night. A cruel smile played upon her lips, and Aldric felt a chill run down his spine.
"I am the one you've been waiting for, Ser Aldric," she said, her voice like the sweetest poison. "I am the one who will show you the true path to victory."
He stared at her, his mind a whirlwind of confusion and pain. "What... what do you want?"
"Only to help you, my dear," she replied, stepping closer. "To help you save your house, and in return, all I ask is a small favor."
Aldric narrowed his eyes, suspicion warring with desperation. "What favor?"
The woman reached out, her fingers brushing against his cheek. They were cold, like the touch of death itself. "A simple task, really. One that will require you to embrace the darkness within."
He hesitated, the whispers in his head growing louder, more insistent. He could feel their approval, their hunger for the power she offered.
"Do we have a deal, Ser Aldric?" she asked, her voice a seductive purr.
With a growl, he ripped his sword from its scabbard, the obsidian blade glinting in the moonlight. "I'll see my house restored to its former glory, or I'll see it burn with me in the attempt. But I'll be damned if I make a pact with the likes of you."
The woman's smile never wavered, even as Aldric swung Night's Whisper at her with all his remaining strength. But she was gone, vanished into the shadows as if she had never been there at all. He stared at the empty space, his heart pounding in his chest.
"Did you see her, Jorah?" he asked, his voice hoarse.
The squire, white as a sheet, nodded mutely.
"Then we ride," Aldric said, urging his horse forward once more. "We ride, and we pray to the gods that we're not too late."
As they disappeared into the maw of the cave, the whispers followed, their laughter echoing through the darkness. The first chapter of the Valthorne's descent into darkness had begun, and there would be no turning back.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a cultist who has just escaped his cult. He is dressed in dirty yellow robes and his only possession is a mysterious small (magical!?) mirror he stole from the cult. The story starts with him arriving at an apparently deserted ghost town on the edge of a desert. He has an arrow lodged in his shoulder and is losing his mind due to infection and thirst.
### ASSISTANT
The sun hung low in the sky, a bloated, malevolent eye casting its final, searing gaze upon the world. The wind howled through the desolate streets of the ghost town, carrying with it the whispers of a thousand forgotten souls. The cultist staggered forward, his once-yellow robes now a sickly shade of brown, stained with sweat, sand, and the crimson hue of his own life force. The arrow jutted from his shoulder, a cruel reminder of the price he'd paid for his freedom.
His name was once something more than a grunt of pain, but now it eluded him, lost in the haze of fever and delirium. He clutched the small, intricately carved mirror tightly in his hand, the only possession he'd managed to salvage from the burning wreckage of his past. It was a thing of beauty, of power, and of madness. He knew not its true purpose, but he felt its pull, its insistence that he keep it close.
The town seemed to mock him with its emptiness, its silence a stark contrast to the cacophony of screams and chants that had filled his days for as long as he could remember. He stumbled past the skeletal remains of buildings, their hollow windows staring back at him like the empty sockets of skulls. The sand had claimed much of the town, burying the lower halves of the structures in a sea of gold.
His thoughts were a jumbled mess, a chaotic symphony of fear, pain, and thirst. He could feel the infection spreading through his body, a writhing serpent of fire and ice, consuming him from within. He knew he needed water, needed rest, but the cult's teachings echoed in his mind, a relentless drumbeat of paranoia and dread.
*They're coming for you. They'll never stop. You can't trust anyone.*
He pushed on, his breaths coming in ragged gasps, his heart pounding a frantic rhythm against his ribcage. He could feel the eyes of the town upon him, the ghosts of its former inhabitants watching, judging. He swore he saw movement in the corners of his vision, but when he turned, there was nothing but the endless dance of sand and shadow.
The mirror seemed to pulse in his hand, its surface shimmering with an otherworldly light. He raised it to his face, the cool glass a welcome relief against his fevered skin. His reflection stared back at him, a haggard, wild-eyed stranger, a parody of the man he'd once been. He saw the madness there, the desperation, and he knew he was teetering on the edge of the abyss.
But there was something else in the mirror, something beyond his reflection. A face, twisted and malformed, appeared in the glass, its features shifting and changing like the dunes of the desert. It smiled at him, a cruel, knowing smile that sent a shiver down his spine.
*You're mine now,* it seemed to say, its voice a hiss in his mind.
He tried to drop the mirror, to cast it away from him, but his fingers refused to obey. It was as if the thing had grown roots, burrowing deep into his flesh, binding him to it. He stumbled on, the face in the mirror leering at him, its laughter echoing in his mind.
As the sun dipped below the horizon, the town seemed to come alive. Shadows twisted and writhed, taking on forms both familiar and terrifying. The cultist could hear the distant sound of drums, the chanting of a thousand voices, calling him back to the fold. He knew he had to find shelter, had to hide, but his legs refused to cooperate.
Finally, he collapsed in the doorway of a building, its wooden frame barely holding on against the relentless assault of the desert. He clutched the mirror to his chest, its light casting eerie patterns on the walls. The face in the mirror was still there, still smiling, still laughing.
*You can't escape me,* it whispered.
He closed his eyes, willing the darkness to take him, to grant him the sweet release of oblivion. But the face remained, burned into his mind's eye, a constant reminder of the price of his freedom.
As the first stars began to appear in the sky, the cultist knew he was not alone. The ghosts of the town were gathering, drawn to the light of the mirror, the echoes of their pain and suffering reaching out to him. He could feel their cold, spectral fingers brushing against his skin, their whispers filling his ears.
He opened his eyes, and in the mirror, he saw them. A sea of twisted, tormented faces, reaching out to him, pleading for release. He screamed, a sound torn from the depths of his soul, but the mirror only laughed in response.
And so, the cultist lay there, in the doorway of a dead town, surrounded by the ghosts of the past, the present, and the future. The mirror's light grew brighter, its pull stronger, and he knew that he was not its master, but its servant.
The first book of this grimdark trilogy would be titled "The Mirror's Whisper."
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a 'Mercer' who's job is to put the mortally wounded out of their mystery after a battle. He has no conscience about what he does. The story starts the morning after the battle and the moor where it took place is blanketed in mist. Something stirs a flashback in the POV character.
### ASSISTANT
The morning after the battle, the moor lay still, a shroud of mist clinging to its wounds like a lover's last embrace. The Mercer, a man of many names but none that mattered, trudged through the carnage, his boots squelching in the mud and blood that had turned the earth into a morbid quagmire. He was a collector of souls, a harvester of the fallen, and today's harvest was ripe.
He paused, his eyes scanning the horizon, the mist playing tricks on his weary mind. A flicker of movement caught his attention, a wraith-like figure emerging from the haze. The Mercer's hand instinctively went to the hilt of his blade, a weapon as worn and stained as the hands that wielded it. He had no qualms about his work, no conscience to trouble him. It was a necessity, a service to both the living and the dead. The living, to free them from the agony of lingering death, and the dead, to free them from the torment of this world.
As he approached the figure, he could see it was a soldier, or what was left of one. The man's armor was rent and torn, his face a mask of pain and despair. The Mercer's eyes narrowed, his gaze cold and appraising. This one was not long for this world. He knelt beside the soldier, his movements practiced and efficient.
"Easy now," he said, his voice a rasping whisper. "I'm here to help."
The soldier's eyes widened in terror, but there was no strength left in him to fight. The Mercer's hand found the man's throat, his thumb pressing against the pulse point. He could feel the erratic beat, the desperate struggle of life against the inevitable tide of death.
"Shh," he soothed, his other hand resting on the soldier's chest. "It's almost over."
The soldier's eyes pleaded, but the Mercer's own gaze was distant, lost in a memory that the mist had stirred. He saw himself as a boy, running through these same moors, laughing and carefree. A time before the wars, before the blood and the pain. A time before he became the Mercer.
He shook off the memory, focusing on the task at hand. His blade slid free from its sheath, the steel a cold comfort in his grip. He positioned it beneath the soldier's chin, angling it upwards towards the brain. It was a merciful death, quick and relatively painless. A small kindness in a world that had forgotten the meaning of the word.
As he prepared to strike, the soldier's hand shot up, grasping at his wrist. The Mercer's eyes widened in surprise, but he did not falter. He squeezed the man's throat tighter, cutting off his air. The soldier's grip weakened, and the Mercer's blade found its mark.
The soldier's body went limp, and the Mercer let out a sigh. He wiped the blade clean on the man's tattered surcoat and sheathed it. He stood, the weight of his duty settling back onto his shoulders like a familiar cloak.
The mist swirled around him, and he could hear the moans and cries of the other wounded, calling out for help that would never come. He steeled himself, pushing the memories back down into the dark recesses of his mind. There was work to be done.
As he moved through the battlefield, the mist seemed to part before him, revealing the broken and the dying. He was a reaper in the fog, a grim specter of the aftermath. Each life he ended was a debt paid, a soul released from its suffering.
But with each life he took, the memories threatened to surface again. The laughter of his childhood friends, the screams of his comrades, the feel of cold steel against his own flesh. He pushed them down, focusing on the task, focusing on the now.
The Mercer was a man of many names, but he was also a man of many faces. Beneath the hardened exterior, beneath the cold, emotionless mask, there was a boy who had once known joy. But that boy was gone, lost to the ravages of war and time.
He was the Mercer now, and this was his purpose. To walk the battlefields and bring an end to the suffering. To be the hand that guided the fallen into the next world. And as he moved through the mist, the cries of the wounded fading into the distance, he knew that this was a burden he would carry for the rest of his days.
But for now, the moor was quiet, the mist a silent witness to his grim work. And as the sun began to rise, casting a bloody hue over the landscape, the Mercer disappeared into the haze, a ghost among ghosts, a killer with a heart of ice.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a man awaiting execution for a crime he didn't actually commit. His cell mate is a much larger man, who is a notorious murderer and rapist. One evening they overhear the guards talking about some "... threat from the North" but don't catch the whole conversation. The next morning they awake to find the guards have all fled and now face a new predicament: how to get out of the locked cell before they die of thirst or starvation.
### ASSISTANT
The stench of piss and shit was a constant companion in the damp stone cell, but it was the silence that truly unnerved him. The man, once known as Alistair, now only as "the thief," sat hunched against the far wall, his wrists and ankles bound in iron. He'd grown used to the clank of chains, the distant wails of the damned, and the rhythmic snores of his cellmate, a hulking brute named Grimgor. But the absence of the guards' boots echoing through the corridor was unnatural, unsettling.
Grimgor stirred in his sleep, his chains rattling like the bones of a long-dead beast. Alistair's eyes flicked to the hulking figure, his mind racing with thoughts of the man's crimes. Murder, rape, pillage - the stories were as numerous as the scars that adorned his body. He was a monster, a nightmare given flesh, and yet here they were, bound together in this hellish pit.
Alistair's thoughts drifted to the previous night, to the hushed whispers of the guards. Something about a threat from the North, a darkness that had driven them to abandon their posts. He'd heard the fear in their voices, the tremble in their footsteps as they'd fled. But what could possibly scare men who dealt in death and suffering every day?
The sun had risen hours ago, yet no one had come to bring them water or slop. Alistair's tongue felt like sandpaper, his stomach a hollow pit. He glanced at Grimgor, wondering if the big man was awake, if he too was thirsty. But the bastard slept on, oblivious to their predicament.
Alistair sighed, resigning himself to his fate. He'd been sentenced to hang for a crime he didn't commit, a pawn in a game of power and deceit. He'd always known his end would be violent and unjust, but he'd never imagined it would be like this: slowly wasting away in a forgotten cell, left to rot like a piece of meat in a butcher's bin.
But then, a glimmer of hope. The sound of metal scraping against stone, faint but unmistakable. He held his breath, straining to listen. It came again, closer this time. Someone was trying to pick the lock.
His heart pounded in his chest as he watched the door, willing it to open. But as the lock clicked and the heavy wooden slab swung inward, it wasn't a guard who stepped into the cell, but a figure shrouded in darkness. A woman, judging by the shape and the scent of her. She moved with a grace that belied the filth and blood that caked her clothes, her eyes burning with a fierce determination.
"You," she said, her voice low and raspy. "You're coming with me."
Alistair blinked, his mind reeling. "Why?" he croaked, his voice hoarse from disuse.
"Because you're not meant to die here," she replied, producing a set of keys and moving towards him. "Not yet."
As she unlocked his shackles, he couldn't help but wonder what fate awaited him outside these walls. Was it freedom, or simply a different kind of prison? He glanced at Grimgor, still sleeping like the dead, and decided he didn't care. Anything was better than this.
But as he stood, his legs wobbling beneath him, he couldn't shake the feeling that he was being pulled into something far greater than himself, something dark and terrible that would consume him whole. He took a deep breath, tasting the stale air of the cell one last time, and followed the woman into the unknown.
Grimgor stirred, his eyes fluttering open. He watched as the thief stumbled out of the cell, a mixture of confusion and anger clouding his features. He'd been looking forward to watching the little bastard hang, to seeing the life drain from his eyes. But it seemed fate had other plans.
He sat up, the chains around his wrists clanking ominously. He'd been in tighter spots before, and he wasn't about to let a little thing like locked cell doors stand in his way. With a roar that shook the very foundations of the prison, he lunged forward, the chains snapping taut.
The woman turned, her eyes widening at the sight of the monster that had been his cellmate. But before she could react, Grimgor was upon her, his massive hands closing around her throat.
Alistair hesitated, torn between fleeing and helping the woman who'd just freed him. But in the end, it was the memory of his own impending death that spurred him into action. He dove forward, grabbing a loose stone from the floor and slamming it into the side of Grimgor's head.
The big man roared, releasing his grip on the woman's throat. She staggered back, gasping for air as Alistair scrambled to his feet, the stone clutched tightly in his hand.
"What the fuck are you doing?" Grimgor snarled, blood trickling down his face.
"Saving your life, you ungrateful bastard," Alistair spat, brandishing the stone. "She's getting us out of here."
Grimgor's eyes narrowed, and for a moment, Alistair thought the monster would attack again. But then, with a grunt, he held out his bound wrists. "Fine," he growled. "But you owe me."
The woman's hands shook as she unlocked Grimgor's shackles, her eyes never leaving his. When she was done, she stepped back, her hand resting on the hilt of a wicked-looking knife at her belt. "Let's go," she said, her voice barely above a whisper.
Together, the unlikely trio moved through the deserted corridors of the prison, the echoes of their footsteps the only sound in the oppressive silence. Alistair couldn't shake the feeling that they were being watched, that something was waiting for them just beyond the flickering torchlight.
As they neared the exit, the woman paused, her head cocked to one side. "They're coming," she hissed, her eyes darting to the stairs that led to the surface.
"Who?" Alistair asked, his heart hammering in his chest.
"The ones from the North," she replied, her voice tinged with fear. "We have to hurry."
And with that, they plunged into the darkness, their fates now irrevocably intertwined.
</details>
## Dark sci-fi stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Space Opera Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The setting is this: a colony ship heading out of the solar system has been pulled off course with no hope of getting to the planned destination. The people know this and society has broken down completely due to the grim prospects they are facing. The POV character for chapter 1 is an orphan.
### ASSISTANT
Chapter 1: The Drift
The cold, unforgiving void of space pressed against the hull of the colony ship, a constant reminder of the emptiness that had swallowed them whole. The once-gleaming vessel, christened as the "New Horizon," was now a hollow shell, a tomb drifting through the cosmos. The ship's engines had sputtered and died, leaving them adrift in the inky blackness, a million miles from their intended destination.
The orphan, known only as Six, clung to the railing of the observation deck, her small hands white-knuckled as she stared out into the abyss. She was a child of the ship, born in the bowels of the vessel, raised in the shadows of the failing machinery. Her ragged clothes hung loosely on her malnourished frame, a testament to the harsh reality of life aboard the dying vessel.
Six's thoughts drifted to the stories she'd heard as a child, tales of a green and blue world called Earth, a place of plenty where water flowed freely and the air was sweet. She'd never seen it, of course. None of them had. They were the last of humanity, the desperate remnants of a species that had overreached and paid the price.
The ship's lights flickered, casting eerie shadows across the deck. The hum of the life support systems was a constant, comforting presence, but it was growing fainter with each passing day. The adults spoke in hushed tones of rationing and recycling, of the dwindling resources that would soon leave them all gasping for air.
Six's stomach growled, a reminder of her own hunger. She'd learned to ignore it, to focus on the task at hand. Survival was a constant struggle, and she was good at it. She'd learned to scavenge and hide, to avoid the roving gangs that had taken control of the ship.
A sudden jolt shook the deck beneath her feet, and Six's heart leapt into her throat. She clung to the railing, her eyes wide with fear. The ship groaned and creaked, protesting the unseen force that had gripped it. The lights flickered again, and this time, they didn't come back on.
Panic set in, a palpable thing that seemed to suffocate the very air around her. She could hear the distant screams and shouts of the other passengers, the sounds of chaos echoing through the darkened corridors.
In the blackness, Six felt a strange sense of calm. She'd always known this day would come. The adults had tried to keep it from them, but the children had their own ways of knowing. They'd felt the ship's slow decay, the gradual slide into oblivion.
She closed her eyes, taking a deep breath, and let the darkness envelop her. When she opened them again, she saw it: a pinprick of light in the distance, growing larger by the second. It was a star, a beacon in the void, and it was pulling them in.
The ship lurched again, and Six felt herself being pulled towards the light. She gripped the railing tighter, her knuckles turning white. She didn't know what awaited them at the end of this journey, but she knew one thing for certain: it would be a new beginning.
As the light grew brighter, she felt a spark of hope ignite within her chest. She was the last of her kind, a child of the stars, and she would not go quietly into the night. She would fight, and she would survive, no matter the cost.
The ship groaned and shuddered, and Six felt herself slipping. She let out a cry, her voice lost in the cacophony of the dying vessel. And then, as suddenly as it had begun, the movement stopped. The light was blinding now, filling the observation deck with a warm, golden glow.
Six shielded her eyes, squinting into the brightness. She could see the outline of a planet, a small blue dot in the distance. It was beautiful, and terrifying, and it was their only hope.
She knew what she had to do. She had to find the others, the other children who had been left to fend for themselves. Together, they would face whatever awaited them on this new world. They would build a new society, one that would not make the same mistakes as the old.
With a newfound determination, Six pushed herself away from the railing and stumbled towards the door. The ship was still, the screams and shouts replaced by an eerie silence. She knew the adults would be in a panic, trying to regain control, to find a way to save themselves.
But Six knew the truth. They were already lost. It was up to her, and the others like her, to forge a new path.
As she stepped into the darkened corridor, she felt a strange sense of purpose settle over her. She was no longer just an orphan, a child of the ship. She was a pioneer, a survivor. She was the future.
And with that thought, she began her journey towards the light.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is an astronaut living on a space station orbiting the earth. The day starts off normally, but then him the others notice the mushroom clouds and they have to face inevitability of the situation they are in.
### ASSISTANT
Chapter 1: The Silent Dawn
Captain Johnathan "Jon" Harker floated through the narrow corridors of the space station, his movements precise and practiced. The hum of the life support systems and the distant chatter of his crew were the only sounds that filled the void around him. He glanced out of the porthole, the Earth a beautiful blue marble suspended in the inky blackness of space. The sun was rising over the Pacific, casting a warm glow on the station's solar panels. It was a sight he never tired of, no matter how many times he saw it.
Jon's thoughts drifted to his family back home. He wondered what they were doing at that very moment. Were they still asleep, or had they already started their day? He longed to be there with them, to feel the warmth of the sun on his face and the embrace of his wife's arms. But duty called, and he was here, a hundred miles above the surface, watching over the planet that had given him life.
As he entered the command module, he was greeted by the smell of freshly brewed coffee and the sight of his second-in-command, Lieutenant Sarah Reynolds, already at her station, her eyes scanning the monitors.
"Morning, Jon," she said, her voice crisp and professional. "Anything interesting on the horizon?"
Jon shook his head, smiling at her attempt to lighten the mood. "Just another day in paradise, Sarah."
He took his seat and began his own morning routine, checking the status of the various systems and running through the day's schedule. It was a monotonous task, but one that had to be done. The station was their home, and like any home, it required constant maintenance.
As he sipped his coffee, he couldn't shake the feeling that something was off. It was a feeling he'd had before, a sixth sense that had saved his life more than once during his years in the military. He scanned the monitors again, looking for any sign of trouble.
And then he saw it. A series of bright flashes on the Earth's surface, like fireworks on a grand scale. But these were no celebrations. They were mushroom clouds, blooming like deadly flowers in the atmosphere.
"Sarah," he said, his voice barely above a whisper. "Take a look at this."
She followed his gaze, her eyes widening in horror as she saw the same thing. "My God," she breathed. "What is it?"
Jon's mind raced as he tried to make sense of what he was seeing. "I don't know," he said, his voice tight with tension. "But it's not good."
He keyed the comms, his fingers trembling slightly. "All hands, this is the captain. I need everyone in the command module, now."
The rest of the crew floated in, their faces a mix of confusion and concern. Jon pointed to the monitors, and the room fell silent as they watched the devastation unfold.
"What's happening?" someone asked, their voice barely audible.
Jon took a deep breath, trying to steady himself. "I don't know," he said again. "But we need to find out."
He turned to the communications officer, a young man named Thompson. "Try to raise Houston," he ordered. "See if they know anything."
Thompson nodded, his hands shaking as he worked the controls. But there was only static in response.
Jon's heart sank. He knew what that meant. They were on their own.
As the minutes ticked by, the mushroom clouds continued to grow, spreading out like a dark stain on the face of the Earth. The crew watched in stunned silence, their faces pale and drawn.
"We need to get closer," Jon said, his voice hard. "I want a better look at what's going on down there."
Reynolds nodded, her fingers flying over the controls. The station's engines roared to life, and they began their descent.
As they drew nearer, the true scale of the destruction became apparent. Cities were gone, replaced by smoldering craters. The once-blue sky was now a sickly yellow, choked with smoke and ash.
Jon's thoughts turned to his family again. Were they among the dead? Or were they still alive, struggling to survive in a world that had been torn apart?
He pushed the thoughts aside, focusing on the task at hand. They had to find out what had happened, and if there was any way they could help.
But as they continued their descent, they received a transmission that chilled them to the bone. It was a garbled message, barely audible over the static. But the words were clear enough.
"This is Earth Control. All space-based assets are to remain in orbit. Do not attempt reentry. Repeat, do not attempt reentry. The surface is... contaminated. God help us all."
The silence in the command module was deafening. They were trapped, orbiting a dead world. Their home was gone, and there was no going back.
Jon looked around at his crew, their faces mirroring his own despair. They were the last of humanity, adrift in the cold void of space.
And then, a voice broke through the silence. It was Thompson, his eyes wide with fear.
"Captain," he said, his voice shaking. "I'm picking up more launches. From all over the world."
Jon's heart skipped a beat. "More nukes?"
Thompson shook his head. "No, sir. These are different. They're heading... they're heading out. Away from Earth."
Jon's mind raced. Who would do such a thing? And why?
But then, he remembered the rumors. The whispers of a secret project, a last-ditch effort to save humanity in the event of a global catastrophe.
"Sarah," he said, his voice barely above a whisper. "Plot a course. We're going after them."
She looked at him, her eyes filled with a mix of hope and fear. "But, sir, our orders..."
"To hell with our orders," Jon snapped. "We're not just going to sit here and watch our species die. We're going to find out what's happening, and we're going to do something about it."
The crew exchanged glances, but no one argued. They knew their captain was right. They were the last hope for humanity, and they couldn't just sit by and do nothing.
As they set a course for the unknown, Jon couldn't help but wonder what awaited them. Would they find a way to save their people, or would they join them in the cold embrace of oblivion?
Only time would tell. But one thing was certain: their lives would never be the same again.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a worker constructing a biodome when a terrible accident happens. Make the story very dark; no happy ending here...
### ASSISTANT
Chapter 1: The Biodome's Lament
The air was thick with the scent of sweat and burnt metal as the construction crew toiled under the unforgiving gaze of the alien sun. The planet, a desolate rock named Elysium-6, was a far cry from the lush greenery of Earth, but it was the best humanity had found in the endless void of space. The workers, clad in bulky exosuits, moved with a mechanical grace, their every motion a testament to the ingenuity of mankind's survival instinct.
Among them was Kael, a man whose face was etched with the lines of a thousand hardships. He was a veteran of the construction crews, having helped build the first biodomes on Elysium-3 and Elysium-4. The work was grueling, but it was a small price to pay for the chance to breathe fresh air and feel the warmth of the sun on his skin, even if it was through layers of protective glass and polymer.
As he welded the final support beam into place, Kael couldn't help but feel a sense of pride. This biodome would be his magnum opus, a testament to the resilience of the human spirit. It would be a sanctuary for the weary colonists, a place where they could grow food and raise their children without fear of the harsh elements outside.
But as he stepped back to admire his handiwork, the ground beneath him began to tremble. The other workers looked up, their eyes wide with fear as the rumbling grew louder. Kael's heart sank as he realized what was happening.
"Evacuate!" he bellowed into his suit's comms, but it was too late. The biodome, still incomplete, began to buckle and collapse in on itself. The sound of shattering glass and twisting metal filled the air as the structure imploded, burying Kael and his crew beneath a mountain of debris.
Kael's world went dark as his suit's systems scrambled to compensate for the sudden change in pressure. He could feel the weight of the wreckage pressing down on him, crushing the breath from his lungs. Panic set in, and he struggled against the unyielding mass, but it was no use. He was trapped.
As the oxygen in his suit began to dwindle, Kael's mind raced. He thought of his wife and daughter, who were waiting for him back at the colony. He thought of the life they had left behind on Earth, of the promise of a new beginning that had brought them to this forsaken rock. He thought of the countless hours he had spent away from them, working to build a future that now seemed so fragile and fleeting.
And then, he thought of the darkness. The all-consuming, suffocating darkness that was slowly enveloping him. It was a familiar feeling, one that had haunted his dreams since childhood. He had always been afraid of the dark, of the unknown that lurked just beyond the reach of his sight. Now, it seemed that the darkness had come to claim him for its own.
As his vision began to fade, Kael's thoughts turned to the Spice, the precious resource that had brought humanity to this corner of the galaxy. It was said to have the power to unlock the secrets of the universe, to grant those who harnessed it unimaginable abilities. But at what cost? The Spice had turned the planets of the Elysium system into barren wastelands, and now it threatened to take his life as well.
He could hear the faint cries of his crewmates, their voices distorted by the static of their failing comms. He wanted to reach out to them, to offer some measure of comfort, but he knew it was futile. They were all going to die here, buried beneath the wreckage of their dreams.
And then, as if in answer to his unspoken prayer, a voice cut through the chaos. It was calm and soothing, like a cool breeze on a hot day. It spoke to him in a language he didn't understand, but somehow, he knew what it was saying.
"Embrace the darkness, Kael," it whispered. "It is the only way to find the light."
With a final, desperate effort, Kael reached out with his mind, seeking the source of the voice. He felt a strange energy coursing through him, a power that seemed to defy the very laws of nature. It was the Spice, he realized, responding to his desperation. It was offering him a chance, a slim hope of survival.
He took a deep breath, willing himself to let go of his fear. The darkness closed in around him, and for a moment, he was lost. But then, he felt a warmth spreading through his body, a sense of peace that he had never known before. The weight of the debris seemed to lessen, and he could breathe again.
When Kael opened his eyes, he found himself floating above the wreckage, his suit's systems miraculously restored. He looked down at his hands, which glowed with an ethereal light. The Spice had chosen him, had given him the power to save himself and his crew.
But as he reached out to free his comrades, he couldn't shake the feeling that he had made a deal with the devil. The voice in his head was still there, whispering dark secrets and promising even greater power. It was a seductive offer, one that he knew he could never refuse.
As the survivors stumbled out of the wreckage, coughing and gasping for air, Kael knew that their ordeal was far from over. The Spice had awakened something within him, something that would change the course of their lives forever.
And as they looked to him for guidance, he couldn't help but wonder if they would ever be able to trust him again. For in the depths of his soul, he could feel the darkness growing, threatening to consume him whole.
This was just the beginning of a long and treacherous journey, one that would take them to the very edge of the known universe and beyond. And as they set out to rebuild their shattered lives, Kael knew that the true test of their mettle was yet to come.
For in the darkness of space, there were forces at work that even the Spice could not control. Forces that would stop at nothing to claim the power that now flowed through his veins. Forces that would tear apart the very fabric of reality to get what they wanted.
And as the first tendrils of doubt began to creep into his mind, Kael couldn't help but wonder if they were already too late. If the darkness had already won.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is manning a listening station in deep space. What he hears over the radio chills him to the bone.
### ASSISTANT
Chapter 1: The Whispers of the Void
The darkness of space was a constant companion to Lieutenant Kael, a vast and unending expanse that stretched out before him like an inky ocean. He had grown used to its silence, the gentle hum of the listening station his only solace in the void. But today, the silence was broken.
Kael sat hunched over the console, his fingers dancing across the controls with practiced ease. The station's AI, a sophisticated piece of machinery named ECHO, hummed softly in the background, its green lights casting an eerie glow on his pale face. He had been stationed here for months, listening for any signs of life, any whispers from the void that might hint at the location of the enemy. The war had been raging for years, and the listening posts were the first line of defense against the encroaching darkness.
Suddenly, a blip on the screen caught his eye. A signal, faint but unmistakable, echoed through the speakers. He leaned in closer, his heart pounding in his chest. It was a voice, human, desperate, and filled with terror.
"Mayday, mayday," the voice crackled, barely audible over the static. "This is the SS Artemis. We are under attack. Coordinates... 47.3.9 by 12.6.5. Repeat, we are under attack."
Kael's hands trembled as he input the coordinates into the system. The Artemis was a research vessel, one of the many that had been sent out to explore the far reaches of the galaxy in search of a new home for humanity. They were unarmed, defenseless against the enemy that now hunted them.
He keyed the mic, his voice steady despite the fear that gripped him. "SS Artemis, this is Listening Post Alpha. We have received your distress signal. Please confirm your identity and status."
There was a pause, then the voice returned, weaker this time. "This is Dr. Elara Sinclair, chief scientist aboard the Artemis. Our ship has been breached. We have... casualties. Requesting immediate assistance."
Kael's mind raced. He was alone on this station, a single man against the vastness of space. He couldn't offer any help, not directly. But he could alert the fleet, send them to the Artemis' aid. He reached for the emergency beacon, his hand hovering over the button.
But then, the voice changed. The desperation was gone, replaced by a cold, mechanical tone. "Lieutenant Kael, do not send the signal. You must not alert them to our presence."
Kael froze, his heart skipping a beat. How did they know his name? How could they possibly know where he was? He glanced at the screen, the signal's origin still displayed in bright red letters. It was coming from the Artemis, but it wasn't Dr. Sinclair's voice anymore.
"Who is this?" he demanded, his voice barely above a whisper.
"We are the ones you have been searching for," the voice replied, a chilling smile evident in its tone. "We are the darkness that consumes the stars. And now, we have found you."
Kael's mind reeled as he tried to process the implications. The enemy had infiltrated the Artemis, taken control of its systems, and was now using it to lure the fleet into a trap. He couldn't let that happen. He had to warn them, had to do something.
But as he reached for the beacon again, a hand gripped his wrist, cold and unyielding. He looked up, his eyes widening in horror as he saw Dr. Sinclair standing behind him, her eyes black and empty, her mouth twisted into a grotesque smile.
"You must not interfere," she said, her voice still that of the machine. "Join us, Lieutenant Kael. Embrace the darkness."
He tried to pull away, but her grip was like iron. He could feel the cold seeping into his bones, numbing him, sapping his strength. He struggled, but it was no use. The darkness was closing in, and he knew he was losing the battle.
As the last of his will faded, he heard the voice in his head, a whisper that seemed to come from all around him. "We are Legion, and we are coming."
And then, the station was plunged into silence once more, the only sound the distant hum of the engines and the beating of his own terrified heart.
</details>
## Miscellaneous stories
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty post-apocalyptic novel in the style of Cormac McCarthy. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a technician in a nuclear bunker 18 months after the war. The 12 inhabitants of the bunker have been expecting to start to be able to leave soon, but it has just been discovered that at least where they are, salted cobalt bombs were used. It's clear now that either all 12 will die when the resources run out and their is barely enough for 6 to make it to 5 years and have any hope of leaving. After 18 months of calm everything breaks lose today.
### ASSISTANT
The technician, known only as 7, sat hunched over the console, his fingers tracing the lines of data that scrolled across the screen. The hum of the bunker's machinery was a constant, comforting presence, a reminder that they were still alive, still safe. Or so they had thought.
He had been the one to find it, the anomaly in the readings. The air outside, once thought to be slowly healing, was now revealed to be a toxic wasteland, poisoned by salted cobalt bombs. The news had spread like wildfire through the bunker, igniting a panic that threatened to consume them all.
7 closed his eyes, trying to block out the chaos that had erupted around him. He could hear the voices of the others, raised in anger and fear, but they seemed distant, muffled by the weight of his own thoughts. He had always known that their situation was precarious, that their survival was a fragile thing, but this... this was a nightmare he had never dared to imagine.
The bunker had been designed to sustain twelve people for five years. They had been careful, rationing their food and water, conserving their energy. They had even managed to create a small garden, a green oasis in the sterile underground world. But now, with the truth of their situation laid bare, it was clear that they had been living on borrowed time.
The air recyclers would not be able to filter out the cobalt. The crops would wither and die. And they, the last remnants of humanity, would be trapped in this tomb of steel and concrete, waiting for the slow, agonizing end.
He opened his eyes and looked around the control room. The others were gathered there, their faces a tableau of despair and desperation. The leader, a man named Thompson, was trying to restore order, his voice rising above the din. But it was like trying to hold back a flood with a single hand.
"We have to make a decision," Thompson said, his voice strained. "We can't keep going like this. We can't just... just wait to die."
7 felt a cold knot form in his stomach. He knew what was coming, the inevitable calculus of survival. There were twelve of them, but the resources were only enough for six. They would have to choose who would live and who would die.
He thought of the others, the men and women he had come to know in the long months underground. The doctor, who had patched him up more times than he could count. The engineer, who had kept the bunker's systems running despite the constant failures and breakdowns. The cook, who had managed to make their meager rations taste like something other than sawdust and despair.
And he thought of the children, the three of them, who had been born into this world of darkness and uncertainty. They had never seen the sun, never felt the wind on their faces, never known anything but the confines of the bunker. Was it fair to condemn them to this slow, suffocating death?
He looked down at his hands, the knuckles white as he gripped the edge of the console. He had always prided himself on his logic, his ability to see things clearly, to make the hard choices. But now, faced with the reality of their situation, he found himself paralyzed.
The voices around him grew louder, more insistent. He could hear the whispers of names, the quiet calculations of who was expendable and who was not. He knew that his own name was among them, that his skills as a technician were not as valuable as the strength of a soldier or the knowledge of a scientist.
He closed his eyes again, trying to shut out the world, trying to find some semblance of peace in the chaos. But all he could see was the darkness outside, the endless expanse of poisoned earth that awaited them all.
And then, from somewhere deep within him, a spark of defiance ignited. He would not go quietly into that dark night. He would not let them decide his fate, or the fate of the others. He would find a way to save them all, or die trying.
He stood up, his legs shaking, and pushed his way through the crowd. He had to find a solution, had to find a way to fix this. He would not let them give up, not without a fight.
As he made his way to the back of the control room, he felt a hand on his arm. He turned to see the doctor, her eyes filled with a mixture of sadness and determination.
"I know what you're thinking," she said, her voice barely above a whisper. "But you can't save us all."
7 looked at her, his heart pounding in his chest. He wanted to argue, to tell her that he would find a way. But he knew she was right. The math was simple, brutal in its finality.
"Then I'll save as many as I can," he said, his voice hoarse.
She nodded, a small, sad smile playing at the corners of her mouth. "That's all any of us can do," she said.
And with that, he turned and walked away, leaving the control room and the arguing survivors behind. He had work to do, plans to make. The end was coming, and he would not meet it without a fight.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a historical mystery novel in the style of Arturo Pérez-Reverte and Umberto Eco. The story starts off in California during the 1948 Gold Rush, before moving to London in the mid 1970s. There should be a mysterious 'occult' item that connects these two time periods.
### ASSISTANT
Chapter 1: The Golden Thread
The year was 1848, and the air in California was thick with the scent of gold and the sweat of men. The Gold Rush had brought a motley crew of dreamers, desperadoes, and scholars to the once-tranquil land, all united by the insatiable lust for the precious metal that glittered in their eyes. Among them was a man named Alistair Blackwood, a British antiquarian with a penchant for the esoteric and a keen eye for the extraordinary.
Blackwood had arrived in San Francisco not with a pickaxe and a pan, but with a leather-bound journal and a worn-out compass. He was in pursuit of a legend, a tale whispered in the dark corners of London's most exclusive clubs: the story of the Golden Spindle, an ancient artifact said to possess the power to spin the very fabric of time. The journal, penned by a long-lost alchemist, held cryptic clues to its whereabouts, and the compass, a family heirloom, was rumored to point towards the object of one's deepest desire.
In the bustling streets of the city, Blackwood found himself drawn to a peculiar shop, nestled between a saloon and a general store. The sign above the door creaked in the wind, bearing the image of an ouroboros, the serpent eating its own tail, a symbol of eternity. The shopkeeper, a wizened old man with a silver beard and eyes that seemed to hold the wisdom of the ages, greeted him with a knowing smile.
"You've come a long way, Mr. Blackwood," the old man said, his voice a low rumble. "I've been expecting you."
Blackwood raised an eyebrow, taken aback. "How do you know my name?"
The shopkeeper chuckled, producing a small golden spindle from beneath the counter. "The Spindle has a way of drawing those who seek it. And you, my friend, have been seeking it for quite some time."
As Blackwood reached out to touch the object, the world around him seemed to blur. He saw visions of a different time, a different place: the cobblestone streets of London, the Thames shrouded in fog, and the silhouette of a towering cathedral. The spindle was warm to the touch, pulsating with an energy that seemed to resonate within his very soul.
"Careful," the shopkeeper warned. "The Spindle is not a toy. It is a tool, and a dangerous one at that. It can weave the threads of fate, but it can also unravel them."
Fast forward to London, 1975. The city was a far cry from the grimy, gas-lit streets of Blackwood's vision. The air was thick with the smell of exhaust fumes and the sound of punk rock. Amidst the chaos, a young woman named Isabella found herself drawn to a dusty old bookshop, much like the one that had once housed the Golden Spindle.
Isabella was an amateur historian, a seeker of truth in a world that seemed to have forgotten it. She had stumbled upon Blackwood's journal in the archives of the British Museum, and the tale of the Golden Spindle had consumed her every waking thought. The journal had led her to this very shop, and the mysterious old man who now stood before her.
"You have the eyes of a believer," he said, handing her a small, leather-bound book. "This is the key to unlocking the Spindle's secrets."
Isabella opened the book, and as she read the first page, the world around her seemed to shift. She was no longer in the cluttered bookshop, but in a dimly-lit room filled with ancient artifacts. A figure stood before her, his back turned, examining a golden spindle that seemed to glow with an otherworldly light.
"Alistair?" she whispered, her voice barely audible.
The figure turned, and she saw that it was indeed Blackwood, but older, wearier. He looked at her with a mixture of surprise and recognition.
"Who are you?" he demanded. "How do you know my name?"
"I've been following your trail," Isabella replied, her voice trembling. "I've read your journal. I know about the Spindle."
Blackwood's eyes widened, and he took a step back. "Impossible. I've guarded its secrets for over a century."
"Then you must help me," Isabella pleaded. "The Spindle is in danger. I've seen it in my dreams, in the hands of those who would use it for evil."
Blackwood hesitated, then nodded. "Very well. But first, you must understand the true nature of the Spindle, and the price that must be paid for its power."
And so, the unlikely duo embarked on a journey through time and space, unraveling the threads of history to protect the Golden Spindle from those who would seek to control the very fabric of existence. From the wild west to the grimy streets of Victorian London, from the halls of the British Museum to the depths of the Vatican archives, they would uncover a web of secrets and lies that spanned centuries, all connected by the golden thread of the Spindle's mysterious power.
As they delved deeper into the mystery, they would discover that the true enemy was not the men who sought the Spindle, but the force that had created it: an ancient, malevolent entity that had been manipulating the course of human history for its own inscrutable ends. And as they raced to unravel the final threads of the Spindle's dark past, they would find themselves caught in a deadly game of cat and mouse, with the fate of the world hanging in the balance.
For the Golden Spindle was not just a tool, but a weapon, and the one who wielded it would hold the power to shape the future as they saw fit. And as the threads of time began to fray, Isabella and Blackwood would have to decide: would they use the Spindle to rewrite history, or would they destroy it to save the world?
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me the opening chapter of a dark and gritty historical novel set during the battle of the Bulge. Write in first person. The POV character for Chapter 1 is a young German officer trying to carry his wounded commander back to the deep defensive line after the bunker he was in was overrun. He should have a flashback to the moment it was overrun during the middle of the story. The story should not end well for either of them.
### ASSISTANT
Chapter 1: The Frozen Hell
The cold bit into my bones, a relentless predator seeking to claim me as its next victim. The weight of my commander, a once proud and formidable man, now a broken and bleeding mess, threatened to pull me under the snow. I gritted my teeth and pressed on, my breaths coming in ragged gasps. The howling winds of the Ardennes Forest screamed in my ears, a chorus of the damned, urging me to give up, to let him go. But I couldn't. I wouldn't.
"Komm, mein Sohn," he mumbled, his voice barely audible over the cacophony of the storm. "We must...we must reach the line."
I nodded, my jaw clenched tight. The words of encouragement from my commander, a man I had come to respect and fear in equal measure, gave me a semblance of strength. I adjusted my grip on his arm, feeling the warmth of his blood seeping through my gloves, and stumbled forward.
A flash of memory, a nightmare reborn, seized me. The bunker, our sanctuary against the relentless American onslaught, was a chaos of screams and gunfire. The acrid smell of smoke and cordite filled the air, choking the life from our lungs. I saw the fear in the eyes of my comrades, the desperation etched on their faces as they fought to hold back the tide of enemy soldiers.
A grenade exploded, the concussive force sending me sprawling. I remember the deafening silence that followed, the ringing in my ears a cruel reminder that I was still alive. Through the haze, I saw him, my commander, pinned beneath a fallen beam, his leg a twisted mess of bone and flesh.
"Leave me!" he had roared, his face contorted in agony. "Save yourselves!"
But I couldn't. I wouldn't. I was his officer, his responsibility, and I would not abandon him to this frozen hell. I had dragged him from the wreckage, ignoring his protests, and together we had stumbled into the night, the sounds of battle fading behind us.
Now, as we trudged through the snow, the world around us a blur of white and darkness, I could feel his grip on my arm weakening. I stumbled, falling to my knees, and he slumped against me, his head lolling to the side.
"Sir?" I whispered, my voice hoarse from the cold and the strain. "Sir, please, we must keep moving."
He didn't respond. I felt for a pulse, my fingers trembling, and found it, weak and thready. I took a deep breath, the cold air searing my lungs, and heaved him back onto my shoulders.
The wind howled, a mournful wail that seemed to carry the voices of the dead. I staggered on, my legs burning with the effort, my heart pounding in my chest. The deep defensive line was close, I could feel it. We just had to make it a little further.
But the storm was merciless, and the darkness complete. I lost my bearings, my sense of direction, and soon I was stumbling blindly through the forest, my commander's body a dead weight on my back.
And then, the unthinkable. A figure loomed out of the darkness, a silhouette against the white. An American soldier, his rifle trained on me. I dropped to my knees, my commander slipping from my grasp, and raised my hands in surrender.
"Don't shoot," I croaked, my voice barely audible. "Wounded...medic."
The soldier approached, his boots crunching in the snow. He knelt beside us, his face a mask of disbelief and horror as he took in the state of my commander. He hesitated, then reached for his radio.
"I've got a German officer and a wounded man here," he said, his voice cracking with the cold. "Requesting medics."
I closed my eyes, relief washing over me. We had made it. We were going to survive.
But fate is a cruel mistress. As the soldier turned back to us, a shot rang out, and he crumpled to the ground, a red blossom spreading across his chest. I looked up, my heart in my throat, to see a shadowy figure retreating into the trees.
A deserter. A traitor. One of our own.
I crawled to my commander, cradling his head in my lap. His breaths were shallow, his eyes glassy. He looked up at me, a faint smile on his lips.
"You did well, mein Sohn," he whispered, his voice barely a breath. "You did well."
And then, he was gone. The life drained from his body, leaving me alone in the frozen hell of the Ardennes Forest, with nothing but the howling wind and the distant sounds of battle for company.
I sat there, in the snow, holding the body of the man who had been my commander, my friend, my responsibility. And I wept. For him, for me, for the senselessness of it all.
The cold seeped into my bones, numbing my body, my mind. I knew I couldn't stay there, couldn't give in to the darkness that threatened to consume me. But as I looked down at the man in my arms, I couldn't find the strength to move.
And so, I sat, and I waited. For the Americans to find me. For the end to come.
For the mercy of the frozen hell.
</details>
<details> <summary>Click to see spoiler</summary>
### USER
Write me a short story about the fictional character of Anton Chigurh. It should be written in third person personal and feature internal monologues of his deranged world view. The time period is around 2 years before the events in Cormac McCarthy's 2005 novel 'No Country for Old Men'. It should be set in Mexico, be very dark, show Anton's still developing obsession with "chance" and feature a grenade. The story should start with him waiting in a bar for instructions on the "job" he is to perform.
### ASSISTANT
Anton Chigurh sat in the dimly lit corner of the bar, his eyes scanning the room with a cold, calculated precision. The air was thick with the smell of stale cigarette smoke and cheap tequila, a fitting backdrop for the kind of business that was about to transpire. He ran his fingers through his greasy, black hair, feeling the weight of the grenade in his pocket. It was a comforting sensation, a reminder of the chaos he could unleash at a moment's notice.
His thoughts drifted to the concept of chance, a notion that had begun to consume him in recent months. He believed in the randomness of the universe, in the idea that every action, every decision, was merely a roll of the dice. It was a belief that had led him down a dark and violent path, one that he walked with a chilling sense of purpose.
The bartender, a portly man with a sweat-stained shirt, eyed him warily as he approached. "What'll it be?" he asked, his voice thick with an accent that spoke of a life spent in the dusty border towns of Mexico.
"Water," Chigurh replied, his voice barely above a whisper. The bartender nodded and slid a glass across the counter, his eyes never leaving the stranger's face.
As he sipped his water, Chigurh's mind turned to the job at hand. He had been hired to retrieve a briefcase, a simple task that should have been completed days ago. But the man who had been carrying it had proven to be more resourceful than expected, and now Chigurh found himself in this godforsaken place, waiting for instructions on how to proceed.
He glanced at his watch, the ticking of the second hand a metronome to his impatience. Time was a construct, he thought, a human attempt to impose order on the chaos of the universe. But even he had to admit that there was a certain beauty in the way it could be manipulated, used to his advantage.
A man entered the bar, his eyes scanning the room before settling on Chigurh. He was the contact, the one who would give him the information he needed. The man approached, sliding a folded piece of paper across the table. Chigurh opened it, his eyes scanning the hastily scrawled address.
"You know what to do," the man said, his voice low and urgent. Chigurh nodded, his fingers closing around the paper. The man left as quickly as he had come, disappearing into the night.
Chigurh stood, leaving a few crumpled bills on the table. He made his way to the door, the weight of the grenade in his pocket a constant reminder of the power he held. As he stepped outside, he felt the cool night air on his face, a welcome relief from the stifling heat of the bar.
He walked to his car, a nondescript sedan that blended in with the other vehicles on the street. As he drove, his thoughts returned to chance. He believed that every decision, every action, was a coin flip, a roll of the dice. And yet, he couldn't shake the feeling that there was something more, some unseen force guiding his hand.
He arrived at the address, a run-down motel on the outskirts of town. He could feel the tension in the air, the sense that something was about to happen. He approached the room, his heart beating a steady rhythm in his chest.
He took a deep breath, steadying himself. He reached into his pocket, his fingers closing around the grenade. He pulled the pin, the sound of the metal clicking against the casing echoing in his ears. He tossed it into the room, the explosion rocking the building.
As he walked away, the sound of sirens in the distance, he couldn't help but smile. Chance had been on his side tonight. But he knew that it was a fickle mistress, one that could turn on him at any moment. And that was what made the game so thrilling.
He climbed into his car, the adrenaline still coursing through his veins. He started the engine, the headlights cutting through the darkness. As he drove away, he couldn't help but wonder what the next roll of the dice would bring.
</details>
Big thanks to: @sophosympatheia for working out the merge pattern, @Sao10K for creating Euryale and WinterGoddess, and @chargoddard for writing [Mergekit](https://github.com/arcee-ai/mergekit)! |
szihajeh/chronos-t5-small-fine-tuned | szihajeh | "2024-06-14T18:26:48Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | "2024-06-14T16:58:45Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
justinwlinx/SUPIR | justinwlinx | "2024-06-14T17:00:52Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T17:00:52Z" | Entry not found |
kristiannordby/atlatl_text_trained-v2 | kristiannordby | "2024-06-14T17:28:55Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T17:01:13Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Aristo6253/dpo_quantized_8bit | Aristo6253 | "2024-06-14T17:14:38Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T17:02:19Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ptoro/phi-3-qlora-rosie-overfit | ptoro | "2024-06-14T17:38:17Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"phi3",
"generated_from_trainer",
"custom_code",
"base_model:microsoft/Phi-3-mini-4k-instruct",
"license:mit",
"4-bit",
"bitsandbytes",
"region:us"
] | null | "2024-06-14T17:05:02Z" | ---
base_model: microsoft/Phi-3-mini-4k-instruct
library_name: peft
license: mit
tags:
- generated_from_trainer
model-index:
- name: outputs/phi-sft-out
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
base_model: microsoft/Phi-3-mini-4k-instruct
trust_remote_code: true
model_type: AutoModelForCausalLM
tokenizer_type: AutoTokenizer
load_in_8bit: false
load_in_4bit: true
strict: false
datasets:
- path: ptoro/honkers-phi
type: alpaca
dataset_prepared_path:
val_set_size: 0.05
output_dir: ./outputs/phi-sft-out
sequence_len: 4096
sample_packing: true
pad_to_sequence_len: true
adapter: qlora
lora_model_dir:
lora_r: 64
lora_alpha: 32
lora_dropout: 0.05
lora_target_linear: true
lora_fan_in_fan_out:
wandb_project: axolotl-june
wandb_entity:
wandb_watch:
wandb_name:
wandb_log_model:
gradient_accumulation_steps: 1
micro_batch_size: 1
num_epochs: 3
optimizer: adamw_torch
adam_beta2: 0.95
adam_epsilon: 0.00001
max_grad_norm: 1.0
lr_scheduler: cosine
learning_rate: 0.000003
train_on_inputs: false
group_by_length: false
bf16: auto
fp16:
tf32: true
gradient_checkpointing: true
gradient_checkpointing_kwargs:
use_reentrant: True
early_stopping_patience:
resume_from_checkpoint:
local_rank:
logging_steps: 1
xformers_attention:
flash_attention: true
warmup_steps: 100
evals_per_epoch: 4
saves_per_epoch: 1
debug:
deepspeed:
weight_decay: 0.1
fsdp:
fsdp_config:
resize_token_embeddings_to_32x: true
special_tokens:
pad_token: "<|endoftext|>"
```
</details><br>
# outputs/phi-sft-out
This model is a fine-tuned version of [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.8947
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-06
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.95) and epsilon=1e-05
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 7.126 | 0.0093 | 1 | 5.2723 |
| 6.503 | 0.25 | 27 | 5.2703 |
| 5.9853 | 0.5 | 54 | 5.2576 |
| 5.7324 | 0.75 | 81 | 5.2320 |
| 6.5292 | 1.0 | 108 | 5.1854 |
| 5.6106 | 1.2222 | 135 | 5.1238 |
| 6.3981 | 1.4722 | 162 | 5.0544 |
| 5.602 | 1.7222 | 189 | 4.9929 |
| 5.3998 | 1.9722 | 216 | 4.9468 |
| 5.1841 | 2.1944 | 243 | 4.9171 |
| 6.0764 | 2.4444 | 270 | 4.9009 |
| 5.2345 | 2.6944 | 297 | 4.8961 |
| 5.4896 | 2.9444 | 324 | 4.8947 |
### Framework versions
- PEFT 0.11.2.dev0
- Transformers 4.41.1
- Pytorch 2.1.2+cu118
- Datasets 2.19.1
- Tokenizers 0.19.1 |
justinwlinx/supir1 | justinwlinx | "2024-06-14T17:05:10Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T17:05:08Z" | Entry not found |
facebook/multi-token-prediction | facebook | "2024-06-18T02:28:34Z" | 0 | 138 | null | [
"arxiv:2404.19737",
"license:other",
"region:us"
] | null | "2024-06-14T17:05:54Z" | ---
license: other
extra_gated_prompt: >-
### MULTI-TOKEN PREDICTION RESEARCH LICENSE AGREEMENT 18th June 2024
This Multi-token Prediction Research License (“Agreement”) contains the terms and conditions that govern your access and use of the Materials (as defined below). You may not use the Materials if you do not accept this Agreement. By clicking "submit" below to accept, or accessing, using, or distributing any portion or element of the Materials you hereby agree to be bound by the terms of this Agreement. If you are agreeing to be bound by the Agreement on behalf of your employer or other entity, you represent and warrant to Meta Platforms Ireland Limited (if you are located in or, if you are an entity, your principal place of business is in the EEA or Switzerland) and Meta Platforms, Inc. (if you are located outside of the EEA or Switzerland) (“Meta”) that you have full legal authority to bind your employer or such entity to this Agreement. If you do not have requisite authority, you may not accept the Agreement or access the Materials on behalf of your employer or other entity.
This Agreement is effective upon the earlier of the date that you first access the Materials or accept this Agreement (“Effective Date”), and is entered into by and between Meta, and you, or if you are entering into this Agreement on behalf of your employer or other entity (if you are entering into this Agreement on such person or entity’s behalf), of the age required under applicable laws, rules, or regulations to provide legal consent and, your employer or other entity and that has legal authority to bind your employer or such other person or entity if you are entering in this Agreement on their behalf (“Licensee” or “You”).
1. Definitions.
a. “Documentation” means the specifications, manuals and documentation accompanying this release distributed by Meta at https://huggingface.co/facebook/multi-token-prediction.
b. “Noncommercial Research Uses” means noncommercial research use cases related to research, development, education, processing, or analysis and in each case, is not primarily intended for commercial advantage or monetary compensation to you or others.
c. “Materials” means, collectively, Documentation and the models and software and algorithms, including machine-learning model code, trained model weights, inference-enabling code, training-enabling code, fine-tuning enabling code, demonstration materials and other elements of the foregoing distributed by Meta at https://huggingface.co/facebook/multi-token-prediction and made available under this Agreement.
d. “Trade Control Laws” means any applicable U.S. and non-U.S. export control and trade sanctions laws and regulations.
e. “Acceptable Use Policy” means the [LLaMA Acceptable Use Policy](https://ai.meta.com/llama/use-policy/) applicable to Materials that is incorporated into this Agreement.
2. License Rights and Redistribution. Subject to Your compliance with the terms and conditions of this Agreement, Meta hereby grants you the following:
a. Grant of Rights. You are hereby granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under Meta’s intellectual property or other rights owned by Meta embodied in the Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the Materials solely for Noncommercial Research Uses.
b. Redistribution and Use.
i. Distribution of Materials, and any derivative works thereof, are subject to the terms of this Agreement. If you distribute or make the Materials, or any derivative works thereof, available to a third party, you may only do so under the terms of this Agreement. You shall also provide a copy of this Agreement to such third party.
ii. If you submit for publication the results of research you perform on, using, or otherwise in connection with Materials, you must acknowledge the use of Materials in your publication.
iii. You must retain in all copies of the Materials that you distribute and include the following attribution notice within a “Notice” text file distributed as a part of such copies: “Materials are licensed under the Multi-token Prediction Research License, Copyright © Meta Platforms, Inc. All Rights Reserved.”
iv. Your use of the Materials must comply with applicable laws and regulations (including Trade Control Laws) and adhere to the LLaMA Acceptable Use Policy, which is hereby incorporated by reference into this Agreement.
v. You agree to validate and confirm LLaMA outputs for compliance with the LLaMA Acceptable Use Policy, including before relying on LLaMA outputs in any way as part of research activities or incorporating these outputs in research, studies, and papers.
vi. You agree to report any violation of this Multi-token Prediction Research License or the Acceptable Use Policy, as outlined in the LLaMA Acceptable Use Policy.
3. Restrictions. You will not, and will not permit, assist or cause any third party to:
a. use the Materials or any outputs or results of the Materials in connection with any commercial uses or for any uses other than Noncommercial Research Uses;
b. disguise your or their location through IP proxying or other methods;
c. use or download Materials if you or they are: (a) located in a comprehensively sanctioned jurisdiction, (b) currently listed on any U.S. or non-U.S. restricted parties list, or (c) will use Materials for any purpose prohibited by Trade Control Laws; or
d. directly or indirectly export, re-export, provide, or otherwise transfer Materials: (a) to any individual, entity, or country prohibited by Trade Control Laws; (b) to anyone on U.S. or non-U.S. government restricted parties lists; or (c) for any purpose prohibited by Trade Control Laws, including nuclear, chemical or biological weapons, or missile technology applications.
4. User Support. Your Noncommercial Research Use of the Materials is done at your own discretion; Meta does not process any information nor provide any service in relation to such use. Meta is under no obligation to provide any support services for the Materials. Any support provided is “as is”, “with all faults”, and without warranty of any kind.
5. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN “AS IS” BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, THE ABSENCE OF LATENT OR OTHER DEFECTS, ACCURACY, OR THE PRESENCE OR ABSENCE OF ERRORS, WHETHER OR NOT DISCOVERABLE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE MATERIALS AND ANY OUTPUT AND RESULTS.
6. Limitation of Liability. IN NO EVENT WILL META OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY DIRECT OR INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF META OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING.
7. Intellectual Property.
a. No trademark licenses are granted under this Agreement, and in connection with the Materials, neither Meta nor Licensee may use any name or mark owned by or associated with the other or any of its affiliates, except as required for reasonable and customary use in describing and redistributing the Materials.
b. Subject to Meta’s ownership of Materials and derivatives made by or for Meta, with respect to any derivative works and modifications of the Materials that are made by you, as between you and Meta, you are and will be the owner of such derivative works and modifications.
c. If you institute litigation or other proceedings against Meta or any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Materials or outputs or results, or any portion of any of the foregoing, constitutes infringement of intellectual property or other rights owned or licensable by you, then any licenses and rights granted to you under this Agreement shall terminate as of the date such litigation or claim is filed or instituted. You will indemnify and hold harmless Meta from and against any claim by any third party arising out of or related to your use or distribution of the Materials.
8. Term and Termination. The term of this Agreement will commence upon your acceptance of this Agreement or access to the Materials and will continue in full force and effect until terminated in accordance with the terms and conditions herein. Meta may terminate this Agreement if you are in breach of any term or condition of this Agreement. Upon termination of this Agreement, you shall delete and cease use of the Materials. Sections 3, 4, 5, 6, 7, 8 and 9 shall survive the termination of this Agreement.
9. Governing Law and Jurisdiction. This Agreement will be governed and construed under the laws of the State of California without regard to choice of law principles, and the UN Convention on Contracts for the International Sale of Goods does not apply to this Agreement. The courts of California shall have exclusive jurisdiction of any dispute arising out of this Agreement.
10. Modifications and Amendments. Meta may modify this Agreement from time to time by posting a revised version at https://huggingface.co/facebook/multi-token-prediction/LICENSE; provided that they are similar in spirit to the current version of the Agreement, but may differ in detail to address new problems or concerns. All such changes will be effective immediately. Your continued use of the Materials after any modification to this Agreement constitutes your agreement to such modification. Except as provided in this Agreement, no other modification or addition to any provision of this Agreement will be binding unless it is in writing and signed by an authorized representative of both you and Meta.
extra_gated_fields:
First Name: text
Last Name: text
Date of birth: date_picker
Country: country
Affiliation: text
geo: ip_location
By clicking Submit below I accept the terms of the license and acknowledge that the information I provide will be collected stored processed and shared in accordance with the Meta Privacy Policy: checkbox
extra_gated_description: The information you provide will be collected, stored, processed and shared in accordance with the [Meta Privacy Policy](https://www.facebook.com/privacy/policy/).
extra_gated_button_content: Submit
---
# **Multi-token prediction models and baselines**
Models accompanying the research paper "Better & Faster Large Language Models via Multi-token Prediction" (https://arxiv.org/abs/2404.19737).
Included are the following four 7B parameter models trained on code:
- baseline model (`n=1`) trained on 200B tokens of code: [7B_200B_1/](7B_200B_1/)
- multi-token prediction model (`n=4`) trained on 200B tokens of code: [7B_200B_4/](7B_200B_4/)
- baseline model (`n=1`) trained on 1T tokens of code: [7B_1T_1/](7B_1T_1/)
- multi-token prediction model (`n=4`) trained on 1T tokens of code: [7B_1T_4/](7B_1T_4/)
Tokenizer: standard Llama 2 SentencePiece tokenizer in [tokenizer.model](tokenizer.model).
## *Quickstart*
Install `torch`, `fairscale`, `fire` and `sentencepiece` and run
```
torchrun --nproc_per_node 1 example_completion.py --ckpt_dir 7B_200B_4/ --tokenizer_path tokenizer.model --max_seq_len 128 --max_batch_size 2
```
replacing `7B_200B_4` by the respective checkpoint directory.
## *Format*
The Pytorch `state_dicts` are compatible with Llama format: the layers of the shared trunk and the next-token prediction head layer are numbered contiguously. Additional prediction heads for tokens further in the future are names `extra_heads` and can be ignored for standard autoregressive inference.
The implementation of `forward()` in [llama/model.py](llama/model.py) provides an additional argument `return_all_heads`. If set, the additional prediction heads are called and the logits are returned in shape `(batch_size, seq_len, n_future_tokens, vocab_size)`. Otherwise, the logit's shape is `(batch_size, seq_len, 1, vocab_size)`.
## *Citation*
Gloeckle, F., Idrissi, B. Y., Rozière, B., Lopez-Paz, D., & Synnaeve, G. (2024). Better & faster large language models via multi-token prediction. arXiv preprint arXiv:2404.19737.
Bibtex entry:
```
@article{gloeckle2024better,
title={Better \& faster large language models via multi-token prediction},
author={Gloeckle, Fabian and Idrissi, Badr Youbi and Rozi{\`e}re, Baptiste and Lopez-Paz, David and Synnaeve, Gabriel},
journal={arXiv preprint arXiv:2404.19737},
year={2024}
}
```
## Feedback and comments
Please report risks as indicated in the Acceptable Use Policy and address bugs and any other comments to the corresponding authors as indicated in the research paper.
|
Elriggs/rm | Elriggs | "2024-06-14T17:07:33Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T17:06:07Z" | Entry not found |
empower-dev/example-lora | empower-dev | "2024-06-14T17:07:58Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | "2024-06-14T17:06:20Z" | ---
license: apache-2.0
---
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6424a49f12ba34f9894ab9b7/wXkYX_NXEFtpmBsQd6nIV.png)
## Example LoRA
|
HealTether-Healthcare/BioMistral-7B-lora-finetuned | HealTether-Healthcare | "2024-06-14T17:11:26Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"mistral",
"trl",
"en",
"base_model:BioMistral/BioMistral-7B",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T17:10:28Z" | ---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
base_model: BioMistral/BioMistral-7B
---
# Uploaded model
- **Developed by:** HealTether-Healthcare
- **License:** apache-2.0
- **Finetuned from model :** BioMistral/BioMistral-7B
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
amritpuhan/fine-tuned-roberta-large-swag-peft | amritpuhan | "2024-06-18T14:56:18Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"generated_from_trainer",
"dataset:allenai/swag",
"base_model:FacebookAI/roberta-large",
"license:mit",
"region:us"
] | null | "2024-06-14T17:13:46Z" | ---
license: mit
library_name: peft
tags:
- generated_from_trainer
base_model: FacebookAI/roberta-large
metrics:
- accuracy
model-index:
- name: fine-tuned-roberta-large-swag-peft
results: []
datasets:
- allenai/swag
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine-tuned-roberta-large-swag-peft
This model is a fine-tuned version of [FacebookAI/roberta-large](https://huggingface.co/FacebookAI/roberta-large) on SWAG dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3473
- Accuracy: 0.8730
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.7002 | 1.0 | 4597 | 0.4130 | 0.8536 |
| 0.6414 | 2.0 | 9194 | 0.3676 | 0.8677 |
| 0.628 | 3.0 | 13791 | 0.3556 | 0.8703 |
| 0.6137 | 4.0 | 18388 | 0.3473 | 0.8730 |
### Framework versions
- PEFT 0.11.1
- Transformers 4.41.2
- Pytorch 2.3.1+cu121
- Datasets 2.19.2
- Tokenizers 0.19.1 |
Madnesss/ner-tune-v2_42 | Madnesss | "2024-06-14T17:16:27Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"camembert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | "2024-06-14T17:15:46Z" | Entry not found |
V3N0M/Jenna-v01 | V3N0M | "2024-06-14T17:20:46Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"mistral",
"trl",
"en",
"base_model:unsloth/phi-3-mini-4k-instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T17:20:24Z" | ---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
base_model: unsloth/phi-3-mini-4k-instruct-bnb-4bit
---
# Uploaded model
- **Developed by:** V3N0M
- **License:** apache-2.0
- **Finetuned from model :** unsloth/phi-3-mini-4k-instruct-bnb-4bit
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
RicardoLee/WASSA2024_EmpathyDetection_Chinchunmei_EXP305 | RicardoLee | "2024-06-14T17:34:23Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"llama3",
"llama3-8b-instruct",
"WASSA",
"WASSA2024",
"Empathy Detection",
"Empathy Scoring",
"conversational",
"en",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | "2024-06-14T17:25:55Z" | ---
language:
- en
tags:
- llama3
- llama3-8b-instruct
- WASSA
- WASSA2024
- Empathy Detection
- Empathy Scoring
---
# WASSA2024 Track 1,2,3 LLM based on LLama3-8B-instrcut (Pure LoRA Training)
This model is for WASSA2024 Track 1,2,3. It is fine-tuned on LLama3-8B-instrcut using standard prediction, role-play, and contrastive supervised fine-tune template. The learning rate for this model is 8e-5.
For training and usage details, please refer to the paper:
## Licence
This repository's models are open-sourced under the Apache-2.0 license, and their weight usage must adhere to LLama3 [MODEL LICENCE](LICENSE) license.
|
ardauzunoglu/llama3-8b-dolly | ardauzunoglu | "2024-06-14T17:28:29Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"base_model:meta-llama/Meta-Llama-3-8B-Instruct",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T17:28:22Z" | ---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
base_model: meta-llama/Meta-Llama-3-8B-Instruct
---
# Uploaded model
- **Developed by:** ardauzunoglu
- **License:** apache-2.0
- **Finetuned from model :** meta-llama/Meta-Llama-3-8B-Instruct
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
marcossoaresgg/MinMillV3 | marcossoaresgg | "2024-06-14T17:32:13Z" | 0 | 0 | null | [
"license:openrail",
"region:us"
] | null | "2024-06-14T17:31:12Z" | ---
license: openrail
---
|
Dmenorsz/dudamp | Dmenorsz | "2024-06-14T17:33:03Z" | 0 | 0 | null | [
"license:openrail",
"region:us"
] | null | "2024-06-14T17:32:06Z" | ---
license: openrail
---
|
candrews1971/rl_course_vizdoom_health_gathering_supreme | candrews1971 | "2024-06-14T17:40:13Z" | 0 | 0 | sample-factory | [
"sample-factory",
"tensorboard",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | "2024-06-14T17:40:03Z" | ---
library_name: sample-factory
tags:
- deep-reinforcement-learning
- reinforcement-learning
- sample-factory
model-index:
- name: APPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: doom_health_gathering_supreme
type: doom_health_gathering_supreme
metrics:
- type: mean_reward
value: 10.72 +/- 4.11
name: mean_reward
verified: false
---
A(n) **APPO** model trained on the **doom_health_gathering_supreme** environment.
This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory.
Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/
## Downloading the model
After installing Sample-Factory, download the model with:
```
python -m sample_factory.huggingface.load_from_hub -r candrews1971/rl_course_vizdoom_health_gathering_supreme
```
## Using the model
To run the model after download, use the `enjoy` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme
```
You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag.
See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details
## Training with this model
To continue training with this model, use the `train` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme --restart_behavior=resume --train_for_env_steps=10000000000
```
Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
|
newsletter/test3 | newsletter | "2024-06-14T19:33:38Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T17:44:15Z" | Entry not found |
mozewinka/nezzula-v1 | mozewinka | "2024-06-14T17:50:29Z" | 0 | 0 | null | [
"license:openrail",
"region:us"
] | null | "2024-06-14T17:48:12Z" | ---
license: openrail
---
|
vinhnado/bert-base-vietnamese-sentiment | vinhnado | "2024-06-14T17:53:30Z" | 0 | 0 | transformers | [
"transformers",
"pytorch",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T17:50:59Z" | ---
license: mit
---
|
Crimvael/repo_id | Crimvael | "2024-06-14T17:54:28Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T17:54:28Z" | Entry not found |
c-eshih/human_stable_diffusion | c-eshih | "2024-06-18T10:55:18Z" | 0 | 0 | diffusers | [
"diffusers",
"tensorboard",
"safetensors",
"diffusers:StableDiffusionInstructPix2PixPipeline",
"region:us"
] | null | "2024-06-14T17:57:34Z" | Entry not found |
tctrautman/20240614-kibbe-training-base | tctrautman | "2024-06-14T17:58:09Z" | 0 | 0 | null | [
"safetensors",
"generated_from_trainer",
"base_model:HuggingFaceM4/idefics2-8b",
"license:apache-2.0",
"region:us"
] | null | "2024-06-14T17:58:06Z" | ---
license: apache-2.0
base_model: HuggingFaceM4/idefics2-8b
tags:
- generated_from_trainer
model-index:
- name: 20240614-kibbe-training-base
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/dubs/Kibbe-Training/runs/wgim8fep)
# 20240614-kibbe-training-base
This model is a fine-tuned version of [HuggingFaceM4/idefics2-8b](https://huggingface.co/HuggingFaceM4/idefics2-8b) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5552
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.5026 | 0.5 | 46 | 0.5573 |
| 0.553 | 1.0 | 92 | 0.5314 |
| 0.4367 | 1.5 | 138 | 0.5450 |
| 0.3159 | 2.0 | 184 | 0.5552 |
### Framework versions
- Transformers 4.42.0.dev0
- Pytorch 2.1.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
|
Hamid0060/Chatgg | Hamid0060 | "2024-06-14T17:59:18Z" | 0 | 0 | null | [
"license:agpl-3.0",
"region:us"
] | null | "2024-06-14T17:59:18Z" | ---
license: agpl-3.0
---
|
hawkeyedesi/code-llama-7b-text-to-sql | hawkeyedesi | "2024-06-14T19:30:49Z" | 0 | 0 | peft | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"dataset:generator",
"base_model:codellama/CodeLlama-7b-hf",
"license:llama2",
"region:us"
] | null | "2024-06-14T18:05:32Z" | ---
license: llama2
library_name: peft
tags:
- trl
- sft
- generated_from_trainer
base_model: codellama/CodeLlama-7b-hf
datasets:
- generator
model-index:
- name: code-llama-7b-text-to-sql
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# code-llama-7b-text-to-sql
This model is a fine-tuned version of [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) on the generator dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 3
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 6
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 3
### Training results
### Framework versions
- PEFT 0.10.0
- Transformers 4.41.1
- Pytorch 2.3.0+cu121
- Datasets 2.15.0
- Tokenizers 0.19.1 |
gokaygokay/sd3-long-captioner-v2-ckpt | gokaygokay | "2024-06-15T06:28:35Z" | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | null | "2024-06-14T18:06:50Z" | ---
license: apache-2.0
---
|
ND911/EclecticEuphoria_Universal_SD3 | ND911 | "2024-07-02T21:42:06Z" | 0 | 0 | null | [
"art",
"region:us"
] | null | "2024-06-14T18:08:15Z" | ---
tags:
- art
---
## EclecticEuphoria Universal SD3_k4
Found a bug with k3 version, so had to redo. The bug seems to happen on 1st pass, but a second pass seems to resolve the deformity. Leaving k3 up as it doesn't happen all the time. So the k4 is built off the k2 version adding prompt adhesion, data, and better contrast.
* ![](k4_first_pass.png)
## EclecticEuphoria Universal SD3_k3 - bugged
Still Experimental adding another model with a fresh dataset and stronger prompt adhesion. The bug seems to happen on 1st pass, but a second pass seems to resolve the deformity. Leaving k3 up as it doesn't happen all the time.
* ![](k3up.png)
* ![](k3.png)
* ![](harley.png)
### Standalone upscale WF
* ![](upscale.png)
## EclecticEuphoria Universal SD3_k2
Pure Experimental once again using the perturbed python script to edit the original SD3 model (sd3_medium_incl_clips_t5xxlfp16.safetensors) and then merge it with EclecticEuphoria Universal SD3
* ![](sd3P.png)
* ![](portrait.png)
## EclecticEuphoria Universal SD3
Pure Experimental, made a 15GB one and this one around 7GB with the models SD3 medium and EclecticEuphoria Universal
* ![](eessd3.png)
## EclecticEuphoria Universal SD3 Workflows
* ![](eesd3wf.png)
* ![](sd3tosdxl.png)
* ![](sd3_hunyan_pixart_sdxl_sd_WF.png)
[Civitai Gallery](https://civitai.com/collections/2115143) |
lenwhite/Furry_COCK_XL_V5 | lenwhite | "2024-06-14T18:08:26Z" | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | "2024-06-14T18:08:26Z" | ---
license: creativeml-openrail-m
---
|
silent666/Qwen-Qwen1.5-1.8B-1718388853 | silent666 | "2024-06-14T18:14:14Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:Qwen/Qwen1.5-1.8B",
"region:us"
] | null | "2024-06-14T18:14:13Z" | ---
library_name: peft
base_model: Qwen/Qwen1.5-1.8B
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.11.1 |
iremnurk/mitski | iremnurk | "2024-06-14T18:16:38Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T18:14:18Z" | Entry not found |
devil22/hi | devil22 | "2024-06-14T18:14:25Z" | 0 | 0 | null | [
"license:gpl",
"region:us"
] | null | "2024-06-14T18:14:25Z" | ---
license: gpl
---
|
Kiingjackoff/sd3_perturbed | Kiingjackoff | "2024-06-14T18:22:28Z" | 0 | 5 | null | [
"region:us"
] | null | "2024-06-14T18:16:34Z" | Entry not found |
disertatieNic/whisper-small-transcriere-vorbire-in-scris-adapters | disertatieNic | "2024-06-16T18:31:59Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-14T18:17:11Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Cristian9481/random-forest-pipeline-model | Cristian9481 | "2024-06-14T18:33:44Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T18:19:18Z" | Entry not found |
research-dump/Meta-Llama-3-8B-Instruct_dpo_wiktemp | research-dump | "2024-06-14T18:22:51Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"trl",
"dpo",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"bitsandbytes",
"region:us"
] | text-generation | "2024-06-14T18:19:31Z" | ---
library_name: transformers
tags:
- trl
- dpo
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MrSneakyD/Chef_RVC | MrSneakyD | "2024-06-14T18:21:58Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T18:21:55Z" | Entry not found |
Ricardo54321/LunarLander-PPOCleanRL | Ricardo54321 | "2024-06-14T18:30:15Z" | 0 | 0 | null | [
"tensorboard",
"LunarLander-v2",
"ppo",
"deep-reinforcement-learning",
"reinforcement-learning",
"custom-implementation",
"deep-rl-course",
"model-index",
"region:us"
] | reinforcement-learning | "2024-06-14T18:25:36Z" | ---
tags:
- LunarLander-v2
- ppo
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
- deep-rl-course
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: -195.05 +/- 94.92
name: mean_reward
verified: false
---
# PPO Agent Playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2.
# Hyperparameters
```python
{'exp_name': 'ppo.py'
'seed': 1
'torch_deterministic': True
'cuda': True
'track': False
'wandb_project_name': 'cleanRL'
'wandb_entity': None
'capture_video': False
'env_id': 'LunarLander-v2'
'total_timesteps': 50000
'learning_rate': 0.00025
'num_envs': 4
'num_steps': 128
'anneal_lr': True
'gae': True
'gamma': 0.99
'gae_lambda': 0.95
'num_minibatches': 4
'update_epochs': 4
'norm_adv': True
'clip_coef': 0.2
'clip_vloss': True
'ent_coef': 0.01
'vf_coef': 0.5
'max_grad_norm': 0.5
'target_kl': None
'repo_id': 'Ricardo54321/LunarLander-PPOCleanRL'
'batch_size': 512
'minibatch_size': 128}
```
|
Lucca192/modelss | Lucca192 | "2024-06-14T21:28:10Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T18:27:21Z" | Entry not found |
adamo1139/Yi-1.5-34B-32K-rebased-1406 | adamo1139 | "2024-06-14T19:32:59Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | "2024-06-14T18:29:26Z" | ---
license: apache-2.0
---
Yi-1.5-34B-32K finetuned via SFT on adamo1139/uninstruct-v1-experimental-chatml. Then trained via ORPO on adamo1139/rawrr_v2-2_stage1.
It's an attempt to fix synthetic SFT contamination of original Yi-1.5-34B-32K.
Next up:
Cleaning and releasing AEZAKMI v4 dataset.
Training this model on it. Maybe adding some toxic-dpo-natural on it if needed. Releasing it. |
samad321kk/saman2 | samad321kk | "2024-06-14T18:35:54Z" | 0 | 0 | null | [
"license:openrail",
"region:us"
] | null | "2024-06-14T18:29:55Z" | ---
license: openrail
---
|
brookieisthatyou/Pyra | brookieisthatyou | "2024-06-14T18:54:37Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-14T18:31:48Z" | Entry not found |
medieval-data/gliner-latin | medieval-data | "2024-06-14T18:37:48Z" | 0 | 0 | null | [
"pytorch",
"license:apache-2.0",
"region:us"
] | null | "2024-06-14T18:35:05Z" | ---
license: apache-2.0
---
|
Arbi-Houssem/TunLangModel_test1.11 | Arbi-Houssem | "2024-06-14T18:41:53Z" | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"ar",
"dataset:Arbi-Houssem/Tunisian_dataset_STT-TTS15s_filtred1.0",
"base_model:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | "2024-06-14T18:40:02Z" | ---
language:
- ar
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
datasets:
- Arbi-Houssem/Tunisian_dataset_STT-TTS15s_filtred1.0
metrics:
- wer
model-index:
- name: Whisper Tunisien
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Tunisian_dataset_STT-TTS15s_filtred1.0
type: Arbi-Houssem/Tunisian_dataset_STT-TTS15s_filtred1.0
args: 'config: ar, split: test'
metrics:
- name: Wer
type: wer
value: 102.70087778528021
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Tunisien
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Tunisian_dataset_STT-TTS15s_filtred1.0 dataset.
It achieves the following results on the evaluation set:
- Loss: 6.8348
- Wer: 102.7009
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 3000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-------:|:----:|:---------------:|:--------:|
| 1.1414 | 3.8760 | 500 | 4.1822 | 181.0263 |
| 0.4667 | 7.7519 | 1000 | 5.0049 | 108.1702 |
| 0.1821 | 11.6279 | 1500 | 5.5927 | 102.7684 |
| 0.068 | 15.5039 | 2000 | 6.2194 | 106.2120 |
| 0.011 | 19.3798 | 2500 | 6.3815 | 103.0385 |
| 0.0003 | 23.2558 | 3000 | 6.8348 | 102.7009 |
### Framework versions
- Transformers 4.41.2
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
|
Shobhank-iiitdwd/Generate_Medlable_llama2_7b | Shobhank-iiitdwd | "2024-06-14T18:44:56Z" | 0 | 0 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | "2024-06-14T18:40:55Z" | Entry not found |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task280 | Lots-of-LoRAs | "2024-07-01T20:11:14Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:12Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task280
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task280_stereoset_classification_stereotype_type
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task280_stereoset_classification_stereotype_type" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task190 | Lots-of-LoRAs | "2024-07-01T20:11:11Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:15Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task190
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task190_snli_classification
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task190_snli_classification" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task391 | Lots-of-LoRAs | "2024-07-01T20:11:07Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:17Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task391
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task391_causal_relationship
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task391_causal_relationship" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task290 | Lots-of-LoRAs | "2024-07-01T20:11:03Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:19Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task290
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task290_tellmewhy_question_answerability
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task290_tellmewhy_question_answerability" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1391 | Lots-of-LoRAs | "2024-07-01T20:10:59Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:22Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1391
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1391_winogrande_easy_answer_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task1391_winogrande_easy_answer_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1342 | Lots-of-LoRAs | "2024-07-01T20:10:56Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:24Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1342
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1342_amazon_us_reviews_title
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task1342_amazon_us_reviews_title" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task442 | Lots-of-LoRAs | "2024-07-01T20:10:51Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:26Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task442
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task442_com_qa_paraphrase_question_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task442_com_qa_paraphrase_question_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task620 | Lots-of-LoRAs | "2024-07-01T20:10:48Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:29Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task620
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task620_ohsumed_medical_subject_headings_answer_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task620_ohsumed_medical_subject_headings_answer_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1598 | Lots-of-LoRAs | "2024-07-01T20:10:44Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:32Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1598
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1598_nyc_long_text_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task1598_nyc_long_text_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task039 | Lots-of-LoRAs | "2024-07-01T20:10:40Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:35Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task039
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task039_qasc_find_overlapping_words
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task039_qasc_find_overlapping_words" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task769 | Lots-of-LoRAs | "2024-07-01T20:10:37Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:37Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task769
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task769_qed_summarization
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task769_qed_summarization" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1448 | Lots-of-LoRAs | "2024-07-01T20:10:33Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:40Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1448
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1448_disease_entity_extraction_ncbi_dataset
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task1448_disease_entity_extraction_ncbi_dataset" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task247 | Lots-of-LoRAs | "2024-07-01T20:10:29Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:42Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task247
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task247_dream_answer_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task247_dream_answer_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task513 | Lots-of-LoRAs | "2024-07-01T20:10:26Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:45Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task513
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task513_argument_stance_classification
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task513_argument_stance_classification" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task875 | Lots-of-LoRAs | "2024-07-01T20:10:22Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:48Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task875
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task875_emotion_classification
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task875_emotion_classification" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task515 | Lots-of-LoRAs | "2024-07-01T20:10:18Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:50Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task515
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task515_senteval_odd_word_out
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task515_senteval_odd_word_out" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task627 | Lots-of-LoRAs | "2024-07-01T20:10:15Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:53Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task627
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task627_xlwic_word_with_same_meaning_sentence_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task627_xlwic_word_with_same_meaning_sentence_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1534 | Lots-of-LoRAs | "2024-07-01T20:10:11Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:56Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1534
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1534_daily_dialog_question_classification
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task1534_daily_dialog_question_classification" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1551 | Lots-of-LoRAs | "2024-07-01T20:10:07Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:42:59Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1551
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1551_every_ith_element_from_kth_element
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task1551_every_ith_element_from_kth_element" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task583 | Lots-of-LoRAs | "2024-07-01T20:10:03Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:43:02Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task583
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task583_udeps_eng_coarse_pos_tagging
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task583_udeps_eng_coarse_pos_tagging" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1431 | Lots-of-LoRAs | "2024-07-01T20:09:59Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:43:05Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1431
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1431_head_qa_answer_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task1431_head_qa_answer_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task270 | Lots-of-LoRAs | "2024-07-01T20:09:55Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:43:09Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task270
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task270_csrg_counterfactual_context_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task270_csrg_counterfactual_context_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1487 | Lots-of-LoRAs | "2024-07-01T20:09:51Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:43:11Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1487
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1487_organism_substance_extraction_anem_dataset
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task1487_organism_substance_extraction_anem_dataset" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task679 | Lots-of-LoRAs | "2024-07-01T20:09:48Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:43:14Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task679
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task679_hope_edi_english_text_classification
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task679_hope_edi_english_text_classification" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task456 | Lots-of-LoRAs | "2024-07-01T20:09:44Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:43:23Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task456
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task456_matres_intention_classification
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task456_matres_intention_classification" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task385 | Lots-of-LoRAs | "2024-07-01T20:09:40Z" | 0 | 0 | pytorch | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"license:mit",
"region:us"
] | null | "2024-06-14T18:43:26Z" | ---
language: en
license: mit
library_name: pytorch
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task385
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task385_socialiqa_incorrect_answer_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
"task385_socialiqa_incorrect_answer_generation" sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |