Spaces:
Runtime error
Runtime error
Update app.py
Browse files
app.py
CHANGED
@@ -12,17 +12,6 @@ MAX_MAX_NEW_TOKENS = 4096
|
|
12 |
DEFAULT_MAX_NEW_TOKENS = 1024
|
13 |
MAX_INPUT_TOKEN_LENGTH = 4000
|
14 |
|
15 |
-
DESCRIPTION = """
|
16 |
-
# Code Llama 13B Chat
|
17 |
-
|
18 |
-
This Space demonstrates model [CodeLlama-13b-Instruct](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) by Meta, a Code Llama model with 13B parameters fine-tuned for chat instructions and specialized on code tasks. Feel free to play with it, or duplicate to run generations without a queue! If you want to run your own service, you can also [deploy the model on Inference Endpoints](https://huggingface.co/inference-endpoints).
|
19 |
-
|
20 |
-
๐ For more details about the Code Llama family of models and how to use them with `transformers`, take a look [at our blog post](https://huggingface.co/blog/codellama) or [the paper](https://huggingface.co/papers/2308.12950).
|
21 |
-
|
22 |
-
๐๐ป Check out our [Playground](https://huggingface.co/spaces/codellama/codellama-playground) for a super-fast code completion demo that leverages a streaming [inference endpoint](https://huggingface.co/inference-endpoints).
|
23 |
-
|
24 |
-
"""
|
25 |
-
|
26 |
title = """# WizardCoder 34B & ChatGPT
|
27 |
"""
|
28 |
LICENSE = """
|
|
|
12 |
DEFAULT_MAX_NEW_TOKENS = 1024
|
13 |
MAX_INPUT_TOKEN_LENGTH = 4000
|
14 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
15 |
title = """# WizardCoder 34B & ChatGPT
|
16 |
"""
|
17 |
LICENSE = """
|