needle-in-a-jungle project!
Browse files
README.md
CHANGED
@@ -16,7 +16,18 @@ language:
|
|
16 |
From the [original model card](https://huggingface.co/gorilla-llm/gorilla-openfunctions-v0):
|
17 |
> Gorilla OpenFunctions extends Large Language Model(LLM) Chat Completion feature to formulate executable APIs call given natural language instructions and API context.
|
18 |
|
19 |
-
##
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
20 |
This version of the model is meant primarily to run smoothly on **Colab**.
|
21 |
I suggest loading the model with **8-bit quantization**, so that you have some free GPU to perform inference.
|
22 |
|
|
|
16 |
From the [original model card](https://huggingface.co/gorilla-llm/gorilla-openfunctions-v0):
|
17 |
> Gorilla OpenFunctions extends Large Language Model(LLM) Chat Completion feature to formulate executable APIs call given natural language instructions and API context.
|
18 |
|
19 |
+
## π£ NEW: try this model for Information Extraction
|
20 |
+
|
21 |
+
π§ͺπ¦ Needle in a Jungle - Information Extraction via LLMs
|
22 |
+
|
23 |
+
I did an experiment: a notebook that extracts information from a given URL (or text) based on a user-provided structure.
|
24 |
+
|
25 |
+
Stack: [ποΈ Haystack LLM framework](https://haystack.deepset.ai/) + Gorilla OpenFunctions
|
26 |
+
|
27 |
+
- π [Post full of details](https://t.ly/8QBWs)
|
28 |
+
- π [Notebook](https://github.com/anakin87/notebooks/blob/main/information_extraction_via_llms.ipynb)
|
29 |
+
|
30 |
+
## General Usage
|
31 |
This version of the model is meant primarily to run smoothly on **Colab**.
|
32 |
I suggest loading the model with **8-bit quantization**, so that you have some free GPU to perform inference.
|
33 |
|