shishirpatil commited on
Commit
768bad3
1 Parent(s): b98d3f1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +73 -0
README.md CHANGED
@@ -1,3 +1,76 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+
5
+ 🚀 Try it out on [Colab](https://colab.research.google.com/drive/16M5J2H9F8YQora_W2PDnp120slZH-Mqd?usp=sharing)
6
+ 📣 Read more in our [OpenFunctions blog release](https://gorilla.cs.berkeley.edu/blogs/4_open_functions.html)
7
+
8
+ ## Introduction
9
+ Gorilla OpenFunctions extends Large Language Model(LLM) Chat Completion feature to formulate
10
+ executable APIs call given natural language instructions and API context.
11
+
12
+ ## Models Available
13
+ |model | functionality|
14
+ |---|---|
15
+ |gorilla-openfunc-v0 | Given a function, and user intent, returns properly formatted json with the right arguments|
16
+ |gorilla-openfunc-v1 | + Parallel functions, and can choose between functions|
17
+
18
+ ## Example Usage
19
+
20
+ 1. OpenFunctions is compatible with OpenAI Functions
21
+
22
+ ```bash
23
+ !pip install openai==0.28.1
24
+ ```
25
+
26
+ 2. Point to Gorilla hosted servers
27
+
28
+ ```python
29
+ import openai
30
+
31
+ def get_gorilla_response(prompt="Call me an Uber ride type \"Plus\" in Berkeley at zipcode 94704 in 10 minutes", model="gorilla-openfunc-v0", functions=[]):
32
+ openai.api_key = "EMPTY"
33
+ openai.api_base = "http://luigi.millennium.berkeley.edu:8000/v1"
34
+ try:
35
+ completion = openai.ChatCompletion.create(
36
+ model="gorilla-openfunc-v1",
37
+ temperature=0.0,
38
+ messages=[{"role": "user", "content": prompt}],
39
+ functions=functions,
40
+ )
41
+ return completion.choices[0].message.content
42
+ except Exception as e:
43
+ print(e, model, prompt)
44
+ ```
45
+
46
+ 3. Pass the user argument and set of functions, Gorilla OpenFunctions returns a fully formatted json
47
+
48
+ ```python
49
+ query = "Call me an Uber ride type \"Plus\" in Berkeley at zipcode 94704 in 10 minutes"
50
+ functions = [
51
+ {
52
+ "name": "Uber Carpool",
53
+ "api_name": "uber.ride",
54
+ "description": "Find suitable ride for customers given the location, type of ride, and the amount of time the customer is willing to wait as parameters",
55
+ "parameters": [{"name": "loc", "description": "location of the starting place of the uber ride"}, {"name":"type", "enum": ["plus", "comfort", "black"], "description": "types of uber ride user is ordering"}, {"name": "time", "description": "the amount of time in minutes the customer is willing to wait"}]
56
+ }
57
+ ]
58
+ get_gorilla_response(query, functions=functions)
59
+ ```
60
+
61
+ 4. Expected output
62
+
63
+ ```bash
64
+ uber.ride(loc="berkeley", type="plus", time=10)
65
+ ```
66
+
67
+ ## Contributing
68
+
69
+ All the models, and data used to train the models is released under Apache 2.0.
70
+ Gorilla is an open source effort from UC Berkeley and we welcome contributors.
71
+ Please email us your comments, criticism, and questions. More information about the project can be found at [https://gorilla.cs.berkeley.edu/](https://gorilla.cs.berkeley.edu/)
72
+
73
+
74
+
75
+
76
+