loubnabnl HF staff commited on
Commit
cb371e9
1 Parent(s): 4fa0d0f
Files changed (1) hide show
  1. architectures/incoder.txt +23 -1
architectures/incoder.txt CHANGED
@@ -11,4 +11,26 @@ During the training of InCoder, spans of code were randomly masked and moved to
11
 
12
  So in addition to program synthesis (via left-to-right generation), InCoder can also perform editing (via infilling). The model gives promising results in some zero-shot code infilling tasks such as type prediction, variable re-naming and comment generation.
13
 
14
- In the code generation demo we use InCoder 1.3B.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
 
12
  So in addition to program synthesis (via left-to-right generation), InCoder can also perform editing (via infilling). The model gives promising results in some zero-shot code infilling tasks such as type prediction, variable re-naming and comment generation.
13
 
14
+ In the code generation demo we use InCoder 1.3B.
15
+
16
+ You can load the model and tokenizer directly from the `transformers`:
17
+
18
+ ```python
19
+ from transformers import AutoTokenizer, AutoModelWithLMHead
20
+
21
+ tokenizer = AutoTokenizer.from_pretrained("facebook/incoder-6B")
22
+ model = AutoModelWithLMHead.from_pretrained("facebook/incoder-6B")
23
+
24
+ inputs = tokenizer("def hello_world():", return_tensors="pt")
25
+ outputs = model(**inputs)
26
+
27
+ ```
28
+
29
+ Or you can use a pipeline
30
+
31
+ ```python
32
+ from transformers import pipeline
33
+
34
+ pipe = pipeline("text-generation", model="facebook/incoder-6B")
35
+ outputs = pipe("def hello_world():")
36
+ ```