Olivier Dehaene
olivierdehaene
Research interests
Organizations
olivierdehaene's community activity
Add hf endpoint handler.py
#24 opened 4 days ago
by
olivierdehaene

Add hf endpoint handler.py
#30 opened 4 days ago
by
olivierdehaene

Add hf endpoint handler.py
#20 opened 4 days ago
by
olivierdehaene

Add hf endpoint handler
#42 opened 4 days ago
by
olivierdehaene

Type of hardware for inference
3
#4 opened 15 days ago
by
jilijeanlouis

disable inference API
5
#43 opened 23 days ago
by
olivierdehaene

Add Inference API support
#1 opened about 1 month ago
by
olivierdehaene

How to run that?
21
#2 opened about 2 months ago
by
Guilherme34
How can I run this?
12
#2 opened 3 months ago
by
Matan1905
Add latest OA Pythia
#13 opened about 2 months ago
by
OllieStanley

flan-ul2 returns blank
#12 opened about 2 months ago
by
i-am-neo
improve dolly serving time
5
#7 opened about 2 months ago
by
xy-covey
Activate inference API
#26 opened 2 months ago
by
olivierdehaene

Required API?
1
#7 opened 3 months ago
by
Dharan
How can I run the model with multiple GPUs?
1
#3 opened 3 months ago
by
eeyrw
feat/openchat
#5 opened 3 months ago
by
olivierdehaene

Remove 1000px width of app
#4 opened 3 months ago
by
aliabid94

424 response when changing promt from default
1
#207 opened 3 months ago
by
imwide
only 424 responses
2
#206 opened 3 months ago
by
imwide
Hosted inference API giving 401
6
#211 opened 3 months ago
by
akat21
Bloom keeps giving me Response [424].
13
#208 opened 3 months ago
by
nicholasKluge

No longer available, why?
11
#21 opened 6 months ago
by
micole66
activate inference widget
1
#37 opened 3 months ago
by
olivierdehaene

Model is overloaded
1
#202 opened 3 months ago
by
olivierdehaene

feat/text_generation_inference
#1 opened 4 months ago
by
olivierdehaene

Adding `safetensors` variant of this model
1
#26 opened 4 months ago
by
olivierdehaene

Repetition penalty in detailed parameters.
5
#184 opened 4 months ago
by
imwide
Adding `safetensors` variant of this model
#2 opened 4 months ago
by
olivierdehaene

try_to_load_from_cache_revision_test
3
#1 opened 4 months ago
by
olivierdehaene

Adding `safetensors` variant of this model
#13 opened 4 months ago
by
olivierdehaene

Adding `safetensors` variant of this model
#3 opened 4 months ago
by
olivierdehaene

"Model is overloaded, please wait for a bit"
15
#70 opened 10 months ago
by
jmarxza
Getting log probabilities from the Inference API?
12
#89 opened 10 months ago
by
Brendan

Getting model overloaded the whole day
4
#154 opened 6 months ago
by
NigelTheMaker

Questions on safetensors and text-generation-inference server
17
#18 opened 7 months ago
by
pai4451
Adding `safetensors` variant of this model
#4 opened 7 months ago
by
olivierdehaene

fix: Fix convert_multi method
1
#3 opened 7 months ago
by
olivierdehaene

Change seed in interference API
4
#131 opened 8 months ago
by
imwide
Add `feature_extractor_type`
3
#1 opened 9 months ago
by
olivierdehaene
