imperialwool commited on
Commit
e476745
1 Parent(s): 6c6b28d

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +2 -2
app.py CHANGED
@@ -25,8 +25,8 @@ async def echo():
25
  async def get():
26
  return '''<style>a:visited{color:black;}</style>
27
  <h1>Hello, world!</h1>
28
- This is showcase how to make own server with OpenBuddy's model.<br>
29
- I'm using here 3b model just for example. Also here's only CPU power.<br>
30
  But you can use GPU power as well!<br>
31
  <h1>How to GPU?</h1>
32
  Change <code>`CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS`</code> in Dockerfile on <code>`CMAKE_ARGS="-DLLAMA_CUBLAS=on"`</code>. Also you can try <code>`DLLAMA_CLBLAST`</code>, <code>`DLLAMA_METAL`</code> or <code>`DLLAMA_METAL`</code>.<br>
 
25
  async def get():
26
  return '''<style>a:visited{color:black;}</style>
27
  <h1>Hello, world!</h1>
28
+ This is showcase how to make own server with Llama2 model.<br>
29
+ I'm using here 7b model just for example. Also here's only CPU power.<br>
30
  But you can use GPU power as well!<br>
31
  <h1>How to GPU?</h1>
32
  Change <code>`CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS`</code> in Dockerfile on <code>`CMAKE_ARGS="-DLLAMA_CUBLAS=on"`</code>. Also you can try <code>`DLLAMA_CLBLAST`</code>, <code>`DLLAMA_METAL`</code> or <code>`DLLAMA_METAL`</code>.<br>