Spaces:
Runtime error
Runtime error
Merging Bugfix & all Previous Commits
Browse files- public/about.md +44 -0
- requirements.txt +1 -0
public/about.md
ADDED
@@ -0,0 +1,44 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# About
|
2 |
+
|
3 |
+
This is a non-commercial research projects as part of a Bachelor Thesis with the topic **"Building an Interpretable Natural Language AI Tool based on Transformer Models and Approaches of Explainable AI".**
|
4 |
+
|
5 |
+
This research tackles the rise of LLM based applications such a chatbots and explores the possibilities of model interpretation and explainability. The goal is to build a tool that can be used to explain the predictions of a LLM based chatbot.
|
6 |
+
|
7 |
+
## Implementation
|
8 |
+
|
9 |
+
This project is an implementation of PartitionSHAP and BERTViz into GODEL by Microsoft - [GODEL Model](https://huggingface.co/microsoft/GODEL-v1_1-large-seq2seq) which is a generative seq2seq transformer fine-tuned for goal directed dialog. It supports context and knowledge base inputs.
|
10 |
+
|
11 |
+
The UI is build with Gradio.
|
12 |
+
|
13 |
+
## Usage
|
14 |
+
|
15 |
+
You can chat with the model by entering a message into the input field and pressing enter. The model will then generate a response. You can also enter a context and knowledge base by clicking on the respective buttons and entering the data into the input fields. The model will then generate a response based on the context and knowledge base.
|
16 |
+
|
17 |
+
To explore explanations, chose one of the explanations methods (HINT: The runtime can increase significantly). Then keep on chatting and explore the explanations in the respective tab.
|
18 |
+
|
19 |
+
### Self Hosted
|
20 |
+
|
21 |
+
You can run this application locally by cloning this repository, setting up a python environment and installing the requirements. Then run the `app.py` file or use "uvicorn main:app --reload" in the *python terminal*.
|
22 |
+
|
23 |
+
For self-hosting you can use the Dockerfile to build a docker image and run it locally or directly use the provided docker image on the [GitHub page](https://github.com/lennardzuendorf/thesis-webapp/).
|
24 |
+
|
25 |
+
## Credit & License
|
26 |
+
This Product is licensed under the MIT license. See [LICENSE](https://github.com/LennardZuendorf/thesis-webapp/blob/main/LICENSE.md) at GitHub for more information.
|
27 |
+
|
28 |
+
Please credit the original authors of this project (Lennard Zündorf) and the credits listed below if you use this project or parts of it in your own work.
|
29 |
+
|
30 |
+
## Contact
|
31 |
+
|
32 |
+
### Author
|
33 |
+
|
34 |
+
- Lennard Zündorf
|
35 |
+
- lennard.zuendorf@student.htw-berlin.de
|
36 |
+
- [GitHub](https://github.com/LennardZuendorf)
|
37 |
+
- [LinkedIn](https://www.zuendorf.me/linkd)
|
38 |
+
|
39 |
+
|
40 |
+
#### University
|
41 |
+
Hochschule für Technik und Wirtschaft Berlin (HTW Berlin) - University of Applied Sciences for Engineering and Economics Berlin
|
42 |
+
|
43 |
+
1. Supervisor: Prof. Dr. Katarina Simbeck
|
44 |
+
2. Supervisor: Prof. Dr. Axel Hochstein
|
requirements.txt
CHANGED
@@ -15,4 +15,5 @@ seaborn~=0.13.0
|
|
15 |
numpy
|
16 |
matplotlib
|
17 |
pre-commit
|
|
|
18 |
#./components/iframe/dist/gradio_iframe-0.0.1-py3-none-any.whl
|
|
|
15 |
numpy
|
16 |
matplotlib
|
17 |
pre-commit
|
18 |
+
ipython
|
19 |
#./components/iframe/dist/gradio_iframe-0.0.1-py3-none-any.whl
|