AskAnything / README.md
ynhe's picture
[Update] ReadMe
b6af989
metadata
title: Ask-Anything:ChatGPT with Video Understanding
emoji: movie_camera
colorFrom: pink
colorTo: green
sdk: gradio
python_version: 3.8.16
app_file: app.py
pinned: false
license: mit

VideoChat

VideoChat is a multifunctional video question answering tool that combines the functions of Action Recognition, Visual Captioning and ChatGPT. Our solution generates dense, descriptive captions for any object and action in a video, offering a range of language styles to suit different user preferences. It supports users to have conversations in different lengths, emotions, authenticity of language.

  • Video-Text Generation
  • Chat about uploaded video
  • Interactive demo

:fire: Updates

  • 2023/04/19: Code Release

:speech_balloon: Example

images images

:running: Usage

# We recommend using conda to manage the environment and use python3.8.16
conda create -n chatvideo python=3.8.16
conda activate chatvideo

# Clone the repository:
git clone https://github.com/OpenGVLab/Ask-Anything.git
cd ask-anything/video_chat

# Install dependencies:
pip install -r requirements.txt
pip install https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.0.0/en_core_web_sm-3.0.0.tar.gz
python -m pip install 'git+https://github.com/facebookresearch/detectron2.git'

# Download the checkpoints
wget https://huggingface.co/spaces/xinyu1205/Tag2Text/resolve/main/tag2text_swin_14m.pth ./pretrained_models/tag2text_swin_14m.pth
wget https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth ./pretrained_models/grit_b_densecap_objectdet.pth
git clone https://huggingface.co/mrm8488/flan-t5-large-finetuned-openai-summarize_from_feedback ./pretrained_models/flan-t5-large-finetuned-openai-summarize_from_feedback

# Configure the necessary ChatGPT APIs
export OPENAI_API_KEY={Your_Private_Openai_Key}

# Run the VideoChat gradio demo.
python app.py

Acknowledgement

The project is based on InternVideo, Tag2Text, GRiT, mrm8488 and ChatGPT. Thanks for the authors for their efforts.