Bappadala Rohith Kumar Naidu commited on
Commit
93d3cf7
Β·
1 Parent(s): 34518f8

chore: rename SafeVisionAI to SafeVixAI across dataset hub

Browse files
README.md CHANGED
@@ -13,16 +13,16 @@ tags:
13
  - traffic-law
14
  - geospatial
15
  - rag
16
- pretty_name: SafeVisionAI Dataset Hub
17
  size_categories:
18
  - 1B<n<10B
19
  ---
20
 
21
- # SafeVisionAI Dataset Hub πŸ›‘οΈ
22
 
23
- > The **Intelligence Layer** for the SafeVisionAI platform β€” IIT Madras Road Safety Hackathon 2026
24
 
25
- This repository hosts all datasets, pre-trained models, notebooks, and **reproducible data acquisition scripts** that power the SafeVisionAI application. It is designed to be cloned directly into Google Colab or any research environment.
26
 
27
  **Main Application Repo:** [SafeVision-AI/SafeVision-AI](https://github.com/SafeVision-AI/SafeVision-AI)
28
 
@@ -32,12 +32,12 @@ This repository hosts all datasets, pre-trained models, notebooks, and **reprodu
32
 
33
  ```python
34
  # Clone the entire intelligence layer
35
- !git clone https://huggingface.co/datasets/rohith083/SafeVisionAI-Dataset-Hub /content/data
36
 
37
  # Symlink into the app structure
38
  import os
39
- os.makedirs("/content/SafeVisionAI/chatbot_service", exist_ok=True)
40
- !ln -sfn /content/data/data/chatbot_service/data /content/SafeVisionAI/chatbot_service/data
41
  ```
42
 
43
  ---
@@ -45,7 +45,7 @@ os.makedirs("/content/SafeVisionAI/chatbot_service", exist_ok=True)
45
  ## πŸ“¦ Repository Structure
46
 
47
  ```
48
- SafeVisionAI-Dataset-Hub/
49
  β”œβ”€β”€ data/ ← 3.6 GB of raw intelligence data
50
  β”‚ β”œβ”€β”€ chatbot_service/data/ ← Legal PDFs, GIS CSVs, accident data, models
51
  β”‚ └── backend/datasets/ ← Challan rules, road infrastructure
@@ -82,7 +82,7 @@ All scripts here are **pure Python** β€” they run without any database or backen
82
 
83
  ```
84
  scripts/
85
- β”œβ”€β”€ scripts/data/ ← from SafeVisionAI/scripts/data/
86
  β”‚ β”œβ”€β”€ _overpass_utils.py ← Core GIS utility (basic version)
87
  β”‚ β”œβ”€β”€ fetch_hospitals.py ← Hospital data from OpenStreetMap
88
  β”‚ β”œβ”€β”€ fetch_police.py ← Police station data
@@ -99,14 +99,14 @@ scripts/
99
  β”‚ β”œβ”€β”€ check_all_scripts.py ← Script syntax validator
100
  β”‚ └── setup_kaggle.ps1 ← Kaggle API auth setup
101
  β”‚
102
- β”œβ”€β”€ backend/data/ ← from SafeVisionAI/backend/scripts/data/
103
  β”‚ β”œβ”€β”€ seed_violations.py ← Traffic fine normalizer (MVA 2019)
104
  β”‚ β”œβ”€β”€ prepare_road_sources.py ← CSV/GeoJSON β†’ LineString converter
105
  β”‚ β”œβ”€β”€ sample_pmgsy.py ← PMGSY 867K roads sampler
106
  β”‚ β”œβ”€β”€ road_sources.example.json ← Manifest template
107
  β”‚ └── road_sources.json ← Active road source manifest
108
  β”‚
109
- └── chatbot_service/data/ ← from SafeVisionAI/chatbot_service/scripts/data/
110
  β”œβ”€β”€ _overpass_utils.py ← Pro GIS utility (with retries + backoff)
111
  β”œβ”€β”€ fetch_hospitals.py ← Pro hospital fetcher
112
  β”œβ”€β”€ fetch_police.py ← Pro police fetcher
 
13
  - traffic-law
14
  - geospatial
15
  - rag
16
+ pretty_name: SafeVixAI Dataset Hub
17
  size_categories:
18
  - 1B<n<10B
19
  ---
20
 
21
+ # SafeVixAI Dataset Hub πŸ›‘οΈ
22
 
23
+ > The **Intelligence Layer** for the SafeVixAI platform β€” IIT Madras Road Safety Hackathon 2026
24
 
25
+ This repository hosts all datasets, pre-trained models, notebooks, and **reproducible data acquisition scripts** that power the SafeVixAI application. It is designed to be cloned directly into Google Colab or any research environment.
26
 
27
  **Main Application Repo:** [SafeVision-AI/SafeVision-AI](https://github.com/SafeVision-AI/SafeVision-AI)
28
 
 
32
 
33
  ```python
34
  # Clone the entire intelligence layer
35
+ !git clone https://huggingface.co/datasets/rohith083/SafeVixAI-Dataset-Hub /content/data
36
 
37
  # Symlink into the app structure
38
  import os
39
+ os.makedirs("/content/SafeVixAI/chatbot_service", exist_ok=True)
40
+ !ln -sfn /content/data/data/chatbot_service/data /content/SafeVixAI/chatbot_service/data
41
  ```
42
 
43
  ---
 
45
  ## πŸ“¦ Repository Structure
46
 
47
  ```
48
+ SafeVixAI-Dataset-Hub/
49
  β”œβ”€β”€ data/ ← 3.6 GB of raw intelligence data
50
  β”‚ β”œβ”€β”€ chatbot_service/data/ ← Legal PDFs, GIS CSVs, accident data, models
51
  β”‚ └── backend/datasets/ ← Challan rules, road infrastructure
 
82
 
83
  ```
84
  scripts/
85
+ β”œβ”€β”€ scripts/data/ ← from SafeVixAI/scripts/data/
86
  β”‚ β”œβ”€β”€ _overpass_utils.py ← Core GIS utility (basic version)
87
  β”‚ β”œβ”€β”€ fetch_hospitals.py ← Hospital data from OpenStreetMap
88
  β”‚ β”œβ”€β”€ fetch_police.py ← Police station data
 
99
  β”‚ β”œβ”€β”€ check_all_scripts.py ← Script syntax validator
100
  β”‚ └── setup_kaggle.ps1 ← Kaggle API auth setup
101
  β”‚
102
+ β”œβ”€β”€ backend/data/ ← from SafeVixAI/backend/scripts/data/
103
  β”‚ β”œβ”€β”€ seed_violations.py ← Traffic fine normalizer (MVA 2019)
104
  β”‚ β”œβ”€β”€ prepare_road_sources.py ← CSV/GeoJSON β†’ LineString converter
105
  β”‚ β”œβ”€β”€ sample_pmgsy.py ← PMGSY 867K roads sampler
106
  β”‚ β”œβ”€β”€ road_sources.example.json ← Manifest template
107
  β”‚ └── road_sources.json ← Active road source manifest
108
  β”‚
109
+ └── chatbot_service/data/ ← from SafeVixAI/chatbot_service/scripts/data/
110
  β”œβ”€β”€ _overpass_utils.py ← Pro GIS utility (with retries + backoff)
111
  β”œβ”€β”€ fetch_hospitals.py ← Pro hospital fetcher
112
  β”œβ”€β”€ fetch_police.py ← Pro police fetcher
data/chatbot_service/data/legal/motor_vehicles_act_1988_summary.txt CHANGED
@@ -386,5 +386,5 @@ For the complete text: https://indiacode.nic.in
386
  For official notifications: https://morth.nic.in
387
  For traffic violation information: https://parivahan.gov.in/echallan
388
 
389
- This summary is intended for the RoadSoS RAG knowledge base to answer user queries about traffic laws and road safety regulations in India.
390
  """
 
386
  For official notifications: https://morth.nic.in
387
  For traffic violation information: https://parivahan.gov.in/echallan
388
 
389
+ This summary is intended for the SafeVixAI RAG knowledge base to answer user queries about traffic laws and road safety regulations in India.
390
  """
notebooks/ChromaDB_RAG_Vectorstore_Build_chatbot_service_data_chroma_db_2.ipynb CHANGED
@@ -11,7 +11,7 @@
11
  "\n",
12
  "**Output:** `chroma_db/` directory -> deployed to `chatbot_service/data/chroma_db/`\n",
13
  "\n",
14
- "This notebook builds the **Retrieval-Augmented Generation (RAG)** knowledge base for the SafeVisionAI chatbot.\n",
15
  "It ingests Indian legal documents and first-aid medical PDFs, chunks them, embeds them, and stores them in a **ChromaDB** vector store.\n",
16
  "\n",
17
  "---\n",
 
11
  "\n",
12
  "**Output:** `chroma_db/` directory -> deployed to `chatbot_service/data/chroma_db/`\n",
13
  "\n",
14
+ "This notebook builds the **Retrieval-Augmented Generation (RAG)** knowledge base for the SafeVixAI chatbot.\n",
15
  "It ingests Indian legal documents and first-aid medical PDFs, chunks them, embeds them, and stores them in a **ChromaDB** vector store.\n",
16
  "\n",
17
  "---\n",
notebooks/README.md CHANGED
@@ -1,6 +1,6 @@
1
- # SafeVisionAI β€” Research Notebooks πŸ““
2
 
3
- These notebooks are the **research and training layer** of SafeVisionAI. Each one processes raw data from the Hub and produces a model, index, or processed dataset used by the live application.
4
 
5
  ---
6
 
@@ -10,7 +10,7 @@ Run this at the top of **any** notebook to mount the full dataset:
10
 
11
  ```python
12
  # Step 1 β€” Clone the Hub (only run once per session)
13
- !git clone https://huggingface.co/datasets/rohith083/SafeVisionAI-Dataset-Hub /content/hub
14
 
15
  # Step 2 β€” Install dependencies
16
  !pip install -q pdfplumber chromadb sentence-transformers ultralytics onnx
@@ -76,11 +76,11 @@ import os
76
  os.environ["HUGGING_FACE_TOKEN"] = "your_token_here"
77
 
78
  !cd /content/hub && git config user.email "you@example.com"
79
- !cd /content/hub && git config user.name "SafeVisionAI"
80
  !cd /content/hub && git add . && git commit -m "chore: update trained model outputs"
81
- !cd /content/hub && git push https://your_username:$HUGGING_FACE_TOKEN@huggingface.co/datasets/rohith083/SafeVisionAI-Dataset-Hub main
82
  ```
83
 
84
  ---
85
 
86
- *Part of the [SafeVisionAI](https://github.com/SafeVision-AI/SafeVision-AI) β€” IIT Madras Road Safety Hackathon 2026*
 
1
+ # SafeVixAI β€” Research Notebooks πŸ““
2
 
3
+ These notebooks are the **research and training layer** of SafeVixAI. Each one processes raw data from the Hub and produces a model, index, or processed dataset used by the live application.
4
 
5
  ---
6
 
 
10
 
11
  ```python
12
  # Step 1 β€” Clone the Hub (only run once per session)
13
+ !git clone https://huggingface.co/datasets/rohith083/SafeVixAI-Dataset-Hub /content/hub
14
 
15
  # Step 2 β€” Install dependencies
16
  !pip install -q pdfplumber chromadb sentence-transformers ultralytics onnx
 
76
  os.environ["HUGGING_FACE_TOKEN"] = "your_token_here"
77
 
78
  !cd /content/hub && git config user.email "you@example.com"
79
+ !cd /content/hub && git config user.name "SafeVixAI"
80
  !cd /content/hub && git add . && git commit -m "chore: update trained model outputs"
81
+ !cd /content/hub && git push https://your_username:$HUGGING_FACE_TOKEN@huggingface.co/datasets/rohith083/SafeVixAI-Dataset-Hub main
82
  ```
83
 
84
  ---
85
 
86
+ *Part of the [SafeVixAI](https://github.com/SafeVision-AI/SafeVision-AI) β€” IIT Madras Road Safety Hackathon 2026*
notebooks/Risk_Model_ONNX_Training_frontend_public_models_5.ipynb CHANGED
@@ -26,7 +26,7 @@
26
  "### Pipeline\n",
27
  "Synthetic data generation -> GBM training -> ONNX conversion -> Download\n",
28
  "\n",
29
- "> The model runs entirely client-side in the SafeVisionAI PWA using onnxruntime-web."
30
  ]
31
  },
32
  {
 
26
  "### Pipeline\n",
27
  "Synthetic data generation -> GBM training -> ONNX conversion -> Download\n",
28
  "\n",
29
+ "> The model runs entirely client-side in the SafeVixAI PWA using onnxruntime-web."
30
  ]
31
  },
32
  {
notebooks/Roads_Data_Processing_backend_data_4.ipynb CHANGED
@@ -9,7 +9,7 @@
9
  "**Output:** `toll_plazas_lite.json` -> deployed to `backend/data/roads/`\n",
10
  "\n",
11
  "This notebook processes the **NHAI Toll Plaza dataset** to produce a lightweight JSON\n",
12
- "suitable for the SafeVisionAI backend API and offline PWA map layer.\n",
13
  "\n",
14
  "---\n",
15
  "### Dataset\n",
 
9
  "**Output:** `toll_plazas_lite.json` -> deployed to `backend/data/roads/`\n",
10
  "\n",
11
  "This notebook processes the **NHAI Toll Plaza dataset** to produce a lightweight JSON\n",
12
+ "suitable for the SafeVixAI backend API and offline PWA map layer.\n",
13
  "\n",
14
  "---\n",
15
  "### Dataset\n",
requirements.txt CHANGED
@@ -1,4 +1,4 @@
1
- # SafeVisionAI Dataset Hub β€” Script Dependencies
2
  # Install with: pip install -r requirements.txt
3
 
4
  # ── Core (all scripts) ────────────────────────────────────────
 
1
+ # SafeVixAI Dataset Hub β€” Script Dependencies
2
  # Install with: pip install -r requirements.txt
3
 
4
  # ── Core (all scripts) ────────────────────────────────────────
scripts/README.md CHANGED
@@ -1,9 +1,9 @@
1
- # SafeVisionAI β€” Data Acquisition Scripts πŸ”¬
2
 
3
- These scripts are the **raw data pipeline** that built the 3.6GB SafeVisionAI dataset.
4
  All scripts here are **pure Python** β€” they require no database, no Redis, no PostGIS.
5
 
6
- > Scripts are mirrored from the [SafeVisionAI main repo](https://github.com/SafeVision-AI/SafeVision-AI) and organized by their origin folder.
7
 
8
  ---
9
 
@@ -11,7 +11,7 @@ All scripts here are **pure Python** β€” they require no database, no Redis, no
11
 
12
  ```
13
  scripts/
14
- β”œβ”€β”€ scripts/data/ ← from SafeVisionAI/scripts/data/
15
  β”‚ β”œβ”€β”€ fetch_*.py ← Overpass GIS fetchers (basic version)
16
  β”‚ β”œβ”€β”€ _overpass_utils.py ← Core GIS utility
17
  β”‚ β”œβ”€β”€ download_legal_pdfs.py
@@ -24,13 +24,13 @@ scripts/
24
  β”‚ β”œβ”€β”€ check_all_scripts.py
25
  β”‚ └── setup_kaggle.ps1
26
  β”‚
27
- β”œβ”€β”€ backend/data/ ← from SafeVisionAI/backend/scripts/data/
28
  β”‚ β”œβ”€β”€ seed_violations.py ← MVA 2019 traffic fine normalizer
29
  β”‚ β”œβ”€β”€ prepare_road_sources.py
30
  β”‚ β”œβ”€β”€ sample_pmgsy.py
31
  β”‚ └── road_sources.example.json
32
  β”‚
33
- └── chatbot_service/data/ ← from SafeVisionAI/chatbot_service/scripts/data/
34
  β”œβ”€β”€ _overpass_utils.py ← ⭐ Pro version (retries + backoff)
35
  └── fetch_*.py ← ⭐ Pro fetchers (use these over scripts/data/)
36
  ```
 
1
+ # SafeVixAI β€” Data Acquisition Scripts πŸ”¬
2
 
3
+ These scripts are the **raw data pipeline** that built the 3.6GB SafeVixAI dataset.
4
  All scripts here are **pure Python** β€” they require no database, no Redis, no PostGIS.
5
 
6
+ > Scripts are mirrored from the [SafeVixAI main repo](https://github.com/SafeVision-AI/SafeVision-AI) and organized by their origin folder.
7
 
8
  ---
9
 
 
11
 
12
  ```
13
  scripts/
14
+ β”œβ”€β”€ scripts/data/ ← from SafeVixAI/scripts/data/
15
  β”‚ β”œβ”€β”€ fetch_*.py ← Overpass GIS fetchers (basic version)
16
  β”‚ β”œβ”€β”€ _overpass_utils.py ← Core GIS utility
17
  β”‚ β”œβ”€β”€ download_legal_pdfs.py
 
24
  β”‚ β”œβ”€β”€ check_all_scripts.py
25
  β”‚ └── setup_kaggle.ps1
26
  β”‚
27
+ β”œβ”€β”€ backend/data/ ← from SafeVixAI/backend/scripts/data/
28
  β”‚ β”œβ”€β”€ seed_violations.py ← MVA 2019 traffic fine normalizer
29
  β”‚ β”œβ”€β”€ prepare_road_sources.py
30
  β”‚ β”œβ”€β”€ sample_pmgsy.py
31
  β”‚ └── road_sources.example.json
32
  β”‚
33
+ └── chatbot_service/data/ ← from SafeVixAI/chatbot_service/scripts/data/
34
  β”œβ”€β”€ _overpass_utils.py ← ⭐ Pro version (retries + backoff)
35
  └── fetch_*.py ← ⭐ Pro fetchers (use these over scripts/data/)
36
  ```
scripts/backend/data/prepare_road_sources.py CHANGED
@@ -25,7 +25,7 @@ import json
25
  import sys
26
  from pathlib import Path
27
 
28
- ROOT = Path(__file__).resolve().parents[1] # SafeVisionAI/backend/
29
  CHATBOT_DATA = ROOT.parent / "chatbot_service" / "data"
30
  OUT_DIR = ROOT / "datasets" / "roads"
31
  OUT_DIR.mkdir(parents=True, exist_ok=True)
 
25
  import sys
26
  from pathlib import Path
27
 
28
+ ROOT = Path(__file__).resolve().parents[1] # SafeVixAI/backend/
29
  CHATBOT_DATA = ROOT.parent / "chatbot_service" / "data"
30
  OUT_DIR = ROOT / "datasets" / "roads"
31
  OUT_DIR.mkdir(parents=True, exist_ok=True)
scripts/backend/data/road_sources.json CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b0a729760cda438188c0c7f10f82e16ea5ce04aa83d71f48fc80ae64c6468c62
3
- size 731
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0fab7e082337cf00095f344c3636dabf47e25f2993db379567f53310cd2b6e43
3
+ size 725
scripts/chatbot_service/data/_overpass_utils.py CHANGED
@@ -20,7 +20,7 @@ DEFAULT_ENDPOINTS = (
20
  )
21
  DEFAULT_HEADERS = {
22
  'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8',
23
- 'User-Agent': 'SafeVisionAI chatbot data fetcher/1.0',
24
  }
25
  CSV_COLUMNS = [
26
  'name',
 
20
  )
21
  DEFAULT_HEADERS = {
22
  'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8',
23
+ 'User-Agent': 'SafeVixAI chatbot data fetcher/1.0',
24
  }
25
  CSV_COLUMNS = [
26
  'name',
scripts/scripts/data/_overpass_utils.py CHANGED
@@ -16,7 +16,7 @@ DEFAULT_ENDPOINTS = (
16
  )
17
  DEFAULT_HEADERS = {
18
  'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8',
19
- 'User-Agent': 'SafeVisionAI bootstrap scripts/1.0',
20
  }
21
  CSV_COLUMNS = [
22
  'osm_id',
 
16
  )
17
  DEFAULT_HEADERS = {
18
  'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8',
19
+ 'User-Agent': 'SafeVixAI bootstrap scripts/1.0',
20
  }
21
  CSV_COLUMNS = [
22
  'osm_id',
scripts/scripts/data/download_legal_pdfs.py CHANGED
@@ -80,7 +80,7 @@ def download_first_working(sources: list[str], destination: Path) -> bool:
80
  req = urllib.request.Request(
81
  url,
82
  headers={
83
- "User-Agent": "Mozilla/5.0 (RoadSoS-DataPipeline/1.0; +https://github.com)"
84
  },
85
  )
86
  with urllib.request.urlopen(req, timeout=60) as response:
 
80
  req = urllib.request.Request(
81
  url,
82
  headers={
83
+ "User-Agent": "Mozilla/5.0 (SafeVixAI-DataPipeline/1.0; +https://github.com)"
84
  },
85
  )
86
  with urllib.request.urlopen(req, timeout=60) as response: