feat: improve streamlit docs and clean navigation icons
Browse files- README.md +57 -38
- app.py +250 -41
- apps/ciq_2g_generator.py +3 -3
- apps/ciq_3g_generator.py +1 -1
- apps/ciq_4g_generator.py +1 -1
- apps/ciq_verification.py +15 -15
- apps/dump_compare.py +3 -3
- apps/fnb_parser.py +3 -7
- apps/kpi_analysis/gsm_capacity.py +1 -1
- apps/kpi_analysis/gsm_lac_load.py +1 -1
- apps/kpi_analysis/lcg_analysis.py +1 -1
- apps/kpi_analysis/lte_capacity.py +1 -1
- apps/kpi_analysis/lte_drop_trafic.py +1 -1
- apps/kpi_analysis/trafic_analysis.py +4 -4
- apps/kpi_analysis/wbts_capacty.py +1 -1
- apps/kpi_analysis/wcel_capacity.py +1 -1
- apps/parameters_distribution.py +2 -2
- apps/sector_kml_generator.py +3 -3
- documentations/DOC_COVERAGE.md +43 -0
- documentations/README_DOCS.md +42 -0
- documentations/anomaly_detection_doc.py +58 -118
- documentations/ciq_2g_doc.py +63 -0
- documentations/ciq_3g_doc.py +56 -0
- documentations/ciq_4g_doc.py +58 -0
- documentations/ciq_verification_doc.py +63 -0
- documentations/clustering_doc.py +62 -0
- documentations/core_dump_doc.py +55 -30
- documentations/database_doc.py +56 -57
- documentations/dbm_watt_doc.py +56 -0
- documentations/distance_doc.py +62 -0
- documentations/dump_analysis_doc.py +56 -0
- documentations/dump_compare_doc.py +54 -0
- documentations/fnb_parser_doc.py +59 -0
- documentations/gps_converter_doc.py +65 -0
- documentations/gsm_lac_load_doc.py +60 -0
- documentations/import_physical_db_doc.py +51 -0
- documentations/index_doc.py +64 -0
- documentations/lcg_analysis_doc.py +58 -0
- documentations/lte_drop_traffic_doc.py +60 -0
- documentations/multi_points_distance_doc.py +62 -0
- documentations/parameters_distribution_doc.py +57 -0
- documentations/sector_kml_doc.py +63 -0
- documentations/trafic_analysis_doc.py +63 -113
- documentations/wbts_capacity_doc.py +57 -0
- documentations/wcel_capacity_doc.py +62 -0
README.md
CHANGED
|
@@ -1,5 +1,5 @@
|
|
| 1 |
-
---
|
| 2 |
-
title:
|
| 3 |
emoji: 🏢
|
| 4 |
colorFrom: red
|
| 5 |
colorTo: green
|
|
@@ -9,49 +9,68 @@ app_file: app.py
|
|
| 9 |
pinned: false
|
| 10 |
---
|
| 11 |
|
| 12 |
-
##
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
|
| 14 |
-
|
| 15 |
|
| 16 |
-
|
|
|
|
|
|
|
| 17 |
|
| 18 |
-
|
| 19 |
-
2. Upload the dump file to the app.
|
| 20 |
-
3. Click the buttons to generate the databases.
|
| 21 |
-
4. Download the databases.
|
| 22 |
|
| 23 |
-
##
|
| 24 |
|
| 25 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 26 |
|
| 27 |
-
##
|
| 28 |
|
| 29 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 30 |
|
| 31 |
## Hosted Version
|
| 32 |
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
##
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
- [x] Add a download button for all databases
|
| 39 |
-
- [x] Add option to download Neighbors database
|
| 40 |
-
- [x] Add page to update physical db
|
| 41 |
-
- [x] Add Core dump checking App
|
| 42 |
-
- [x] Add site config band in database
|
| 43 |
-
- [x] Add TRX database
|
| 44 |
-
- [x] Add MRBTS with code
|
| 45 |
-
- [x] Check TCH from MAL sheet
|
| 46 |
-
- [x] Add Analitic dashboards for each database (Count of NE)
|
| 47 |
-
- [x] Add kml generation
|
| 48 |
-
- [x] Parameters Distribution App
|
| 49 |
-
- [x] Symetric neighbours checking
|
| 50 |
-
- [x] Add the ability to select columns
|
| 51 |
-
- [x] Add authentication
|
| 52 |
-
- [x] Initial frequency distribution chart GSM
|
| 53 |
-
- [x] Add paging analysis App
|
| 54 |
-
- [x] Add capacity analysis App
|
| 55 |
-
- [ ] Improve Dashboard
|
| 56 |
-
- [ ] Error handling
|
| 57 |
-
- [ ] Add KPI analysis App
|
|
|
|
| 1 |
+
---
|
| 2 |
+
title: OML DB Tools
|
| 3 |
emoji: 🏢
|
| 4 |
colorFrom: red
|
| 5 |
colorTo: green
|
|
|
|
| 9 |
pinned: false
|
| 10 |
---
|
| 11 |
|
| 12 |
+
## OML DB Tools (Streamlit)
|
| 13 |
+
|
| 14 |
+
OML DB Tools is a multi-page Streamlit application for telecom engineering workflows:
|
| 15 |
+
- database generation from OML dumps
|
| 16 |
+
- CIQ generation and CIQ verification
|
| 17 |
+
- dump core parsing
|
| 18 |
+
- UE capability parsing
|
| 19 |
+
- geospatial utilities (GPS conversion, distance, KML generation)
|
| 20 |
+
- KPI and capacity analysis tools
|
| 21 |
+
|
| 22 |
+
## Quick Start
|
| 23 |
+
|
| 24 |
+
1. Create and activate a Python virtual environment.
|
| 25 |
+
2. Install dependencies:
|
| 26 |
+
- `pip install -r requirements.txt`
|
| 27 |
+
3. Run the Streamlit app:
|
| 28 |
+
- `.\run_app.ps1 -App streamlit`
|
| 29 |
+
- or `python -m streamlit run app.py`
|
| 30 |
+
|
| 31 |
+
Default local URL: `http://localhost:8501`
|
| 32 |
|
| 33 |
+
## Authentication
|
| 34 |
|
| 35 |
+
The app uses Streamlit secrets for login:
|
| 36 |
+
- `username`
|
| 37 |
+
- `password`
|
| 38 |
|
| 39 |
+
Configure them in your local Streamlit secrets file before sharing the app.
|
|
|
|
|
|
|
|
|
|
| 40 |
|
| 41 |
+
## Main Functional Areas
|
| 42 |
|
| 43 |
+
1. Apps
|
| 44 |
+
- Database processing, CIQ generators, CIQ verification, dump core parser, UE capability parser, GPS tools, KML generator, clustering, dump compare.
|
| 45 |
+
2. Capacity Analysis
|
| 46 |
+
- GSM/WBTS/WCEL/LCG/LTE capacity analysis.
|
| 47 |
+
3. Paging Analysis
|
| 48 |
+
- GSM LAC load analysis.
|
| 49 |
+
4. KPI Analysis
|
| 50 |
+
- LTE traffic drop, KPI anomaly detection, traffic analysis.
|
| 51 |
+
5. Documentations
|
| 52 |
+
- In-app user documentation pages (from the top navigation).
|
| 53 |
|
| 54 |
+
## Input Formats (most used)
|
| 55 |
|
| 56 |
+
- Dump files: `.xlsb`
|
| 57 |
+
- CIQ files: `.xlsx` / `.xls`
|
| 58 |
+
- KPI reports: `.csv` (depending on page)
|
| 59 |
+
- UE capability exports: `.txt`
|
| 60 |
+
- GPS / distance / KML helpers: `.xlsx`
|
| 61 |
+
|
| 62 |
+
## In-App Documentation
|
| 63 |
+
|
| 64 |
+
Use `Documentations > Documentation Home` in the app for:
|
| 65 |
+
- page-by-page usage guides
|
| 66 |
+
- required columns and file templates
|
| 67 |
+
- troubleshooting notes
|
| 68 |
+
- known limitations
|
| 69 |
|
| 70 |
## Hosted Version
|
| 71 |
|
| 72 |
+
https://davmelchi-db-query.hf.space/
|
| 73 |
+
|
| 74 |
+
## Release Notes
|
| 75 |
+
|
| 76 |
+
See `Changelog.md` for version history.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app.py
CHANGED
|
@@ -1,5 +1,37 @@
|
|
| 1 |
import streamlit as st
|
| 2 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
|
| 4 |
# Authentication function
|
| 5 |
def check_password():
|
|
@@ -80,7 +112,7 @@ def check_password():
|
|
| 80 |
"password_correct" in st.session_state
|
| 81 |
and not st.session_state["password_correct"]
|
| 82 |
):
|
| 83 |
-
st.error("
|
| 84 |
|
| 85 |
# Login form with improved input fields
|
| 86 |
st.text_input("Username", key="username", placeholder="Enter your username")
|
|
@@ -102,115 +134,292 @@ def check_password():
|
|
| 102 |
|
| 103 |
# Only show the app if authentication is successful
|
| 104 |
if check_password():
|
| 105 |
-
|
| 106 |
-
page_title="NPO DB Query",
|
| 107 |
-
page_icon="💻",
|
| 108 |
-
layout="wide",
|
| 109 |
-
initial_sidebar_state="expanded",
|
| 110 |
-
menu_items={
|
| 111 |
-
"About": "**📡 NPO DB Query v0.2.12**",
|
| 112 |
-
},
|
| 113 |
-
)
|
| 114 |
|
| 115 |
pages = {
|
| 116 |
"Apps": [
|
| 117 |
-
st.Page("apps/database_page.py", title="🏡Generate Databases"),
|
| 118 |
st.Page(
|
| 119 |
-
"apps/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 120 |
),
|
| 121 |
-
st.Page("apps/ciq_2g_generator.py", title="🧾 CIQ 2G Generator"),
|
| 122 |
-
st.Page("apps/ciq_3g_generator.py", title="🧾 CIQ 3G Generator"),
|
| 123 |
-
st.Page("apps/ciq_4g_generator.py", title="🧾 CIQ 4G Generator"),
|
| 124 |
-
st.Page("apps/ciq_verification.py", title="🔍 CIQ Verification"),
|
| 125 |
-
st.Page("apps/core_dump_page.py", title="📠Parse dump core"),
|
| 126 |
-
st.Page("apps/gps_converter.py", title="🧭GPS Converter"),
|
| 127 |
-
st.Page("apps/dbm_watt_calculator.py", title="dBm <> Watt Calculator"),
|
| 128 |
-
st.Page("apps/distance.py", title="🛰Distance Calculator"),
|
| 129 |
st.Page(
|
| 130 |
"apps/multi_points_distance_calculator.py",
|
| 131 |
-
title="
|
|
|
|
| 132 |
),
|
| 133 |
st.Page(
|
| 134 |
"apps/sector_kml_generator.py",
|
| 135 |
-
title="
|
|
|
|
| 136 |
),
|
| 137 |
st.Page(
|
| 138 |
"apps/clustering.py",
|
| 139 |
-
title="
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 140 |
),
|
| 141 |
-
st.Page("apps/fnb_parser.py", title="📄 F4NB Extractor"),
|
| 142 |
-
st.Page("apps/dump_compare.py", title="📊 Dump Compare"),
|
| 143 |
st.Page(
|
| 144 |
-
"apps/import_physical_db.py",
|
|
|
|
|
|
|
| 145 |
),
|
| 146 |
],
|
| 147 |
"Capacity Analysis": [
|
| 148 |
st.Page(
|
| 149 |
"apps/kpi_analysis/gsm_capacity.py",
|
| 150 |
-
title="
|
|
|
|
| 151 |
),
|
| 152 |
st.Page(
|
| 153 |
"apps/kpi_analysis/wbts_capacty.py",
|
| 154 |
-
title="
|
|
|
|
| 155 |
),
|
| 156 |
st.Page(
|
| 157 |
"apps/kpi_analysis/wcel_capacity.py",
|
| 158 |
-
title="
|
|
|
|
| 159 |
),
|
| 160 |
st.Page(
|
| 161 |
"apps/kpi_analysis/lcg_analysis.py",
|
| 162 |
-
title="
|
|
|
|
| 163 |
),
|
| 164 |
st.Page(
|
| 165 |
"apps/kpi_analysis/lte_capacity.py",
|
| 166 |
-
title="
|
|
|
|
| 167 |
),
|
| 168 |
],
|
| 169 |
"Paging Analysis": [
|
| 170 |
st.Page(
|
| 171 |
"apps/kpi_analysis/gsm_lac_load.py",
|
| 172 |
-
title="
|
|
|
|
| 173 |
),
|
| 174 |
],
|
| 175 |
"KPI Analysis": [
|
| 176 |
st.Page(
|
| 177 |
"apps/kpi_analysis/lte_drop_trafic.py",
|
| 178 |
-
title="
|
|
|
|
| 179 |
),
|
| 180 |
st.Page(
|
| 181 |
"apps/kpi_analysis/anomalie.py",
|
| 182 |
-
title="
|
|
|
|
| 183 |
),
|
| 184 |
st.Page(
|
| 185 |
"apps/kpi_analysis/trafic_analysis.py",
|
| 186 |
-
title="
|
|
|
|
| 187 |
),
|
| 188 |
],
|
| 189 |
"Documentations": [
|
| 190 |
st.Page(
|
| 191 |
-
"documentations/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 192 |
),
|
| 193 |
st.Page(
|
| 194 |
-
"documentations/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 195 |
),
|
| 196 |
st.Page(
|
| 197 |
"documentations/gsm_capacity_docs.py",
|
| 198 |
-
title="
|
|
|
|
| 199 |
),
|
| 200 |
st.Page(
|
| 201 |
"documentations/lte_capacity_docs.py",
|
| 202 |
-
title="
|
|
|
|
| 203 |
),
|
| 204 |
st.Page(
|
| 205 |
"documentations/trafic_analysis_doc.py",
|
| 206 |
-
title="
|
|
|
|
| 207 |
),
|
| 208 |
st.Page(
|
| 209 |
"documentations/anomaly_detection_doc.py",
|
| 210 |
-
title="
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 211 |
),
|
| 212 |
],
|
| 213 |
}
|
| 214 |
|
| 215 |
pg = st.navigation(pages, position="top")
|
| 216 |
pg.run()
|
|
|
|
|
|
|
|
|
| 1 |
import streamlit as st
|
| 2 |
|
| 3 |
+
st.set_page_config(
|
| 4 |
+
page_title="NPO DB Query",
|
| 5 |
+
page_icon=":computer:",
|
| 6 |
+
layout="wide",
|
| 7 |
+
initial_sidebar_state="expanded",
|
| 8 |
+
menu_items={
|
| 9 |
+
"About": "**NPO DB Query v0.2.15**",
|
| 10 |
+
},
|
| 11 |
+
)
|
| 12 |
+
|
| 13 |
+
|
| 14 |
+
def apply_icon_color_css() -> None:
|
| 15 |
+
st.markdown(
|
| 16 |
+
"""
|
| 17 |
+
<style>
|
| 18 |
+
section[data-testid="stSidebar"] {
|
| 19 |
+
display: none !important;
|
| 20 |
+
}
|
| 21 |
+
div[data-testid="stSidebarCollapsedControl"] {
|
| 22 |
+
display: none !important;
|
| 23 |
+
}
|
| 24 |
+
span[data-testid="stIconMaterial"] {
|
| 25 |
+
color: #2563eb !important;
|
| 26 |
+
}
|
| 27 |
+
div[data-baseweb="popover"] span[data-testid="stIconMaterial"] {
|
| 28 |
+
color: #1d4ed8 !important;
|
| 29 |
+
}
|
| 30 |
+
</style>
|
| 31 |
+
""",
|
| 32 |
+
unsafe_allow_html=True,
|
| 33 |
+
)
|
| 34 |
+
|
| 35 |
|
| 36 |
# Authentication function
|
| 37 |
def check_password():
|
|
|
|
| 112 |
"password_correct" in st.session_state
|
| 113 |
and not st.session_state["password_correct"]
|
| 114 |
):
|
| 115 |
+
st.error("User not known or password incorrect")
|
| 116 |
|
| 117 |
# Login form with improved input fields
|
| 118 |
st.text_input("Username", key="username", placeholder="Enter your username")
|
|
|
|
| 134 |
|
| 135 |
# Only show the app if authentication is successful
|
| 136 |
if check_password():
|
| 137 |
+
apply_icon_color_css()
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 138 |
|
| 139 |
pages = {
|
| 140 |
"Apps": [
|
|
|
|
| 141 |
st.Page(
|
| 142 |
+
"apps/database_page.py",
|
| 143 |
+
title="Generate Databases",
|
| 144 |
+
icon=":material/home:",
|
| 145 |
+
),
|
| 146 |
+
st.Page(
|
| 147 |
+
"apps/parameters_distribution.py",
|
| 148 |
+
title="Parameters distribution",
|
| 149 |
+
icon=":material/tune:",
|
| 150 |
+
),
|
| 151 |
+
st.Page(
|
| 152 |
+
"apps/ciq_2g_generator.py",
|
| 153 |
+
title="CIQ 2G Generator",
|
| 154 |
+
icon=":material/description:",
|
| 155 |
+
),
|
| 156 |
+
st.Page(
|
| 157 |
+
"apps/ciq_3g_generator.py",
|
| 158 |
+
title="CIQ 3G Generator",
|
| 159 |
+
icon=":material/description:",
|
| 160 |
+
),
|
| 161 |
+
st.Page(
|
| 162 |
+
"apps/ciq_4g_generator.py",
|
| 163 |
+
title="CIQ 4G Generator",
|
| 164 |
+
icon=":material/description:",
|
| 165 |
+
),
|
| 166 |
+
st.Page(
|
| 167 |
+
"apps/ciq_verification.py",
|
| 168 |
+
title="CIQ Verification",
|
| 169 |
+
icon=":material/fact_check:",
|
| 170 |
+
),
|
| 171 |
+
st.Page(
|
| 172 |
+
"apps/core_dump_page.py",
|
| 173 |
+
title="Parse dump core",
|
| 174 |
+
icon=":material/terminal:",
|
| 175 |
+
),
|
| 176 |
+
st.Page(
|
| 177 |
+
"apps/ue_capability_parser.py",
|
| 178 |
+
title="UE Capability Parser",
|
| 179 |
+
icon=":material/memory:",
|
| 180 |
+
),
|
| 181 |
+
st.Page(
|
| 182 |
+
"apps/gps_converter.py",
|
| 183 |
+
title="GPS Converter",
|
| 184 |
+
icon=":material/explore:",
|
| 185 |
+
),
|
| 186 |
+
st.Page(
|
| 187 |
+
"apps/dbm_watt_calculator.py",
|
| 188 |
+
title="dBm <> Watt Calculator",
|
| 189 |
+
icon=":material/calculate:",
|
| 190 |
+
),
|
| 191 |
+
st.Page(
|
| 192 |
+
"apps/distance.py",
|
| 193 |
+
title="Distance Calculator",
|
| 194 |
+
icon=":material/social_distance:",
|
| 195 |
),
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 196 |
st.Page(
|
| 197 |
"apps/multi_points_distance_calculator.py",
|
| 198 |
+
title="Multi Points Distance Calculator",
|
| 199 |
+
icon=":material/route:",
|
| 200 |
),
|
| 201 |
st.Page(
|
| 202 |
"apps/sector_kml_generator.py",
|
| 203 |
+
title="Sector KML Generator",
|
| 204 |
+
icon=":material/map:",
|
| 205 |
),
|
| 206 |
st.Page(
|
| 207 |
"apps/clustering.py",
|
| 208 |
+
title="Automatic Site Clustering",
|
| 209 |
+
icon=":material/hub:",
|
| 210 |
+
),
|
| 211 |
+
st.Page(
|
| 212 |
+
"apps/fnb_parser.py",
|
| 213 |
+
title="F4NB Extractor",
|
| 214 |
+
icon=":material/article:",
|
| 215 |
+
),
|
| 216 |
+
st.Page(
|
| 217 |
+
"apps/dump_compare.py",
|
| 218 |
+
title="Dump Compare",
|
| 219 |
+
icon=":material/compare_arrows:",
|
| 220 |
),
|
|
|
|
|
|
|
| 221 |
st.Page(
|
| 222 |
+
"apps/import_physical_db.py",
|
| 223 |
+
title="Physical Database Verification",
|
| 224 |
+
icon=":material/public:",
|
| 225 |
),
|
| 226 |
],
|
| 227 |
"Capacity Analysis": [
|
| 228 |
st.Page(
|
| 229 |
"apps/kpi_analysis/gsm_capacity.py",
|
| 230 |
+
title="GSM Capacity Analysis",
|
| 231 |
+
icon=":material/bar_chart:",
|
| 232 |
),
|
| 233 |
st.Page(
|
| 234 |
"apps/kpi_analysis/wbts_capacty.py",
|
| 235 |
+
title="WBTS Capacity BB and CE Analysis",
|
| 236 |
+
icon=":material/developer_board:",
|
| 237 |
),
|
| 238 |
st.Page(
|
| 239 |
"apps/kpi_analysis/wcel_capacity.py",
|
| 240 |
+
title="WCEL Capacity Analysis",
|
| 241 |
+
icon=":material/cell_tower:",
|
| 242 |
),
|
| 243 |
st.Page(
|
| 244 |
"apps/kpi_analysis/lcg_analysis.py",
|
| 245 |
+
title="LCG Capacity Analysis",
|
| 246 |
+
icon=":material/stacked_line_chart:",
|
| 247 |
),
|
| 248 |
st.Page(
|
| 249 |
"apps/kpi_analysis/lte_capacity.py",
|
| 250 |
+
title="LTE Capacity Analysis",
|
| 251 |
+
icon=":material/signal_cellular_alt:",
|
| 252 |
),
|
| 253 |
],
|
| 254 |
"Paging Analysis": [
|
| 255 |
st.Page(
|
| 256 |
"apps/kpi_analysis/gsm_lac_load.py",
|
| 257 |
+
title="GSM LAC Load Analysis",
|
| 258 |
+
icon=":material/sms:",
|
| 259 |
),
|
| 260 |
],
|
| 261 |
"KPI Analysis": [
|
| 262 |
st.Page(
|
| 263 |
"apps/kpi_analysis/lte_drop_trafic.py",
|
| 264 |
+
title="LTE Drop Traffic Analysis",
|
| 265 |
+
icon=":material/trending_down:",
|
| 266 |
),
|
| 267 |
st.Page(
|
| 268 |
"apps/kpi_analysis/anomalie.py",
|
| 269 |
+
title="KPIs Anomaly Detection",
|
| 270 |
+
icon=":material/warning:",
|
| 271 |
),
|
| 272 |
st.Page(
|
| 273 |
"apps/kpi_analysis/trafic_analysis.py",
|
| 274 |
+
title="Trafic Analysis",
|
| 275 |
+
icon=":material/analytics:",
|
| 276 |
),
|
| 277 |
],
|
| 278 |
"Documentations": [
|
| 279 |
st.Page(
|
| 280 |
+
"documentations/index_doc.py",
|
| 281 |
+
title="Documentation Home",
|
| 282 |
+
icon=":material/menu_book:",
|
| 283 |
+
),
|
| 284 |
+
st.Page(
|
| 285 |
+
"documentations/database_doc.py",
|
| 286 |
+
title="Databases Documentation",
|
| 287 |
+
icon=":material/menu_book:",
|
| 288 |
+
),
|
| 289 |
+
st.Page(
|
| 290 |
+
"documentations/core_dump_doc.py",
|
| 291 |
+
title="Dump Core Documentation",
|
| 292 |
+
icon=":material/menu_book:",
|
| 293 |
+
),
|
| 294 |
+
st.Page(
|
| 295 |
+
"documentations/ue_capability_doc.py",
|
| 296 |
+
title="UE Capability Documentation",
|
| 297 |
+
icon=":material/menu_book:",
|
| 298 |
+
),
|
| 299 |
+
st.Page(
|
| 300 |
+
"documentations/ciq_verification_doc.py",
|
| 301 |
+
title="CIQ Verification Documentation",
|
| 302 |
+
icon=":material/menu_book:",
|
| 303 |
),
|
| 304 |
st.Page(
|
| 305 |
+
"documentations/gps_converter_doc.py",
|
| 306 |
+
title="GPS Converter Documentation",
|
| 307 |
+
icon=":material/menu_book:",
|
| 308 |
+
),
|
| 309 |
+
st.Page(
|
| 310 |
+
"documentations/distance_doc.py",
|
| 311 |
+
title="Distance Calculator Documentation",
|
| 312 |
+
icon=":material/menu_book:",
|
| 313 |
+
),
|
| 314 |
+
st.Page(
|
| 315 |
+
"documentations/sector_kml_doc.py",
|
| 316 |
+
title="Sector KML Generator Documentation",
|
| 317 |
+
icon=":material/menu_book:",
|
| 318 |
),
|
| 319 |
st.Page(
|
| 320 |
"documentations/gsm_capacity_docs.py",
|
| 321 |
+
title="GSM Capacity Documentation",
|
| 322 |
+
icon=":material/menu_book:",
|
| 323 |
),
|
| 324 |
st.Page(
|
| 325 |
"documentations/lte_capacity_docs.py",
|
| 326 |
+
title="LTE Capacity Documentation",
|
| 327 |
+
icon=":material/menu_book:",
|
| 328 |
),
|
| 329 |
st.Page(
|
| 330 |
"documentations/trafic_analysis_doc.py",
|
| 331 |
+
title="Traffic Analysis Documentation",
|
| 332 |
+
icon=":material/menu_book:",
|
| 333 |
),
|
| 334 |
st.Page(
|
| 335 |
"documentations/anomaly_detection_doc.py",
|
| 336 |
+
title="Anomaly Detection Documentation",
|
| 337 |
+
icon=":material/menu_book:",
|
| 338 |
+
),
|
| 339 |
+
st.Page(
|
| 340 |
+
"documentations/dump_compare_doc.py",
|
| 341 |
+
title="Dump Compare Documentation",
|
| 342 |
+
icon=":material/menu_book:",
|
| 343 |
+
),
|
| 344 |
+
st.Page(
|
| 345 |
+
"documentations/clustering_doc.py",
|
| 346 |
+
title="Clustering Documentation",
|
| 347 |
+
icon=":material/menu_book:",
|
| 348 |
+
),
|
| 349 |
+
st.Page(
|
| 350 |
+
"documentations/import_physical_db_doc.py",
|
| 351 |
+
title="Physical Database Documentation",
|
| 352 |
+
icon=":material/menu_book:",
|
| 353 |
+
),
|
| 354 |
+
st.Page(
|
| 355 |
+
"documentations/multi_points_distance_doc.py",
|
| 356 |
+
title="Multi Points Distance Documentation",
|
| 357 |
+
icon=":material/menu_book:",
|
| 358 |
+
),
|
| 359 |
+
st.Page(
|
| 360 |
+
"documentations/dump_analysis_doc.py",
|
| 361 |
+
title="Dump Analytics Documentation",
|
| 362 |
+
icon=":material/menu_book:",
|
| 363 |
+
),
|
| 364 |
+
st.Page(
|
| 365 |
+
"documentations/wbts_capacity_doc.py",
|
| 366 |
+
title="WBTS Capacity Documentation",
|
| 367 |
+
icon=":material/menu_book:",
|
| 368 |
+
),
|
| 369 |
+
st.Page(
|
| 370 |
+
"documentations/wcel_capacity_doc.py",
|
| 371 |
+
title="WCEL Capacity Documentation",
|
| 372 |
+
icon=":material/menu_book:",
|
| 373 |
+
),
|
| 374 |
+
st.Page(
|
| 375 |
+
"documentations/lcg_analysis_doc.py",
|
| 376 |
+
title="LCG Analysis Documentation",
|
| 377 |
+
icon=":material/menu_book:",
|
| 378 |
+
),
|
| 379 |
+
st.Page(
|
| 380 |
+
"documentations/gsm_lac_load_doc.py",
|
| 381 |
+
title="GSM LAC Load Documentation",
|
| 382 |
+
icon=":material/menu_book:",
|
| 383 |
+
),
|
| 384 |
+
st.Page(
|
| 385 |
+
"documentations/ciq_2g_doc.py",
|
| 386 |
+
title="CIQ 2G Documentation",
|
| 387 |
+
icon=":material/menu_book:",
|
| 388 |
+
),
|
| 389 |
+
st.Page(
|
| 390 |
+
"documentations/ciq_3g_doc.py",
|
| 391 |
+
title="CIQ 3G Documentation",
|
| 392 |
+
icon=":material/menu_book:",
|
| 393 |
+
),
|
| 394 |
+
st.Page(
|
| 395 |
+
"documentations/ciq_4g_doc.py",
|
| 396 |
+
title="CIQ 4G Documentation",
|
| 397 |
+
icon=":material/menu_book:",
|
| 398 |
+
),
|
| 399 |
+
st.Page(
|
| 400 |
+
"documentations/dbm_watt_doc.py",
|
| 401 |
+
title="dBm Watt Documentation",
|
| 402 |
+
icon=":material/menu_book:",
|
| 403 |
+
),
|
| 404 |
+
st.Page(
|
| 405 |
+
"documentations/fnb_parser_doc.py",
|
| 406 |
+
title="F4NB Extractor Documentation",
|
| 407 |
+
icon=":material/menu_book:",
|
| 408 |
+
),
|
| 409 |
+
st.Page(
|
| 410 |
+
"documentations/lte_drop_traffic_doc.py",
|
| 411 |
+
title="LTE Drop Traffic Documentation",
|
| 412 |
+
icon=":material/menu_book:",
|
| 413 |
+
),
|
| 414 |
+
st.Page(
|
| 415 |
+
"documentations/parameters_distribution_doc.py",
|
| 416 |
+
title="Parameters Distribution Documentation",
|
| 417 |
+
icon=":material/menu_book:",
|
| 418 |
),
|
| 419 |
],
|
| 420 |
}
|
| 421 |
|
| 422 |
pg = st.navigation(pages, position="top")
|
| 423 |
pg.run()
|
| 424 |
+
|
| 425 |
+
|
apps/ciq_2g_generator.py
CHANGED
|
@@ -15,7 +15,7 @@ Génère les exports CIQ 2G à partir du dump OML + CIQ brut.
|
|
| 15 |
"""
|
| 16 |
)
|
| 17 |
|
| 18 |
-
st.subheader("
|
| 19 |
samples_dir = Path(__file__).resolve().parents[1] / "samples"
|
| 20 |
sample_ciq_2g = samples_dir / "CIQ_2G.xlsx"
|
| 21 |
sample_forbidden = samples_dir / "FORBIDEN_SCF.xlsx"
|
|
@@ -60,7 +60,7 @@ if dump_file is None:
|
|
| 60 |
st.info("Upload dump xlsb + CIQ brut Excel to generate CIQ 2G.")
|
| 61 |
st.stop()
|
| 62 |
|
| 63 |
-
st.subheader("
|
| 64 |
if st.button("Generate BCF libre", type="secondary"):
|
| 65 |
try:
|
| 66 |
with st.spinner("Generating BCF libre..."):
|
|
@@ -85,7 +85,7 @@ if free_bcf_bytes:
|
|
| 85 |
mime="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
| 86 |
)
|
| 87 |
|
| 88 |
-
st.subheader("
|
| 89 |
if ciq_file is None:
|
| 90 |
st.info("Upload CIQ brut 2G Excel to generate CIQ 2G.")
|
| 91 |
|
|
|
|
| 15 |
"""
|
| 16 |
)
|
| 17 |
|
| 18 |
+
st.subheader("Sample files")
|
| 19 |
samples_dir = Path(__file__).resolve().parents[1] / "samples"
|
| 20 |
sample_ciq_2g = samples_dir / "CIQ_2G.xlsx"
|
| 21 |
sample_forbidden = samples_dir / "FORBIDEN_SCF.xlsx"
|
|
|
|
| 60 |
st.info("Upload dump xlsb + CIQ brut Excel to generate CIQ 2G.")
|
| 61 |
st.stop()
|
| 62 |
|
| 63 |
+
st.subheader("Generation BCF libre")
|
| 64 |
if st.button("Generate BCF libre", type="secondary"):
|
| 65 |
try:
|
| 66 |
with st.spinner("Generating BCF libre..."):
|
|
|
|
| 85 |
mime="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
| 86 |
)
|
| 87 |
|
| 88 |
+
st.subheader("Generation CIQ 2G")
|
| 89 |
if ciq_file is None:
|
| 90 |
st.info("Upload CIQ brut 2G Excel to generate CIQ 2G.")
|
| 91 |
|
apps/ciq_3g_generator.py
CHANGED
|
@@ -13,7 +13,7 @@ Génère les exports CIQ 3G (WBTS + WCEL) à partir du CIQ brut.
|
|
| 13 |
"""
|
| 14 |
)
|
| 15 |
|
| 16 |
-
st.subheader("
|
| 17 |
samples_dir = Path(__file__).resolve().parents[1] / "samples"
|
| 18 |
sample_ciq_3g = samples_dir / "CIQ_3G.xlsx"
|
| 19 |
|
|
|
|
| 13 |
"""
|
| 14 |
)
|
| 15 |
|
| 16 |
+
st.subheader("Sample files")
|
| 17 |
samples_dir = Path(__file__).resolve().parents[1] / "samples"
|
| 18 |
sample_ciq_3g = samples_dir / "CIQ_3G.xlsx"
|
| 19 |
|
apps/ciq_4g_generator.py
CHANGED
|
@@ -13,7 +13,7 @@ Génère les exports CIQ 4G à partir du CIQ brut.
|
|
| 13 |
"""
|
| 14 |
)
|
| 15 |
|
| 16 |
-
st.subheader("
|
| 17 |
samples_dir = Path(__file__).resolve().parents[1] / "samples"
|
| 18 |
sample_ciq_4g = samples_dir / "CIQ_LTE.xlsx"
|
| 19 |
|
|
|
|
| 13 |
"""
|
| 14 |
)
|
| 15 |
|
| 16 |
+
st.subheader("Sample files")
|
| 17 |
samples_dir = Path(__file__).resolve().parents[1] / "samples"
|
| 18 |
sample_ciq_4g = samples_dir / "CIQ_LTE.xlsx"
|
| 19 |
|
apps/ciq_verification.py
CHANGED
|
@@ -54,15 +54,15 @@ with col3:
|
|
| 54 |
|
| 55 |
# Validation
|
| 56 |
if dump_file is None:
|
| 57 |
-
st.info("
|
| 58 |
st.stop()
|
| 59 |
|
| 60 |
if ciq_2g_file is None and ciq_3g_file is None and ciq_lte_file is None:
|
| 61 |
-
st.warning("
|
| 62 |
st.stop()
|
| 63 |
|
| 64 |
# Verify button
|
| 65 |
-
if st.button("
|
| 66 |
try:
|
| 67 |
results_2g = None
|
| 68 |
results_3g = None
|
|
@@ -71,7 +71,7 @@ if st.button("🔎 Vérifier", type="primary"):
|
|
| 71 |
with st.spinner("Traitement en cours..."):
|
| 72 |
# Process 2G if provided
|
| 73 |
if ciq_2g_file is not None:
|
| 74 |
-
st.text("
|
| 75 |
dump_gsm = process_dump_gsm(dump_file)
|
| 76 |
dump_file.seek(0) # Reset file pointer
|
| 77 |
ciq_2g_df = read_ciq_file(ciq_2g_file)
|
|
@@ -79,7 +79,7 @@ if st.button("🔎 Vérifier", type="primary"):
|
|
| 79 |
|
| 80 |
# Process 3G if provided
|
| 81 |
if ciq_3g_file is not None:
|
| 82 |
-
st.text("
|
| 83 |
dump_wcdma = process_dump_wcdma(dump_file)
|
| 84 |
dump_file.seek(0) # Reset file pointer
|
| 85 |
ciq_3g_df = read_ciq_file(ciq_3g_file)
|
|
@@ -87,7 +87,7 @@ if st.button("🔎 Vérifier", type="primary"):
|
|
| 87 |
|
| 88 |
# Process LTE if provided
|
| 89 |
if ciq_lte_file is not None:
|
| 90 |
-
st.text("
|
| 91 |
dump_lte = process_dump_lte(dump_file)
|
| 92 |
dump_file.seek(0) # Reset file pointer
|
| 93 |
ciq_lte_df = read_ciq_file(ciq_lte_file)
|
|
@@ -106,10 +106,10 @@ if st.button("🔎 Vérifier", type="primary"):
|
|
| 106 |
st.session_state["verify_sheets"] = sheets
|
| 107 |
st.session_state["verify_excel_bytes"] = excel_bytes
|
| 108 |
|
| 109 |
-
st.success("
|
| 110 |
|
| 111 |
except Exception as e:
|
| 112 |
-
st.error(f"
|
| 113 |
import traceback
|
| 114 |
st.code(traceback.format_exc())
|
| 115 |
|
|
@@ -127,9 +127,9 @@ def display_stats(stats: dict, tech: str):
|
|
| 127 |
with col1:
|
| 128 |
st.metric("Total Cells", stats["total_cells"])
|
| 129 |
with col2:
|
| 130 |
-
st.metric("
|
| 131 |
with col3:
|
| 132 |
-
st.metric("
|
| 133 |
with col4:
|
| 134 |
st.metric("❓ Not Found", stats["not_found_count"])
|
| 135 |
|
|
@@ -170,7 +170,7 @@ def style_results(df: pd.DataFrame) -> pd.DataFrame:
|
|
| 170 |
|
| 171 |
if sheets:
|
| 172 |
st.divider()
|
| 173 |
-
st.subheader("
|
| 174 |
|
| 175 |
tabs = []
|
| 176 |
tab_names = []
|
|
@@ -188,7 +188,7 @@ if sheets:
|
|
| 188 |
|
| 189 |
if results_2g:
|
| 190 |
with tabs[tab_idx]:
|
| 191 |
-
st.markdown("###
|
| 192 |
display_stats(results_2g[1], "2G")
|
| 193 |
st.dataframe(
|
| 194 |
style_results(results_2g[0]),
|
|
@@ -199,7 +199,7 @@ if sheets:
|
|
| 199 |
|
| 200 |
if results_3g:
|
| 201 |
with tabs[tab_idx]:
|
| 202 |
-
st.markdown("###
|
| 203 |
display_stats(results_3g[1], "3G")
|
| 204 |
st.dataframe(
|
| 205 |
style_results(results_3g[0]),
|
|
@@ -210,7 +210,7 @@ if sheets:
|
|
| 210 |
|
| 211 |
if results_lte:
|
| 212 |
with tabs[tab_idx]:
|
| 213 |
-
st.markdown("###
|
| 214 |
display_stats(results_lte[1], "LTE")
|
| 215 |
st.dataframe(
|
| 216 |
style_results(results_lte[0]),
|
|
@@ -221,7 +221,7 @@ if sheets:
|
|
| 221 |
if excel_bytes:
|
| 222 |
st.divider()
|
| 223 |
st.download_button(
|
| 224 |
-
label="
|
| 225 |
data=excel_bytes,
|
| 226 |
file_name="CIQ_Verification_Report.xlsx",
|
| 227 |
mime="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
|
|
|
| 54 |
|
| 55 |
# Validation
|
| 56 |
if dump_file is None:
|
| 57 |
+
st.info("Veuillez uploader le fichier dump (xlsb).")
|
| 58 |
st.stop()
|
| 59 |
|
| 60 |
if ciq_2g_file is None and ciq_3g_file is None and ciq_lte_file is None:
|
| 61 |
+
st.warning("Au moins un fichier CIQ (2G, 3G ou LTE) est requis.")
|
| 62 |
st.stop()
|
| 63 |
|
| 64 |
# Verify button
|
| 65 |
+
if st.button("Verifier", type="primary"):
|
| 66 |
try:
|
| 67 |
results_2g = None
|
| 68 |
results_3g = None
|
|
|
|
| 71 |
with st.spinner("Traitement en cours..."):
|
| 72 |
# Process 2G if provided
|
| 73 |
if ciq_2g_file is not None:
|
| 74 |
+
st.text("Traitement 2G...")
|
| 75 |
dump_gsm = process_dump_gsm(dump_file)
|
| 76 |
dump_file.seek(0) # Reset file pointer
|
| 77 |
ciq_2g_df = read_ciq_file(ciq_2g_file)
|
|
|
|
| 79 |
|
| 80 |
# Process 3G if provided
|
| 81 |
if ciq_3g_file is not None:
|
| 82 |
+
st.text("Traitement 3G...")
|
| 83 |
dump_wcdma = process_dump_wcdma(dump_file)
|
| 84 |
dump_file.seek(0) # Reset file pointer
|
| 85 |
ciq_3g_df = read_ciq_file(ciq_3g_file)
|
|
|
|
| 87 |
|
| 88 |
# Process LTE if provided
|
| 89 |
if ciq_lte_file is not None:
|
| 90 |
+
st.text("Traitement LTE...")
|
| 91 |
dump_lte = process_dump_lte(dump_file)
|
| 92 |
dump_file.seek(0) # Reset file pointer
|
| 93 |
ciq_lte_df = read_ciq_file(ciq_lte_file)
|
|
|
|
| 106 |
st.session_state["verify_sheets"] = sheets
|
| 107 |
st.session_state["verify_excel_bytes"] = excel_bytes
|
| 108 |
|
| 109 |
+
st.success("Verification terminee.")
|
| 110 |
|
| 111 |
except Exception as e:
|
| 112 |
+
st.error(f"Erreur: {e}")
|
| 113 |
import traceback
|
| 114 |
st.code(traceback.format_exc())
|
| 115 |
|
|
|
|
| 127 |
with col1:
|
| 128 |
st.metric("Total Cells", stats["total_cells"])
|
| 129 |
with col2:
|
| 130 |
+
st.metric("OK", stats["ok_count"])
|
| 131 |
with col3:
|
| 132 |
+
st.metric("Mismatch", stats["mismatch_count"])
|
| 133 |
with col4:
|
| 134 |
st.metric("❓ Not Found", stats["not_found_count"])
|
| 135 |
|
|
|
|
| 170 |
|
| 171 |
if sheets:
|
| 172 |
st.divider()
|
| 173 |
+
st.subheader("Resultats de la verification")
|
| 174 |
|
| 175 |
tabs = []
|
| 176 |
tab_names = []
|
|
|
|
| 188 |
|
| 189 |
if results_2g:
|
| 190 |
with tabs[tab_idx]:
|
| 191 |
+
st.markdown("### Verification 2G")
|
| 192 |
display_stats(results_2g[1], "2G")
|
| 193 |
st.dataframe(
|
| 194 |
style_results(results_2g[0]),
|
|
|
|
| 199 |
|
| 200 |
if results_3g:
|
| 201 |
with tabs[tab_idx]:
|
| 202 |
+
st.markdown("### Verification 3G")
|
| 203 |
display_stats(results_3g[1], "3G")
|
| 204 |
st.dataframe(
|
| 205 |
style_results(results_3g[0]),
|
|
|
|
| 210 |
|
| 211 |
if results_lte:
|
| 212 |
with tabs[tab_idx]:
|
| 213 |
+
st.markdown("### Verification LTE")
|
| 214 |
display_stats(results_lte[1], "LTE")
|
| 215 |
st.dataframe(
|
| 216 |
style_results(results_lte[0]),
|
|
|
|
| 221 |
if excel_bytes:
|
| 222 |
st.divider()
|
| 223 |
st.download_button(
|
| 224 |
+
label="Telecharger le rapport de verification (Excel)",
|
| 225 |
data=excel_bytes,
|
| 226 |
file_name="CIQ_Verification_Report.xlsx",
|
| 227 |
mime="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
apps/dump_compare.py
CHANGED
|
@@ -40,7 +40,7 @@ def detect_dist_col(columns):
|
|
| 40 |
|
| 41 |
# === Interface Streamlit ===
|
| 42 |
|
| 43 |
-
st.title("
|
| 44 |
st.markdown(
|
| 45 |
":blue[**Upload the old and new dumps, then input the object class (comma-separated) to compare**]"
|
| 46 |
)
|
|
@@ -153,9 +153,9 @@ if st.button("Run Comparison", type="primary", use_container_width=True):
|
|
| 153 |
logs.append(f"No changes in '{sheet}'")
|
| 154 |
|
| 155 |
except Exception as e:
|
| 156 |
-
logs.append(f"
|
| 157 |
|
| 158 |
-
st.success(f"
|
| 159 |
for log in logs:
|
| 160 |
st.write(log)
|
| 161 |
|
|
|
|
| 40 |
|
| 41 |
# === Interface Streamlit ===
|
| 42 |
|
| 43 |
+
st.title("Dump Compare Tool")
|
| 44 |
st.markdown(
|
| 45 |
":blue[**Upload the old and new dumps, then input the object class (comma-separated) to compare**]"
|
| 46 |
)
|
|
|
|
| 153 |
logs.append(f"No changes in '{sheet}'")
|
| 154 |
|
| 155 |
except Exception as e:
|
| 156 |
+
logs.append(f"Error in '{sheet}': {e}")
|
| 157 |
|
| 158 |
+
st.success(f"Comparison completed. Total changes: {total}")
|
| 159 |
for log in logs:
|
| 160 |
st.write(log)
|
| 161 |
|
apps/fnb_parser.py
CHANGED
|
@@ -243,11 +243,7 @@ def process_files_to_dataframe(uploaded_files) -> pd.DataFrame:
|
|
| 243 |
|
| 244 |
|
| 245 |
def main() -> None:
|
| 246 |
-
st.
|
| 247 |
-
page_title="F4NB Extractor to Excel", page_icon="📄", layout="wide"
|
| 248 |
-
)
|
| 249 |
-
|
| 250 |
-
st.title("📄 F4NB Extractor to Excel")
|
| 251 |
st.markdown(
|
| 252 |
"Convert F4NB Word documents into a tidy Excel / DataFrame containing site & sector information.\n"
|
| 253 |
"Upload one or many F4NB `.docx` files and hit **Process**."
|
|
@@ -292,7 +288,7 @@ def main() -> None:
|
|
| 292 |
with pd.ExcelWriter(buffer, engine="xlsxwriter") as writer:
|
| 293 |
df.to_excel(writer, index=False, sheet_name="Extract")
|
| 294 |
st.download_button(
|
| 295 |
-
label="
|
| 296 |
data=buffer.getvalue(),
|
| 297 |
file_name="extracted_fnb.xlsx",
|
| 298 |
mime="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
|
@@ -317,7 +313,7 @@ def main() -> None:
|
|
| 317 |
)
|
| 318 |
)
|
| 319 |
if not geo_df.empty:
|
| 320 |
-
st.subheader("
|
| 321 |
fig = px.scatter_map(
|
| 322 |
geo_df,
|
| 323 |
lat="Latitude",
|
|
|
|
| 243 |
|
| 244 |
|
| 245 |
def main() -> None:
|
| 246 |
+
st.title("F4NB Extractor to Excel")
|
|
|
|
|
|
|
|
|
|
|
|
|
| 247 |
st.markdown(
|
| 248 |
"Convert F4NB Word documents into a tidy Excel / DataFrame containing site & sector information.\n"
|
| 249 |
"Upload one or many F4NB `.docx` files and hit **Process**."
|
|
|
|
| 288 |
with pd.ExcelWriter(buffer, engine="xlsxwriter") as writer:
|
| 289 |
df.to_excel(writer, index=False, sheet_name="Extract")
|
| 290 |
st.download_button(
|
| 291 |
+
label="Download Excel",
|
| 292 |
data=buffer.getvalue(),
|
| 293 |
file_name="extracted_fnb.xlsx",
|
| 294 |
mime="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
|
|
|
| 313 |
)
|
| 314 |
)
|
| 315 |
if not geo_df.empty:
|
| 316 |
+
st.subheader("Site Locations")
|
| 317 |
fig = px.scatter_map(
|
| 318 |
geo_df,
|
| 319 |
lat="Latitude",
|
apps/kpi_analysis/gsm_capacity.py
CHANGED
|
@@ -9,7 +9,7 @@ from utils.convert_to_excel import ( # Import convert_dfs from the appropriate
|
|
| 9 |
)
|
| 10 |
from utils.kpi_analysis_utils import GsmCapacity
|
| 11 |
|
| 12 |
-
st.title("
|
| 13 |
doc_col, image_col = st.columns(2)
|
| 14 |
|
| 15 |
with doc_col:
|
|
|
|
| 9 |
)
|
| 10 |
from utils.kpi_analysis_utils import GsmCapacity
|
| 11 |
|
| 12 |
+
st.title("GSM Capacity Analysis")
|
| 13 |
doc_col, image_col = st.columns(2)
|
| 14 |
|
| 15 |
with doc_col:
|
apps/kpi_analysis/gsm_lac_load.py
CHANGED
|
@@ -173,7 +173,7 @@ def analyze_lac_load(dump_path: str, hourly_report_path: str) -> List[pd.DataFra
|
|
| 173 |
|
| 174 |
def display_ui() -> None:
|
| 175 |
"""Display the Streamlit user interface."""
|
| 176 |
-
st.title("
|
| 177 |
doc_col, image_col = st.columns(2)
|
| 178 |
|
| 179 |
with doc_col:
|
|
|
|
| 173 |
|
| 174 |
def display_ui() -> None:
|
| 175 |
"""Display the Streamlit user interface."""
|
| 176 |
+
st.title("GSM LAC Load Analysis")
|
| 177 |
doc_col, image_col = st.columns(2)
|
| 178 |
|
| 179 |
with doc_col:
|
apps/kpi_analysis/lcg_analysis.py
CHANGED
|
@@ -11,7 +11,7 @@ class LcgCapacity:
|
|
| 11 |
|
| 12 |
|
| 13 |
# Streamlit UI
|
| 14 |
-
st.title("
|
| 15 |
doc_col, image_col = st.columns(2)
|
| 16 |
|
| 17 |
with doc_col:
|
|
|
|
| 11 |
|
| 12 |
|
| 13 |
# Streamlit UI
|
| 14 |
+
st.title("LCG Analysis")
|
| 15 |
doc_col, image_col = st.columns(2)
|
| 16 |
|
| 17 |
with doc_col:
|
apps/kpi_analysis/lte_capacity.py
CHANGED
|
@@ -6,7 +6,7 @@ from process_kpi.process_lte_capacity import process_lte_bh_report
|
|
| 6 |
from utils.convert_to_excel import convert_lte_analysis_dfs
|
| 7 |
from utils.kpi_analysis_utils import LteCapacity
|
| 8 |
|
| 9 |
-
st.title("
|
| 10 |
doc_col, image_col = st.columns(2)
|
| 11 |
|
| 12 |
with doc_col:
|
|
|
|
| 6 |
from utils.convert_to_excel import convert_lte_analysis_dfs
|
| 7 |
from utils.kpi_analysis_utils import LteCapacity
|
| 8 |
|
| 9 |
+
st.title("LTE Capacity Analysis")
|
| 10 |
doc_col, image_col = st.columns(2)
|
| 11 |
|
| 12 |
with doc_col:
|
apps/kpi_analysis/lte_drop_trafic.py
CHANGED
|
@@ -76,7 +76,7 @@ if uploaded_file:
|
|
| 76 |
|
| 77 |
if not result.empty:
|
| 78 |
st.download_button(
|
| 79 |
-
label="
|
| 80 |
data=convert_df(result),
|
| 81 |
file_name="traffic_drop_cells.xlsx",
|
| 82 |
mime="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
|
|
|
| 76 |
|
| 77 |
if not result.empty:
|
| 78 |
st.download_button(
|
| 79 |
+
label="Download affected cells",
|
| 80 |
data=convert_df(result),
|
| 81 |
file_name="traffic_drop_cells.xlsx",
|
| 82 |
mime="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
apps/kpi_analysis/trafic_analysis.py
CHANGED
|
@@ -860,7 +860,7 @@ def monthly_data_analysis(df: pd.DataFrame) -> pd.DataFrame:
|
|
| 860 |
|
| 861 |
|
| 862 |
############################## UI #########################
|
| 863 |
-
st.title("
|
| 864 |
doc_col, image_col = st.columns(2)
|
| 865 |
|
| 866 |
with doc_col:
|
|
@@ -908,7 +908,7 @@ with sla_lte_col:
|
|
| 908 |
sla_lte = st.number_input("LTE Cell availability SLA (%)", value=98.0)
|
| 909 |
|
| 910 |
if len(pre_range) != 2 or len(post_range) != 2:
|
| 911 |
-
st.warning("
|
| 912 |
st.stop()
|
| 913 |
if not all([two_g_file, three_g_file, lte_file]):
|
| 914 |
st.info("Please upload all 3 reports and select the comparison periods.")
|
|
@@ -916,7 +916,7 @@ if not all([two_g_file, three_g_file, lte_file]):
|
|
| 916 |
|
| 917 |
# Warning if pre and post periode are the same
|
| 918 |
if pre_range == post_range:
|
| 919 |
-
st.warning("
|
| 920 |
st.stop()
|
| 921 |
|
| 922 |
# Warning if pre and post are overlapping
|
|
@@ -1655,7 +1655,7 @@ if TraficAnalysis.last_period_df is not None:
|
|
| 1655 |
"Top_Critical_Sites",
|
| 1656 |
],
|
| 1657 |
)
|
| 1658 |
-
#
|
| 1659 |
st.download_button(
|
| 1660 |
on_click="ignore",
|
| 1661 |
type="primary",
|
|
|
|
| 860 |
|
| 861 |
|
| 862 |
############################## UI #########################
|
| 863 |
+
st.title("Global Trafic Analysis - 2G / 3G / LTE")
|
| 864 |
doc_col, image_col = st.columns(2)
|
| 865 |
|
| 866 |
with doc_col:
|
|
|
|
| 908 |
sla_lte = st.number_input("LTE Cell availability SLA (%)", value=98.0)
|
| 909 |
|
| 910 |
if len(pre_range) != 2 or len(post_range) != 2:
|
| 911 |
+
st.warning("Please select 2 dates for each period (pre and post).")
|
| 912 |
st.stop()
|
| 913 |
if not all([two_g_file, three_g_file, lte_file]):
|
| 914 |
st.info("Please upload all 3 reports and select the comparison periods.")
|
|
|
|
| 916 |
|
| 917 |
# Warning if pre and post periode are the same
|
| 918 |
if pre_range == post_range:
|
| 919 |
+
st.warning("Pre and post periode are the same.")
|
| 920 |
st.stop()
|
| 921 |
|
| 922 |
# Warning if pre and post are overlapping
|
|
|
|
| 1655 |
"Top_Critical_Sites",
|
| 1656 |
],
|
| 1657 |
)
|
| 1658 |
+
# Bouton de telechargement
|
| 1659 |
st.download_button(
|
| 1660 |
on_click="ignore",
|
| 1661 |
type="primary",
|
apps/kpi_analysis/wbts_capacty.py
CHANGED
|
@@ -7,7 +7,7 @@ from utils.convert_to_excel import convert_dfs
|
|
| 7 |
|
| 8 |
# Streamlit UI
|
| 9 |
|
| 10 |
-
st.title("
|
| 11 |
doc_col, image_col = st.columns(2)
|
| 12 |
|
| 13 |
with doc_col:
|
|
|
|
| 7 |
|
| 8 |
# Streamlit UI
|
| 9 |
|
| 10 |
+
st.title("WBTS Capacity Analysis")
|
| 11 |
doc_col, image_col = st.columns(2)
|
| 12 |
|
| 13 |
with doc_col:
|
apps/kpi_analysis/wcel_capacity.py
CHANGED
|
@@ -10,7 +10,7 @@ from utils.convert_to_excel import convert_wcel_capacity_dfs
|
|
| 10 |
|
| 11 |
# Streamlit UI
|
| 12 |
|
| 13 |
-
st.title("
|
| 14 |
doc_col, image_col = st.columns(2)
|
| 15 |
|
| 16 |
with doc_col:
|
|
|
|
| 10 |
|
| 11 |
# Streamlit UI
|
| 12 |
|
| 13 |
+
st.title("WCEL Capacity Analysis")
|
| 14 |
doc_col, image_col = st.columns(2)
|
| 15 |
|
| 16 |
with doc_col:
|
apps/parameters_distribution.py
CHANGED
|
@@ -3,7 +3,7 @@ import plotly.express as px
|
|
| 3 |
import streamlit as st
|
| 4 |
from pyxlsb import open_workbook
|
| 5 |
|
| 6 |
-
st.title("
|
| 7 |
|
| 8 |
uploaded_file = st.file_uploader("Upload an .xlsb Dump file", type="xlsb")
|
| 9 |
|
|
@@ -29,7 +29,7 @@ if uploaded_file:
|
|
| 29 |
|
| 30 |
if parameters:
|
| 31 |
for param in parameters:
|
| 32 |
-
st.markdown(f"---\n###
|
| 33 |
col1, col2 = st.columns(2)
|
| 34 |
|
| 35 |
# Distribution table
|
|
|
|
| 3 |
import streamlit as st
|
| 4 |
from pyxlsb import open_workbook
|
| 5 |
|
| 6 |
+
st.title("Parameters distribution Analyzer")
|
| 7 |
|
| 8 |
uploaded_file = st.file_uploader("Upload an .xlsb Dump file", type="xlsb")
|
| 9 |
|
|
|
|
| 29 |
|
| 30 |
if parameters:
|
| 31 |
for param in parameters:
|
| 32 |
+
st.markdown(f"---\n### {param}")
|
| 33 |
col1, col2 = st.columns(2)
|
| 34 |
|
| 35 |
# Distribution table
|
apps/sector_kml_generator.py
CHANGED
|
@@ -5,7 +5,7 @@ import streamlit as st
|
|
| 5 |
|
| 6 |
from utils.kml_creator import generate_kml_from_df
|
| 7 |
|
| 8 |
-
st.title("
|
| 9 |
|
| 10 |
|
| 11 |
# display mandatory columns
|
|
@@ -86,10 +86,10 @@ if uploaded_file is not None:
|
|
| 86 |
|
| 87 |
# Download button
|
| 88 |
st.download_button(
|
| 89 |
-
label="
|
| 90 |
data=kml_data,
|
| 91 |
file_name=f"Sectors_kml_{datetime.now()}.kml",
|
| 92 |
mime="application/vnd.google-earth.kml+xml",
|
| 93 |
)
|
| 94 |
|
| 95 |
-
st.success("KML file generated successfully
|
|
|
|
| 5 |
|
| 6 |
from utils.kml_creator import generate_kml_from_df
|
| 7 |
|
| 8 |
+
st.title("Telecom Sector KML Generator")
|
| 9 |
|
| 10 |
|
| 11 |
# display mandatory columns
|
|
|
|
| 86 |
|
| 87 |
# Download button
|
| 88 |
st.download_button(
|
| 89 |
+
label="Download KML",
|
| 90 |
data=kml_data,
|
| 91 |
file_name=f"Sectors_kml_{datetime.now()}.kml",
|
| 92 |
mime="application/vnd.google-earth.kml+xml",
|
| 93 |
)
|
| 94 |
|
| 95 |
+
st.success("KML file generated successfully.")
|
documentations/DOC_COVERAGE.md
ADDED
|
@@ -0,0 +1,43 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Documentation Coverage Matrix
|
| 2 |
+
|
| 3 |
+
Last update: 2026-02-23
|
| 4 |
+
|
| 5 |
+
Status values:
|
| 6 |
+
- `done`: dedicated documentation page exists in `documentations/` and is linked in navigation
|
| 7 |
+
- `partial`: no dedicated page, but basic in-app guidance exists
|
| 8 |
+
- `todo`: no dedicated page and limited guidance
|
| 9 |
+
|
| 10 |
+
| App page | Dedicated doc | Status | Notes |
|
| 11 |
+
| --- | --- | --- | --- |
|
| 12 |
+
| `apps/database_page.py` | `documentations/database_doc.py` | done | Main DB generation flow documented |
|
| 13 |
+
| `apps/core_dump_page.py` | `documentations/core_dump_doc.py` | done | Input/output and parsing behavior documented |
|
| 14 |
+
| `apps/ue_capability_parser.py` | `documentations/ue_capability_doc.py` | done | Sheets, options, and benchmark mode documented |
|
| 15 |
+
| `apps/ciq_verification.py` | `documentations/ciq_verification_doc.py` | done | Required files and report structure documented |
|
| 16 |
+
| `apps/ciq_2g_generator.py` | `documentations/ciq_2g_doc.py` | done | 2G CIQ workflow and optional forbidden BCF documented |
|
| 17 |
+
| `apps/ciq_3g_generator.py` | `documentations/ciq_3g_doc.py` | done | 3G CIQ parameters and generation flow documented |
|
| 18 |
+
| `apps/ciq_4g_generator.py` | `documentations/ciq_4g_doc.py` | done | 4G CIQ parameters and generation flow documented |
|
| 19 |
+
| `apps/gps_converter.py` | `documentations/gps_converter_doc.py` | done | Excel + manual conversion workflows documented |
|
| 20 |
+
| `apps/distance.py` | `documentations/distance_doc.py` | done | Two-point distance workflow documented |
|
| 21 |
+
| `apps/multi_points_distance_calculator.py` | `documentations/multi_points_distance_doc.py` | done | Dual-dataset nearest-distance workflow documented |
|
| 22 |
+
| `apps/sector_kml_generator.py` | `documentations/sector_kml_doc.py` | done | Required columns and KML export documented |
|
| 23 |
+
| `apps/dbm_watt_calculator.py` | `documentations/dbm_watt_doc.py` | done | Unit conversion modes and constraints documented |
|
| 24 |
+
| `apps/fnb_parser.py` | `documentations/fnb_parser_doc.py` | done | DOCX extraction and export workflow documented |
|
| 25 |
+
| `apps/parameters_distribution.py` | `documentations/parameters_distribution_doc.py` | done | Per-parameter distribution analysis documented |
|
| 26 |
+
| `apps/dump_compare.py` | `documentations/dump_compare_doc.py` | done | Old/new dump comparison and ZIP export documented |
|
| 27 |
+
| `apps/clustering.py` | `documentations/clustering_doc.py` | done | Clustering methods and map output documented |
|
| 28 |
+
| `apps/import_physical_db.py` | `documentations/import_physical_db_doc.py` | done | Local physical DB inspection documented |
|
| 29 |
+
| `apps/dump_analysis.py` | `documentations/dump_analysis_doc.py` | done | Analytics module behavior documented |
|
| 30 |
+
| `apps/kpi_analysis/gsm_capacity.py` | `documentations/gsm_capacity_docs.py` | done | Dedicated technical/user doc exists |
|
| 31 |
+
| `apps/kpi_analysis/lte_capacity.py` | `documentations/lte_capacity_docs.py` | done | Dedicated technical/user doc exists |
|
| 32 |
+
| `apps/kpi_analysis/wbts_capacty.py` | `documentations/wbts_capacity_doc.py` | done | WBTS thresholds and outputs documented |
|
| 33 |
+
| `apps/kpi_analysis/wcel_capacity.py` | `documentations/wcel_capacity_doc.py` | done | WCEL thresholds/charts/maps documented |
|
| 34 |
+
| `apps/kpi_analysis/lcg_analysis.py` | `documentations/lcg_analysis_doc.py` | done | LCG analysis flow documented |
|
| 35 |
+
| `apps/kpi_analysis/gsm_lac_load.py` | `documentations/gsm_lac_load_doc.py` | done | LAC load workflow and filters documented |
|
| 36 |
+
| `apps/kpi_analysis/lte_drop_trafic.py` | `documentations/lte_drop_traffic_doc.py` | done | Drop detection logic and map output documented |
|
| 37 |
+
| `apps/kpi_analysis/trafic_analysis.py` | `documentations/trafic_analysis_doc.py` | done | Input/output and usage flow documented |
|
| 38 |
+
| `apps/kpi_analysis/anomalie.py` | `documentations/anomaly_detection_doc.py` | done | KPI anomaly workflow documented |
|
| 39 |
+
|
| 40 |
+
## Next Priorities
|
| 41 |
+
1. Add screenshots/examples for each major workflow in documentation pages.
|
| 42 |
+
2. Add automated doc lint checks (required sections + link validity).
|
| 43 |
+
3. Keep release-note driven doc updates for each new app feature.
|
documentations/README_DOCS.md
ADDED
|
@@ -0,0 +1,42 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Documentation Standard (Streamlit)
|
| 2 |
+
|
| 3 |
+
Last update: 2026-02-23
|
| 4 |
+
|
| 5 |
+
## Goal
|
| 6 |
+
This file defines the writing standard for all in-app documentation pages in `documentations/*.py`.
|
| 7 |
+
|
| 8 |
+
## Required Sections (for every page)
|
| 9 |
+
1. Objective
|
| 10 |
+
2. When to use this tool
|
| 11 |
+
3. Input files and accepted formats
|
| 12 |
+
4. Required columns/fields (if applicable)
|
| 13 |
+
5. Step-by-step usage
|
| 14 |
+
6. Outputs generated
|
| 15 |
+
7. Frequent errors and fixes
|
| 16 |
+
8. Minimal reproducible example
|
| 17 |
+
9. Known limitations
|
| 18 |
+
10. Version and last update date
|
| 19 |
+
|
| 20 |
+
## Writing Rules
|
| 21 |
+
- Write for end users first.
|
| 22 |
+
- Keep examples concrete and copy/paste friendly.
|
| 23 |
+
- Keep naming aligned with UI labels and button names.
|
| 24 |
+
- Avoid implementation details unless they help troubleshoot user issues.
|
| 25 |
+
- Include exact file extensions (`.xlsb`, `.xlsx`, `.csv`, `.txt`).
|
| 26 |
+
|
| 27 |
+
## Quality Checklist
|
| 28 |
+
- [ ] All 10 required sections are present.
|
| 29 |
+
- [ ] File formats are explicit.
|
| 30 |
+
- [ ] Required columns are listed exactly as expected by the app.
|
| 31 |
+
- [ ] At least 3 frequent user errors are documented.
|
| 32 |
+
- [ ] One minimal example is provided.
|
| 33 |
+
- [ ] A date is present in the page.
|
| 34 |
+
|
| 35 |
+
## Naming Convention
|
| 36 |
+
- File name: `<tool_name>_doc.py`
|
| 37 |
+
- Page title in navigation: `<Tool Name> Documentation`
|
| 38 |
+
|
| 39 |
+
## Review Process
|
| 40 |
+
1. Verify the doc against the actual page in `apps/`.
|
| 41 |
+
2. Open the Streamlit page and confirm terminology matches UI.
|
| 42 |
+
3. Update `documentations/DOC_COVERAGE.md` status.
|
documentations/anomaly_detection_doc.py
CHANGED
|
@@ -1,125 +1,65 @@
|
|
| 1 |
-
import streamlit as st
|
| 2 |
|
| 3 |
st.markdown(
|
| 4 |
"""
|
| 5 |
# KPI Anomaly Detection Documentation
|
| 6 |
|
| 7 |
-
##
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
##
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
-
|
| 14 |
-
-
|
| 15 |
-
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
##
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
| 39 |
-
-
|
| 40 |
-
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
|
| 44 |
-
|
| 45 |
-
-
|
| 46 |
-
|
| 47 |
-
|
| 48 |
-
-
|
| 49 |
-
-
|
| 50 |
-
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
|
| 56 |
-
|
| 57 |
-
|
| 58 |
-
|
| 59 |
-
-
|
| 60 |
-
|
| 61 |
-
|
| 62 |
-
-
|
| 63 |
-
|
| 64 |
-
### 3. Review Results
|
| 65 |
-
- The application will display a list of KPIs with detected anomalies
|
| 66 |
-
- Select a KPI and cell to view detailed analysis
|
| 67 |
-
- The plot shows:
|
| 68 |
-
- KPI values over time (blue line)
|
| 69 |
-
- Detected change points (red markers)
|
| 70 |
-
- Initial mean (gray dotted line)
|
| 71 |
-
- Final mean (black dashed line)
|
| 72 |
-
|
| 73 |
-
### 4. Export Results
|
| 74 |
-
- Click "Generate Excel file with anomalies" to export all detected anomalies
|
| 75 |
-
- Each KPI is saved in a separate sheet
|
| 76 |
-
- The Excel file includes all data points with change point indicators
|
| 77 |
-
|
| 78 |
-
## Technical Details
|
| 79 |
-
|
| 80 |
-
### Algorithm
|
| 81 |
-
- Uses the PELT (Pruned Exact Linear Time) algorithm from the `ruptures` library
|
| 82 |
-
- Model: RBF (Radial Basis Function) kernel for detecting changes in mean
|
| 83 |
-
- Automatic pruning of similar change points
|
| 84 |
-
|
| 85 |
-
### Performance Considerations
|
| 86 |
-
- Processing time depends on:
|
| 87 |
-
- Number of cells in the dataset
|
| 88 |
-
- Number of KPIs
|
| 89 |
-
- Length of the time series
|
| 90 |
-
- Large datasets may take several minutes to process
|
| 91 |
-
- Results are cached for better performance when adjusting parameters
|
| 92 |
-
|
| 93 |
-
## Troubleshooting
|
| 94 |
-
|
| 95 |
-
### Common Issues
|
| 96 |
-
1. **No anomalies detected**
|
| 97 |
-
- Try reducing the penalty value
|
| 98 |
-
- Check if the data contains enough variation
|
| 99 |
-
- Ensure there are at least 30 data points per cell
|
| 100 |
-
|
| 101 |
-
2. **Too many false positives**
|
| 102 |
-
- Increase the penalty value
|
| 103 |
-
- Check data for noise or outliers
|
| 104 |
-
- Consider pre-processing the data
|
| 105 |
-
|
| 106 |
-
3. **File format errors**
|
| 107 |
-
- Ensure the file is not open in another program
|
| 108 |
-
- Check that the file is not corrupted
|
| 109 |
-
- Verify the column structure matches requirements
|
| 110 |
-
|
| 111 |
-
## Best Practices
|
| 112 |
-
1. Start with the default penalty value (2.5) and adjust as needed
|
| 113 |
-
2. For large datasets, consider processing in smaller chunks
|
| 114 |
-
3. Review detected anomalies in context with network events or changes
|
| 115 |
-
4. Regularly update the application to get the latest improvements
|
| 116 |
-
|
| 117 |
-
## Dependencies
|
| 118 |
-
- Python 3.7+
|
| 119 |
-
- pandas
|
| 120 |
-
- numpy
|
| 121 |
-
- plotly
|
| 122 |
-
- ruptures
|
| 123 |
-
- streamlit
|
| 124 |
"""
|
| 125 |
)
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
|
| 3 |
st.markdown(
|
| 4 |
"""
|
| 5 |
# KPI Anomaly Detection Documentation
|
| 6 |
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Detect KPI trend breaks (change points) per cell and export anomaly-focused datasets for investigation.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need to:
|
| 12 |
+
- detect abrupt KPI behavior changes
|
| 13 |
+
- isolate affected cells per KPI
|
| 14 |
+
- visualize change points over time
|
| 15 |
+
- export anomaly evidence to Excel
|
| 16 |
+
|
| 17 |
+
## 3. Input files and accepted formats
|
| 18 |
+
- Required: KPI file in `.csv`
|
| 19 |
+
- Expected delimiter in current implementation: `;`
|
| 20 |
+
|
| 21 |
+
## 4. Required columns
|
| 22 |
+
The first 5 columns are interpreted as:
|
| 23 |
+
1. date/time
|
| 24 |
+
2. controller
|
| 25 |
+
3. BTS
|
| 26 |
+
4. cell
|
| 27 |
+
5. DN
|
| 28 |
+
All remaining columns are treated as KPI series.
|
| 29 |
+
|
| 30 |
+
## 5. Step-by-step usage
|
| 31 |
+
1. Open `KPI Analysis > KPIs Anomaly Detection`.
|
| 32 |
+
2. Upload the KPI CSV file.
|
| 33 |
+
3. Set `Penalty` (lower = more sensitive, higher = less sensitive).
|
| 34 |
+
4. Review detected KPI list and affected cells.
|
| 35 |
+
5. Inspect plot with detected change points.
|
| 36 |
+
6. Generate and download Excel anomalies report.
|
| 37 |
+
|
| 38 |
+
## 6. Outputs generated
|
| 39 |
+
- interactive anomaly plot for selected KPI/cell
|
| 40 |
+
- per-KPI anomaly datasets
|
| 41 |
+
- downloadable Excel file (one sheet per KPI)
|
| 42 |
+
|
| 43 |
+
## 7. Frequent errors and fixes
|
| 44 |
+
- No anomalies detected.
|
| 45 |
+
- Fix: reduce `Penalty` or verify KPI variability.
|
| 46 |
+
- Parsing/date issues.
|
| 47 |
+
- Fix: check first column date format and CSV delimiter (`;`).
|
| 48 |
+
- Too many false positives.
|
| 49 |
+
- Fix: increase `Penalty` and inspect noisy KPIs.
|
| 50 |
+
|
| 51 |
+
## 8. Minimal reproducible example
|
| 52 |
+
- Input: CSV with at least 30 points per cell and one KPI showing a level shift.
|
| 53 |
+
- Action: upload file, set penalty to `2.5`, inspect one KPI/cell.
|
| 54 |
+
- Expected result: chart with change-point markers and exportable Excel output.
|
| 55 |
+
|
| 56 |
+
## 9. Known limitations
|
| 57 |
+
- Cells with fewer than 30 points are ignored.
|
| 58 |
+
- Detection quality is sensitive to noise and seasonality.
|
| 59 |
+
- Current uploader is configured for CSV input only.
|
| 60 |
+
|
| 61 |
+
## 10. Version and update date
|
| 62 |
+
- Documentation version: 1.0
|
| 63 |
+
- Last update: 2026-02-23
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 64 |
"""
|
| 65 |
)
|
documentations/ciq_2g_doc.py
ADDED
|
@@ -0,0 +1,63 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# CIQ 2G Generator Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Generate CIQ 2G deliverables from OML dump and CIQ raw input, including optional free BCF extraction.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when preparing or validating 2G CIQ before deployment.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: dump file (`.xlsb`)
|
| 15 |
+
- Required for CIQ generation: CIQ raw 2G file (`.xlsx` or `.xls`)
|
| 16 |
+
- Optional: forbidden BCF list (`.xlsx`, `.xls`, `.xlsb`)
|
| 17 |
+
|
| 18 |
+
## 4. Required columns/fields
|
| 19 |
+
Forbidden BCF file should include one of these pairs:
|
| 20 |
+
- `BSC` and `BCF`
|
| 21 |
+
- or `BSCID` and `BCFID`
|
| 22 |
+
|
| 23 |
+
Optional parameters:
|
| 24 |
+
- `MCC` (default: 610)
|
| 25 |
+
- `MNC` (default: 2)
|
| 26 |
+
|
| 27 |
+
## 5. Step-by-step usage
|
| 28 |
+
1. Open `Apps > CIQ 2G Generator`.
|
| 29 |
+
2. Upload dump `.xlsb`.
|
| 30 |
+
3. Optionally generate and download `BCF libre`.
|
| 31 |
+
4. Upload CIQ brut 2G file.
|
| 32 |
+
5. Set MCC/MNC if needed.
|
| 33 |
+
6. Click `Generate` and review sheet tabs.
|
| 34 |
+
7. Download final CIQ Excel.
|
| 35 |
+
|
| 36 |
+
## 6. Outputs generated
|
| 37 |
+
- optional `BCF_LIBRE.xlsx`
|
| 38 |
+
- generated CIQ 2G workbook (download as `CIQ_2G.xlsx`)
|
| 39 |
+
- in-app sheet preview tabs
|
| 40 |
+
|
| 41 |
+
## 7. Frequent errors and fixes
|
| 42 |
+
- Missing dump file.
|
| 43 |
+
- Fix: upload `.xlsb` first.
|
| 44 |
+
- CIQ generation blocked.
|
| 45 |
+
- Fix: upload CIQ brut 2G file.
|
| 46 |
+
- Forbidden BCF mismatch.
|
| 47 |
+
- Fix: verify expected column names.
|
| 48 |
+
|
| 49 |
+
## 8. Minimal reproducible example
|
| 50 |
+
- Input: valid dump `.xlsb` + `samples/CIQ_2G.xlsx`.
|
| 51 |
+
- Action: click `Generate` with default MCC/MNC.
|
| 52 |
+
- Expected result: sheet tabs and downloadable `CIQ_2G.xlsx`.
|
| 53 |
+
|
| 54 |
+
## 9. Known limitations
|
| 55 |
+
- Processing time depends on dump size.
|
| 56 |
+
- Source CIQ template quality directly affects output.
|
| 57 |
+
- Forbidden list handling depends on expected BSC/BCF column format.
|
| 58 |
+
|
| 59 |
+
## 10. Version and update date
|
| 60 |
+
- Documentation version: 1.0
|
| 61 |
+
- Last update: 2026-02-23
|
| 62 |
+
"""
|
| 63 |
+
)
|
documentations/ciq_3g_doc.py
ADDED
|
@@ -0,0 +1,56 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# CIQ 3G Generator Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Generate 3G CIQ outputs (WBTS/WCEL) from a raw CIQ Excel file.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when building 3G CIQ outputs with naming and band conventions.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: CIQ brut 3G file (`.xlsx` or `.xls`)
|
| 15 |
+
|
| 16 |
+
## 4. Required columns/fields
|
| 17 |
+
Required structure must match expectations of the 3G processing pipeline.
|
| 18 |
+
User-configurable inputs:
|
| 19 |
+
- `Year suffix` (default: `25`)
|
| 20 |
+
- `Bands string` (default: `G9G18U9U21L8L18L26`)
|
| 21 |
+
|
| 22 |
+
## 5. Step-by-step usage
|
| 23 |
+
1. Open `Apps > CIQ 3G Generator`.
|
| 24 |
+
2. Upload CIQ brut 3G Excel file.
|
| 25 |
+
3. Set `Year suffix` and `Bands string` if needed.
|
| 26 |
+
4. Click `Generate`.
|
| 27 |
+
5. Review generated sheets.
|
| 28 |
+
6. Download final workbook.
|
| 29 |
+
|
| 30 |
+
## 6. Outputs generated
|
| 31 |
+
- in-app tabs for generated sheets
|
| 32 |
+
- downloadable workbook `CIQ_3G.xlsx`
|
| 33 |
+
|
| 34 |
+
## 7. Frequent errors and fixes
|
| 35 |
+
- Missing input file.
|
| 36 |
+
- Fix: upload `.xlsx`/`.xls` CIQ 3G file.
|
| 37 |
+
- Generation error.
|
| 38 |
+
- Fix: verify CIQ template structure and required columns.
|
| 39 |
+
- Unexpected naming output.
|
| 40 |
+
- Fix: validate `Year suffix` and `Bands string` values.
|
| 41 |
+
|
| 42 |
+
## 8. Minimal reproducible example
|
| 43 |
+
- Input: `samples/CIQ_3G.xlsx`
|
| 44 |
+
- Action: keep defaults and click `Generate`.
|
| 45 |
+
- Expected result: populated sheets and downloadable `CIQ_3G.xlsx`.
|
| 46 |
+
|
| 47 |
+
## 9. Known limitations
|
| 48 |
+
- Output quality depends on CIQ raw file quality.
|
| 49 |
+
- Custom naming conventions require correct user parameters.
|
| 50 |
+
- Schema deviations can break generation.
|
| 51 |
+
|
| 52 |
+
## 10. Version and update date
|
| 53 |
+
- Documentation version: 1.0
|
| 54 |
+
- Last update: 2026-02-23
|
| 55 |
+
"""
|
| 56 |
+
)
|
documentations/ciq_4g_doc.py
ADDED
|
@@ -0,0 +1,58 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# CIQ 4G Generator Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Generate 4G CIQ outputs from raw CIQ Excel input with configurable naming and MCC/MNC values.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when preparing LTE CIQ outputs for deployment planning or verification.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: CIQ brut 4G file (`.xlsx` or `.xls`)
|
| 15 |
+
|
| 16 |
+
## 4. Required columns/fields
|
| 17 |
+
Input must follow expected 4G CIQ raw schema.
|
| 18 |
+
User-configurable inputs:
|
| 19 |
+
- `Year suffix` (default: `25`)
|
| 20 |
+
- `Bands string` (default: `G9G18U9U21L8L18L26`)
|
| 21 |
+
- `MCC` (default: 610)
|
| 22 |
+
- `MNC` (default: 2)
|
| 23 |
+
|
| 24 |
+
## 5. Step-by-step usage
|
| 25 |
+
1. Open `Apps > CIQ 4G Generator`.
|
| 26 |
+
2. Upload CIQ brut 4G file.
|
| 27 |
+
3. Set suffix/bands/MCC/MNC values.
|
| 28 |
+
4. Click `Generate`.
|
| 29 |
+
5. Review generated sheet tabs.
|
| 30 |
+
6. Download `CIQ_4G.xlsx`.
|
| 31 |
+
|
| 32 |
+
## 6. Outputs generated
|
| 33 |
+
- generated sheets preview in tabs
|
| 34 |
+
- downloadable workbook `CIQ_4G.xlsx`
|
| 35 |
+
|
| 36 |
+
## 7. Frequent errors and fixes
|
| 37 |
+
- Missing CIQ file.
|
| 38 |
+
- Fix: upload `.xlsx`/`.xls` input.
|
| 39 |
+
- Processing error.
|
| 40 |
+
- Fix: validate source template and required columns.
|
| 41 |
+
- Wrong MCC/MNC values in output.
|
| 42 |
+
- Fix: verify numeric MCC/MNC before generation.
|
| 43 |
+
|
| 44 |
+
## 8. Minimal reproducible example
|
| 45 |
+
- Input: `samples/CIQ_LTE.xlsx`
|
| 46 |
+
- Action: use defaults and click `Generate`.
|
| 47 |
+
- Expected result: generated sheet tabs and downloadable `CIQ_4G.xlsx`.
|
| 48 |
+
|
| 49 |
+
## 9. Known limitations
|
| 50 |
+
- Strict dependency on CIQ raw schema.
|
| 51 |
+
- Incorrect parameter values can propagate to outputs.
|
| 52 |
+
- No dump cross-check is performed in this page.
|
| 53 |
+
|
| 54 |
+
## 10. Version and update date
|
| 55 |
+
- Documentation version: 1.0
|
| 56 |
+
- Last update: 2026-02-23
|
| 57 |
+
"""
|
| 58 |
+
)
|
documentations/ciq_verification_doc.py
ADDED
|
@@ -0,0 +1,63 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# CIQ Verification Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Verify CIQ parameter values against the OML dump database for 2G, 3G, and LTE.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page before integration/deployment to quickly detect:
|
| 12 |
+
- parameter mismatches
|
| 13 |
+
- missing references in dump
|
| 14 |
+
- per-technology CIQ consistency issues
|
| 15 |
+
|
| 16 |
+
## 3. Input files and accepted formats
|
| 17 |
+
- Required: dump file in `.xlsb`
|
| 18 |
+
- Optional CIQ files: `.xlsx` or `.xls`
|
| 19 |
+
- CIQ 2G
|
| 20 |
+
- CIQ 3G
|
| 21 |
+
- CIQ LTE
|
| 22 |
+
- Rule: at least one CIQ file must be uploaded.
|
| 23 |
+
|
| 24 |
+
## 4. Required fields and structure
|
| 25 |
+
- Dump and CIQ files must contain expected network parameter columns used by verification functions.
|
| 26 |
+
- Each technology is processed independently when its CIQ file is provided.
|
| 27 |
+
|
| 28 |
+
## 5. Step-by-step usage
|
| 29 |
+
1. Open `Apps > CIQ Verification`.
|
| 30 |
+
2. Upload dump `.xlsb`.
|
| 31 |
+
3. Upload one or more CIQ files (2G/3G/LTE).
|
| 32 |
+
4. Click `Verifier`.
|
| 33 |
+
5. Review per-technology metrics (`OK`, `Mismatch`, `Not Found`).
|
| 34 |
+
6. Download the Excel verification report.
|
| 35 |
+
|
| 36 |
+
## 6. Outputs generated
|
| 37 |
+
- tabbed result tables per technology
|
| 38 |
+
- summary metrics for each processed technology
|
| 39 |
+
- downloadable Excel report: `CIQ_Verification_Report.xlsx`
|
| 40 |
+
|
| 41 |
+
## 7. Frequent errors and fixes
|
| 42 |
+
- Message: dump is required.
|
| 43 |
+
- Fix: upload `.xlsb` before running verification.
|
| 44 |
+
- Message: at least one CIQ file required.
|
| 45 |
+
- Fix: upload at least one CIQ 2G/3G/LTE file.
|
| 46 |
+
- Runtime verification error.
|
| 47 |
+
- Fix: validate CIQ headers and sheet integrity.
|
| 48 |
+
|
| 49 |
+
## 8. Minimal reproducible example
|
| 50 |
+
- Input: one dump `.xlsb` + one CIQ LTE `.xlsx`.
|
| 51 |
+
- Action: click `Verifier`.
|
| 52 |
+
- Expected result: LTE tab with status highlights + Excel report button.
|
| 53 |
+
|
| 54 |
+
## 9. Known limitations
|
| 55 |
+
- Verification quality depends on CIQ template consistency.
|
| 56 |
+
- Mixed naming standards can produce `NOT_FOUND` results.
|
| 57 |
+
- Processing large files may take noticeable time.
|
| 58 |
+
|
| 59 |
+
## 10. Version and update date
|
| 60 |
+
- Documentation version: 1.0
|
| 61 |
+
- Last update: 2026-02-23
|
| 62 |
+
"""
|
| 63 |
+
)
|
documentations/clustering_doc.py
ADDED
|
@@ -0,0 +1,62 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# Automatic Site Clustering Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Cluster sites from geographic coordinates with a configurable max sites per cluster.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page to build operational clusters for planning, field operations, or optimization workloads.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: one Excel file in `.xlsx`
|
| 15 |
+
- Sample: `samples/Site_Clustering.xlsx`
|
| 16 |
+
|
| 17 |
+
## 4. Required columns/fields
|
| 18 |
+
You must select:
|
| 19 |
+
- latitude column
|
| 20 |
+
- longitude column
|
| 21 |
+
- region column
|
| 22 |
+
- site code column
|
| 23 |
+
|
| 24 |
+
## 5. Step-by-step usage
|
| 25 |
+
1. Open `Apps > Automatic Site Clustering`.
|
| 26 |
+
2. Upload `.xlsx` dataset.
|
| 27 |
+
3. Select columns and set `Max sites per cluster`.
|
| 28 |
+
4. Choose clustering method:
|
| 29 |
+
- uniform cluster size (Hilbert curve)
|
| 30 |
+
- lower-than-max non-uniform clusters (KMeans)
|
| 31 |
+
5. Optionally enable region mixing.
|
| 32 |
+
6. Click `Run Clustering` and download output.
|
| 33 |
+
|
| 34 |
+
## 6. Outputs generated
|
| 35 |
+
- clustered dataset with a `Cluster` column
|
| 36 |
+
- cluster size charts
|
| 37 |
+
- map visualization by cluster
|
| 38 |
+
- downloadable file: `clustered_sites.xlsx`
|
| 39 |
+
|
| 40 |
+
## 7. Frequent errors and fixes
|
| 41 |
+
- Invalid map or missing points.
|
| 42 |
+
- Fix: verify numeric latitude/longitude values.
|
| 43 |
+
- Unexpected cluster composition.
|
| 44 |
+
- Fix: tune `Max sites per cluster` and method choice.
|
| 45 |
+
- Empty output.
|
| 46 |
+
- Fix: ensure uploaded file is not empty and selected columns are correct.
|
| 47 |
+
|
| 48 |
+
## 8. Minimal reproducible example
|
| 49 |
+
- Input: `samples/Site_Clustering.xlsx`
|
| 50 |
+
- Action: run with default `max_sites=25`, no region mixing.
|
| 51 |
+
- Expected result: cluster assignment, charts, map, and downloadable Excel.
|
| 52 |
+
|
| 53 |
+
## 9. Known limitations
|
| 54 |
+
- KMeans outcome can vary with data distribution.
|
| 55 |
+
- Hilbert strategy is coordinate-normalization based.
|
| 56 |
+
- Extreme outliers can reduce cluster interpretability.
|
| 57 |
+
|
| 58 |
+
## 10. Version and update date
|
| 59 |
+
- Documentation version: 1.0
|
| 60 |
+
- Last update: 2026-02-23
|
| 61 |
+
"""
|
| 62 |
+
)
|
documentations/core_dump_doc.py
CHANGED
|
@@ -1,34 +1,59 @@
|
|
| 1 |
-
import streamlit as st
|
| 2 |
|
| 3 |
st.markdown(
|
| 4 |
"""
|
| 5 |
-
#
|
| 6 |
-
|
| 7 |
-
##
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
##
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
##
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
-
|
| 22 |
-
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 34 |
)
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
|
| 3 |
st.markdown(
|
| 4 |
"""
|
| 5 |
+
# Dump Core Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Parse dump core text files and extract key 2G/3G identifiers with hexadecimal-to-decimal conversion.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need quick checks of:
|
| 12 |
+
- Global cell ID vs LA cell name (2G)
|
| 13 |
+
- 3G service area number and name
|
| 14 |
+
- LAC / Cell ID / SAC conversions to decimal
|
| 15 |
+
|
| 16 |
+
## 3. Input files and accepted formats
|
| 17 |
+
- Required: one or multiple `.txt` dump core files
|
| 18 |
+
|
| 19 |
+
## 4. Required fields in file content
|
| 20 |
+
The parser looks for lines containing:
|
| 21 |
+
- `Global cell ID`
|
| 22 |
+
- `LA cell name`
|
| 23 |
+
- `3G service area number`
|
| 24 |
+
- `3G service area name`
|
| 25 |
+
|
| 26 |
+
## 5. Step-by-step usage
|
| 27 |
+
1. Open `Apps > Parse dump core`.
|
| 28 |
+
2. (Optional) Download sample file from the page.
|
| 29 |
+
3. Upload one or multiple `.txt` files.
|
| 30 |
+
4. Review generated 2G and 3G tables.
|
| 31 |
+
|
| 32 |
+
## 6. Outputs generated
|
| 33 |
+
Two result tables are displayed:
|
| 34 |
+
- `2G CORE DUMP DATA` with `LAC_DECIMAL` and `Cell_ID_DECIMAL`
|
| 35 |
+
- `3G CORE DUMP DATA` with `LAC_DECIMAL` and `SAC_ID_DECIMAL`
|
| 36 |
+
|
| 37 |
+
## 7. Frequent errors and fixes
|
| 38 |
+
- Empty output table.
|
| 39 |
+
- Fix: confirm the uploaded file contains expected key strings.
|
| 40 |
+
- Decimal conversion errors.
|
| 41 |
+
- Fix: verify extracted IDs are valid hexadecimal values.
|
| 42 |
+
- Wrong file format.
|
| 43 |
+
- Fix: upload `.txt` only.
|
| 44 |
+
|
| 45 |
+
## 8. Minimal reproducible example
|
| 46 |
+
- Input: `samples/dump_core.txt`
|
| 47 |
+
- Action: upload file to the page.
|
| 48 |
+
- Expected result: two tables with converted decimal columns.
|
| 49 |
+
|
| 50 |
+
## 9. Known limitations
|
| 51 |
+
- Parsing is line-pattern based (strict keywords).
|
| 52 |
+
- No Excel export button is provided in this page.
|
| 53 |
+
- Mixed/irregular dump formats may require cleanup before upload.
|
| 54 |
+
|
| 55 |
+
## 10. Version and update date
|
| 56 |
+
- Documentation version: 1.0
|
| 57 |
+
- Last update: 2026-02-23
|
| 58 |
+
"""
|
| 59 |
)
|
documentations/database_doc.py
CHANGED
|
@@ -1,71 +1,70 @@
|
|
| 1 |
-
import streamlit as st
|
| 2 |
|
| 3 |
st.markdown(
|
| 4 |
"""
|
| 5 |
-
#
|
| 6 |
-
### Overview
|
| 7 |
-
This app is designed to process and generate databases from uploaded dump files. It provides a user-friendly interface for uploading files, selecting database types, and downloading processed databases.
|
| 8 |
|
| 9 |
-
##
|
| 10 |
-
|
| 11 |
-
2. Select a database type: Choose a database type from the available options (2G, 3G, LTE, All, or NEI).
|
| 12 |
-
3. Download the processed database: Once the processing is complete, click on the "Download" button to save the processed database in .xlsx format.
|
| 13 |
|
| 14 |
-
##
|
| 15 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
- TRX
|
| 21 |
-
- MAL
|
| 22 |
-
2. **3G :**
|
| 23 |
-
- WCEL
|
| 24 |
-
- WBTS
|
| 25 |
-
- WNCEL
|
| 26 |
-
3. **LTE :**
|
| 27 |
-
- LNBTS
|
| 28 |
-
- LNCEL
|
| 29 |
-
- LNCEL_FDD
|
| 30 |
-
- LNCEL_TDD
|
| 31 |
-
4. **NEI :**
|
| 32 |
-
- ADCE
|
| 33 |
-
- ADJS
|
| 34 |
-
- ADJI
|
| 35 |
-
- ADJG
|
| 36 |
-
- ADJW
|
| 37 |
-
- BTS
|
| 38 |
-
- WCEL
|
| 39 |
-
5. **MRBTS :**
|
| 40 |
-
- MRBTS
|
| 41 |
-
6. **INVUNIT :**
|
| 42 |
-
- INVUNIT
|
| 43 |
|
| 44 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 45 |
|
| 46 |
-
##
|
| 47 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 48 |
|
| 49 |
-
|
| 50 |
-
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
|
|
|
|
|
|
|
| 56 |
|
| 57 |
-
|
| 58 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 59 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 60 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 61 |
|
| 62 |
-
##
|
| 63 |
-
-
|
| 64 |
-
-
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
### Contact
|
| 68 |
-
If you have any questions or issues with the app, please contact [Dav Melchi] at [davmelchi@gmail.com].
|
| 69 |
-
|
| 70 |
-
"""
|
| 71 |
)
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
|
| 3 |
st.markdown(
|
| 4 |
"""
|
| 5 |
+
# Databases Documentation
|
|
|
|
|
|
|
| 6 |
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Generate telecom databases and KML outputs from a dump file (`.xlsb`) using the `Database processing` page.
|
|
|
|
|
|
|
| 9 |
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need one or several outputs from the same dump:
|
| 12 |
+
- 2G / 3G / LTE databases
|
| 13 |
+
- NEI database
|
| 14 |
+
- INVUNIT database
|
| 15 |
+
- ADJL / ATL / Nice datasets (when full dump is available)
|
| 16 |
+
- 2G / 3G / LTE KML files
|
| 17 |
|
| 18 |
+
## 3. Input files and accepted formats
|
| 19 |
+
- Required: one dump file in `.xlsb`
|
| 20 |
+
- Optional behavior toggle: `Exclude decommissioned 2G BSCs` (enabled by default)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 21 |
|
| 22 |
+
## 4. Required sheets
|
| 23 |
+
Depending on selected outputs, the dump should contain:
|
| 24 |
+
- 2G: `BTS`, `BCF`, `TRX`, `MAL`
|
| 25 |
+
- 3G: `WCEL`, `WBTS`, `WNCEL`
|
| 26 |
+
- LTE: `LNBTS`, `LNCEL`, `LNCEL_FDD`, `LNCEL_TDD`
|
| 27 |
+
- Neighbors: `ADCE`, `ADJS`, `ADJI`, `ADJG`, `ADJW`, `BTS`, `WCEL`
|
| 28 |
+
- INVUNIT: `INVUNIT`
|
| 29 |
+
- Full dump extras: `MRBTS` and related full-dump dependencies
|
| 30 |
|
| 31 |
+
## 5. Step-by-step usage
|
| 32 |
+
1. Open `Apps > Generate Databases`.
|
| 33 |
+
2. Upload your `.xlsb` dump.
|
| 34 |
+
3. Review available buttons (enabled according to detected sheets).
|
| 35 |
+
4. Click the needed generation button (`Generate 2G DB`, `Generate All DBs`, etc.).
|
| 36 |
+
5. Wait for completion message and download the generated file.
|
| 37 |
|
| 38 |
+
## 6. Outputs generated
|
| 39 |
+
The page can generate downloadable files such as:
|
| 40 |
+
- `2G database_<timestamp>.xlsx`
|
| 41 |
+
- `3G database_<timestamp>.xlsx`
|
| 42 |
+
- `LTE database_<timestamp>.xlsx`
|
| 43 |
+
- `All databases_<timestamp>.xlsx`
|
| 44 |
+
- `Neighbors databases_<timestamp>.xlsx`
|
| 45 |
+
- `INVUNIT database_<timestamp>.xlsx`
|
| 46 |
+
- KML files for 2G/3G/LTE
|
| 47 |
|
| 48 |
+
## 7. Frequent errors and fixes
|
| 49 |
+
- Error: required sheets missing.
|
| 50 |
+
- Fix: verify the dump contains the required tabs listed above.
|
| 51 |
+
- Error: unsupported file type.
|
| 52 |
+
- Fix: upload only `.xlsb`.
|
| 53 |
+
- Long processing or timeout feeling.
|
| 54 |
+
- Fix: generate one dataset at a time for very large dumps.
|
| 55 |
|
| 56 |
+
## 8. Minimal reproducible example
|
| 57 |
+
- Input: one valid full dump `.xlsb` containing 2G/3G/LTE sheets.
|
| 58 |
+
- Action: click `Generate All DBs`.
|
| 59 |
+
- Expected result: download button appears for `All databases_<timestamp>.xlsx`.
|
| 60 |
|
| 61 |
+
## 9. Known limitations
|
| 62 |
+
- Processing time increases with dump size.
|
| 63 |
+
- Behavior depends on source dump consistency.
|
| 64 |
+
- Some outputs are only available when full dump requirements are met.
|
| 65 |
|
| 66 |
+
## 10. Version and update date
|
| 67 |
+
- Documentation version: 1.0
|
| 68 |
+
- Last update: 2026-02-23
|
| 69 |
+
"""
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 70 |
)
|
documentations/dbm_watt_doc.py
ADDED
|
@@ -0,0 +1,56 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# dBm Watt Calculator Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Convert power values between dBm, Watt (W), and milliwatt (mW).
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page for quick RF power conversions during planning, checks, and reporting.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
No file upload is required.
|
| 15 |
+
Manual numeric input only.
|
| 16 |
+
|
| 17 |
+
## 4. Required fields
|
| 18 |
+
Three conversion modes are available:
|
| 19 |
+
- `dBm -> W + mW`
|
| 20 |
+
- `W -> dBm + mW`
|
| 21 |
+
- `mW -> dBm + W`
|
| 22 |
+
|
| 23 |
+
## 5. Step-by-step usage
|
| 24 |
+
1. Open `Apps > dBm <> Watt Calculator`.
|
| 25 |
+
2. Select conversion mode.
|
| 26 |
+
3. Enter input value.
|
| 27 |
+
4. Click `Convert`.
|
| 28 |
+
5. Read decimal and scientific notation outputs.
|
| 29 |
+
|
| 30 |
+
## 6. Outputs generated
|
| 31 |
+
- converted values in two units
|
| 32 |
+
- decimal and scientific formatting
|
| 33 |
+
|
| 34 |
+
## 7. Frequent errors and fixes
|
| 35 |
+
- Error for non-positive W input.
|
| 36 |
+
- Fix: use value strictly greater than 0.
|
| 37 |
+
- Error for non-positive mW input.
|
| 38 |
+
- Fix: use value strictly greater than 0.
|
| 39 |
+
- Unexpected value.
|
| 40 |
+
- Fix: verify unit and conversion mode selection.
|
| 41 |
+
|
| 42 |
+
## 8. Minimal reproducible example
|
| 43 |
+
- Input: mode `dBm -> W + mW`, value `30`.
|
| 44 |
+
- Action: click `Convert`.
|
| 45 |
+
- Expected result: approximately `1 W` and `1000 mW`.
|
| 46 |
+
|
| 47 |
+
## 9. Known limitations
|
| 48 |
+
- Manual one-value conversion only.
|
| 49 |
+
- No batch file conversion in current version.
|
| 50 |
+
- No history of previous calculations.
|
| 51 |
+
|
| 52 |
+
## 10. Version and update date
|
| 53 |
+
- Documentation version: 1.0
|
| 54 |
+
- Last update: 2026-02-23
|
| 55 |
+
"""
|
| 56 |
+
)
|
documentations/distance_doc.py
ADDED
|
@@ -0,0 +1,62 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# Distance Calculator Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Compute geodesic distance (meters) between two coordinate points for each row in an Excel dataset.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need row-level distance calculation between:
|
| 12 |
+
- source and destination points
|
| 13 |
+
- serving and neighbor points
|
| 14 |
+
- any two coordinate pairs in the same row
|
| 15 |
+
|
| 16 |
+
## 3. Input files and accepted formats
|
| 17 |
+
- Required: `.xlsx` file containing coordinates for two points.
|
| 18 |
+
|
| 19 |
+
## 4. Required columns/fields
|
| 20 |
+
Select 4 columns from the uploaded file:
|
| 21 |
+
- latitude of point 1
|
| 22 |
+
- longitude of point 1
|
| 23 |
+
- latitude of point 2
|
| 24 |
+
- longitude of point 2
|
| 25 |
+
|
| 26 |
+
Columns must be numeric or convertible to numeric coordinates.
|
| 27 |
+
|
| 28 |
+
## 5. Step-by-step usage
|
| 29 |
+
1. Open `Apps > Distance Calculator`.
|
| 30 |
+
2. (Optional) Download sample file.
|
| 31 |
+
3. Upload Excel file.
|
| 32 |
+
4. Select the 4 coordinate columns.
|
| 33 |
+
5. Click `CALCULATE DISTANCE`.
|
| 34 |
+
6. Review the resulting table.
|
| 35 |
+
|
| 36 |
+
## 6. Outputs generated
|
| 37 |
+
- a new column: `distance_meters`
|
| 38 |
+
- interactive table with original data plus distance result
|
| 39 |
+
|
| 40 |
+
## 7. Frequent errors and fixes
|
| 41 |
+
- Calculation error for one or more rows.
|
| 42 |
+
- Fix: check selected columns contain valid numeric latitude/longitude.
|
| 43 |
+
- No output after click.
|
| 44 |
+
- Fix: confirm all 4 selectors are set correctly.
|
| 45 |
+
- Wrong distances.
|
| 46 |
+
- Fix: verify coordinates are in decimal degrees (not DMS text).
|
| 47 |
+
|
| 48 |
+
## 8. Minimal reproducible example
|
| 49 |
+
- Input: `samples/distance.xlsx`
|
| 50 |
+
- Action: map point 1/point 2 columns and click `CALCULATE DISTANCE`.
|
| 51 |
+
- Expected result: `distance_meters` populated for each row.
|
| 52 |
+
|
| 53 |
+
## 9. Known limitations
|
| 54 |
+
- Only Excel `.xlsx` input is supported.
|
| 55 |
+
- No dedicated export button on this page.
|
| 56 |
+
- Invalid rows can fail row-level distance computation.
|
| 57 |
+
|
| 58 |
+
## 10. Version and update date
|
| 59 |
+
- Documentation version: 1.0
|
| 60 |
+
- Last update: 2026-02-23
|
| 61 |
+
"""
|
| 62 |
+
)
|
documentations/dump_analysis_doc.py
ADDED
|
@@ -0,0 +1,56 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# Dump Analytics Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Explain the analytics dashboard displayed after generating full databases with stats.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this documentation when reviewing post-generation charts and metrics for GSM/WCDMA/LTE (FDD/TDD).
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
No direct upload in this module.
|
| 15 |
+
It is fed by the `Generate All DBs and Show Stats` flow from `Apps > Generate Databases`.
|
| 16 |
+
|
| 17 |
+
## 4. Required fields
|
| 18 |
+
The analytics rely on precomputed structures in `utils.utils_vars` (site, GSM, WCDMA, LTE metrics).
|
| 19 |
+
|
| 20 |
+
## 5. Step-by-step usage
|
| 21 |
+
1. Open `Apps > Generate Databases`.
|
| 22 |
+
2. Upload a full dump `.xlsb`.
|
| 23 |
+
3. Click `Generate All DBs and Show Stats`.
|
| 24 |
+
4. Open Data and Chart tabs.
|
| 25 |
+
5. Use this page to interpret the displayed indicators.
|
| 26 |
+
|
| 27 |
+
## 6. Outputs generated
|
| 28 |
+
Displayed analytics include:
|
| 29 |
+
- site counts and band distributions
|
| 30 |
+
- GSM controller/LAC/TRX distributions
|
| 31 |
+
- WCDMA RNC/LAC/admin-state distributions
|
| 32 |
+
- LTE FDD/TDD distributions (bands, PCI, TAC, admin-state)
|
| 33 |
+
|
| 34 |
+
## 7. Frequent errors and fixes
|
| 35 |
+
- Empty analytics sections.
|
| 36 |
+
- Fix: ensure full dump criteria are met before stats generation.
|
| 37 |
+
- Missing distributions.
|
| 38 |
+
- Fix: verify source dump has required technology sheets.
|
| 39 |
+
- Inconsistent counts.
|
| 40 |
+
- Fix: regenerate all DBs and ensure no parsing errors occurred.
|
| 41 |
+
|
| 42 |
+
## 8. Minimal reproducible example
|
| 43 |
+
- Input: full dump `.xlsb` containing 2G/3G/LTE sheets.
|
| 44 |
+
- Action: run `Generate All DBs and Show Stats`.
|
| 45 |
+
- Expected result: populated Data and Chart tabs with multi-technology metrics.
|
| 46 |
+
|
| 47 |
+
## 9. Known limitations
|
| 48 |
+
- Not exposed as a standalone top-level app page.
|
| 49 |
+
- Availability depends on successful preprocessing in database workflow.
|
| 50 |
+
- Large dumps can impact rendering speed.
|
| 51 |
+
|
| 52 |
+
## 10. Version and update date
|
| 53 |
+
- Documentation version: 1.0
|
| 54 |
+
- Last update: 2026-02-23
|
| 55 |
+
"""
|
| 56 |
+
)
|
documentations/dump_compare_doc.py
ADDED
|
@@ -0,0 +1,54 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# Dump Compare Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Compare two dump files and detect row-level and parameter-level differences per object class.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need to identify what changed between an old and a new dump before integration.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: old dump `.xlsb`
|
| 15 |
+
- Required: new dump `.xlsb`
|
| 16 |
+
- Required: object classes (comma-separated, for example `BCF,BTS,WCEL`)
|
| 17 |
+
|
| 18 |
+
## 4. Required columns/fields
|
| 19 |
+
For each requested object class sheet, the tool expects a Dist Name-like column (detected by matching `dist` + `name`).
|
| 20 |
+
|
| 21 |
+
## 5. Step-by-step usage
|
| 22 |
+
1. Open `Apps > Dump Compare`.
|
| 23 |
+
2. Upload old and new `.xlsb` files.
|
| 24 |
+
3. Enter object classes separated by commas.
|
| 25 |
+
4. Click `Run Comparison`.
|
| 26 |
+
5. Review logs and download ZIP results.
|
| 27 |
+
|
| 28 |
+
## 6. Outputs generated
|
| 29 |
+
- in-app logs per object class
|
| 30 |
+
- ZIP archive (`differences.zip`) containing one Excel per sheet with detected differences
|
| 31 |
+
|
| 32 |
+
## 7. Frequent errors and fixes
|
| 33 |
+
- Dist_Name column not found.
|
| 34 |
+
- Fix: verify sheet headers and naming consistency.
|
| 35 |
+
- No common rows between dumps.
|
| 36 |
+
- Fix: verify the same object class and naming standard in both files.
|
| 37 |
+
- Sheet read error.
|
| 38 |
+
- Fix: ensure object class exists in both dumps.
|
| 39 |
+
|
| 40 |
+
## 8. Minimal reproducible example
|
| 41 |
+
- Input: two dumps containing sheet `BTS` with same Dist_Name domain and at least one changed parameter.
|
| 42 |
+
- Action: run comparison on `BTS`.
|
| 43 |
+
- Expected result: one `BTS_differences.xlsx` inside `differences.zip`.
|
| 44 |
+
|
| 45 |
+
## 9. Known limitations
|
| 46 |
+
- Header detection is pattern-based and may fail on highly custom sheets.
|
| 47 |
+
- Column comparison requires matching column names after cleaning.
|
| 48 |
+
- Export is split by object class only.
|
| 49 |
+
|
| 50 |
+
## 10. Version and update date
|
| 51 |
+
- Documentation version: 1.0
|
| 52 |
+
- Last update: 2026-02-23
|
| 53 |
+
"""
|
| 54 |
+
)
|
documentations/fnb_parser_doc.py
ADDED
|
@@ -0,0 +1,59 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# F4NB Extractor Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Extract site and sector information from F4NB Word documents (`.docx`) and export structured Excel output.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when transforming design documents into tabular data for analysis and mapping.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: one or multiple `.docx` files
|
| 15 |
+
- Sample file available: `samples/FN4B.docx`
|
| 16 |
+
|
| 17 |
+
## 4. Required fields in document content
|
| 18 |
+
Parser looks for common labels in table cells such as:
|
| 19 |
+
- code/site name
|
| 20 |
+
- locality/address
|
| 21 |
+
- coordinates (`X`, `Y`, `Z`)
|
| 22 |
+
- sector parameters (`Azimuth`, `Height`, `Tilt mecanique`, `Tilt electrique`)
|
| 23 |
+
|
| 24 |
+
## 5. Step-by-step usage
|
| 25 |
+
1. Open `Apps > F4NB Extractor`.
|
| 26 |
+
2. Upload one or more `.docx` files.
|
| 27 |
+
3. Click `Process`.
|
| 28 |
+
4. Review extracted dataframe.
|
| 29 |
+
5. Download Excel output.
|
| 30 |
+
6. Review map if coordinates are parsed.
|
| 31 |
+
|
| 32 |
+
## 6. Outputs generated
|
| 33 |
+
- extracted dataframe (site + sector rows)
|
| 34 |
+
- downloadable file `extracted_fnb.xlsx`
|
| 35 |
+
- optional map using converted decimal coordinates
|
| 36 |
+
|
| 37 |
+
## 7. Frequent errors and fixes
|
| 38 |
+
- No data extracted.
|
| 39 |
+
- Fix: verify input document follows expected table structure.
|
| 40 |
+
- Coordinate conversion fails.
|
| 41 |
+
- Fix: clean coordinate format and direction markers.
|
| 42 |
+
- Missing map points.
|
| 43 |
+
- Fix: ensure X/Y values are valid and parseable.
|
| 44 |
+
|
| 45 |
+
## 8. Minimal reproducible example
|
| 46 |
+
- Input: `samples/FN4B.docx`
|
| 47 |
+
- Action: upload and click `Process`.
|
| 48 |
+
- Expected result: sector rows displayed and Excel download available.
|
| 49 |
+
|
| 50 |
+
## 9. Known limitations
|
| 51 |
+
- Extraction is heuristic and template-dependent.
|
| 52 |
+
- Highly custom DOCX layouts may require parser updates.
|
| 53 |
+
- Map relies on successful coordinate conversion.
|
| 54 |
+
|
| 55 |
+
## 10. Version and update date
|
| 56 |
+
- Documentation version: 1.0
|
| 57 |
+
- Last update: 2026-02-23
|
| 58 |
+
"""
|
| 59 |
+
)
|
documentations/gps_converter_doc.py
ADDED
|
@@ -0,0 +1,65 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# GPS Converter Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Convert coordinates between Decimal and DMS (Degree-Minute-Second) formats, from Excel files or manual input.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need to:
|
| 12 |
+
- normalize coordinate format before mapping/export
|
| 13 |
+
- convert DMS logs into decimal values
|
| 14 |
+
- convert decimal values into DMS for reporting
|
| 15 |
+
|
| 16 |
+
## 3. Input files and accepted formats
|
| 17 |
+
Two workflows are available:
|
| 18 |
+
- Import Excel mode: `.xlsx` file
|
| 19 |
+
- Manual mode: direct coordinate input in UI
|
| 20 |
+
|
| 21 |
+
## 4. Required columns/fields
|
| 22 |
+
For Excel mode, select two columns from uploaded file:
|
| 23 |
+
- latitude column
|
| 24 |
+
- longitude column
|
| 25 |
+
|
| 26 |
+
For manual mode:
|
| 27 |
+
- Decimal -> DMS: numeric lat/lon
|
| 28 |
+
- DMS -> Decimal: valid DMS strings with direction (N/S/E/W)
|
| 29 |
+
|
| 30 |
+
## 5. Step-by-step usage
|
| 31 |
+
1. Open `Apps > GPS Converter`.
|
| 32 |
+
2. Choose tab: `Import Excel` or `Manual Input`.
|
| 33 |
+
3. In Excel mode, upload `.xlsx` and select latitude/longitude columns.
|
| 34 |
+
4. Select conversion type (`DMS to Decimal` or `Decimal to DMS`).
|
| 35 |
+
5. Click `CONVERT`.
|
| 36 |
+
6. Review converted table and map visualization.
|
| 37 |
+
|
| 38 |
+
## 6. Outputs generated
|
| 39 |
+
- converted latitude/longitude columns
|
| 40 |
+
- interactive grid display
|
| 41 |
+
- map preview for valid coordinates
|
| 42 |
+
|
| 43 |
+
## 7. Frequent errors and fixes
|
| 44 |
+
- Invalid decimal ranges.
|
| 45 |
+
- Fix: latitude must be [-90, 90], longitude [-180, 180].
|
| 46 |
+
- DMS parse error.
|
| 47 |
+
- Fix: use valid DMS format with direction letter.
|
| 48 |
+
- Empty map.
|
| 49 |
+
- Fix: verify converted columns contain valid numeric coordinates.
|
| 50 |
+
|
| 51 |
+
## 8. Minimal reproducible example
|
| 52 |
+
- Input: `samples/Decimal_to_DMS.xlsx`
|
| 53 |
+
- Action: choose latitude/longitude columns, set `Decimal to DMS`, click `CONVERT`.
|
| 54 |
+
- Expected result: converted columns + map rendered from original decimal values.
|
| 55 |
+
|
| 56 |
+
## 9. Known limitations
|
| 57 |
+
- Conversion errors are strict by coordinate range.
|
| 58 |
+
- Map only renders rows with valid numeric coordinates.
|
| 59 |
+
- Mixed malformed rows may require source cleanup.
|
| 60 |
+
|
| 61 |
+
## 10. Version and update date
|
| 62 |
+
- Documentation version: 1.0
|
| 63 |
+
- Last update: 2026-02-23
|
| 64 |
+
"""
|
| 65 |
+
)
|
documentations/gsm_lac_load_doc.py
ADDED
|
@@ -0,0 +1,60 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# GSM LAC Load Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Analyze GSM LAC paging load by combining dump data and hourly KPI report.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page to detect overloaded LACs and review paging behavior per BSC/LAC over time.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: dump file in `.xlsb`
|
| 15 |
+
- Required: hourly KPI report in `.csv`
|
| 16 |
+
- Recommendation: at least 7 days of hourly KPI data
|
| 17 |
+
|
| 18 |
+
## 4. Required columns/fields
|
| 19 |
+
From dump processing:
|
| 20 |
+
- `ID_BTS`, `BSC`, `code`, `locationAreaIdLAC`, plus TRX fields
|
| 21 |
+
From KPI report (after cleaning):
|
| 22 |
+
- `BSC_name`, `BTS_name`, `Paging_messages_on_air_interface`, `DELETE_PAGING_COMMAND_c003038`, `datetime`
|
| 23 |
+
|
| 24 |
+
## 5. Step-by-step usage
|
| 25 |
+
1. Open `Paging Analysis > GSM LAC Load Analysis`.
|
| 26 |
+
2. Upload dump `.xlsb` and hourly report `.csv`.
|
| 27 |
+
3. Click `Analyze Data`.
|
| 28 |
+
4. Review max paging/utilization table.
|
| 29 |
+
5. Filter BSC and BSC_Lac in interactive section.
|
| 30 |
+
6. Review trend charts and filtered dataframe.
|
| 31 |
+
|
| 32 |
+
## 6. Outputs generated
|
| 33 |
+
- detailed aggregated `lac_load_df`
|
| 34 |
+
- max paging summary per `BSC_Lac`
|
| 35 |
+
- line charts for paging and delete paging command
|
| 36 |
+
- interactive filtered data table
|
| 37 |
+
|
| 38 |
+
## 7. Frequent errors and fixes
|
| 39 |
+
- CSV parsing issue.
|
| 40 |
+
- Fix: ensure `;` delimiter and expected KPI columns.
|
| 41 |
+
- Missing BSC/LAC context after merge.
|
| 42 |
+
- Fix: verify BTS naming and code extraction consistency.
|
| 43 |
+
- No output after analysis.
|
| 44 |
+
- Fix: ensure both files are uploaded and contain valid data.
|
| 45 |
+
|
| 46 |
+
## 8. Minimal reproducible example
|
| 47 |
+
- Input: one valid dump `.xlsb` + one hourly KPI CSV with paging fields.
|
| 48 |
+
- Action: run analysis and filter one BSC.
|
| 49 |
+
- Expected result: summary table and non-empty trend charts.
|
| 50 |
+
|
| 51 |
+
## 9. Known limitations
|
| 52 |
+
- Depends on naming consistency for code extraction from BTS name.
|
| 53 |
+
- Utilization formula uses fixed reference value in implementation.
|
| 54 |
+
- Hourly report schema deviations can break analysis.
|
| 55 |
+
|
| 56 |
+
## 10. Version and update date
|
| 57 |
+
- Documentation version: 1.0
|
| 58 |
+
- Last update: 2026-02-23
|
| 59 |
+
"""
|
| 60 |
+
)
|
documentations/import_physical_db_doc.py
ADDED
|
@@ -0,0 +1,51 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# Physical Database Verification Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Display the current physical database reference used by the app.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need to inspect the latest physical database content before validation tasks.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
No user upload is required.
|
| 15 |
+
The page reads local file:
|
| 16 |
+
- `physical_db/physical_database.csv`
|
| 17 |
+
|
| 18 |
+
## 4. Required columns/fields
|
| 19 |
+
No fixed schema is enforced by this page. It displays the CSV as-is.
|
| 20 |
+
|
| 21 |
+
## 5. Step-by-step usage
|
| 22 |
+
1. Open `Apps > Physical Database Verification`.
|
| 23 |
+
2. Click `Show actual physical database`.
|
| 24 |
+
3. Review the displayed table.
|
| 25 |
+
|
| 26 |
+
## 6. Outputs generated
|
| 27 |
+
- data table rendered from `physical_database.csv`
|
| 28 |
+
|
| 29 |
+
## 7. Frequent errors and fixes
|
| 30 |
+
- File not found warning.
|
| 31 |
+
- Fix: ensure `physical_db/physical_database.csv` exists in repository.
|
| 32 |
+
- Empty table.
|
| 33 |
+
- Fix: verify CSV has data rows and valid delimiter.
|
| 34 |
+
- Parsing issue.
|
| 35 |
+
- Fix: confirm CSV encoding and structure.
|
| 36 |
+
|
| 37 |
+
## 8. Minimal reproducible example
|
| 38 |
+
- Input: existing `physical_db/physical_database.csv`
|
| 39 |
+
- Action: click show button.
|
| 40 |
+
- Expected result: full CSV displayed as Streamlit dataframe.
|
| 41 |
+
|
| 42 |
+
## 9. Known limitations
|
| 43 |
+
- Read-only page (no edit/upload flow).
|
| 44 |
+
- No filtering/export controls in current implementation.
|
| 45 |
+
- Depends entirely on local file availability.
|
| 46 |
+
|
| 47 |
+
## 10. Version and update date
|
| 48 |
+
- Documentation version: 1.0
|
| 49 |
+
- Last update: 2026-02-23
|
| 50 |
+
"""
|
| 51 |
+
)
|
documentations/index_doc.py
ADDED
|
@@ -0,0 +1,64 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.title("Documentation Home")
|
| 4 |
+
st.markdown(
|
| 5 |
+
"""
|
| 6 |
+
Welcome to the in-app documentation hub.
|
| 7 |
+
|
| 8 |
+
Use this page as a starting point to:
|
| 9 |
+
- pick the right tool for your workflow
|
| 10 |
+
- check input formats and required columns
|
| 11 |
+
- troubleshoot common issues quickly
|
| 12 |
+
|
| 13 |
+
Recommended reading order for new users:
|
| 14 |
+
1. Databases Documentation
|
| 15 |
+
2. CIQ Verification Documentation
|
| 16 |
+
3. UE Capability Documentation
|
| 17 |
+
4. GPS / Distance / Sector KML documentation
|
| 18 |
+
"""
|
| 19 |
+
)
|
| 20 |
+
|
| 21 |
+
st.subheader("Core Documentation")
|
| 22 |
+
st.page_link("documentations/database_doc.py", label="Databases Documentation")
|
| 23 |
+
st.page_link("documentations/core_dump_doc.py", label="Dump Core Documentation")
|
| 24 |
+
st.page_link("documentations/ue_capability_doc.py", label="UE Capability Documentation")
|
| 25 |
+
st.page_link("documentations/ciq_verification_doc.py", label="CIQ Verification Documentation")
|
| 26 |
+
st.page_link("documentations/ciq_2g_doc.py", label="CIQ 2G Documentation")
|
| 27 |
+
st.page_link("documentations/ciq_3g_doc.py", label="CIQ 3G Documentation")
|
| 28 |
+
st.page_link("documentations/ciq_4g_doc.py", label="CIQ 4G Documentation")
|
| 29 |
+
|
| 30 |
+
st.subheader("Geo Utilities")
|
| 31 |
+
st.page_link("documentations/gps_converter_doc.py", label="GPS Converter Documentation")
|
| 32 |
+
st.page_link("documentations/distance_doc.py", label="Distance Calculator Documentation")
|
| 33 |
+
st.page_link("documentations/sector_kml_doc.py", label="Sector KML Generator Documentation")
|
| 34 |
+
st.page_link("documentations/dbm_watt_doc.py", label="dBm Watt Documentation")
|
| 35 |
+
|
| 36 |
+
st.subheader("KPI and Capacity")
|
| 37 |
+
st.page_link("documentations/gsm_capacity_docs.py", label="GSM Capacity Documentation")
|
| 38 |
+
st.page_link("documentations/lte_capacity_docs.py", label="LTE Capacity Documentation")
|
| 39 |
+
st.page_link("documentations/wbts_capacity_doc.py", label="WBTS Capacity Documentation")
|
| 40 |
+
st.page_link("documentations/wcel_capacity_doc.py", label="WCEL Capacity Documentation")
|
| 41 |
+
st.page_link("documentations/lcg_analysis_doc.py", label="LCG Analysis Documentation")
|
| 42 |
+
st.page_link("documentations/gsm_lac_load_doc.py", label="GSM LAC Load Documentation")
|
| 43 |
+
st.page_link("documentations/trafic_analysis_doc.py", label="Traffic Analysis Documentation")
|
| 44 |
+
st.page_link("documentations/anomaly_detection_doc.py", label="Anomaly Detection Documentation")
|
| 45 |
+
|
| 46 |
+
st.subheader("Operational Tools")
|
| 47 |
+
st.page_link("documentations/dump_compare_doc.py", label="Dump Compare Documentation")
|
| 48 |
+
st.page_link("documentations/clustering_doc.py", label="Clustering Documentation")
|
| 49 |
+
st.page_link("documentations/multi_points_distance_doc.py", label="Multi Points Distance Documentation")
|
| 50 |
+
st.page_link("documentations/import_physical_db_doc.py", label="Physical Database Documentation")
|
| 51 |
+
st.page_link("documentations/dump_analysis_doc.py", label="Dump Analytics Documentation")
|
| 52 |
+
st.page_link("documentations/fnb_parser_doc.py", label="F4NB Extractor Documentation")
|
| 53 |
+
st.page_link("documentations/parameters_distribution_doc.py", label="Parameters Distribution Documentation")
|
| 54 |
+
st.page_link("documentations/lte_drop_traffic_doc.py", label="LTE Drop Traffic Documentation")
|
| 55 |
+
|
| 56 |
+
st.subheader("Governance")
|
| 57 |
+
st.markdown(
|
| 58 |
+
"""
|
| 59 |
+
- Coverage matrix: `documentations/DOC_COVERAGE.md`
|
| 60 |
+
- Writing standard: `documentations/README_DOCS.md`
|
| 61 |
+
|
| 62 |
+
Last update: 2026-02-23
|
| 63 |
+
"""
|
| 64 |
+
)
|
documentations/lcg_analysis_doc.py
ADDED
|
@@ -0,0 +1,58 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# LCG Capacity Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Analyze LCG utilization behavior and detect imbalance/capacity pressure.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page for LCG-level dimensioning checks and region-level optimization decisions.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: LCG report in `.csv`
|
| 15 |
+
- Data recommendation: at least 3 days, daily aggregated, LCG-level export
|
| 16 |
+
|
| 17 |
+
## 4. Required columns/fields
|
| 18 |
+
Input CSV must include columns expected by `load_and_process_lcg_data`.
|
| 19 |
+
Outputs include `final_comments` and geographic fields for map plotting.
|
| 20 |
+
|
| 21 |
+
## 5. Step-by-step usage
|
| 22 |
+
1. Open `Capacity Analysis > LCG Capacity Analysis`.
|
| 23 |
+
2. Upload LCG CSV report.
|
| 24 |
+
3. Set analysis days, threshold days, utilization threshold, and LCG difference threshold.
|
| 25 |
+
4. Click `Analyze Data`.
|
| 26 |
+
5. Review tables, charts, and map.
|
| 27 |
+
6. Download report.
|
| 28 |
+
|
| 29 |
+
## 6. Outputs generated
|
| 30 |
+
- LCG analysis dataframe
|
| 31 |
+
- KPI dataframe
|
| 32 |
+
- final comment distributions (global and per region)
|
| 33 |
+
- map of final comments
|
| 34 |
+
- downloadable file: `LCG_Capacity_Report.xlsx`
|
| 35 |
+
|
| 36 |
+
## 7. Frequent errors and fixes
|
| 37 |
+
- Validation warning on days settings.
|
| 38 |
+
- Fix: ensure threshold days <= analysis days and analysis days >= 3.
|
| 39 |
+
- Processing error.
|
| 40 |
+
- Fix: verify CSV schema and numeric fields.
|
| 41 |
+
- Empty map.
|
| 42 |
+
- Fix: confirm presence of `Longitude`/`Latitude` and non-empty comments.
|
| 43 |
+
|
| 44 |
+
## 8. Minimal reproducible example
|
| 45 |
+
- Input: valid LCG CSV with at least 7 days.
|
| 46 |
+
- Action: run default settings.
|
| 47 |
+
- Expected result: analysis table, distributions, map, and Excel download.
|
| 48 |
+
|
| 49 |
+
## 9. Known limitations
|
| 50 |
+
- Quality depends on source KPI completeness.
|
| 51 |
+
- Thresholds may require tuning by market behavior.
|
| 52 |
+
- Map view requires valid coordinates.
|
| 53 |
+
|
| 54 |
+
## 10. Version and update date
|
| 55 |
+
- Documentation version: 1.0
|
| 56 |
+
- Last update: 2026-02-23
|
| 57 |
+
"""
|
| 58 |
+
)
|
documentations/lte_drop_traffic_doc.py
ADDED
|
@@ -0,0 +1,60 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# LTE Drop Traffic Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Detect LTE cells with significant traffic drop by comparing recent days against long-term average.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page for early detection of degraded LTE traffic behavior and location-based prioritization.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: traffic report in `.csv`
|
| 15 |
+
- Current parser expects `;` delimiter.
|
| 16 |
+
|
| 17 |
+
## 4. Required columns/fields
|
| 18 |
+
Input CSV should contain at least:
|
| 19 |
+
- `PERIOD_START_TIME` (format `%m.%d.%Y`)
|
| 20 |
+
- `LNCEL name`
|
| 21 |
+
- `4G/LTE DL Traffic Volume (GBytes)`
|
| 22 |
+
- `4G/LTE UL Traffic Volume (GBytes)`
|
| 23 |
+
|
| 24 |
+
## 5. Step-by-step usage
|
| 25 |
+
1. Open `KPI Analysis > LTE Drop Traffic Analysis`.
|
| 26 |
+
2. Upload CSV report.
|
| 27 |
+
3. Set number of last days and drop threshold.
|
| 28 |
+
4. Review affected cells table.
|
| 29 |
+
5. Explore trend plot by selected cell.
|
| 30 |
+
6. Review map and download affected cells.
|
| 31 |
+
|
| 32 |
+
## 6. Outputs generated
|
| 33 |
+
- affected cells dataframe with `drop_%`
|
| 34 |
+
- downloadable file `traffic_drop_cells.xlsx`
|
| 35 |
+
- trend plot per selected cell
|
| 36 |
+
- map of dropped cells
|
| 37 |
+
|
| 38 |
+
## 7. Frequent errors and fixes
|
| 39 |
+
- Date parsing error.
|
| 40 |
+
- Fix: ensure `PERIOD_START_TIME` matches `%m.%d.%Y` format.
|
| 41 |
+
- Empty affected cells list.
|
| 42 |
+
- Fix: lower threshold or increase analysis window.
|
| 43 |
+
- Missing map coordinates.
|
| 44 |
+
- Fix: ensure physical DB mapping by code is available.
|
| 45 |
+
|
| 46 |
+
## 8. Minimal reproducible example
|
| 47 |
+
- Input: LTE traffic CSV with several days and measurable drop on some cells.
|
| 48 |
+
- Action: set threshold to `50%` and run analysis.
|
| 49 |
+
- Expected result: non-empty affected cell list and downloadable Excel.
|
| 50 |
+
|
| 51 |
+
## 9. Known limitations
|
| 52 |
+
- Uses mean-based long-term vs recent-day comparison.
|
| 53 |
+
- Relies on code matching against physical database for map.
|
| 54 |
+
- Strict date format requirement in input parser.
|
| 55 |
+
|
| 56 |
+
## 10. Version and update date
|
| 57 |
+
- Documentation version: 1.0
|
| 58 |
+
- Last update: 2026-02-23
|
| 59 |
+
"""
|
| 60 |
+
)
|
documentations/multi_points_distance_doc.py
ADDED
|
@@ -0,0 +1,62 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# Multi Points Distance Calculator Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Calculate distances between two datasets of points and identify closest matches.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page to compare reference sites against candidate/neighbor sites and find nearest pairs.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: Dataset 1 in `.xlsx`
|
| 15 |
+
- Required: Dataset 2 in `.xlsx`
|
| 16 |
+
- Samples: `samples/Dataset1.xlsx`, `samples/Dataset2.xlsx`
|
| 17 |
+
|
| 18 |
+
## 4. Required columns/fields
|
| 19 |
+
For each dataset, select:
|
| 20 |
+
- code column
|
| 21 |
+
- latitude column
|
| 22 |
+
- longitude column
|
| 23 |
+
|
| 24 |
+
## 5. Step-by-step usage
|
| 25 |
+
1. Open `Apps > Multi Points Distance Calculator`.
|
| 26 |
+
2. Upload both Excel datasets.
|
| 27 |
+
3. Select code/latitude/longitude columns for each dataset.
|
| 28 |
+
4. Set minimum distance threshold (km).
|
| 29 |
+
5. Click `Calculate Distances`.
|
| 30 |
+
6. Review closest matches and download CSV outputs.
|
| 31 |
+
|
| 32 |
+
## 6. Outputs generated
|
| 33 |
+
- closest matches table
|
| 34 |
+
- closest matches below threshold table
|
| 35 |
+
- downloadable files:
|
| 36 |
+
- `all_distances.csv`
|
| 37 |
+
- `closest_matches.csv`
|
| 38 |
+
- `closest_matches_<threshold>km.csv`
|
| 39 |
+
|
| 40 |
+
## 7. Frequent errors and fixes
|
| 41 |
+
- Processing error after upload.
|
| 42 |
+
- Fix: verify selected columns and coordinate formats.
|
| 43 |
+
- No close matches below threshold.
|
| 44 |
+
- Fix: increase threshold value.
|
| 45 |
+
- Unexpected distances.
|
| 46 |
+
- Fix: ensure coordinates are decimal degrees.
|
| 47 |
+
|
| 48 |
+
## 8. Minimal reproducible example
|
| 49 |
+
- Input: `Dataset1.xlsx` and `Dataset2.xlsx` samples.
|
| 50 |
+
- Action: select columns and run with threshold `5 km`.
|
| 51 |
+
- Expected result: closest matches plus downloadable CSV files.
|
| 52 |
+
|
| 53 |
+
## 9. Known limitations
|
| 54 |
+
- Only `.xlsx` inputs are supported.
|
| 55 |
+
- Quality depends on coordinate accuracy.
|
| 56 |
+
- Large datasets may take longer to compute.
|
| 57 |
+
|
| 58 |
+
## 10. Version and update date
|
| 59 |
+
- Documentation version: 1.0
|
| 60 |
+
- Last update: 2026-02-23
|
| 61 |
+
"""
|
| 62 |
+
)
|
documentations/parameters_distribution_doc.py
ADDED
|
@@ -0,0 +1,57 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# Parameters Distribution Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Explore value distributions of selected parameters in any object class sheet from an OML dump.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page for quick parameter profiling, outlier spotting, and baseline checks by object class.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: dump file in `.xlsb`
|
| 15 |
+
|
| 16 |
+
## 4. Required columns/fields
|
| 17 |
+
No fixed parameter list is required.
|
| 18 |
+
Workflow:
|
| 19 |
+
- select object class (sheet)
|
| 20 |
+
- select one or multiple parameters (columns)
|
| 21 |
+
|
| 22 |
+
## 5. Step-by-step usage
|
| 23 |
+
1. Open `Apps > Parameters distribution`.
|
| 24 |
+
2. Upload dump `.xlsb`.
|
| 25 |
+
3. Select object class sheet.
|
| 26 |
+
4. Select one or more parameters.
|
| 27 |
+
5. Review distribution table and bar chart per parameter.
|
| 28 |
+
|
| 29 |
+
## 6. Outputs generated
|
| 30 |
+
For each selected parameter:
|
| 31 |
+
- value count table
|
| 32 |
+
- percentage distribution
|
| 33 |
+
- bar chart
|
| 34 |
+
|
| 35 |
+
## 7. Frequent errors and fixes
|
| 36 |
+
- Selected sheet appears empty.
|
| 37 |
+
- Fix: verify sheet contains readable data rows.
|
| 38 |
+
- Parameter list missing expected fields.
|
| 39 |
+
- Fix: check selected object class is correct.
|
| 40 |
+
- Large cardinality chart is hard to read.
|
| 41 |
+
- Fix: filter/clean source data before analysis.
|
| 42 |
+
|
| 43 |
+
## 8. Minimal reproducible example
|
| 44 |
+
- Input: valid `.xlsb` dump.
|
| 45 |
+
- Action: select sheet `BTS`, then one categorical parameter.
|
| 46 |
+
- Expected result: count/percentage table and bar chart.
|
| 47 |
+
|
| 48 |
+
## 9. Known limitations
|
| 49 |
+
- High-cardinality fields can produce very dense charts.
|
| 50 |
+
- No export button in current implementation.
|
| 51 |
+
- Requires readable xlsb sheet via current engine.
|
| 52 |
+
|
| 53 |
+
## 10. Version and update date
|
| 54 |
+
- Documentation version: 1.0
|
| 55 |
+
- Last update: 2026-02-23
|
| 56 |
+
"""
|
| 57 |
+
)
|
documentations/sector_kml_doc.py
ADDED
|
@@ -0,0 +1,63 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# Sector KML Generator Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Generate a KML file from sector-level Excel data for geographic visualization.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need to:
|
| 12 |
+
- visualize telecom sectors in GIS/KML viewers
|
| 13 |
+
- share sector orientation and metadata
|
| 14 |
+
- prepare map overlays for field/optimization teams
|
| 15 |
+
|
| 16 |
+
## 3. Input files and accepted formats
|
| 17 |
+
- Required: one `.xlsx` file containing sector data.
|
| 18 |
+
|
| 19 |
+
## 4. Required columns
|
| 20 |
+
The uploaded file must contain all required columns:
|
| 21 |
+
- `code`
|
| 22 |
+
- `name`
|
| 23 |
+
- `Azimut`
|
| 24 |
+
- `Longitude`
|
| 25 |
+
- `Latitude`
|
| 26 |
+
- `size`
|
| 27 |
+
- `color`
|
| 28 |
+
|
| 29 |
+
Any additional column is exported in sector description metadata.
|
| 30 |
+
|
| 31 |
+
## 5. Step-by-step usage
|
| 32 |
+
1. Open `Apps > Sector KML Generator`.
|
| 33 |
+
2. (Optional) Download sample file from the page.
|
| 34 |
+
3. Upload your Excel file.
|
| 35 |
+
4. Ensure required columns are present.
|
| 36 |
+
5. Download generated KML.
|
| 37 |
+
|
| 38 |
+
## 6. Outputs generated
|
| 39 |
+
- downloadable KML file named like `Sectors_kml_<timestamp>.kml`
|
| 40 |
+
|
| 41 |
+
## 7. Frequent errors and fixes
|
| 42 |
+
- Missing required columns error.
|
| 43 |
+
- Fix: rename columns exactly as required.
|
| 44 |
+
- Empty/invalid geometry in output.
|
| 45 |
+
- Fix: verify `Latitude`/`Longitude` and azimuth values.
|
| 46 |
+
- Unexpected style/color rendering.
|
| 47 |
+
- Fix: validate color codes and supported color naming.
|
| 48 |
+
|
| 49 |
+
## 8. Minimal reproducible example
|
| 50 |
+
- Input: `samples/Sector_kml.xlsx`
|
| 51 |
+
- Action: upload file and click download when generated.
|
| 52 |
+
- Expected result: valid KML file ready for map tools.
|
| 53 |
+
|
| 54 |
+
## 9. Known limitations
|
| 55 |
+
- Input schema is case-sensitive for required column names.
|
| 56 |
+
- Only `.xlsx` is supported.
|
| 57 |
+
- Invalid coordinate values may produce unusable geometry.
|
| 58 |
+
|
| 59 |
+
## 10. Version and update date
|
| 60 |
+
- Documentation version: 1.0
|
| 61 |
+
- Last update: 2026-02-23
|
| 62 |
+
"""
|
| 63 |
+
)
|
documentations/trafic_analysis_doc.py
CHANGED
|
@@ -1,118 +1,68 @@
|
|
| 1 |
-
import streamlit as st
|
| 2 |
|
| 3 |
st.markdown(
|
| 4 |
"""
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
| 39 |
-
|
| 40 |
-
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
|
| 47 |
-
|
| 48 |
-
|
| 49 |
-
|
| 50 |
-
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
|
| 56 |
-
|
| 57 |
-
|
| 58 |
-
|
| 59 |
-
|
| 60 |
-
|
| 61 |
-
|
| 62 |
-
-
|
| 63 |
-
|
| 64 |
-
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
- Set the number of top traffic sites to display
|
| 68 |
-
- The application will automatically process and analyze the data
|
| 69 |
-
|
| 70 |
-
### 4. Results
|
| 71 |
-
- **Summary Analysis**: Overview of traffic metrics
|
| 72 |
-
- **Top Sites**: Lists and charts of highest traffic sites
|
| 73 |
-
- **Maps**: Geographical visualization of traffic distribution
|
| 74 |
-
- **Monthly Trends**: Traffic patterns over time
|
| 75 |
-
|
| 76 |
-
## Technical Implementation
|
| 77 |
-
|
| 78 |
-
### Key Functions
|
| 79 |
-
|
| 80 |
-
1. **Data Processing**
|
| 81 |
-
- preprocess_2g(): Processes 2G traffic data
|
| 82 |
-
- preprocess_3g(): Processes 3G traffic data
|
| 83 |
-
- preprocess_lte(): Processes LTE traffic data
|
| 84 |
-
|
| 85 |
-
2. **Analysis Functions**
|
| 86 |
-
- merge_and_compare(): Combines and compares traffic data across technologies
|
| 87 |
-
- monthly_data_analysis(): Aggregates data by month for trend analysis
|
| 88 |
-
|
| 89 |
-
3. **Visualization**
|
| 90 |
-
- Interactive maps using Plotly
|
| 91 |
-
- Bar charts for top sites
|
| 92 |
-
- Data tables with sortable columns
|
| 93 |
-
|
| 94 |
-
## Dependencies
|
| 95 |
-
- Python 3.7+
|
| 96 |
-
- pandas
|
| 97 |
-
- plotly
|
| 98 |
-
- streamlit
|
| 99 |
-
- numpy
|
| 100 |
-
|
| 101 |
-
## Output
|
| 102 |
-
The application generates:
|
| 103 |
-
1. Summary tables of traffic metrics
|
| 104 |
-
2. Interactive visualizations
|
| 105 |
-
3. Exportable reports in Excel format
|
| 106 |
-
|
| 107 |
-
## Best Practices
|
| 108 |
-
1. Ensure input files follow the required format
|
| 109 |
-
2. Select appropriate date ranges for meaningful comparison
|
| 110 |
-
3. Review top traffic sites for network optimization opportunities
|
| 111 |
-
4. Use the monthly analysis to identify long-term trends
|
| 112 |
-
|
| 113 |
-
## Troubleshooting
|
| 114 |
-
- If data doesn't load, check file formats and required columns
|
| 115 |
-
- Ensure date ranges are properly selected
|
| 116 |
-
- Verify that uploaded files contain valid data
|
| 117 |
"""
|
| 118 |
)
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
|
| 3 |
st.markdown(
|
| 4 |
"""
|
| 5 |
+
# Traffic Analysis Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Analyze and compare 2G/3G/LTE traffic behavior across pre/post periods and visualize trends, top sites, and maps.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when you need:
|
| 12 |
+
- pre/post traffic comparison
|
| 13 |
+
- top traffic site ranking
|
| 14 |
+
- trend visualization over time
|
| 15 |
+
- map-based traffic review
|
| 16 |
+
- exportable summary outputs
|
| 17 |
+
|
| 18 |
+
## 3. Input files and accepted formats
|
| 19 |
+
Typical inputs are KPI traffic exports in `.csv` (or zip containing CSV, depending on source export).
|
| 20 |
+
You usually provide one report per RAT:
|
| 21 |
+
- 2G traffic report
|
| 22 |
+
- 3G traffic report
|
| 23 |
+
- LTE traffic report
|
| 24 |
+
|
| 25 |
+
## 4. Required fields (typical)
|
| 26 |
+
Required columns depend on RAT and source report format. Common fields include:
|
| 27 |
+
- site/cell identifiers (`BCF name`, `WBTS name`, `LNBTS name`)
|
| 28 |
+
- timestamp/date (`PERIOD_START_TIME`)
|
| 29 |
+
- traffic metrics (voice/data indicators by RAT)
|
| 30 |
+
|
| 31 |
+
## 5. Step-by-step usage
|
| 32 |
+
1. Open `KPI Analysis > Trafic Analysis`.
|
| 33 |
+
2. Upload available traffic reports (2G/3G/LTE).
|
| 34 |
+
3. Configure period filters (pre/post/last period).
|
| 35 |
+
4. Choose analysis options (top N, thresholds, drill-down filters).
|
| 36 |
+
5. Review tables, charts, and maps.
|
| 37 |
+
6. Download generated report when available.
|
| 38 |
+
|
| 39 |
+
## 6. Outputs generated
|
| 40 |
+
- comparative summary tables
|
| 41 |
+
- top sites by voice/data traffic
|
| 42 |
+
- trend charts
|
| 43 |
+
- map visualizations
|
| 44 |
+
- downloadable report (Excel) when generated
|
| 45 |
+
|
| 46 |
+
## 7. Frequent errors and fixes
|
| 47 |
+
- Empty charts or missing KPIs.
|
| 48 |
+
- Fix: verify required traffic columns exist and are numeric.
|
| 49 |
+
- Date filtering produces no data.
|
| 50 |
+
- Fix: confirm date format and selected ranges.
|
| 51 |
+
- Unexpected aggregation result.
|
| 52 |
+
- Fix: check identifier consistency (site/cell naming between reports).
|
| 53 |
+
|
| 54 |
+
## 8. Minimal reproducible example
|
| 55 |
+
- Input: one valid 2G, one valid 3G, one valid LTE traffic CSV covering overlapping dates.
|
| 56 |
+
- Action: select pre/post periods and run analysis.
|
| 57 |
+
- Expected result: non-empty summary tables and top-sites charts.
|
| 58 |
+
|
| 59 |
+
## 9. Known limitations
|
| 60 |
+
- Results depend on source export quality and naming consistency.
|
| 61 |
+
- Very large reports can increase rendering time.
|
| 62 |
+
- Mixed naming standards across RATs can reduce merge quality.
|
| 63 |
+
|
| 64 |
+
## 10. Version and update date
|
| 65 |
+
- Documentation version: 1.0
|
| 66 |
+
- Last update: 2026-02-23
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 67 |
"""
|
| 68 |
)
|
documentations/wbts_capacity_doc.py
ADDED
|
@@ -0,0 +1,57 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# WBTS Capacity Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Analyze WBTS resource usage and classify capacity risk based on BB/CE comments.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page when planning WBTS upgrades or investigating BB/CE load pressure.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: WBTS capacity report in `.csv`
|
| 15 |
+
- Data recommendation: at least 3 days, daily aggregated, WBTS-level export
|
| 16 |
+
|
| 17 |
+
## 4. Required columns/fields
|
| 18 |
+
Input CSV must contain the fields expected by `process_kpi.process_wbts_capacity.load_data`.
|
| 19 |
+
Typical outputs include `bb_comments` and `ce_comments`.
|
| 20 |
+
|
| 21 |
+
## 5. Step-by-step usage
|
| 22 |
+
1. Open `Capacity Analysis > WBTS Capacity BB and CE Analysis`.
|
| 23 |
+
2. Upload WBTS CSV report.
|
| 24 |
+
3. Set analysis days, threshold days, and threshold value.
|
| 25 |
+
4. Click `Analyze Data`.
|
| 26 |
+
5. Review table and BB/CE distributions.
|
| 27 |
+
6. Download Excel report.
|
| 28 |
+
|
| 29 |
+
## 6. Outputs generated
|
| 30 |
+
- analysis table with WBTS comments
|
| 31 |
+
- BB comments distribution chart
|
| 32 |
+
- CE comments distribution chart
|
| 33 |
+
- downloadable file: `WBTS_Analysis_Report.xlsx`
|
| 34 |
+
|
| 35 |
+
## 7. Frequent errors and fixes
|
| 36 |
+
- Parsing/analysis error.
|
| 37 |
+
- Fix: verify CSV structure and required columns.
|
| 38 |
+
- Empty comment distributions.
|
| 39 |
+
- Fix: ensure uploaded report contains valid WBTS-level KPI rows.
|
| 40 |
+
- Unexpected threshold behavior.
|
| 41 |
+
- Fix: confirm threshold values align with KPI scale.
|
| 42 |
+
|
| 43 |
+
## 8. Minimal reproducible example
|
| 44 |
+
- Input: WBTS daily aggregated CSV with >= 3 days.
|
| 45 |
+
- Action: run default parameters and analyze.
|
| 46 |
+
- Expected result: populated WBTS table + Excel download button.
|
| 47 |
+
|
| 48 |
+
## 9. Known limitations
|
| 49 |
+
- Relies on expected input schema from processing module.
|
| 50 |
+
- Very short time windows may produce weak conclusions.
|
| 51 |
+
- No geographic map visualization on this page.
|
| 52 |
+
|
| 53 |
+
## 10. Version and update date
|
| 54 |
+
- Documentation version: 1.0
|
| 55 |
+
- Last update: 2026-02-23
|
| 56 |
+
"""
|
| 57 |
+
)
|
documentations/wcel_capacity_doc.py
ADDED
|
@@ -0,0 +1,62 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import streamlit as st
|
| 2 |
+
|
| 3 |
+
st.markdown(
|
| 4 |
+
"""
|
| 5 |
+
# WCEL Capacity Documentation
|
| 6 |
+
|
| 7 |
+
## 1. Objective
|
| 8 |
+
Evaluate WCEL capacity and service quality using availability, congestion, and fail-related thresholds.
|
| 9 |
+
|
| 10 |
+
## 2. When to use this tool
|
| 11 |
+
Use this page for 3G capacity monitoring, issue prioritization, and region-level reporting.
|
| 12 |
+
|
| 13 |
+
## 3. Input files and accepted formats
|
| 14 |
+
- Required: WCEL capacity report in `.csv`
|
| 15 |
+
- Data recommendation: at least 3 days, daily aggregated, WCEL-level report
|
| 16 |
+
|
| 17 |
+
## 4. Required columns/fields
|
| 18 |
+
Input CSV must provide fields required by `load_and_process_wcel_capacity_data`.
|
| 19 |
+
Output analysis includes fields such as:
|
| 20 |
+
- `final_comments`
|
| 21 |
+
- `operational_comments`
|
| 22 |
+
- `fails_comments`
|
| 23 |
+
- coordinates (`Latitude`, `Longitude`) for map views
|
| 24 |
+
|
| 25 |
+
## 5. Step-by-step usage
|
| 26 |
+
1. Open `Capacity Analysis > WCEL Capacity Analysis`.
|
| 27 |
+
2. Upload WCEL CSV report.
|
| 28 |
+
3. Configure thresholds and analysis windows.
|
| 29 |
+
4. Click `Analyze Data`.
|
| 30 |
+
5. Review analysis table, distributions, and maps.
|
| 31 |
+
6. Download report.
|
| 32 |
+
|
| 33 |
+
## 6. Outputs generated
|
| 34 |
+
- detailed WCEL analysis dataframe
|
| 35 |
+
- KPI dataframe
|
| 36 |
+
- comment distribution charts (global and by region)
|
| 37 |
+
- map views for fails and operational comments
|
| 38 |
+
- downloadable file: `WCEL_Capacity_Report.xlsx`
|
| 39 |
+
|
| 40 |
+
## 7. Frequent errors and fixes
|
| 41 |
+
- Processing error.
|
| 42 |
+
- Fix: validate CSV structure and numeric KPI fields.
|
| 43 |
+
- Empty maps.
|
| 44 |
+
- Fix: ensure `Latitude` and `Longitude` are populated.
|
| 45 |
+
- Over/under sensitivity.
|
| 46 |
+
- Fix: tune thresholds to match operational baseline.
|
| 47 |
+
|
| 48 |
+
## 8. Minimal reproducible example
|
| 49 |
+
- Input: WCEL CSV with valid KPI and coordinate fields.
|
| 50 |
+
- Action: run with defaults and review output charts.
|
| 51 |
+
- Expected result: analysis table + map visualizations + downloadable report.
|
| 52 |
+
|
| 53 |
+
## 9. Known limitations
|
| 54 |
+
- Depends on strict KPI schema expected by processing module.
|
| 55 |
+
- Coordinate gaps reduce map coverage.
|
| 56 |
+
- Large datasets can increase rendering time.
|
| 57 |
+
|
| 58 |
+
## 10. Version and update date
|
| 59 |
+
- Documentation version: 1.0
|
| 60 |
+
- Last update: 2026-02-23
|
| 61 |
+
"""
|
| 62 |
+
)
|