assemsabry commited on
Commit
0e4286d
·
verified ·
1 Parent(s): 5e7d59c

Add Cloud-Pre Model Card

Browse files
Files changed (1) hide show
  1. README.md +18 -13
README.md CHANGED
@@ -1,16 +1,21 @@
1
  ---
2
- license: mit
3
- language:
4
- - en
5
- metrics:
6
- - mae
7
- base_model:
8
- - google/timesfm-2.0-500m-pytorch
9
- library_name: transformers
10
  tags:
11
- - timesfm
12
  - cloud-computing
13
- - resource-forecasting
14
- - anomaly-detection
15
- - finance
16
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
 
 
 
 
 
 
 
 
2
  tags:
3
+ - time-series-forecasting
4
  - cloud-computing
5
+ - timesfm
6
+ - resource-prediction
7
+ ---
8
+ # 🌩️ Cloud-Pre (500M)
9
+
10
+ **Cloud-Pre** is a specialized time-series foundation model designed for predicting cloud environment resource usage (CPU, RAM, Network).
11
+
12
+ It is fine-tuned based on Google's highly advanced `google/timesfm-2.0-500m-pytorch`.
13
+
14
+ ## Model Details
15
+ - **Architecture:** Decoder-Only Transformer (TimesFM 2.0 Base)
16
+ - **Parameters:** 500 Million
17
+ - **Fine-tuning Objective:** Cloud CPU/Resource peak and anomaly forecasting to aid with Predictive Auto-scaling.
18
+ - **Developer:** Assem Sabry
19
+
20
+ ## How to find the training code?
21
+ The complete open-source training pipeline and data engineering scripts can be found on my GitHub Repository.