--- license: mit tags: - generated_from_trainer metrics: - f1 language: - de model-index: - name: BACnet-Klassifizierung-Raumlufttechnik-bert-base-german-cased results: [] --- # BACnet-Klassifizierung-Raumlufttechnik-bert-base-german-cased This model is a fine-tuned version of [bert-base-german-cased](https://huggingface.co/bert-base-german-cased) on the [gart-labor](https://huggingface.co/gart-labor) "klassifizierung_rlt_v2" dataset. It achieves the following results on the evaluation set: - Loss: 0.0597 - F1: [0.98461538 0.66666667 1. 1. 1. 1. 0.94736842 1. 1. 1. 1. 0.99115044 0.85714286 1. 1. 1. 1. 0. 1. ] ## Model description This model makes it possible to classify the components of room ventilation technology described with the BACnet standard into different categories. The model is based on a German-language data set. ## Intended uses & limitations The model divides descriptive texts into the following ventilation technology categories: Exhaust air, Exhaust air filter, Exhaust air fan, Other, Outside air damper, Humidifier, Fire protection, Fire damper, Heater, Cooler, Reheater, AHU, Room, Fan, Preheater, Heat recovery, Supply air, Supply air filter and supply air fan. ## Training and evaluation data The model is based on a German-language data set. ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:| | 2.1097 | 0.99 | 18 | 1.1253 | [0.77966102 0. 0.7037037 0. 0.875 0.57142857 0. 0.94736842 0. 0.92857143 0. 0.85496183 0. 1. 0.69230769 0. 0.79569892 0. 0.53333333] | | 0.8677 | 1.99 | 36 | 0.4032 | [0.98461538 0. 0.91666667 0.90909091 1. 1. - Transformers 4.21.1 - Pytorch 1.12.0+cu113 - Datasets 2.4.0 - Tokenizers 0.12.1