NeMo
English
nvidia
steerlm
llama3
reward model
zhilinw commited on
Commit
2a39d2e
1 Parent(s): c1f346b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -48,11 +48,11 @@ Llama3-70B-SteerLM-RM is trained with NVIDIA [NeMo-Aligner](https://github.com/N
48
  | Model | Type of Model| Overall | Chat | Chat-Hard | Safety | Reasoning |
49
  |:-----------------------------|:----------------|:-----|:----------|:-------|:----------|:-----------------------|
50
  | Nemotron-4-340B-SteerLM-RM | Proprietary LLM| **91.6** | 95.5 |**86.4** | 90.8 | 93.6 |
51
- | ArmoRM-Llama3-8B-v0.1 | Trained with GPT4 Data| 90.8 | 96.9 | 76.8 | 92.2 | 97.3 |
52
  | Cohere May 2024 | Proprietary LLM | 89.5 | 96.4 | 71.3 | **92.7** | 97.7 |
53
  | _**Llama3-70B-SteerLM-RM**_ | Trained with Permissive Licensed Data | 88.2 | 91.9 | 79.8 | 92.2 | 89.0 |
54
  | Google Gemini Pro 1.5 | Proprietary LLM | 88.1 | 92.3 | 80.6 | 87.5 | 92.0 |
55
- | RLHFlow-Llama3-8B | Trained with GPT4 Data | 87.1 | **98.3** | 65.8 | 89.7 | 94.7 |
56
  | Cohere March 2024 | Proprietary LLM | 87.1| 94.7 | 65.1 | 90.3 | **98.7** |
57
  | GPT-4-0125-Preview|Proprietary LLM | 85.9 | 95.3 | 74.3 | 87.2 | 86.9 |
58
  | Claude 3 Opus 0229 | Proprietary LLM | 80.7 | 94.7 | 60.3 | 89.1 | 78.7 |
 
48
  | Model | Type of Model| Overall | Chat | Chat-Hard | Safety | Reasoning |
49
  |:-----------------------------|:----------------|:-----|:----------|:-------|:----------|:-----------------------|
50
  | Nemotron-4-340B-SteerLM-RM | Proprietary LLM| **91.6** | 95.5 |**86.4** | 90.8 | 93.6 |
51
+ | ArmoRM-Llama3-8B-v0.1 | Trained with GPT4 Generated Data| 90.8 | 96.9 | 76.8 | 92.2 | 97.3 |
52
  | Cohere May 2024 | Proprietary LLM | 89.5 | 96.4 | 71.3 | **92.7** | 97.7 |
53
  | _**Llama3-70B-SteerLM-RM**_ | Trained with Permissive Licensed Data | 88.2 | 91.9 | 79.8 | 92.2 | 89.0 |
54
  | Google Gemini Pro 1.5 | Proprietary LLM | 88.1 | 92.3 | 80.6 | 87.5 | 92.0 |
55
+ | RLHFlow-Llama3-8B | Trained with GPT4 Generated Data | 87.1 | **98.3** | 65.8 | 89.7 | 94.7 |
56
  | Cohere March 2024 | Proprietary LLM | 87.1| 94.7 | 65.1 | 90.3 | **98.7** |
57
  | GPT-4-0125-Preview|Proprietary LLM | 85.9 | 95.3 | 74.3 | 87.2 | 86.9 |
58
  | Claude 3 Opus 0229 | Proprietary LLM | 80.7 | 94.7 | 60.3 | 89.1 | 78.7 |