Ekgren commited on
Commit
587e2ed
1 Parent(s): fbe69a8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -62,7 +62,7 @@ Following Mitchell et al. (2018), we provide a model card for GPT-SW3.
62
 
63
  # Model Details
64
  - Person or organization developing model: GPT-SW3 was developed by AI Sweden in collaboration with RISE and the WASP WARA for Media and Language.
65
- - Model date: GPT-SW3 is released (ADD RELEASE DATE)
66
  - Model version: This is the second generation of GPT-SW3.
67
  - Model type: GPT-SW3 is a large decoder-only transformer language model.
68
  - Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: GPT-SW3 was trained with the NeMo Megatron GPT implementation.
 
62
 
63
  # Model Details
64
  - Person or organization developing model: GPT-SW3 was developed by AI Sweden in collaboration with RISE and the WASP WARA for Media and Language.
65
+ - Model date: GPT-SW3 date of release 2022-12-20
66
  - Model version: This is the second generation of GPT-SW3.
67
  - Model type: GPT-SW3 is a large decoder-only transformer language model.
68
  - Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: GPT-SW3 was trained with the NeMo Megatron GPT implementation.