AI & ML interests
A set of red-teamed models aimed at researching multimodal, multilingual, multidomain mixture of expert (MOE) based on Starcoderplus. Trained on the LUMI HPC and JUWELS HPC. Work products of Ontocord's M*DEL open source community effort. This should NOT be confused with the AuroraGPT, https://www.hpcwire.com/2023/11/13/training-of-1-trillion-parameter-scientific-ai-begins/.