Papers
arxiv:2307.09793

On the Origin of LLMs: An Evolutionary Tree and Graph for 15,821 Large Language Models

Published on Jul 19, 2023
ยท Featured in Daily Papers on Jul 20, 2023
Authors:
,

Abstract

Since late 2022, Large Language Models (LLMs) have become very prominent with LLMs like ChatGPT and Bard receiving millions of users. Hundreds of new LLMs are announced each week, many of which are deposited to Hugging Face, a repository of machine learning models and datasets. To date, nearly 16,000 Text Generation models have been uploaded to the site. Given the huge influx of LLMs, it is of interest to know which LLM backbones, settings, training methods, and families are popular or trending. However, there is no comprehensive index of LLMs available. We take advantage of the relatively systematic nomenclature of Hugging Face LLMs to perform hierarchical clustering and identify communities amongst LLMs using n-grams and term frequency-inverse document frequency. Our methods successfully identify families of LLMs and accurately cluster LLMs into meaningful subgroups. We present a public web application to navigate and explore Constellation, our atlas of 15,821 LLMs. Constellation rapidly generates a variety of visualizations, namely dendrograms, graphs, word clouds, and scatter plots. Constellation is available at the following link: https://constellation.sites.stanford.edu/.

Community

Paper author

Check out our Twitter thread about our results: https://twitter.com/itsandrewgao/status/1681888414705266688?s=20

Paper author

Radial dendrogram of all Hugging Face LLMs with more than 5,000 downloads!
high_res_image-min.png

Paper author

Try out our interactive Constellation atlas here: https://constellation.sites.stanford.edu/

btw, we would love to host the Constellation app on Spaces!

Paper author

Sounds awesome, would love to learn more: gaodrew@stanford.edu

Paper author

From the paper the distance between two models are based on their name. Is that the only distance measure or have you considered model parameters to be a distance measure?

Paper author

We split the names into n-grams, and usually model parameters get considered as 2 or 3 grams, i.e. "7B", "65B". We did not include model parameters as a specific measure since we weren't able to obtain them for all the models.

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2307.09793 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2307.09793 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 1