BRP Tiny-Transformers
Collection
Models for the 2024-Q4 BSc. Research Project: "Architectural Decisions for Language Modelling with Small Transformers".
•
14 items
•
Updated
Basemodel: GPT-Neo
Configs: Vocab size: 10,000 Hidden size: 512 Max position embeddings: 512 Number of layers: 2 Number of heads: 4 Window size: 256 Intermediate-size: 1024
Results: