This is for UniGLM: Training One Unified Language Model for Text-Attributed Graphs.
This weight is for Bert trained by 5 Amazon Datasets, PubMed, Ogbn-Arxiv, Ogbn-Products(Subset) for 3 epoch.
Unable to determine this model's library. Check the
docs
.