Hello! Great project, I was thinking of using it as a dataset instead of Shapenet for an upcoming hackathon project. I had two questions:
Do you know of or have any open source models (generative or otherwise) using this dataset (or some subset of it)?
Do you have any pre-computer CLIP embeddings for the objects, or at least embeddings for renderings of them? If not I was planning to make my own using img2dataset and clip-retrieval. Any advice or caveats you'd recommend for that?
Thank you, and great work!
Hi @MathYouF, glad to hear your excitement!
- We haven't made the models public yet, but see our paper for the models that we trained. For the generative models in the paper, they are only trained on a category-by-category basis (e.g., we train a separate model to generate fruits, and a different model to generate shoes).
- Yes, we do have CLIP embeddings stored for the experiments and that we've used for filtering. This isn't too hard to compute, but I would have to look and see how we'd distribute them. I don't think there are any caveats, but we've just used them from a single view of the objects (which might not fully characterize the object, since you won't be able to see all sides of it).
Hope that helps :)
Any chance I might be able to get the embeddings and/or model weights for my hackathon project on Jan. 11th?
I can give any credit you'd like and also not share them in the repo.
I'd love to not have to load each of the models and take a render and re-run img2dataset and clip-retrieval if it's possible to just get the embeddings directly.
Yeah, we should be able to upload them by then. Let me see what I can do :)
Hey! Just wanted to check on this.
If you think it may take more than another week, I can get started doing the embeddings and Get3D training on my own.
To the best of my recollection, here are the three generative GET3D models we trained using Objaverse data:
You'll need to see the GET3D read me file for information on how to run inference on these models. Let me know if I can be of help otherwise. Good luck on the hackathon :)!