πͺ Objaverse-XL Rendering Script
Scripts for rendering Objaverse-XL with Blender. Rendering is the process of taking pictures of the 3D objects. These images can then be used for training AI models.
π₯οΈ Setup
- Clone the repository and enter the rendering directory:
git clone https://github.com/allenai/objaverse-xl.git && \
cd objaverse-xl/scripts/rendering
- Download Blender:
wget https://download.blender.org/release/Blender3.2/blender-3.2.2-linux-x64.tar.xz && \
tar -xf blender-3.2.2-linux-x64.tar.xz && \
rm blender-3.2.2-linux-x64.tar.xz
- If you're on a headless Linux server, install Xorg and start it:
sudo apt-get install xserver-xorg -y && \
sudo python3 start_x_server.py start
- Install the Python dependencies. Note that Python >3.8 is required:
cd ../.. && \
pip install -r requirements.txt && \
pip install -e . && \
cd scripts/rendering
πΈ Usage
π₯ Minimal Example
After setup, we can start to render objects using the main.py
script:
python3 main.py
After running this, you should see 10 zip files located in ~/.objaverse/github/renders
. Each zip file corresponds to the rendering of a unique object, in this case from our example 3D objects repo:
> ls ~/.objaverse/github/renders
0fde27a0-99f0-5029-8e20-be9b8ecabb59.zip 54f7478b-4983-5541-8cf7-1ab2e39a842e.zip 93499b75-3ee0-5069-8f4b-1bab60d2e6d6.zip
21dd4d7b-b203-5d00-b325-0c041f43524e.zip 5babbc61-d4e1-5b5c-9b47-44994bbf958e.zip ab30e24f-1046-5257-8806-2e346f4efebe.zip
415ca2d5-9d87-568c-a5ff-73048a084229.zip 5f6d2547-3661-54d5-9895-bebc342c753d.zip
44414a2a-e8f0-5a5f-bb58-6be50d8fd034.zip 8a170083-0529-547f-90ec-ebc32eafe594.zip
If we unzip one of the zip files:
> cd ~/.objaverse/github/renders
> unzip 0fde27a0-99f0-5029-8e20-be9b8ecabb59.zip
we will see that there is a new 0fde27a0-99f0-5029-8e20-be9b8ecabb59
directory. If we look in that directory, we'll find the following files:
> ls 0fde27a0-99f0-5029-8e20-be9b8ecabb59
000.npy 001.npy 002.npy 003.npy 004.npy 005.npy 006.npy 007.npy 008.npy 009.npy 010.npy 011.npy metadata.json
000.png 001.png 002.png 003.png 004.png 005.png 006.png 007.png 008.png 009.png 010.png 011.png
Here, we see that there are 12 renders [000-011].png
. Each render will look something like one of the 4 images shown below, but likely with the camera at a different location as its location is randomized during rendering:
Additionally, there are 12 npy files [000-011].npy
, which include information about the camera's pose for a given render. We can read the npy files using:
import numpy as np
array = np.load("000.npy")
where array is now a 3x4 camera matrix that looks something like:
array([[6.07966840e-01, 7.93962419e-01, 3.18103019e-08, 2.10451518e-07],
[4.75670159e-01, -3.64238620e-01, 8.00667346e-01, -5.96046448e-08],
[6.35699809e-01, -4.86779213e-01, -5.99109232e-01, -1.66008198e+00]])
Finally, we also have a metadata.json
file, which contains metadata about the object and scene:
{
"animation_count": 0,
"armature_count": 0,
"edge_count": 2492,
"file_identifier": "https://github.com/mattdeitke/objaverse-xl-test-files/blob/ead0bed6a76012452273bbe18d12e4d68a881956/example.abc",
"file_size": 108916,
"lamp_count": 1,
"linked_files": [],
"material_count": 0,
"mesh_count": 3,
"missing_textures": {
"count": 0,
"file_path_to_color": {},
"files": []
},
"object_count": 8,
"poly_count": 984,
"random_color": null,
"save_uid": "0fde27a0-99f0-5029-8e20-be9b8ecabb59",
"scene_size": {
"bbox_max": [
4.999998569488525,
6.0,
1.0
],
"bbox_min": [
-4.999995231628418,
-6.0,
-1.0
]
},
"sha256": "879bc9d2d85e4f3866f0cfef41f5236f9fff5f973380461af9f69cdbed53a0da",
"shape_key_count": 0,
"vert_count": 2032
}
π Configuration
π§βπ¬οΈ Experimental Features
USDZ support is experimental. Since Blender does not natively support usdz, we use this Blender addon, but it doesn't work with all types of USDZs. If you have a better solution, PRs are very much welcome π!
π Our Team
Objaverse-XL is an open-source project managed by the PRIOR team at the Allen Institute for AI (AI2). AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering.