Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
openfree
/
ginigen-sora
like
67
Paused
App
Files
Files
Community
259ae21
ginigen-sora
/
xora
/
models
/
transformers
19 contributors
History:
10 commits
Sapir Weissbuch
Merge pull request #30 from LightricksResearch/fix-no-flash-attention
05cb3e4
unverified
about 1 month ago
__init__.py
Safe
0 Bytes
refactor
2 months ago
attention.py
Safe
49.8 kB
model: fix flash attention enabling - do not check device type at this point (can be CPU)
about 1 month ago
embeddings.py
Safe
4.47 kB
Lint: added ruff.
about 2 months ago
symmetric_patchifier.py
Safe
2.92 kB
Remove the word "pixart" from code.
about 1 month ago
transformer3d.py
Safe
21.9 kB
Merge pull request #30 from LightricksResearch/fix-no-flash-attention
about 1 month ago