Feature Extraction
Transformers
PyTorch
Safetensors
fimhawkes
time-series
temporal-point-processes
hawkes-processes
scientific-ml
custom_code
Instructions to use FIM4Science/FIM-PP with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use FIM4Science/FIM-PP with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="FIM4Science/FIM-PP", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("FIM4Science/FIM-PP", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- e37453dcdb57a2bf56d6b670829f0352bd2a94361c1f0131d0b0fb64d9ae1064
- Size of remote file:
- 64.5 MB
- SHA256:
- 2f6099a698340f62e0e61bccd1c3e8b4c37a36134927676b99259064d8686b0f
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.