lora concepts library

AI & ML interests

None defined yet.

Recent Activity

Yardenfren  updated a model 6 months ago
lora-library/B-LoRA-toy_storee
Yardenfren  updated a model 8 months ago
lora-library/B-LoRA-drawing2
Yardenfren  updated a model 8 months ago
lora-library/B-LoRA-painting
View all activity

lora-library's activity

1aurent 
posted an update 12 days ago
1aurent 
posted an update 4 months ago
view post
Post
1270
Hey everyone 🤗!
We (finegrain) have created some custom ComfyUI nodes to use our refiners micro-framework inside comfy! 🎉

We only support our new Box Segmenter at the moment, but we're thinking of adding more nodes since there seems to be a demand for it. We leverage the new (beta) Comfy Registry to host our nodes. They are available at: https://registry.comfy.org/publishers/finegrain/nodes/comfyui-refiners. You can install them by running:
comfy node registry-install comfyui-refiners

Or by unzipping the archive you can download by clicking "Download Latest" into your custom_nodes comfy folder.
We are eager to hear your feedbacks and suggestions for new nodes and how you'll use them! 🙏
1aurent 
posted an update 4 months ago
view post
Post
4418
Hey everyone 🤗!
Check out this awesome new model for object segmentation!
finegrain/finegrain-object-cutter.

We (finegrain) have trained this new model in partnership with Nfinite and some of their synthetic data, the resulting model is incredibly accurate 🚀.
It’s all open source under the MIT license ( finegrain/finegrain-box-segmenter), complete with a test set tailored for e-commerce ( finegrain/finegrain-product-masks-lite). Have fun experimenting with it!
JoseRFJunior 
posted an update 5 months ago
view post
Post
1693
JoseRFJunior/TransNAR
https://github.com/JoseRFJuniorLLMs/TransNAR
https://arxiv.org/html/2406.09308v1
TransNAR hybrid architecture. Similar to Alayrac et al, we interleave existing Transformer layers with gated cross-attention layers which enable information to flow from the NAR to the Transformer. We generate queries from tokens while we obtain keys and values from nodes and edges of the graph. The node and edge embeddings are obtained by running the NAR on the graph version of the reasoning task to be solved. When experimenting with pre-trained Transformers, we initially close the cross-attention gate, in order to fully preserve the language model’s internal knowledge at the beginning of training.
1aurent 
posted an update 6 months ago
view post
Post
2571
Hey everyone 🤗!
Check out this cool new space from Finegrain: finegrain/finegrain-object-eraser

Under the hoods, it's a pipeline of models (currently exposed via an API) that allows you to easily erase any object from your image just by naming it or selecting it! Not only will the object disappear, but so will its effects on the scene, like shadows and reflections. Built on top of Refiners, our micro-framework for simple foundation model adaptation (feel free to star it on GitHub if you like it: https://github.com/finegrain-ai/refiners)
  • 2 replies
·
1aurent 
posted an update 6 months ago
1aurent 
posted an update 6 months ago