Prithiv Sakthi PRO

prithivMLmods

AI & ML interests

Computer Vision | Prompting | Diffusion | Data Science | Web Product Production | GenAI

Organizations

prithivMLmods's activity

replied to their post about 20 hours ago
view reply

Hi @ezzdev This was a demo space for the computer vision models. You can use images in your project. Generating images out of ethicalness is subject to your own risk ( not safe for work ).

replied to their post about 20 hours ago
view reply

Three of them with Tensor Art & remaining with the SD in RUNPOD. Using it with the appropriate base model will give the good results. But it was in initial training stage, better may be come on future ..

posted an update 8 days ago
view post
Post
2888
Hey guys! @mk230580 @wikeeyang @Yasirkh & others asked how to run the Hugging Face spaces outside of the HF environment locally with their source editor and Google Colab. Here is how to do that simply ๐Ÿ‘‡๐Ÿ‘‡.

๐Ÿ“I have just created a step-by-step procedure with a Colab demo link also attached in the repository's README.md.

๐Ÿ”—: https://github.com/prithivsakthiur/how-to-run-huggingface-spaces-on-local-machine-demo

Thanks for a read !!
ยท
replied to their post 9 days ago
replied to their post 9 days ago
view reply

You have mentioned, you have experience with JuggernautXL, Rav Animated, SDXL, Realistic Vision, DreamShaper, and Lora. to generate images in Local & also mentioned GTX 3060 you hold for processing / accelerating !!.

Collection Zero is Zero GPU Nvidia A100 HCP Running Spaces & DALLE 4K, MidJourney are the Quick Names, i had just keep it for an Trend.

โญKindly Please provide Clearly, What i need to do for you / Guide you. Since you had mentioned experiences with Automatic 11111 or ComfyUI on your my local Hardware....

So You trying to run spaces in your local hardware / something else ??.

replied to their post 9 days ago
view reply

Hi @mk230580 !!

You are asked for the image that has high res quality with the fast computation, for your instance i came up with the idea in acceleration of GPU T4, Yes we know NVIDIA A100 TC GPU is unmatched in its power and highest- performing elastic computations (HPC) tasks. Apart from that you can use T4 as hardware accelerators. You asked me how to run externally from hugging face right. Use T4 in Google Colab or any other work spaces that compatible of it. A100 is also available in Colab but you be a premium user.

Running in the local system same as follows

Just have the HF Token to passed for login...

--Authenticate with Hugging Face
from huggingface_hub import login

--Log in to Hugging Face using the provided token
hf_token = '---pass your huging face access token---'
login(hf_token)

Visit my colab space for an instances to run local out of HF
Hardware accelerator : T4 GPU
See we know, we can get A100, L4 there in colab for premium / for cost. T4 is free for certain amount of computation i went with it . In local hardware you know what to do...

Second thing: the amount details you have in prompt also will have the desired results. see the higher-end details prompts via https://huggingface.co/spaces/prithivMLmods/Top-Prompt-Collection or in freeflo.ai, prompthero for better details results.

Colab link ( example of the stabilityai/sdxl-turbo) :
https://colab.research.google.com/drive/1zYj5w0howOT3kiuASjn8PnBUXGh_MvhJ#scrollTo=Ok9PcD_kVwUI
You can use the various model like RealVisXL_V4.0, Turbo, & more for better results
** After passing the Access Token, Remove your token to share for others**

replied to their post 12 days ago
view reply

Hi Yasirkh,

Yes!, you can run any Python-based SDK in Google Colab with the appropriate model and its api_url by using the correct request library. For example, if you are trying to run a text-to-image model, you can do it with the inference API.

โš ๏ธFor example:

import requests
API_URL = "----------your api addr goes here---------"
headers = {"Authorization": "Bearer hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"}
def query(payload):
response = requests.post(API_URL, headers=headers, json=payload)
return response.content
image_bytes = query({
"inputs": "Astronaut riding a horse",
})
--You can access the image with PIL.Image for example
import io
from PIL import Image
image = Image.open(io.BytesIO(image_bytes))

โš ๏ธYou can find your access token on hf settings >> access token, replace it on this "headers = {"Authorization": "Bearer hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"}" instead of x

โš ๏ธExample: headers = {"Authorization": "Bearer hf_ABC1234567890xyz9876543210"}, install the required PyPI Libs.

๐Ÿš€Then add the Gradio blocks what you need to perform in the interface func().

๐Ÿš€ For your information, this is not the original MidJourney model; I have named the space "MidJourney" to perform the same work similarly. Make try & let me know you got it or else. And one more you cannot commit / push the code when the access tokens are visible use secret key or variables ( when in repos).

โš ๏ธIf you feel any difficulties, you can reply me again ; i will sure help you with the logic or i will share the colab work link for make the case easier.

------ Try in Google Colab / Jupyter Nb / Data Spell / even in VsCode were you find the easier way ------

-Thank You !

posted an update 23 days ago
view post
Post
3424
Hey Guys !! ๐Ÿง‹

This is the time to Share the Collection of Prompts which have high parametric details to produce the most detailed flawless images.

๐Ÿ”—You can watch out the Collection on: prithivMLmods/Top-Prompt-Collection

๐Ÿ”ขMore than 200+ High Detailed prompts have been used in the Spaces.
@prithivMLmods

Thank you for the read. !!
ยท
posted an update 26 days ago
posted an update about 2 months ago
view post
Post
2101
#Previous Version / Older
๐Ÿ“‚Huggingface for Android โžก๏ธ
๐ŸชถMedian ( Go Native ) Plugin :

version 0.0.1
๐Ÿš€ prithivMLmods/Huggingface-Android-App