Hello when is falcon 180B going to get a payed service this is by far the best chatbot i've ever seen and I hate the long ques when will a paid version be released ?

#23
by Reptiles - opened

This is by far the best chatbot I've ever seen in it's response complexity and depth I would be more than happy to pay for subscriptions when will this be publicly available to pay for ? I hate these fucking ques and chatgpt is way to censored when will falcon 180 b create a payed model with there own dedicated website as making payments on this website seems kind of sus

This is by far the best chatbot I've ever seen in it's response complexity and depth I would be more than happy to pay for subscriptions when will this be publicly available to pay for ? I hate these fucking ques and chatgpt is way to censored when will falcon 180 b create a payed model with there own dedicated website as making payments on this website seems kind of sus

You could try purchasing a RunPod container and using the model via the pod. At full presicion you need 4 A100 80GBs.

This is by far the best chatbot I've ever seen in it's response complexity and depth I would be more than happy to pay for subscriptions when will this be publicly available to pay for ? I hate these fucking ques and chatgpt is way to censored when will falcon 180 b create a payed model with there own dedicated website as making payments on this website seems kind of sus

You could try purchasing a RunPod container and using the model via the pod. At full presicion you need 4 A100 80GBs.

I can't afford that it costs way to much money

This is by far the best chatbot I've ever seen in it's response complexity and depth I would be more than happy to pay for subscriptions when will this be publicly available to pay for ? I hate these fucking ques and chatgpt is way to censored when will Falcon 180 b create a payed model with there own dedicated website as making payments on this website seems kind of sus

You could try purchasing a RunPod container and using the model via the pod. At full precision you need 4 A100 80GBs.

Well, it not what Reptiles are asking. Running this model for yourself is too costly. I also would consider buying subscription for Falcon-180 if it existed. Sharing model between many peoples are much more realistic, especially if it is done in form of service that already set up and you need just use it, like ChatGPT.

This is by far the best chatbot I've ever seen in it's response complexity and depth I would be more than happy to pay for subscriptions when will this be publicly available to pay for ? I hate these fucking ques and chatgpt is way to censored when will Falcon 180 b create a payed model with there own dedicated website as making payments on this website seems kind of sus

You could try purchasing a RunPod container and using the model via the pod. At full precision you need 4 A100 80GBs.

Well, it not what Reptiles are asking. Running this model for yourself is too costly. I also would consider buying subscription for Falcon-180 if it existed. Sharing model between many peoples are much more realistic, especially if it is done in form of service that already set up and you need just use it, like ChatGPT.

Exactly I think every 1 is alright with it I mean it's way better than the competition and much cheaper on every 1 and especially on them they get a fiscal incentives

I wouldn't buy just yet. I think GPT 4 is still the undisputed king, even Claude 2.0 couldn't match GPT 3.5, let alone 4. Falcon will get better, and I hope we have something better or as good as GPT 4. Competition drives progress.

I wouldn't buy just yet. I think GPT 4 is still the undisputed king, even Claude 2 couldn't match GPT 3.5, let alone 4. Falcon will get better, and I hope we have something better or as good as GPT 4. Competition drives progress.

For me, Claude 2.0 is on par if not better than GPT-4 at least in my use case. Though, yes, Falcon is not as good as GPT-4 and/or Claude 2. I guess it's on par with GPT-3.5, at least in my use case, and fine-tunes of this model will be at least better than GPT-3.5(In my use case), so somewhere in the middle of GPT-3.5 and GPT-4.

Also, none of the models can match GPT-4, GPT-3.5, nor Claude 2 in coding yet, so that's a reason this is not as good.

Sign up or log in to comment