Thread #108616187
File: 1776359284004.jpg (100.9 KB)
100.9 KB JPG
>paid users get rate limited
>whinge that nobody wants to pay for AI
32 RepliesView Thread
>>
>>
>>
>>
File: ai deletes database kek.jpg (213.9 KB)
213.9 KB JPG
>>108616187
>paying for AI
>>
>>
>>
>>
>>
>>
>>
>>
>>108617370
I mean, they aren't SOTA for enterprise usage but those local llms are pretty fucking neat still, got them to fix and implement some stuff on some random c++ abandonware from github, I have a surface level knowledge of python at best, I can just read error tracebacks and haphazardly try to fix stuff, not actually code stuff, and getting bugs fixed and features implemented on c++ is not something that would've taken me a few minutes to do, if I would've even be capable of, but that small local model did
>>
>>
>>
>>108616187
LLMs are stupid expensive to operate. They're all losing money. Turns out, making a datacenter create something for $50 that your brain can do for the price of half a banana and a sip of water is a bad idea, but sunk cost fallacy rules modern tech.
>>
>>108617370
running a local model on a 5090 would far surpass anything these "state of the art" models can do, even on the lowest paid tier. they're shoveling shit out the door because they know it's expensive to run the hardware.
Go download Gemma4-31b and you can get better responses, even without a gpu, it will just be slow
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
File: claude.png (303.1 KB)
303.1 KB PNG
>>108616187
claudetrannies on suicide watch
>>
>whine that no one wants to pay for AI
Anthropic isn't complaining about that. Almost the opposite. They complain they have too much paying users to serve and it's the fastest growing user base in history so they can't scale up fast enough to serve anyone.
To give you some indication, the user base has doubled every month for the past 28 months in a row.
>>
>>
>>
>>