Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

human beings are estimated to use roughly 50 to 100W when idle (up to maybe 1000-2000W when exerting ourselves physically), and I think it's fair to say we're generally intelligent.

Something is fundamentally missing with LLMs w.r.t. intelligence per watt. assuming gpt4 is around human intelligence, that needs 2-4 H100s, so roughly the same and that doesn't include the rest of the computer.

That being said, we're willing to brute force our way to a solution to some extent so maybe it doesn't matter, but I say the fact that we don't use that much energy is proof enough we haven't perfected the architecture yet.



At 5 cents or less per kWH these days, 10 kW is 50 cents per hour, well below minimum wage. LLMs aren't AGI and I'm not convinced we're anywhere close to AGI, but they are useful. That the people deploying them have the same product instincts as Microsoft executives seems to be the core issue.


This being said in this setup of 2-4 h100 you’ll be able to generate with batch size of somewhere around 128 ie its 128 humans and not one. And just like that difference in efficiency isn’t that high anymore.


I think the more important point to bring up is “you can hire a human for minimum wage.”

The median monthly salary in Bangladesh is cheaper than the Cursor Ultra plan. And Cursor loses money.

An experienced developer in India makes around $20k.


Brains use approximately 20W.


Not for inference, right?


correct - h100 can do like 100 tokens per second on a gpt4 like model, but you'd need to account for regular fine-tuning to accurately compare to a person, hence 4 or so. of course the whole comparison is inane since computers and humans are obviously so different ha...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: