Hacker Newsnew | past | comments | ask | show | jobs | submit | flir's commentslogin

Didn't a bunch of hardware that was destined for Tesla get redirected to xAI? I'm sure I remember something like that.

Yep! Why his shareholders in Tesla abide by this kind of thing is beyond me, but he often mixes resources from completely unrelated companies: https://www.cnbc.com/amp/2024/06/04/elon-musk-told-nvidia-to...

Pesky laws only apply to plebs

It's interesting to think about what the process will look like when we do understand them. I imagine pulling bits of LLM off the shelf like libraries and compiling them together into a functioning "brain", precisely tailored to your needs.

I do. "Commoditize your complement". Want to sell lots of silicon? Give away good local models to run on that silicon.

Even if SOTA models in the cloud are a few percentage points better, most work can be routed to local models most of the time. That leaves the cloud providers fighting over the most computationally intensive tasks. In the long term, I think models are going to be local-first.

(Unless providers can figure out a network effect that local models can't replicate).


> I think models are going to be local-first.

Why on earth would that happen when everything else is moving into the cloud to tie it to ever-escalating subscription fees and prevent piracy?

Even with gaming, where running high-end 3D games in the cloud seems like madness and inevitably degrades the quality of the experience, they won't stop trying.


> In the long term, I think models are going to be local-first.

Why? There's an inherent efficiency advantage to scale, while the only real advantage for local models (privacy/secrecy) hasn't proven convincing for broader IT either.


Local first models aren't just more private than the API vendors, they also have the advantages of fixed cost, lower latency, and better stability - local models don't get nerfed/"updated" in the background like chatgpt does.

Maybe in a world where these AI companies behaved with some semblance of ethics and user-friendliness they would be on even ground, but for anyone paying attention local models are obviously the future.


> the only real advantage for local models (privacy/secrecy) hasn't proven convincing for broader IT either

Because of nonexistent regulation. Just wait for it…

The legal situation in for example the EU is crystal clear, only that it will take some time to go though all court instances.


It's foolish not to care about privacy especially as a company. You know how it prevents you from emailing yourself your tax documents? Meanwhile thousands of employees are sending literal design docs, software, product goals, etc to several ai third partys. Not only is that insane, the companies they are sending it too intend too and openly admit to scanning the data, make software products themselves, and intend to create models that can produce their products automatically.

The reason local models hasn't caught on is several fold. It's marketing to say your company follows the latest trend, and there's an inherent pressure to keep AI companies afloat so the economy doesn't entirely collapse. The other is, it wasn't until the last month that these models have caught up to frontier models. They just did, and they are more efficient and don't require a team of 500 to deploy.


To not depend on an external company that can decide the price.

That's a silly reason. For non-agent use cases what kind of utilization are you going to average on your own GPU, 5-10%? And that's without batching.

Even with overhead and scaling for peak use and a large profit margin, any company with an ounce of competition will be vastly cheaper than self-hosting. And for models you can run yourself, there will be plenty of competition.


I think you are calculating with current prices. Try to extrapolate the price in one year, seeing the current trends instead.

The models I could reasonably run at home aren't experiencing big price hikes, as far as I'm aware.

Price is a reason to escape many proprietary models, but not so much a reason to self host. Buying an expensive GPU mostly for AI purposes is not likely to save money unless you load it all day long.


Extrapolating current trends, I expect API prices to drop significantly for a given measure of 'intelligence'.

The transition to mobile-first was a good call. Probably the last good call though. Oh, and buying Instagram.

And WhatsApp. And the VR glasses seem to be a success.

And whatsapp.

The "Fantasy, but the chemistry works" phrasing in the last box on the first tab makes me suspect chatbot input.

Which is a pity, because I like the exhaustive structure. I just can't trust it. But I guess if I was going to dive into inventing weird cheeses, I wouldn't start with a blog post anyway.

(It would be so easy to generate 50k "Periodic table of <noun>" pages and just throw them into the wild. The public internet really is cooked, isn't it).


I once printed out a directions from an online map that contained "pass straight over the next fourteen roundabouts" (I think it was on the way into Reading). Lose count, and you are stuffed. I much prefer a turn-by-turn approach.

I'm really surprised that you're that deep into folk and have never heard of Connie Converse. She's turned up on my radar multiple times over the past couple of decades, and I don't go looking for her, or much folk content.

I guess it underlines that we all live in filter bubbles - I had assumed she was more well-known than she really is based on news stories like this. Every other artist mentioned on this page? I've never heard of them. Off to youtube...


Well, to be fair, I don't listen to a lot of recorded music, and while I hear a lot of music it's mostly in whatever circles I am in.

For instance, I learned "Big Cheesburgers" by Blaze Foley picking bass on stage during a performance at a winery long before I ever heard a recording of him playing it. Same with, say Chick Pyle's "Jaded Lover" or Nancy Griffth's "I Wish it Would Rain" or, for that matter, Rodney Crowell's "I Wish It Would Rain". Those are all folks I either met in passing or knew folks who knew them well.

So while I have a pretty deep familiarity with music from folks with Texas connections, I don't know a whole lot of stuff outside of that area unless there is a connection to my proximately local folk festivals... David Amram or Stan Rodgers or Trout Fishing in America for instance.

I don't think I am unique in that approach to folk music. A lot the lineage of that stuff is mostly people playing in song circles or in small performances picking up a song from someone else, who in turn picked it up from someone, going back to Leadbelly or whoever.

I like recordings- it's super frustrating to hear a song and then not be able to find it anywhere. Especially if I want to learn it and add it to the other 400 or so songs I have memorized at any time, or go back and re-learn something.

Bubbles are real, but I am okay with them because in a certain sense that's what it means to be in a community. But communities aren't usually fragile like bubbles, and folks can come and go without gatekeeping, so that seems closer to how I think about these systems for knowing about folks.


fyi https://books.google.com/ngrams/graph?content=cyberdeck&year...

(I'd like to know who was using "cyberdeck" in 1976)


Ok, I see - it's a consequence of the smoothing function used in the graph. First actual use is 1979.


That's not the model, that's the box the model came in.

It's unlikely we've hit the limits on improving agent UX, but there are some fundamental limits on LLMs that seem unlikely to be fixed by better UX.


That's funny, 'cos being anti-gay contravenes the UK Online Safety Act too.


Yes. I made no claim your legislators are competent. (Congratulations on the summary dismissal of the hereditary peers!)


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: