Hacker Newsnew | past | comments | ask | show | jobs | submit | random3's commentslogin

Since we don’t know the number of atoms, we’d need to let omega be a function, then deal with all the edge cases, rename omega with ∞ and..

Yeah I can’t tell if op is trolling or really thinks they can just define a rational number big enough to not need infinity as a concept.

Veritasium has a nice video about it https://www.youtube.com/watch?v=MiUHjLxm3V0

Obviously, there are a lot of reasons why. But it boils down to having the vision, the belief and the strength to follow through over many years. It's important to not confound vision with random Kool-Aid. Instead it's grounded in research. That research is itself grounded in a strong vision and belief — it got laughed by the entire physics community at the time:

> 'people seemed unwilling to believe bending x-rays, and they tended to that we had actually made an image by regard the whole thing as a big fish story

Now contrast this with the current academic reality — "publish or perish" and the reality of venture financing and corporate culture that "depends" (arguably in self-inflicted manner, that's not 100% the case) on quarterly repots.

ASML is just a recent story, but if you look back, you'll see that most revolutions have a similar pattern of people crazy enough to deviate from the herd.

The rest— the immense financial risk, the 5000 suppliers, etc. came as a result of having the ability to see through all the noise and the grit to to follow through when everyone calls you an idiot for not doing something "useful"


One could argue that NVidia's advantage comes from a similar vision epiphany that led to them developing CUDA years before it was viable. The result is similar.

I'm tempted to call that pure luck. As far as they knew, crypto would be the killer app.

However, if you start with the assumption that at some point, people are going to need a lot of fast parallel compute for something, you could rationally justify their long-term strategy. They skated where the proverbial puck was going. They couldn't see the puck, but they were pretty sure there was one. In hindsight that really does look like a safe bet.


People were (ab)using OpenGL to run compute on GPUs in 2004-2006, doing stuff like rendering 2 triangles covering the whole screen and then doing the actual compute in the pixel shaders, getting 10x speedups over CPUs for some problems.

NVIDIA just had their eyes open to an obvious market demand and made it easier by creating CUDA.


Nvidia subsidized machine learning research for years (both with CUDA, hardware donations and developing what was a very niche product line just for them) before deep learning became big, much less the advent of LLMs.

Certainly Jensen seemed to have an extremely long view on this burgeoning machine learning market in the early 2010's.


It didn’t hurt that they had a two companies named Intel and Microsoft that completely missed the boat where GPUs or mobile computing were concerned both are currently the top two companies in tech today by market cap?

CUDA came out of the need for running parallel cores in their GPUs. This is not luck, it's product evolution. They did it first, they did it best, and they are reaping the benefits. The alternative here is to Not have CUDA and continue writing sub-optimal code for GPUs.

it's hard to recommend Veritasium on this subject when Asianometry exists... https://www.youtube.com/watch?v=CFsn1CUyXWs&list=PLKtxx9TnH7...

Very true one thing you have to be impressed with is ASML willingness over time to iterate and iterate again nose to the grindstone. More companies could learn a lesson from that. There were several American companies that had it all in the last 60 years and squandered it. IBM, Xerox, Kodak, US Steel.

Xerox in my time there skewed compensation to the salespeople while turning a blind eye to the software developers who made the sale possible.

> But it boils down to having the vision, the belief and the strength to follow through over many years

And importantly, that vision being correct. The graveyard of history is full of the dedicated yet incorrect.


You can also have the right vision but at the wrong time.

The short answer is they solved something very difficult to accomplish and have a stack of trade secrets as their moat.

The whole take seems to be from the lens of immature software people that have never started or worked in a startup, or have had enough curiosity to learn that the code is not too relevant, especially at early YC stage.

That how you got the taste for lollipops?

Makes you think what’s the biggest concerns wrt Mythos — is it finding or fixing the vulnerabilities that’s scarier :))


CaniRun's not a great tool - look how long its been since it's been updated. It's not got any of the qwen3.6 models on the list nor the new kimi one. In fact it's missing many of the "popular" models.

(for everyone else) I know this sounds cynical, but it's just a "flashlight" at a lot of the reality that surrounds entire startup "ecosystems". Founders should be aware whether they want to choose the blue pill or the red pill.

The only point about bootstrapping is that there's no "natural" bootstrapping. You're either not bootstrapping because you "own" (one way or another) one of the sides, or you're faking it. Any other "strategy" is a pipe dream meant to get to the bootstrapping-not-bootstrapping graveyard.


You and your friend seem to make a dream team.

I wish people would understand that extremist thinking is wrong and obtuse - just as it seems to most effective solution in your case it starts looking naive and damaging if you become the target just because you have a bad face or bad luck.

Fair point, but I can’t stop wondering whether the crux of the issue wasn’t the OS and the decisions that go back to iPhone or iPod. I can’t recall the details, but they tried to get MacOS to work and then decided to go with the alternative. I can only speculate on both that MacOS must have had a mobile story/dream since Newton and more so after nexstep.

So I wonder if what the article is pointing at wasn’t actually the inability of merging iOS and MacOS fully more than anything.


> I can only speculate on both that MacOS must have had a mobile story/dream since Newton

https://apple.fandom.com/wiki/PenLite

https://apple.fandom.com/wiki/Freestyle


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: