Obviously, there are a lot of reasons why. But it boils down to having the vision, the belief and the strength to follow through over many years. It's important to not confound vision with random Kool-Aid. Instead it's grounded in research. That research is itself grounded in a strong vision and belief — it got laughed by the entire physics community at the time:
> 'people seemed unwilling to believe bending x-rays, and they tended to
that we had actually made an image by regard the whole thing as a big fish story
Now contrast this with the current academic reality — "publish or perish" and the reality of venture financing and corporate culture that "depends" (arguably in self-inflicted manner, that's not 100% the case) on quarterly repots.
ASML is just a recent story, but if you look back, you'll see that most revolutions have a similar pattern of people crazy enough to deviate from the herd.
The rest— the immense financial risk, the 5000 suppliers, etc. came as a result of having the ability to see through all the noise and the grit to to follow through when everyone calls you an idiot for not doing something "useful"
One could argue that NVidia's advantage comes from a similar vision epiphany that led to them developing CUDA years before it was viable. The result is similar.
I'm tempted to call that pure luck. As far as they knew, crypto would be the killer app.
However, if you start with the assumption that at some point, people are going to need a lot of fast parallel compute for something, you could rationally justify their long-term strategy. They skated where the proverbial puck was going. They couldn't see the puck, but they were pretty sure there was one. In hindsight that really does look like a safe bet.
People were (ab)using OpenGL to run compute on GPUs in 2004-2006, doing stuff like rendering 2 triangles covering the whole screen and then doing the actual compute in the pixel shaders, getting 10x speedups over CPUs for some problems.
NVIDIA just had their eyes open to an obvious market demand and made it easier by creating CUDA.
Nvidia subsidized machine learning research for years (both with CUDA, hardware donations and developing what was a very niche product line just for them) before deep learning became big, much less the advent of LLMs.
Certainly Jensen seemed to have an extremely long view on this burgeoning machine learning market in the early 2010's.
It didn’t hurt that they had a two companies named Intel and Microsoft that completely missed the boat where GPUs or mobile computing were concerned both are currently the top two companies in tech today by market cap?
CUDA came out of the need for running parallel cores in their GPUs. This is not luck, it's product evolution. They did it first, they did it best, and they are reaping the benefits. The alternative here is to Not have CUDA and continue writing sub-optimal code for GPUs.
Very true one thing you have to be impressed with is ASML willingness over time to iterate and iterate again nose to the grindstone. More companies could learn a lesson from that. There were several American companies that had it all in the last 60 years and squandered it. IBM, Xerox, Kodak, US Steel.
The whole take seems to be from the lens of immature software people that have never started or worked in a startup, or have had enough curiosity to learn that the code is not too relevant, especially at early YC stage.
CaniRun's not a great tool - look how long its been since it's been updated. It's not got any of the qwen3.6 models on the list nor the new kimi one. In fact it's missing many of the "popular" models.
(for everyone else) I know this sounds cynical, but it's just a "flashlight" at a lot of the reality that surrounds entire startup "ecosystems". Founders should be aware whether they want to choose the blue pill or the red pill.
The only point about bootstrapping is that there's no "natural" bootstrapping. You're either not bootstrapping because you "own" (one way or another) one of the sides, or you're faking it. Any other "strategy" is a pipe dream meant to get to the bootstrapping-not-bootstrapping graveyard.
I wish people would understand that extremist thinking is wrong and obtuse - just as it seems to most effective solution in your case it starts looking naive and damaging if you become the target just because you have a bad face or bad luck.
Fair point, but I can’t stop wondering whether the crux of the issue wasn’t the OS and the decisions that go back to iPhone or iPod. I can’t recall the details, but they tried to get MacOS to work and then decided to go with the alternative. I can only speculate on both that MacOS must have had a mobile story/dream since Newton and more so after nexstep.
So I wonder if what the article is pointing at wasn’t actually the inability of merging iOS and MacOS fully more than anything.
reply