No it's the modern web itself. Signal:noise ratio has become poor (as the article notes). Much more crap to wade through before finding a gem somewhere.
That's ignoring software bloat, super-heavy web frameworks, social media's addictive algorithms, user tracking & what have you.
> About 300 geese live in this sleepy Bay Area suburb, equal to nearly 1% of our human population—and some say this town isn’t big enough for the both of us.
What? That's nothing.
Come to the NL, especially coastal areas. Last week I drove past a field covered in geese. Imagine a rough square with ~150 geese on each side, entirely filled in. Some quick math told me I could be looking at ~20k geese. Definitely >10k. In one swarm.
Okay, that one was big enough to amaze even me. And I have seen many geese - especially the last few years.
But 300? Be happy they're there & enjoy the wildlife while you still have some, for ff's sake. Remember there was a time when wild animals outnumbered humans on this planet.
The cost of coding [something] (that may or may not serve purposes) is approaching 0. That doesn't mean code is (or came) cheap.
I'm reminded of a quote I heard recently, concerning military drones: "hardware becoming cheaper & cheaper. Software that controls it, becoming more & more expensive".
Read: opening a can of pre-coded stuff is 'free'. But: the engineering effort that went into that existing code, and the effort required to improve that, is on a hockey-stick curve.
It's sad Z80 production was discontinued. There are some niche IC manufacturers that specialize in legacy parts (eg. Rochester Electronics comes to mind). It would have been nice if Zilog had passed on manufacture of the good 'ol Z80 to a manufacturer like this. Even if it's just small production batches every couple of months/years or so.
There'll be plenty of hobbyists and/or legacy industrial / niche applications for a looong time to come.
Maybe not so thin: much humans' knowledge is embedded in things we create (outside of language).
For example the design of a machine may have it tolerate inputs way outside spec & work fine. It may be built to take a beating, while no manual mentions using it in a rough environment. There may be subtle or not-so-subtle tweaks done to it over the years.
So that machine embodies knowledge, that may be 're-discovered' (by observing machine in action) long after its original designer is gone.
Another example: the design of traffic systems, the layout of cities (mostly organic growth), and how it affects the flow of people & goods through that city.
That's just a few examples. In short: knowledge is stored in other ways besides books/videos etc, or people's heads.
That said: it's a bit sad there's so little (if anything) in the space between microcontrollers & feature-packed Linux capable SoC's.
I mean: these days a multi-core, 64 bit CPU & a few GB's of RAM seems to be the absolute minimum for smartphones, tablets etc, let alone desktop style work. But remember ~y2k masses of people were using single core, sub-1GHz CPU's with a few hundred MB RAM or less. And running full-featured GUI's, Quake1/2/3 & co, web surfing etc etc on that. GUI's have been done on sub-1MB RAM machines once.
Microcontrollers otoh seem to top out on ~512KB RAM. I for one would love a part with integrated:
# Multi-core, but 32 bit CPU. 8+ cores cost 'nothing' in this context.
# Say, 8 MB+ RAM (up to a couple hundred MB)
# Simple 2D graphics, maybe a blitter, some sound hw etc
# A few options for display output. Like, DisplayPort & VGA.
Read: relative low-complexity, but with the speed & power efficient integration of modern IC's. The RP2350pc goes in this direction, but just isn't (quite) there.
> A company of 100 engineers should probably have 10-20% of the team allocated to just internal tools and making things go faster.
But beware of Jevons paradox.
Say that eg. a software project has 10 developers, and each build takes ~15 minutes. Most developers would take at least some care to check their patches, understand how they work etc, before submitting. And then discuss follow-on steps with their team over a coffee.
Now imagine near-instant builds. A developer could copy-paste a fix & hit "submit", see "oh that doesn't work, let's try something else", and repeat. You'll agree that probably wouldn't help to improve the codebase quality. It would just increase the # of non-functional patches that can be tested & rejected in a given time span.
Nothing new in the article imho. But it's a nice overview of what content creators are facing, and what to look for when carving out a niche.
The #1 point really: have access to data / experiences / expert knowledge that's unique & can't be distilled from public sources and/or scraped from the internet. This has always been the case. It just holds more weight when AI agents are everywhere.
That's ignoring software bloat, super-heavy web frameworks, social media's addictive algorithms, user tracking & what have you.
reply