Hmmm. I have a different take there: when you are young and wild, you achieve stuff because you think later and instantly produce code. When you turn older, you do it the other way leading to your example.
In the early 2000s I have been in a startup and we delivered rapidly in C# as we did in PHP. We just coded the shit.
I think what you said is a healthy progression : write dumb code -> figure out it doesn't scale -> add a bunch of clever abstraction layers -> realize you fucked yourself when you're on call for 12 hours trying to fix a mess for a critical issue over the weekend x how many time it takes you to get it -> write dumb code and only abstract when necessary.
Problem is devs these days start from step two because we're teaching that in all sources - they never learned why it's done by doing step one - it's all theoretical example and dogma. Or they are solving problems from Google/Microsoft/etc. scale and they are a 5 people startup. But everyone wants to apply "lessons learned" by big tech.
And all this advice is usually coming from very questionable sources - tech influencers and book authors. People who spend more time talking about code than reading/writing it, and earn money by selling you on an idea. I remember opening uncle bobs repo once when I was learning clojure - the most unreadable scattered codebase I've seen in the language. I would never want to work with that guy - yet Clean Code was a staple for years. DDD preachers, Event driven gurus.
C# is the community where I've noticed this the most.
In the early 2000s I have been in a startup and we delivered rapidly in C# as we did in PHP. We just coded the shit.