I came here to pretty much make the same comment. If you have a business that can get away with OTS software, more power to you. But complicated things are complicated and complex billing rules can be make-or-break for a company. Whether you start with 3rd party or start home grown, billing systems often need humans and there really isn't a substitute.
Somebody mentioned this in a reply, but it deserves to be a top level comment. A good chunk of this year's unusual heat may be attributable to a reduction in SO2 emissions from shipping. Article cites other factors as well, but this implies that 2023 is the new normal.
It's true that they are still there, but I think loyalty is really soft. Musk seems to make some antagonizing change at least once a month that results in defections, the whole Substack thing being the latest. I follow Rex Chapman who seems to be one of those people that just does not want to have to rebuild his following somewhere else. He just recently signed up at Spoutible.
There is a bit of a forest / trees problem here. Python's simplicity is all about the barrier to computing. Because the barrier to use of Python is so much lower than for Rust, et al. you can get to a point where you've automated some human activity much more quickly or, indeed, even bothered to automate it at all. Someone knocking out a Python tool so that a manager doesn't have to gather spreadsheets every month to monitor progress is better than it not happening because Rust doesn't have the higher level libraries to make it a 5 day, 1 person project.
This article really just reads like there is some internal griping about not using Python.
About 20 years ago, I was at a pharmaceutical company and, to put together a web application,
- we had to buy a prod and a dev database server
- we had to buy a prod and a dev application server
- we needed Oracle as a database
- we needed a DBA, a sysadmin, and a hardware team
- we needed Tomcat / Java and Oracle development expertise
The idea of developing and running this application with a single person would have been laughable.
Today, because tools are soooo accessible, I can, and do, routinely spin up my own VM, apply Puppet, drop my MySQL and Django containers onto that VM, and pull https certificates in addition to doing the front and back end software development.
Life would be waaay simpler if I could just write server side web application code and wait around for database developers, sys admins, and front-end folks to do their thing. Imagine not having to learn a testing harness because there are actually people testing the software!
This is one thing I feel conflicted about. On one hand, it is a lot easier to do those things (VMs, docker, DBs, etc).
On the other hand, now a regular run-of-the-mill developer is also supposed to be knowledgeable about sooo much stuff that is not so directly related to their code. As you say, sometimes it’s nice to be able to hand that off to specialists - a person can’t be good at everything.
That word seems to mean different things to different people.
I have heard on internet explanations like "it means that developers and operations are communicating more and working together as one big team", but so far every company I worked at has interpreted it like "it means that in addition to developing the application, you are also supposed to do the operations".
I think this is really the crux of the matter. More generally, STABLE management that drives software quality. Even if you're lucky enough to be part of a team that starts of with good management, reorgs, mergers, and turnover are just way too frequent to provide the year-over-year improvements needed for really good, efficient code.
For something like this to be successful, I think it has to have some kind of domain focus. I've been working in bioinformatics-adjacent jobs for the last couple decades and I've seen things like this get used when there were a lot of integrated, domain-specific tools. Even then, the users were a curious slice of the population that could think like a programmer, but had not bothered to learn a language.
It also has to have a lot of buy-in and support. As everyone here is fully aware, a lot of problems get solved by Googling for a Stack Overflow answer. That's tough to do with a niche-y tool.
Not saying this isn't a good thing; it's just that it'll be difficult to be successful as a general purpose tool.
I work in the software side of the same field and visual flow programming tools like this help bioinformatics and scientists visualise analytics code bases, but they're not sold well to customers. Somehow the teams that sell these products still think in terms of low code or data science, when they probably should be speaking in terms of translational medicine
I think the whole value of the Unix tool ecosystem is that you don't need to build large pieces of software. If you look at all the GUI tools that Microsoft had to build for Windows administration in the earlier days and how they eventually came out with PowerShell because of the inadequacy of those tools, you can see how successful Unix tools have been by the --lack-- of big software that does their job.
I know it's just piling on at this point, but I've been doing a lot of software builds and deployment over the last 20 years, mostly in a scientific computing context. Nothing I've dealt with is as bad as JavaScript.
I've generated and hacked dozens of autotools builds and written m4 macros of my own; I've hacked CMake builds; written RPM spec files; fixed Rcpp package installs; written and modified innumerable Makefiles; setup Tomcat webapps; and use Python for most of my daily work. I've even hacked around with Boost's build thing.
Complexity is not the issue with JavaScript. Stability is the problem.
Most devs do not want to "know" build systems. You want it to work most of the time and when it doesn't, you want to be able to find the answer to a problem. You want that knowledge to accumulate over time so that the answer you found the last time still works.
Autotools is an insane system, but it's been the same insane for 30 years.
Currently using vue-cli with Vue 2 and Vuetify, a configuration depending on a lot of precisely pinned versions to get functional. There is a pile of deprecated library warnings during the lengthy build process that I have no idea how long it will take me to address. Could be a couple of hours, could be several days, could be impossible.
Would like to try Vite or esbuild because God knows we could use the speed up, but after investing a couple of weeks to figure out the precise balance of versions that will work and propagating to our 8 or 9 applications, where are we going to be next year? I saw a Tweet yesterday about vite on swc. Is that going to be the winner?
I half wish Richard Stallman would take over. How desperate is that?
You really prefer autotools or CMake to esbuild just because they are older and don't change as much? I will take esbuild over CMake every day of the week and twice on Sunday. C++ build tools are an abject disaster. (And CMake does change anyway)
Not the parent commenter. I spent time using autotools long ago and developed a grudging respect for what they did and how they worked. Their design made sense, even though the system itself was quite cursed. CMake also—I never choose it for my own projects, and I hate using it, but I respect the consistency and portability.
Building JavaScript is just a fucking disaster. I’ve used a fair number of different bundling tools—Browserify, Webpack, Rollup, Esbuild, and Closure (the compiler). I’ve also used old-school concatenation… anything from <script> tags to something like “cat”. It’s just fucking awful. Stay on the straight, narrow path and you’ll survive. Deviate slightly and you’re doomed.
Making things worse, you might have two different environments you run your code in. Node.js and the browser. Surprisingly, the browser lurches ever fowards, and Node.js is the anchor keeping us in the past. Making things worse, you might try to use TypeScript. Making things worse, you might use JSX.
Nearly every JS project I work on needs a ton of babysitting w.r.t. the build system. At various places I’ve worked, there might be a team handling that for me, but if I’m working on a side project, it is very difficult to keep the complexity of the build system down to a reasonable level. Most guides on how to use frameworks will tell you to do something like “oh, just use create-react app” or similar, and you end up with a couple dozen new dependencies. Your build system will be a mix of templated code pasted in to your repo and third-party libraries. Integrating with anything else often requires various "adapter" dependencies, but it's a roll of the dice whether those libraries are built reasonably.
My basic desire is often a fairly simple list… I want front-end TypeScript code to be type-checked and bundled, I want a dev server that serves the bundle, I want to see build errors quickly and easily, I want to run tests without a browser environment. I know this is possible, but every time I’ve gotten it, it’s taken an unreasonable amount of effort.
Not your parent commenter. I worked both with CMake/autotools and Java build tools (not esbuild specifically) and the point (which I agree with) is - yes - C/C++ build tools are complicated, but once you set it up it just runs most of the time without any problem. With things like webpack you are always one upgrade away from things breaking down in spectacular fashion, and the troubleshooting involves hunting github issues, blogs, and sometimes looking through the source code to figure out why this combination of npm package versions gives you the trouble.
Somebody once described configuring fvwm2 (which is still my WM of choice, and indeed was theirs at the time of writing) as "like training a brain damaged hyena."
The thing is, as the same somebody noted, that then it stays trained.
To quote from upthread, "the same insane for 30 years" is in and of itself a huge feature that makes up for quite a lot of insane.
I must be crazy because I still prefer regular Make to CMake. It seems like CMake tried to fix autotools problems (which were not Make's problems) but then created new problems of it's own. Why is setting an install prefix so complicated when it's the ONE option that almost everyone wants? Why the case sensitive name ending in .txt?
And CMake still seems extremely C/C++ focused. I don't want a language-specific build tool anymore. Heck, I'm not even sure I necessarily need a build tool so much as a generic task runner with patterns and complex/dynamic dependencies.
I think CMake is a real pain and Autotools is an older pain. But I can figure them out and things I learned two blue moons ago are still basically true. The churn in JS makes it really hard for me to accumulate knowledge and solutions.
I think you missed the point. Churn is a real problem. Even if the tools are improving, it does need to be balanced against stability. They admited autotools was insane. but it was a consistent well know insanity.
Things are getting better. but js has definitely had more churn, and reinvented more wheels than anything else i know of.
I think you missed my point. Churn is obviously less desirable in the abstract, but the state of C++ build tools in particular is deplorable and the lack of churn doesn't come close to compensating for how terrible it is.
esbuild is the gold standard. Its creator has done a phenomenal job (and it's still early in its lifecycle). It's the first time I used a JS build tool and didn't scream at the monitor when dealing with nonsensical APIs and error output.
The "early in its lifecycle" part is the source of GP's worry. esbuild _seems_ great now, but based on the history of JS tooling, who knows what will happen in a year or two. Look at the recent example of the JS community's multi-year embrace and then rejection of Webpack. I would be nervous about migrating too.
I share a similar fear (generally speaking), but in this particular case I think the risk is worth it because the cost of setup is next to zero and the speed of the thing is unreal. If something catastrophic happens with the project years down the road, at most I'll have spent a few hours getting it wired up (worth it considering the positive impact it's had on my productivity).
It's not the usual JS personality screaming "wEbPaCk iS dEaD!!!lol" it's clear the author is thoughtful (observable via the project itself, the docs, and how he interacts with people in Github comments).
One advantage of autotools is that at least a lot of the skills are transferable. Shell scripts and Makefiles (and OK, M4 isn't so popular these days, but it was for a reasonably long time) are things I can learn and use everywhere, not just this project, not just C projects, not even just development.
I don't learn any transferable skills writing Webpack or Babel config. They don't even necessarily transfer to the next generation of the same tool.
I would also take a stable autotools over an unstable autotools, and I consider most of JS-land roughly equal in usability to unstable autotools. Autotools was also not that bad on the happy path where you were targeting, say, the top five POSIX systems over ten years - and maybe you didn't even need it at all, if you didn't need the performance from distinguishing each platform's best supported fd polling variant. Vs. JS, which is still bad even if you're only targeting evergreen browsers, because maybe you're stuck trying to integrate CSS modules with TypeScript or something. That kind of problem simply doesn't exist in autotools world.
You're not totally wrong; I'm getting into that age range.
It's not that I prefer those tools; they are really obtuse. It's just that I can't handle the moving target of JavaScript. I'm sure I can figure out esbuild just fine, but in a year or two it could be abandoned in favor of swc or Vite or a Vite-esbuild-swc uber package.