> it figures out your terminal dimensions. The underlying APIs it uses have effectively been stable since the earliest days of computing terminals—what, 50 years or so?
No, they haven't been stable, not really. The TIOCGWINSZ ioctl has never been standardized to my knowledge, and it has many different names on different Unixes and BSDs. The tcgetwinsize() function only got in POSIX in 2024, and this whole thing has really sad history, honestly [0], and that's before we even get to the Windows side of things.
This was vaguely my take away from the article: It's not that his replacements are simpler because they're better or made by him. They're simpler because they're only handling his use-cases.
Sometimes - that's fine.
Sometimes - that's making his software worse for folks who have different use-cases, or are running on systems he doesn't understand or use himself.
The real value of a library, even with all those dependencies (and to be clear, I disagree that 3 or 4 dependencies for a library that runs across windows/linux is "all that many", esp when his platform specific implementation still uses at least 1), is that it turns out even relatively simple problems have a wealth of complexity to them. The person who's job it is to write that library is going to be more experienced in the subject domain than you (at least in the good cases) and they can deal with it. Most importantly - they can deal with your unknown, unknowns. The places you don't even have the experience to know you're missing information.
> They're simpler because they're only handling his use-cases.
This is a major part of the thesis of no dependencies. General code is bad code. It’s slow, branchy, complex, filled with mutexes, nan checks, etc. Read “the old new thing”, to see the extreme
When you have a concrete goal you can apply assumptions that simplify the problem space.
A good example of this was Casey’s work on a fast terminal. At first all the naysaying was “production terminals are really hard because you have to handle fonts and internationalization, accessibility, etc”. Indeed those problems suck, but he used a general windows API to render a concrete representation of the char set on demand, and the the rest was simple.
I think most times, as a user of software, I almost always prefer to have something that solves my problem, even if it's got some rough edges or warts. That's what general code is - stuff that solves a problem for lots of people.
Would I prefer a tool that solves exactly my problem in the best way possible? Yeah, sure. Do I want to pay what that costs in money, time or attention? Usually no. The general purpose tool is plenty good enough to solve the problem now and let me move on.
The value I get from solving the problem isn't really tied to how optimally I solve the problem. If I can buy a single hammer that drives all the nails I need today - that's a BETTER solution for me than spending 10 hours speccing out the ideal hammer for each nail and each hand that might hold it, much less paying for them all.
I'll have already finished if I just pick the general purpose hammer, getting my job done and providing value.
---
So to your terminal example - I think you're genuinely arguing for more general code here.
There's performance in making a terminal run at 6k fps. It's an art. It's clearly a skill and I can respect it. Sounds like it's an edge case that dude wants, so I'm in favor of trying to make the terminal faster (and more general).
But... I also don't give a flying fuck for anything I do. Printing 1gb of text to the terminal is useless to me from a value perspective (it's nearly 1000 full length novels of text, I can't read that much in a year if it was all I did, so going from 5 minutes to 5 seconds is almost meaningless to me).
The sum total of the value I see from that change is "maybe once or twice a year when I cat a long file by mistake, I don't have to hit ctrl-c".
I also genuinely fail to understand how this guy gets meaningful value from printing 1gb of text to a terminal that quickly either... even the fastest of speed readers are still going to be SO MANY orders of magnitude slower to process that, and anything else he might want to do with that text is already plenty fast - copying it to a new file? already fast. Searching it? fast. Deleting it? fast. Editing it? fast.
So... I won't make any comment on why this case is slow or the discussion around it (I haven't read it, it sounds like it could be faster, and they made a lot of excuses not to solve his specific edge case). All I'll say is your argument sure sounds like adding an edge case that nearly no one has, there-by making the terminal more general.
Any terminal I wrote for myself sure as fuck wouldn't be as fast as that because I don't have the rendering experience he has, and my use case doesn't need it at all.
For users and programmers. Slower to use. Harder to read and maintain. More code to do the same tasks.
> I almost always prefer to have something that solves my problem
I didn’t say anything about losing features. I don’t read Arabic. I’m glad the OS supports it. But it would be bad if Arabic complexity had to leak into every line of code. Limit the complexity scope. Casey’s terminal supports Arabic by following this principle.
> The value I get from solving the problem isn't really tied to how optimally I solve the problem.
Yes it is. There are ok products and there are great products.
> text still uses general solution
Operating systems have to solve general problems which is why they are expensive and brittle. Every program is not an operating system.
> There's performance in making a terminal run at 6k fps. It's an art.
I think you are missing context about the issue that led to this. And yes, as soon as there are no spinning wheels while doing normal tasks, then we can stop complaining about performance. But that’s not the situation.
> and my use case doesn't need it at all.
Would the world be better or worse if your operating system respected your time more?
> Would the world be better or worse if your operating system respected your time more?
This is a non-sequitur.
It's like saying that my Honda CRV is disrespecting my time because it has a top speed of 110, when it could be a 230 mph F1 racecar - it's a bad argument. I drive on roads with a max speed limit of 75... and most are down near 25mph.
> There are ok products and there are great products.
And users don't give a flying fuck about which yours is if they can't use it in the first place. That's why general things exist.
100%. If OP is willing to maintain a rust crate that takes in no dependencies and can determine terminal size on any platform I choose to build for, then I will gladly use your crate.
OTOH, if minimizing dependencies is important for a very specific project, then the extra work of implementing the functionality falls on that project. It will be simpler because it must only support one project. It may not receive critical compatibility updates or security updates also.
It does not fall on the community to go and remove dependencies from all their battle tested crates that are in common use. I think anyone and everyone would choose a crate with fewer over more dependencies. So, go make them?
> If OP is willing to maintain a rust crate that takes in no dependencies and can determine terminal size on any platform I choose to build for, then I will gladly use your crate.
I already mentioned this on twitter but not a lot of people work this way. I don't have to point you farther than my sha1-smol crate. It was originally published under the sha1 name and was the one that the entire ecosystem used. As rust-crypto became more popular there were demands that the name was used for rust-crypto instead.
I have given up the name, moved the crate to sha1-smol. It has decent downloads, but it only has 40 dependents vs. >600 for sha1. Data would indicate that people don't really care all that much about it.
(Or the sha1 crate is that much better than sha1-smol, but I'm not sure if people actually end up noticing the minor performance improvements in practice)
Not trying to be perverse - but why does sha1-smol need any dependencies, let alone 40? Isn’t it a single public pure function from a slice of bytes to a u128? (Implemented with some temporary state and a handful of private functions.)
(I am likely being completely naive, but I am well satisfied by Julia’s stdlib function for this, so wondering what makes Rust different since I’ve been doing more and more rust lately and am trying to understand what the difference might be).
> Not trying to be perverse - but why does sha1-smol need any dependencies, let alone 40?
It has zero dependencies, but only 40 crates on the eco system depend on it. Compared to sha1 which has 10 dependencies, but > 660 crates depend on it.
The rust-crypto sha1 crate has four direct dependencies
cfg-if - a tiny library making it easier to use different implementations on different platforms. Maintained by the official rust-lang libs team. Also used by the rust std library. No recursive dependencies.
cpufeatures - a tiny library for detecting CPU features maintained by the rust-crypto people, with a recursive dependency only on libc (which is maintained by rust-lang and used by the standard library).
digest - a library providing traits abstracting over different hash functions, maintained by the rust-crypto-people. In turn digest depends blockbuffer (some code to minimize boundchecks) and cryptocommon (more traits for abstracting), both maintained by the rust-crypto people. Both in turn depend on
- typenum, a third party library for compile time math, no dependencies.
- generic-array, a third library for fixed length arrays with lengths computed by typenum, typenum is its only dependency.
- version_check, a third party library for checking what features the rustc being used supports (only block-buffer depends on this one).
sha1-asm - A library with a faster assembly implementation of sha1, maintained by the rust-crypto people. Dependent only on cc (c compiler support), maintained by rust-lang, used in building the rust compiler. cc is itself in turn dependent only on
- shlex, a third party library for splitting things the same way a shell does. So this tree ends in an actual third party dependency, but one you're already indirectly dependent on by simply using the rust compiler at all.
Oh, and in the next release
sha1-asm is being removed
The crypto common dependencies are being replaced with hybrid-array, which is a replacement for generic-array maintained by the rust-crypto people, and is only dependent on typenum.
The version_check dependency is being dropped.
---
So what's going on is
1. This is a faster implementation of sha1, using all the tricks to have different implementations on different platforms. This resulted in 2 third party dependencies, down to 1 in the next release.
2. This is fitting into a trait ecosystem, and is using tricks to abstract over them without degrading performance. This resulted in 2 third party dependencies, down to 1 in the next release.
3. The count of non-third party dependencies is inflated because rust-crypto has split up its code into multiple crates, and the rust-lang team has split its code up into multiple crates.
4. It's not 40, it's 9.
Is it worth the extra code? I guess that depends on how many times you are hashing things and how much you care about the performance and environmental impact of extra code. Or if you're doing something where you want to be generic over hash functions.
Uh, no, I understand the comment I was replying to forwards.
I simply decided not to point out the source of their misunderstanding since it had already been pointed out by the time I replied, and this comment is already way too long.
It's not standardized but those calls do not change. The windows calls in particular are guaranteed ABI stable since they are compiled into a lot of binaries. There are definitely issues with ioctl but the changes landing in terminal-size or any of the dependencies that caused all these releases, are entirely unrelated to ioctl/TIOCGWINSZ constants/winsize struct. That code hasn't changed.
In this case the terminal-size crate just calls Rustix tcgetwinsize, which in turn just calls the libc tcgetwinsize. So I suppose you could save yourself a whole bunch of dependencies by just doing the same yourself. The only cost is Windows support.
If this particular API has been stable, or at least reasonably defined for 50 or 25 years is a detail, because the dependency doesn't even pretend to deal with that and the function is unlikely to change or be removed in the near future.
Well, it hasn't. The tcgetwinsize() was proposed (under this name) in 2017 and was standardized only in 2024. So it's less than a 10 year old API, which is missing from lots of libc implementations, see e.g. [0]. Before its appearance, you had to mess with doing ioctl's and hoping your libc has exposed the TIOCGWINSZ constant (which glibc by default didn't).
I had to check the Rustix implementation again, because that would indicate that that terminal-size wouldn't work on a number of operating systems. However Rustix also uses TIOCGWINSZ in it's tcgetwinsize implementation.
Terminals are a good example of something that seems really simple but is a major PITA because of too many different vendors in the early days, and no industry standard emerged. What is the closest thing? VT100? VT102? I mostly write raw to those, but stuff like terminal size and various other features like raw (non-cooked) mode are crappy and require ioctl's and such. Frankly, it sucks.
...but the libraries suck even more! If you don't want to link against ncurses then may God have mercy on your soul.
Previous summer I've toyed with trying to write an "async prompt" a-la Erlang's shell with output scrolling and line-editing (see e.g. [0] for example of what I am talking about), but it is so bloody difficult to do correctly, especially when the input spans several lines and there are some full-width characters on the screen, that I've abandoned it.
No, they haven't been stable, not really. The TIOCGWINSZ ioctl has never been standardized to my knowledge, and it has many different names on different Unixes and BSDs. The tcgetwinsize() function only got in POSIX in 2024, and this whole thing has really sad history, honestly [0], and that's before we even get to the Windows side of things.
[0] https://news.ycombinator.com/item?id=42039401