> Fundamental fairness requires that if an automated system denies you a loan, a house, or a job, it be able to explain something you can challenge, fix, or at least understand.
That could get interesting, as most companies will not provide feedback if you are denied employment.
Fair point. Maybe the requirement should be that the automated system provide an explanation that some human could review for fairness and correctness. While who receives the explanation may be a separate question, the drawback of LLMs judging people is that said explanation may not even exist.
To be fair, we could also just optimize the runtime engines for interpreted languages.
I do enjoy golang, but Rust gives me nightmares. I make my living in higher level languages.
When I started learning to program JavaScript was just starting to gain popularity outside of the browser. It was the first language I could actually grasp, and I largely think it for giving me a career.
No more evictions for me!
The only real downside to JavaScript, you know being used as a tool for native apps with stuff like electron is it eats ram. Everything needs to a ship a full chrome binary.
But if we go back to native applications, we don't get things like quality Linux ports. If you would have told me 15 years ago that Microsoft would create the most popular IDE on Linux I'd assume that you had misspoke.
I actually compared WASM to Javascript for a particular integer-math-heavy task. For a single run, Javascript beat out WASM because WASM had a lot more setup time. After running both 1000 times, they were almost equal in runtime.
Yes, even though the Javascript was written using Doubles and the WASM was written using 64 bit ints. It just means that it's possible to write optimized Javascript (mainly by reducing object allocations, reuse objects instead)
A benchmark of adding numbers doesn’t tell you how it performs on real world websites and codebases. I wouldn’t be surprised if JavaScript was still very competitive, simply because of how good V8 is, but I don’t think we can conclude anything from your benchmark.
Of course it is always possible to write highly optimised code. But that’s not what people actually do, because of time, skill and maintenance constraints. Here’s a case study: in 2018 Mozilla ported some code from JS to Rust + WASM and got a 6x speed up [1]. An expert in V8 responded to this with highly optimised JavaScript, saying Maybe you don't need Rust and WASM to speed up your JS [2]. Both articles are worth reading! But it is worth remembering that it’s a lot quicker and easier to write the code in #1 than #2 and it is easier to maintain as well.
It wasn't some dummy "add numbers" loop, this was doing math (multiply-add) on large 336-bit integers.
Performance sucked when I used native Javacsript BigInts. When I made my own BigInt by using an array of doubles, and pretended that the doubles were 48-bit integers, performance was much better. Using the arrays meant that all allocation of temporary values completely stopped. I had to write my own multiply-and-add function that would do bigint = bigint * 48-bit number + other bigint + other 48-bit number.
V8 means javascript can be fast. However no amount of optimization can get around inefficient code. There is only so much optimizes can do about too many layers of abstraction, calculations that are used but not needed, and nested loops. Someone needs to step back once in a while and fix bottlenecks to make things fast.
Your mental model of integer vs double performance sounds outdated by decades. I’d suggest reading up on instruction performance on agnerfog, should be eye opening.
Related, somewhat, but I am moving away from Microsoft Office and to Apple's Pages/Numbers/Keynote/FreeForm, mostly because they are free and they are good enough.
They’re technically not “free”, they’re just bundled with new Macs. It’s a subtle distinction but if you have an old Mac you still have to pay for them.
OS/2 never had a chance. I was working at Radio Shack at the time that IBM was trying to sell Aptivas with OS/2. No one wanted it.
It was just weird to people. Microsoft did a big consumer push for Windows 95 and there were lines to buy it and Bill Gates promoted it on the Jay Leno show.
There were plenty of people who wanted OS/2. They simply weren't the type of people who would go to Radio Shack.
By plenty, I should be clear: it probably wasn't enough for OS/2 survive. IBM made some bad decisions early on. Microsoft was also a thorn in everyone's side and it looked like. While their product was good enough, their business practices were savage. Possibly something they learned from IBM's legacy.
I was in my late teens/early twenties at the time. What I learned about how major corporations at the time is likely what led to my llife long interest in open source.
Douglas Copeland wrote a fictional book called Microserfs. He did a lot of comparison between the culture of IBM and Microsoft. Not a bad book , I recall. as
It was never really suited to retail sales channels like Radio Shack. It was more of a corporate thing although it obviously failed there as well.
I only came across it at one site, a big UK bank (Midland Bank). They were using a heavily modified version, it didn’t look anything like the original product.
The longer answer is that a lot of people already use Git for Debian version control, and the article expands on how this will be better-integrated in the future. But what goes into the archive (for building) is fundamentally just a source package with a version number. There's a changelog, but you're free to lie in it if you so wish.