Hacker Newsnew | past | comments | ask | show | jobs | submit | rilindo's commentslogin

<nevermind, I see the filter for size>


> Fundamental fairness requires that if an automated system denies you a loan, a house, or a job, it be able to explain something you can challenge, fix, or at least understand.

That could get interesting, as most companies will not provide feedback if you are denied employment.


Fair point. Maybe the requirement should be that the automated system provide an explanation that some human could review for fairness and correctness. While who receives the explanation may be a separate question, the drawback of LLMs judging people is that said explanation may not even exist.


The overuse of metaphors makes me feel like this person is trying to reinvent Chef, but for AI.


Here is mine:

https://rilindo.com

Just reactivated it recently and mostly updating it daily with links and short pity comments.


Another hot take: maybe we will see a spike of compiles languages like Go, Rust and WASM, over Python, Ruby, and Node.


To be fair, we could also just optimize the runtime engines for interpreted languages.

I do enjoy golang, but Rust gives me nightmares. I make my living in higher level languages.

When I started learning to program JavaScript was just starting to gain popularity outside of the browser. It was the first language I could actually grasp, and I largely think it for giving me a career.

No more evictions for me!

The only real downside to JavaScript, you know being used as a tool for native apps with stuff like electron is it eats ram. Everything needs to a ship a full chrome binary.

But if we go back to native applications, we don't get things like quality Linux ports. If you would have told me 15 years ago that Microsoft would create the most popular IDE on Linux I'd assume that you had misspoke.


we can use AI to rewrite everything in Rust

this way all the RAM that AI data centers scoop up will be used to lessen demand for RAM that those same datacenters created

net-zero RAM!


Are you selling renewable memory offset credits? My company is seeking to burnish our ESG reputation.


You've got to [download more ram](https://downloadmoreram.com/)


Yet another hot take: we won’t see any of that. Instead users will simply get used to waiting.


This.

Nodejs and Python were used in 2012, why is now any different?


I actually compared WASM to Javascript for a particular integer-math-heavy task. For a single run, Javascript beat out WASM because WASM had a lot more setup time. After running both 1000 times, they were almost equal in runtime.

Yes, even though the Javascript was written using Doubles and the WASM was written using 64 bit ints. It just means that it's possible to write optimized Javascript (mainly by reducing object allocations, reuse objects instead)


A benchmark of adding numbers doesn’t tell you how it performs on real world websites and codebases. I wouldn’t be surprised if JavaScript was still very competitive, simply because of how good V8 is, but I don’t think we can conclude anything from your benchmark.

Of course it is always possible to write highly optimised code. But that’s not what people actually do, because of time, skill and maintenance constraints. Here’s a case study: in 2018 Mozilla ported some code from JS to Rust + WASM and got a 6x speed up [1]. An expert in V8 responded to this with highly optimised JavaScript, saying Maybe you don't need Rust and WASM to speed up your JS [2]. Both articles are worth reading! But it is worth remembering that it’s a lot quicker and easier to write the code in #1 than #2 and it is easier to maintain as well.

[1] - https://hacks.mozilla.org/2018/01/oxidizing-source-maps-with...

[2] - https://mrale.ph/blog/2018/02/03/maybe-you-dont-need-rust-to...


It wasn't some dummy "add numbers" loop, this was doing math (multiply-add) on large 336-bit integers.

Performance sucked when I used native Javacsript BigInts. When I made my own BigInt by using an array of doubles, and pretended that the doubles were 48-bit integers, performance was much better. Using the arrays meant that all allocation of temporary values completely stopped. I had to write my own multiply-and-add function that would do bigint = bigint * 48-bit number + other bigint + other 48-bit number.


V8 means javascript can be fast. However no amount of optimization can get around inefficient code. There is only so much optimizes can do about too many layers of abstraction, calculations that are used but not needed, and nested loops. Someone needs to step back once in a while and fix bottlenecks to make things fast.


Your mental model of integer vs double performance sounds outdated by decades. I’d suggest reading up on instruction performance on agnerfog, should be eye opening.


Related, somewhat, but I am moving away from Microsoft Office and to Apple's Pages/Numbers/Keynote/FreeForm, mostly because they are free and they are good enough.


They’re technically not “free”, they’re just bundled with new Macs. It’s a subtle distinction but if you have an old Mac you still have to pay for them.


I got complicated feelings about that. He did help pave the way for Linux, but he also killed OS/2.


OS/2 never had a chance. I was working at Radio Shack at the time that IBM was trying to sell Aptivas with OS/2. No one wanted it.

It was just weird to people. Microsoft did a big consumer push for Windows 95 and there were lines to buy it and Bill Gates promoted it on the Jay Leno show.

Windows 95 almost killed Apple.


There were plenty of people who wanted OS/2. They simply weren't the type of people who would go to Radio Shack.

By plenty, I should be clear: it probably wasn't enough for OS/2 survive. IBM made some bad decisions early on. Microsoft was also a thorn in everyone's side and it looked like. While their product was good enough, their business practices were savage. Possibly something they learned from IBM's legacy.

I was in my late teens/early twenties at the time. What I learned about how major corporations at the time is likely what led to my llife long interest in open source.


Microsoft spent a billion 1995 dollars marketing Win 95. Plus they had existing DOS and Windows 3.x users.

Arguably OS/2 was "better" (depending on your definition) but MS were all-in on Windows, and for IBM OS/2 was a side-line. It never stood a chance.


Douglas Copeland wrote a fictional book called Microserfs. He did a lot of comparison between the culture of IBM and Microsoft. Not a bad book , I recall. as


It was never really suited to retail sales channels like Radio Shack. It was more of a corporate thing although it obviously failed there as well.

I only came across it at one site, a big UK bank (Midland Bank). They were using a heavily modified version, it didn’t look anything like the original product.


OS/2 had some of the ugliest icons I've ever seen (and that elephant!) - looking cute always wins.


Yeah, I remember switching from OS/2 to Linux when OS/2 was more or less abandoned.


I always thought that Debian is already on git, so this confused me. How is source control currently (or was) done with the Debian project?


The short answer is that it's not.

The longer answer is that a lot of people already use Git for Debian version control, and the article expands on how this will be better-integrated in the future. But what goes into the archive (for building) is fundamentally just a source package with a version number. There's a changelog, but you're free to lie in it if you so wish.


This feel suspiciously like the sendmail-fiction of postfix. I don't know if that is a good thing.

EDIT: Not really, but it still feels like adding unneeded


Archer? Is that you?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: