I doubt AI will replace any job in my lifetime (got 40-50 years left).
Progress will grind to a halt just like self driving cars did because the real world is just too chaotic and 'random' to be captured by a formula/equation/algorithm.
My prediction is: AGI is theoretically possible, but would require impractical amounts of computing power - kinda like how intergalactic travel will never happen.
And regrading comparison with self driving car they are still improving just the bar for them is much higher. If autopilot works 99.9% if time then 1 out of 1000 drivers will die - so technology has to be even better. for LLM is enough if it’s 90% good to be broadly useful.
It’s not about replacing all programmers. If one programmer with AI assistant can do work the same as 2 programmers then one position is redundant.
Even with self driving truck if one truck driver is leading another truck behind controlled by AI, and just for safety you have somewhere C&C center with one person monitoring 4 such ai trucks and in case unexpected event remotely take over control then one truck driver position is redundant.
While I do think there is some threshold where increased productivity makes positions redundant, I don't think 2x would do it in most orgs. My current team easily has enough work for us all to be 2x more productive.
fwiw, self-driving cars did not grind to a halt, development just did not move as quickly as the pundits and self-promotion claimed. I just rode in a fully driverless car on public streets in downtown Austin this week.
Which is absolutely not solved by assigning it to _ neither, if anything it will just make the variable appear used in syntax highlighting and that will make me forget to properly use it!
IDEs just gray out the unused variables and that is an 1000x better way of handling this issue.
I think you misunderstand the purpose of assigning to _.
Extremely common bug 1: a function returns an error/future, but you don’t check/poll it “foo();”. May happen because you copy some tutorial code that’s not rigid about error checking, or you don’t realize that this language doesn’t have futures that run without being polled.
Quite common bug 2: you store the error/future, but don’t check/poll it “let a = foo();”. May happen when moving code, certain branches, copy paste error, not enough coffee.
Opt-in: it doesn’t matter if the call succeeded or the future is polled, leave me alone “let _ = foo()”. May happen in test code for example. This is not the common case, so the opt-in annoyance is justified to improve the common case.
You misunderstand me - I’m talking about my original problem of an unused variable that happened due to me commenting out something for example. Temporarily unused variable if you will.
Without recursively commenting out any further variable that have also become unused by my action (which I hope we can all agree is extreme tedious and error prone), I am left with no choice but assigning it to _, which as you mentioned already have a normal usage, making it later very hard to discern which is just a temporarily unused variable, or a deliberately ignored one. The very “feature” can cause bugs, besides being annoying as hell.
I agree. I gave Zig a fair shake, even went so far as to find the compiler PR that automatically adds `_ = foo` to your source code while compiling[1] and set up the VSCode extension with autofix. I couldn't get used to seeing the lines with `_ =` appear and disappear all over my code while I was typing.
With the usual warnings approach, you can rely on compiler/IDE/pre-commit tooling to find unused variables - they'll all nag you until it's fixed. With Zig's autofix approach, those problems are immediately silenced and you don't have any help from the compiler or tooling to find unused variables. It's quite an ironic outcome if you think about it.
That is something a decent type system should solve - make it impossible to pass 'incomplete values' on, so any state further on which depends on the error handled/not handled will expect the appropriate type and compiler will error at that call site.
An unused variable means you weren't passing it on anywhere so there is no code which depends on its value, so how can it be a bug?
future = x.do_async();
return;
should not error out because of 'unused variable', it should give a error message concerning the lifetime of the future object.
As someone who works at a company with an old horrible code base, for me its not so much of it can cause bugs, but it presents people from doing stupid things. In parts of a codebase we have a long legacy function that contains a string that tries to show what "state" a process is in, but the string is only ever used for that. Never is it actually used by the system. So we have code that looks like this
public void processTransaction() {
string state = "start";
getCardDetails();
state = "serialize";
serializeCardData();
state = "send data";
sendData();
state = "check return code";
bool success = checkReturnCode();
if (!success) {
state = "failed";
doFailThing();
return;
}
state = "store transaction record";
storeTheThing();
state = "complete";
}
First, this code is a simple example, in reality, the code has bunch of branching and doesn't actually call out to functions to do things like "getCardDetails," so just replace that function in the example above with some parsing logic of a string to parse Track1 and Track2 data. The equivalent method we actually have in our code base to do that is ~1500 lines of code. But that string is doing nothing that a comment couldn't accomplish. Or more importantly, what the code could describe itself if it actually adhered to good design principles. Ideally, I would refactor this, but the owner of the company is adament on keeping this 1500 line abomination untouched.
For me, I often unused variables in production code are usually filling the void a comment or good design would have filled.
The good news is that although intuitively that feels horrible, mechanically those are string literals, so, that doesn't actually do very much, just re-assigning a pointer. Moreover, there's no way an optimising compiler can't see those assignments are futile and elide them, whereupon in release builds it might as well be a comment.
And yes, lots of people who don't like Zig's choice agree unused variables are bad and shouldn't survive into your release code. They just don't agree with Andrew that it's a fatal error and the program shouldn't build.
Progress will grind to a halt just like self driving cars did because the real world is just too chaotic and 'random' to be captured by a formula/equation/algorithm.
My prediction is: AGI is theoretically possible, but would require impractical amounts of computing power - kinda like how intergalactic travel will never happen.