> Swift has turned into a gigantic super complicated bag of special cases, special syntax, special stuff...
That's true, but only partly true. It already was a gigantic super complicated bag of special cases right from the start.
Rob Rix noted the following 10 years ago:
Swift is a crescendo of special cases stopping just short of the general; the result is complexity in the semantics, complexity in the behaviour (i.e. bugs), and complexity in use (i.e. workarounds).
Apple's new Swift language has taken a page from the C++ and Java playbooks and made initialization a special case. Well, lots of special cases actually. The Swift book has 30 pages on initialization, and they aren't just illustration and explanation, they are dense with rules and special cases
Of course, that doesn't mean that it didn't get worse. It got lot worse. For example (me again, 2020):
I was really surprised to learn that Swift recently adopted Smalltalk keyword syntax ... Of course, Swift wouldn't be Swift if this weren't a special case of a special case, specifically the case of multiple trailing closures, which is a special case of trailing closures, which are weird and special-casey enough by themselves.
A prediction I made was that these rules, despite or more likely because of their complexity, would not be sufficient. And that turned out to be correct, as predicted, people turned to workarounds, just like they did with C++ and Java constructors.
So it is true that it is now bad and that it has gotten worse. It's just not the case that it was ever simple to start with. And the further explosion of complexity was not some accidental thing that happened to what was otherwise a good beginning. That very explosion was already pretty much predetermined in the language as it existed from inception and in the values that were visible.
From my exchange with Chris regarding initializers:
"Chris Lattner said...
Marcel, I totally agree with your simplicity goal, but this isn't practical unless you are willing to sacrifice non-default initializable types (e.g. non-nullable pointers) or memory safety."
Part of my response:
"Let me turn it around: Chris, I totally agree with your goal of initializable types, but it is just not practical unless you are willing to sacrifice simplicity, parsimony and power (and ignore the fact that it doesn't actually work)."
Simplicity is not the easy option. Simplicity is hard. Swift took the easy route.
[...] when you first attack a problem it seems really simple because you don't understand it. Then when you start to really understand it, you come up with these very complicated solutions because it's really hairy. Most people stop there. But a few people keep burning the midnight oil and finally understand the underlying principles of the problem and come up with an elegantly simple solution for it. But very few people go the distance to get there.
-- Steve Jobs (borrowed and adapted from Heinelein)
But they can silently drop their mistakes without ever mentioning them again, for example Garbage Collection, Modern Objective-C Syntax, Cocoa-Java.
While they will do this and just start treating the old thing as if it were brand new and shiny, it helps if they actually do have some new shiny thing.
Happy to rebrand Objective-Smalltalk into AppleTalk. The network protocol was dropped 15 years ago, so that could work.
Lasers have very limited applications, they have an inherent line of sight limits, and even the most powerful ship mounted lasers that can do like 50kW, take a minute to boil a kettle of water away, more if you wrap it in tinfoil.
And a shot might cost $10, the laser itself cost $$$, fits only in a cargo container, and requires crazy amounts of juice.
Meanwhile a simple AA gun needs none of those things and can kill things just fine.
At this point no one is talking about using lasers to defend against hyper-sonic missiles (at least not anywhere near the target). All of the current laser systems require being focused on the targets for some amount of time to "burn though", which means they are only suitable for lower-speed targets (drones, cruise missiles, and some low-level ballistics).
You would need to have significantly stronger lasers to try and "burn through" on something moving that fast.
For completeness I should mention that there was quite some work on trying to get laser defenses against ballistic missiles on their "boost" phase (when they were launching, so slow enough to track a point in the missile), for example George Bush's "Star Wars" defense system. These would have been space based (some of the testing involved mounting on 747s, but I don't think that was ever an end-goal), but never made it near production.
Lasers will probably only be used for point defenses against drones which isn't useless but they aren't the cheap future panacea everyone seems to think.
None of that refutes anything that was said. macOS is a third-class citizen measured by market share, and the total sum of annual Mac profits is lower than what the iPad ecosystem makes in a year.
Consumers do not want the Mac. Datacenters don't want Apple Silicon. People want the iPhone, they want Airpods, but the M-series Macs have spent 5 years changing absolutely nothing.
> and the total sum of annual Mac profits is lower than what the iPad ecosystem makes in a year.
So the company that makes between 50-60% of all profits in personal computers has created a market where it makes 100% of the profits, but albeit smaller than the whole PC market. That's terrrrible, what was Apple thinking!
Market share is far from everything when people live in poverty and do not have money to spend on good hardware and software. Apple makes stuff for affluent people, and then makes a ton of money from those rich folks. Making Apple the most valuable company in the history of humanity. Boy, that's a terrible place to be in!
I shouldn't have to repeat myself; this still doesn't refute the claim that Apple has ceded the consumer compute market. Cheap Macs have flooded the used market for years, and people still gravitate towards plastic Wintel boxes and Chromebooks.
> Apple makes stuff for affluent people
is just repeating the original claim upthread:
>> they've priced the consumer/pro-sumer out of the market prettymuch and so B2B is the more sustainable paying population.
The fact Apple maximizes for profits, and does not care about market share, does not mean it has ceded the market at all. It’s the exact contrary. Apple’s making money akin to the #2 position while being #4 and that’s an issue for you?
Once again you retreat to anecdata; how can you prove that used Mac laptops are not popular?
No, treating people with hostility and escalating the situation only makes it more likely that someone will snap and attack a cop.
People generally do not shoot at cops, because whether or not they hit the target doing so is pretty much signing their own death sentence. All cops have to do to protect themselves is to not provoke people to fight-or-flight reactions.
> What is cheap is nuclear, what is expensive (and requiring massive subsidies) are intermittent renewables.
This has not been the case for a while now.
Specifically German source to avoid questions about regional variation, from 2024; only "small rooftop" PV (both with and without batteries) even has an overlap with nuclear, and even then at the low end of the cost range for nuclear:
I suspect we're still going to get new nuclear reactors, but for the radioisotopes and not because of any questions about cost or supply diversity or dunkelflaute.
That "source" is complete BS. It is a theoretical model that models LCOE (Levelized Cost of Electricity), pure generating cost. It does says nothing about the real world, nor does it model the cost of electricity, which has to include system costs. See
Those system costs rise dramatically with the share of intermittent renewables in a grid, making LCOE almost completely meaningless and downright misleading for such a grid. Full Levelized System Cost of Electricity is a better measure.
The discrepancy can also be seen in the real world, where despite falling generating costs for intermittent renewables, electricty prices correlated extremely strongly with the share of intermittent renewables in a grid.
The check against the Real World™ is also another item where this Fraunhofer piece of disinformation fails. They claim costs of 13-49 cents for nuclear.
However, in the Real World™, nuclear power plants profitably produce for 3-5 cents in Switzerland (world-renowned for its low, lower, lowest prices!).
So something very obviously does not add up.
When you look at the model more closely, you see exactly what doesn't add up: all their assumptions going into the model are lopsided, to the point of being absurd:
1. For construction costs, they take as the baseline not the average of current construction costs, nor the median either. They take the highest construction costs seen up to that point. As the baseline. And then increase it. This statistical malpractice even without knowing any details. And the details make it even more ridiculous: their baseline is a FOAK build (prototype), the reactor design (EPR) has been discontinued by the manufacturer, because it was too hard to build, and the country (Finland) had to rebuild its nuclear construction know how.
2. For the lifetime of the plant, they assume 40 years. Nuclear reactors in the US, for example, are all being extended to 80 years, which is also the design lifetime of modern reactors. This obviously doubles the Capex per kWh produced.
3. They assume comically low utilization down to only 20% capacity factor. For references, the US nuclear fleet has a capacity factor of over 90%, as did the German fleet. The reasoning they give is that they want nuclear to run only as a backup to the intermittent renewables. Which is ridiculous, because it allows the intermittent renewables to shift the cost for their intermittency onto nuclear.
4. They also assume much higher capital costs than they do for intermittent renewables. Not sure how they justify this, possibly with the fact that the guaranteed subsidies for intermittent renewables make them a low risk.
5. To balance things out, they assume higher utilization rates for wind and solar than have ever been achieved in the Real World™. And those utilization rates are decreasing, not increasing.
Put those components together, and you get around a factor 10. So this "study" makes assumptions that result in nuclear looking around 10x more expensive than it should be. Which, surprise surprise, pretty much matches the discrepancy between what this study shows and the evidence from the real world.
Just for LCOE.
You then have to add the system costs.
Pretty much the entire rest of the industrialized world are getting into nuclear, expanding their nuclear programs or even ditching their nuclear exits (the most recent: Belgium and Taiwan, with Switzerland on the cusp). There is a pledge to triple nuclear power endorsed by essentially all the economic powers in the world, and they are also following through.
Germany will get there in the end, but we have always been more stubborn in insisting on our mistakes. The longer it takes, the more expensive it will be.
But they're not all in. The politician who is being attacked in this very article for being insufficiently pro-nuclear is also against wind power.
And if the author believes the non-mainstream claim that nuclear is cheaper than renewables then they could have stated that rather than mysteriously avoid the topic.
Germany is still all-in on the failed Energiewende. Last I checked, this is due to the coalition agreement with the SPD, who have not yet managed to come to terms with their mistake.
The problems of the intermittent renewables wind power and solar are inherent, they have nothing to do with being "for" or "against" it. Stating simple facts is not being "against" something.
Well, unless you are such a blind-eyed partisan that any criticism of your religion, no matter how justified, is viewed as heresy. We don't have such blind partisanship here, do we? I mean apart from the crucial topics of vi vs emacs and static vs dynamic typing?
And nuclear being cheaper is not a "non-mainstream view". It is a simple, easily checkable fact.
Let's say that I'm in the same info-bubble as you and the OP and we all agree that renewables are more expensive than nuclear and furthermore that this point is so well accepted that it's not even worth mentioning.
Still in that case, it's kind of weird to attack a pro-nuclear politician for not replacing deadly coal fast enough without even mentioning that he is against wind as well.
Or to avoid any reference the exact same thing happening in your own country.
That "info-bubble" is also known as The Real World™.
Adding more intermittent renewables at this point is useless. Last year we installed significant new wind capacity and output from wind for that year actually dropped.
Although the weather did play a role (which is bad enough by itself!), this is not an anomaly: we have reached a point where we have way more intermittent renewable capacity than demand, so when the sun shines and the wind blows it already needs to be curtailed. Added capacity does not add output here, it just adds to the curtailment.
And when the existing intermittent renewables don't produce, the newly installed capacity also does not produce.
Capacity factors for both wind and PV have dropped, and will continue to drop even faster. And they still need full backup with fossil fuels, because the guaranteed production is either exactly zero or close to zero.
> They have shown that the information density in our existing languages is extremely low: small prompts can generate very large programs.
"Write a book about a small person who happens upon a magical ring which turns out to be the repository of an evil entities power. The small person needs to destroy the ring somehow, probably using the same means it was created"
Small prompts leading to large programs has absolutely nothing to do with programming languages and everything to do with the design of the word generators used to produce the programs — which ingest millions of programs in training and can spit out almost entire examples based on them.
The "likely" in "likely ...to land safely" and "likely to work fine" is not nearly good enough.
reply