Looking at tweets like [1] it seems like their original bet was that server single core performance would dramatically outpace client single core performance. Honestly even if that had happened I still don't think their product would have made sense. The hosting cost and bandwidth/latency requirements would still have killed it.
The death of this product is a good sign for computing overall. The more of computing that happens on the client the better. Clients are (at least sometimes) under user control. When stuff moves to the cloud users give up control and that's a bad thing. Long live fat clients!
The product, "compute pixels on powerful machine -> render on thin client", makes much more sense if you don't try to monetize by being a cloud landlord, but you let the users own that powerful machine, make money through hardware, and help them with the software, make money through support, no subscriptions, no rent-seeking.
Think at the level of a family: 2 parents, 2 children which means: 4 smartphones, 2÷3 laptops, 1÷2 desktops, 1÷2 tablets, each of them $1,000+ which means around $8,000+ and $11,000+, replaced every 2÷4 years. Instead of 8÷11 screens, each with it's own beefy CPU/GPU/RAM, you could have 8÷11 thin clients (at the most $200 each, for fancy cameras and larger batteries) and a powerful machine, somewhere between $5,000 and $10,000, dual CPU, dual GPU, etc., which can last 10+ years (thinking of all those refurbished servers with 2012 Xeon E5-2620s which work great even today, and have no reason not to work perfectly fine even in 2032).
Bandwidth/latency is an issue but as we are advancing in 5G+ technologies it's just a wait game.
Long live thin clients where the user owns the upgradeable fat server rendering the thin client!
I think something like this could make sense as a place to run large machine learning models locally. I don't need a server to run my browser or my apps, they run just fine locally. But a server to run large language models, Stable Diffusion, Whisper voice recognition, etc would be useful, as these types of models can run much faster at higher quality on a beefy GPU than they ever will on a phone.
The endgame of these models is an agent that knows practically everything about me and can perform tasks on my behalf, which I would really prefer to live in my house running on hardware I own, rather than in a data center under someone else's control.
Precisely, data will become the ultimate material, from chemputers [1] to neural laces, it's all data, and having the user be the true absolute owner of it, as in actually owning the database, is paramount for any sane future use and development of technology. More thoughts and some code on this [2].
After the user owns a server capable of storing the databases and the function execution contexts (see deserve functions) for all their services, having also pixels rendered there and sent to thin clients it's just a matter of convenience and security: no more worries that the device you carry all day can be broken or stolen, it's just a low-cost piece of glass with an uplink to your server.
I find it deeply unsettling that you'd consider it normal/common to upgrade 4 digit (is that all Apple?) devices every 2 years, or that folks would need so many different form factors in the first place.
Just thinking of the literal mountains of ewaste gives me goosebumps.
It's pretty much what happens in the regular household with median and around income [1] [2].
Yes, we are currently generating way beyond ridiculous levels of waste, electronic and beyond: we throw away globally 1.4 billion tons of food per year [3] in a world with 822 million people suffering from undernourishment [4]. If we are this bad with something as vital as food, of course we are worse with devices.
The lower limit of the example budget above was $8,000 and considering a family of 4 over 2 to 4 years.
"Apple generated $365 billion revenue in 2021, 52% came from iPhone sales" [1] Taking Apple as example, those billions have to come from someone. The greater point was that instead of having the billions try to make every single device the most powerful CPU/GPU of the current Moore's Law peak, because no one wants a slow device, you might just as well have a bare glass able to render pixels computed on a remote powerful machine, but that machine must be owned by the user. And you don't stream an app, a browser, you stream an entire OS, on the spot, on request.
Picking screen metrics from 4-5 years ago is a pretty terrible, metric - I have maybe 14 screens but only 2 or 3 of them see active use or are hooked up to anything.
My oldest (working) devices are already all closing in on or past a decade. This metric is flawed in many ways...
"Adults spend $1,200 a year on consumer electronics and own, on average, 25 CE products per household, according to a study by the CEA." [1] In a family of 4 [2], this can mean somewhere around $3÷4,000 per year, or $6÷8,000 every 2 years.
I can think of quite a few electronics I spent money this year which did not include any screen device. Again, these stats are misleading, maybe dishonest.
My "device" budget we'll call it has been less than 1000$ over the past 5 years, and that is with spending a fair amount on non-screen decives, which are electronic in nature but not investments in dedicated computing like you would have us beleive.
"Consumer electronics" consist of "telephones, televisions, and calculators, then audio and video recorders and players, game consoles, mobile phones, personal computers and MP3 players", "GPS, automotive electronics (car stereos), video game consoles, electronic musical instruments (e.g., synthesizer keyboards), karaoke machines, digital cameras, and video players", "smart light fixtures and appliances, digital cameras, camcorders, cell phones, and smartphones", "virtual reality head-mounted display goggles, smart home devices that connect home devices to the Internet, streaming devices, and wearable technology." [1] Most of them have screens.
As a general heuristic, one single person's decision tree is rather irrelevant in the long open game of average spending behaviours.
"one single person's decision tree is rather irrelevant in the long open game of average spending behaviours."
Sure but collectively is what matters, you can't make such precise arguments from such general behaviors, as you put it it's not about a single person's decision tree.
Your own metrics disagree with your original point.
My old gaming computer from 2013 still plays 2022 games on medium-high settings just fine. You can make computers run for a long time if you put in the effort.
Precisely, the users, by definition, care only about the user interface. The matmul machine can be hidden in any box, just that so far it has been tied in same box with the screen. Hence why the greater point is that thin clients are relevant if the user owns the matmul box.
Most people who upgrade on a cycle also sell on a cycle - if you buy the new hotness today, and sell it in two years, it'll still be relatively hot. The old one isn't thrown away, it's just sold to someone else.
People do it with cars (even if they're not leasing, which is explicitly doing it).
And even among households that do have so many devices, aren't the kids' devices typically "replaced" with hand-me-downs from the grownups rather than with totally brand-new devices?
It was just a bit of napkin mathematics seasoned with some quickly searched statistics of averages. The greater point is that thin clients become truly relevant when the user also owns the rendering machine, think of self-cloud.
"It also has other uses in a variety of specialist contexts." [1] Language is complex and signs often have more than one signification, but one must have curiosity and endeavour beyond what stands prima facie.
With this specific meaning of "range between"/"closed interval" it is often used in Eastern European italo-/franco-phone languages/texts. A short history of obelus on StackExchange [1].
yes! this is the same reason I’m glad Stadia died. “cool demo” does not equal “practical business”, and local compute will always be superior to cloud compute.
What? Stadia worked great (I say this as a user). What didn’t work was the business model, the lack of games, and the lack of trust in google to invest to fix those other things.
It worked flawlessly for me for years over Wi-Fi. GeForce now which I currently use is rockier due to being more bandwidth sensitive and not as controller friendly since you’re really using a PC, not a console as stadia was.
Even under ideal conditions where most people were claiming it's working great, I never found the latency even remotely acceptable. Maybe it's due to my preference to esports titles that requires all the responsiveness possible, but the point stands. Remote rendering would never be able to achieve anything even close to acceptable performance to be competitive with on-client rendering.
Agree with co-commenter, I have never seen anyone who used it for low latency apps say it was sufficient. The target market for displacing e.g. gpus is heavily geared towards low latency.
It might have been OK for mobile games, but then they were just competing with themselves, with an arguably inferior product (if I can't play my mobile game anywhere, only in the vicinity of my house where I already have my high powered rig, what's the point?)
I do think it was an attempt to defeat a very legit, widespread use case to try tk justify pushing thin clients on everyone, glad it failed.
My primary feeling is that I think technical people drastically overestimate the pain that most "normal" people experience when it comes to browser performance.
I see it all the time on HN, e.g. people bitching about the amount of RAM Electron apps take and the like. Who cares? The average user certainly doesn't.
If anything, 98% of the time when I experience browser performance issues it's network-latency related, or the fact that some page is loading 300 ad-tracking scripts and one of them is accidentally blocking. The only time I really notice client-side execution performance is when someone posts a cool 3D browser example on HN and things slightly slow down a bit on my phone when there are a couple million polygons or whatever. Even then, things are fine on my laptop.
100% - people cared about browser performance and usability until Firefox and Chrome solved 99% of the problems users would ever face (tabbing, consistent rendering, security, beauty(!)) and now it’s a market that’s immune to change.
My only use for chromium is youtube. I only watch a handful of channels but things are getting ridiculous. Sometimes I get ads every minute. Every minute! Just the other day I got served 5 ads in a row after a simple fast forward. I gave up TV more than 20 years ago and this crap is way worse.
What can I do without letting them in my actual browser? These people are crazy.
This, as much as people may moan, is the answer. Either that or go subscribe to Nebula or whatever not-YouTube platform your favourite YouTube channels are publishing to.
Content production costs money, in some cases vast amounts of it, and while advertising is a terrible solution to that just not paying is an even worse one. If everyone blocks ads then nobody gets paid, and nothing gets produced.
Its not just about moaning, it's about paying it forward. I'm not sure when people decided everything should be free - under a certain income sure we should be subsidized, paying for things is a way of voting for the way you want to see the world.
If you don't pay you don't have a choice. "Not using" is not enough, because you still need to support what you do use.
I pay for premium but still have to use vanced to avoid the sponsor segments that are in literally every video now. I totally understand why people wouldn't bother with premium until youtube offers a native sponsorblock-like alternative
>I see it all the time on HN, e.g. people bitching about the amount of RAM Electron apps take and the like. Who cares? The average user certainly doesn't.
It is our job to care so non IT people don't have to.
I think you got your greater than sign the wrong way around? Consider a lot of non-tech people don't constantly upgrade, and still rock 4GB systems, at least an eighth of that resource is a big ask for a sloppy app (unless it is the only app they use on the device).
I am an insider to this space and I think you are probably an outsider.
That link doesn’t say what you think it does.
LCP before optimization is a cause of what? For example, a complex product page. Well of course that will convert worse, because the product is complicated.
But not too simple! 0.5s performs as bad as 2s.
Anyway, in my real experience, conversion fluctuates so much across products, the range greatly exceeds the impact measured by aggregate conversions by LCP, so your interpretation is wrong.
98% of performance problems on my machine are traceable to a crappy Electron video communication app (skype) eating 4 cores and still having hiccups galore. Switching to a different app usually fixes the issue.
I'm pretty sure the average user cares about "hey this app sucks, but this other app gives me the same functionality without issue".
You can hope they never discover other apps for work-from-home, but frankly, if that's your business model, you're doing it wrong.
> My primary feeling is that I think technical people drastically overestimate the pain that most "normal" people experience when it comes to browser performance.
Where have you lived? Asking because “normal” varies a lot by geography.
> If anything, 98% of the time when I experience browser performance issues it's network-latency related
I guess if you are on HN, you wouldn’t be considered “normal”?
"I think technical people drastically overestimate the pain that most "normal" people experience when it comes to browser performance."
Eh I think technical sales people overestimate how much users don't care, the feeling of "normal" users don't care is primarily driven by a narrow age bracket of folks who haven't had enough time to form an opinion (ages 20-25).
As this age bracket thinks of itself as "normal" and tends to live in its own bubble, it dismisses the general distaste for technology as being "not technological", when in fact the older generations already have expectations which it recognizes that technology is getting worse, not better.
I think this is true in one sense and false in another.
It's true because users often don't complain about or even consciously notice poor performance. So if you are trying to sell a product, advertising performance benefits doesn't work well. People don't think they care (except in extreme cases).
But it's false because even moderate performance differences actually have enormous and easily measurable effects on user behavior. So if you care about providing value, then performance is one of the most important things, and it's consistently overlooked because it doesn't sell.
People complained about the amount of RAM NeWS apps took up, too. In retrospect we realized NeWS was a missed opportunity to revolutionize networked-app UI.
Electron is the dream of NeWS for current-era hardware and software. In the future we won't even blink at the space it takes up.
It was one of the competitors to X back in the day. Basically all the window painting and client-side logic would be handled by an extended PostScript interpreter, which would receive and run the client code from the application, much like we do with HTML/CSS/JS today.
The death of this product is a good sign for computing overall. The more of computing that happens on the client the better. Clients are (at least sometimes) under user control. When stuff moves to the cloud users give up control and that's a bad thing. Long live fat clients!
[1] https://twitter.com/Suhail/status/1588906086459150337?s=20&t...