All the little tricks that the CPU has to speed things up, like branch prediction, out of order execution, parallel branch execution, etc, are mostly more expensive than just not having to rely on them in the first place. Branch prediction in particular is not something that should be relied on too heavily either, since it is actually quite a fragile optimization that can cause relatively large performance swings with seemingly meaningless changes to the code.
I got ad-free Twitch years ago with Twitch Prime. And then they added ads back in for Prime users. How long will it be until they start adding ads back in for Turbo users?
I think the original intent was referring to the fact that given the same operand, the order of multiplication and division shouldn't matter, e.g. A * X / X should give the same result as A / X * X, but in reality they can give different results when done by a computer due to precision limits, overflow, etc.
This is why I think that the trade school model of teaching software development should really be developed further. Having been in charge of hiring software developers a number of times in the past, I have found that a Bachelors in Computer Whatever barely tells me anything about whether you can actually program. The most it tells me on paper is that you have seen some amount of code at some point in your life and know a few buzzwords.
This isn't to say that a College degree is worthless, but it is not a job training tool, and the sooner that people realize that the better. I would love to see a world where software development is treated more like some of the trades. Rather than insisting that all applicants have a BS in Computer Science or equivalent, we should hire promising kids fresh out of High School (Or college) and fund their 1 year education in programming + apprenticeship. I feel like that would lead to a healthier industry as a whole, and bring a lot more highly talented individuals into the world of programming.
And no, web dev bootcamps are not what I'm talking about. They are closer to my ideal than a college, but they have a lot of other problems that keep them from being useful in general.
It's one of those things where if it works in your area, it works great. I've been on MVNOs for a decade now and the worst I've ever experienced was sometimes my data rate would drop to a couple mbps in crowded areas, but that's never been a material problem for me. I happily spend the $10-15/mo and get service that exceeds my needs. Sure it won't work for everyone in all cases, but you shouldn't discount MVNOs out of hand just because they are second class networks.
Yes and no. Real wages in US manufacturing have been on a steady decline for a long time now, and the cost of manufacturing goods overseas has been steadily increasing. After factoring in all associated costs with importing goods from a place like China, the difference in cost might even tip towards made in the US being cheaper in the long run. That said, a lot of countries other than China do still have cheap slave laborers that we depend on to give us cheap stuff, and we just can't compete with that using our domestic work force. Also if the local workers unionize and push wages up more towards what they would be had they kept pace with inflation, then local goods will be far more expensive. The cost of a new factory would certainly inflate a lot of short term prices too.
All that said, I know a lot of people that would be willing to pay a substantial premium for well made goods that were produced domestically. Everyone has experienced just how terrible the goods we source from overseas often are, and if paying double or triple for something means that you get a useful lifespan that's 5 times longer, a lot of people are willing to pay that. There are also a lot of people that would happily pay for goods that they know for certain weren't made in a Sri Lankan sweatshop.
I think what we need are product reviews that aren't so easily gamed by $$$. As it is, you don't know if the premium you pay for a product translates into quality.
100k probably just isn't as dense as you'd intuitively expect. As a sense of scale, 100,000 pixels is just shy of 5% of the pixels on a typical 1080p screen, so even with a radius of a few pixels per star, there will still be quite a bit of negative space given all the overlap.
I think a "web app" should definitely be able to have some sort of more properly integrated filesystem API, but a "web page" has no business having access to such a thing. There just hasn't been a line drawn in the sand between the two, so every "web page" has all the capabilities of a "web app" by default.
Personally I wish there would be a meaningful line drawn between the two so that users could have a nice shorthand for allowing web pages to "upgrade" into apps which have access to things like WebGL and filesystem access. Such a thing would only have any meaning to power users and privacy oriented people though, and the general trend in browser design has been to spurn such users in favor of reducing friction for everyone else at all costs.
Nearly all of the security and privacy problems we have with the World Wide Web today was because it went from a content-delivery platform (with deliberately limited interactivity) to a fairly complete app-delivery platform.
Javascript isn't the new Java. Web browsers are the new Java.
I would be very much in favor of a way to draw a line between "content" on the web and "apps" delivered by the web. I don't know what form that would take. But it will probably never happen because the FAANGs that run the web these days are actively opposed to any way to deliver content over the web that doesn't also let them include apps to track your activities online.
Maybe some kind of plain text machine-parsable-but-human-readable "protocol" that "applications" could use to specify the "content type" of some "resource" they want to "get"...
That line was blurred long ago, and the distinction between an "app" and a "page" is irrelevant today. Even news articles these days sometimes contain interactive elements like WebGL visualizations.
Personally, even though every new API inevitably gets abused, I don't think we should throw the baby out with the bathwater because there are tons of legitimate uses here. (In fact, I'm currently building something as a side project that would benefit greatly from these file system APIs.) My own worry is about complexity creep—specifically, the number of things I'll be expected to know and keep track of as a frontend engineer in another 10 years. But that's probably just me getting older. :)
> foobar.com wants to access your location. [Block] [Allow]
> foobar.com wants to access your camera and microphone. [Block] [Allow]
> foobar.com wants to send push notifications. [Block] [Allow]
Ideally these prompts are presented above the line of death, and clicking “Block” prevents future prompts, so you can’t get spammed.
Of course users click on these prompts without caring. Of course websites may try to unnecessarily block access if you dont agree. Of course websites make their own obviously fake prompts so you click “block” and then they present the obviously fake prompt again just to waste your time.
But users already download and open random files and grant them admin privileges, and websites already spam you. The current notification system works and extending it to WebGPU and file systems is natural.
Exactly my thoughts when the pitchforks come out over advances in browser tech. Arguments denouncing such progress eventually leave me feeling like banning personal computers and only allow communication via physical paper would be the only way to satisfy the nay-sayers.
If web apps are fully sandboxed by default as today, then presenting the user a UI for a web page wanting to upgrade to a (still sandboxed permissionless) web app seems like a waste of the user's attention. Why should the user see a prompt just because a webpage wants to do some WebGL visualization (that doesn't put any of the user's data at risk)? It seems like the perfect recipe to lead to user apathy to permission dialogs and users clicking to allow permissions automatically, because most of the dialogs are for nothing, but then the user may be taught to click through actual important dialogs just as automatically. I'm reminded of when IE used to warn the user about secure connections.
If they were, then tracking users via third-party cookies and other resources wouldn't be possible. Nor would it be possible for a web site in my browser to suddenly start taking up all of my CPU/RAM due to a programming error or malicious site such as a crypto-miner. For the relatively little isolation that does happen, sandbox-escape vulnerabilities seem to be getting discovered all the time.
Also, as a technical user, I want more control over what web sites can do with my computer than a non-technical user might.
The more holes you poke in a sandbox, the worse a sandbox it is.
Third-party cookies seem to be on the way out thankfully. I agree that there should be a permission necessary (or at least some much better heuristic) for allowing a webpage to use too much CPU/memory.
> Why should the user see a prompt just because a webpage wants to do some WebGL visualization (that doesn't put any of the user's data at risk)?
Probably because there's no way to say that it "doesn't put any of the user's data at risk". WebGL has been abused for browser fingerprinting which itself puts user's privacy at risk, but it also has a long history of very nasty vulnerabilities and exploits. It's been fully disabled in my browser for years because of the security issues.
Web browsers are not bulletproof. Their sandboxing implementations depend on OS features, which vary. Check a CVE database for your favorite browser and you'll find plenty of historic holes to dig through.
WebGL is one of the biggest fingerprinting vectors on the modern Web platform, and expands browser attack surface significantly. Most webpages should absolutely not have access to privileges like this.
But WebGL does put the users data at risk. WebGL is an attack vector for fingerprinting which is data about the user, that gets correlated with other data about the user to stalk them.
While this isn't directly Google's doing, they do enable it for other services too: when I was on a road trip, I realized that I was going to need one additional hotel stay before getting home, so at 1 AM I was shopping around for a hotel on my phone and found a place that looked decent enough. I looked up their web page to get their phone number (since I don't trust the Google side-bar) to call them and everything seemed normal enough.
I clarified what their pet policy was over the phone and they said I was fine to bring the pets I needed to bring. They said that bringing a large dog and a cat was fine, so I thought everything was good. When I got there, the person handling check-in complained that I was violating hotel policy, as they only allowed one pet and only small dogs were allowed. I was confused as I had specifically asked when booking if they would be allowed and I was assured that it was fine. Thankfully we were allowed to keep the reservation given that we were keeping the animals crated while in the room.
When I got home, I compared the phone number in my dial history to the one on the hotel's website, and I noticed that they were different. After a lot of confusion, I tried replicating my steps to navigate to the hotel's website on my phone and realized that I had actually clicked on an "Ad" link on Google to get to the website, rather than the organic result. The webpages were almost identical, with the only difference being the phone number. Presumably the "Ad" link was put up by a booking site to funnel phone traffic away from the legitimate site.
Presumably because I was tired, I either didn't notice it was an ad, or I figured that it would be equivalent to the organic search result, but it could have potentially caused me to have to find a hotel that accepts big dogs with vacancy at 11 PM in a city I don't know, in addition to wasting my money since the reservations weren't refundable at that point.
tl;dr: Booking sites use Google ads to steal organic traffic intended to go directly to hotels and caused me to unintentionally violate a hotel's pet policy