> Are web pages much heavier than they need to be? Yes
What about the question "do web pages work any better than they did in 2007?" when we were using full page reloads and server side logic instead of Javascipt tricks.
I see so much basic brokenness on the web today from the back button not working to horribly overloaded news websites with mystery-meat mobile navigation I find myself wondering what have we really achieved in the last 9 years? This stuff used to work
I think you are looking at the past through incredibly rose-tinted glasses. The web has been a mess for a long time, and we used to have to make sure our computer was set to 800x600 and we were using Internet Explorer 6 in order to even use it.
And it is not rare now either. Nothing has changed in that regard. Things have just gotten massively slower, use insane amounts of CPU, and are less functional.
I know that. The date range was not exclusive. The point was that high resolution CRTs date quite a bit back even for consumer use. We had no problems browsing the web.
Which network and which latency? My local network runs audio way below the Haas limit - I can record over the network without incurring any latency penalty.
Comparing local-network performance with "random" cross-internet traffic IMHO isn't very useful, because there is a wide range for internet latency.
My wired desktop gets DNS responses from 8.8.8.8 nearly as if it were in my network, in way under 10 ms, ping responses in 2 ms or so. Accessing websites hosted in e.g. Korea takes >100 ms.
Add a congested wireless connection somewhere (WLAN or mobile network) and you can add another few hundred ms. And neither cross-continent nor congested wireless latency is going to go away.
Perhaps I should have been more explicit - comparing local audio traffic to local web traffic, there's a heck of a lot of difference. That would be the stacks.
I said audio. I provided this as a counterexample to the stated thesis of your post. There exist things that can be done over a network such that latency is not an issue. I am obviously not pulling data over a cross-continental link.
FWIW, the protocols I write at work can do a full data pull - a couple thousand elements and growing - in under a half second end to end. I don't know of any HTML/Web based protocols that can even get close to that over localhost.
So yeah - we know the Web is an utter pig. My point is that it probably doesn't have to be.
What about the question "do web pages work any better than they did in 2007?" when we were using full page reloads and server side logic instead of Javascipt tricks.
I see so much basic brokenness on the web today from the back button not working to horribly overloaded news websites with mystery-meat mobile navigation I find myself wondering what have we really achieved in the last 9 years? This stuff used to work