Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Are web pages much heavier than they need to be? Yes

What about the question "do web pages work any better than they did in 2007?" when we were using full page reloads and server side logic instead of Javascipt tricks.

I see so much basic brokenness on the web today from the back button not working to horribly overloaded news websites with mystery-meat mobile navigation I find myself wondering what have we really achieved in the last 9 years? This stuff used to work



I think you are looking at the past through incredibly rose-tinted glasses. The web has been a mess for a long time, and we used to have to make sure our computer was set to 800x600 and we were using Internet Explorer 6 in order to even use it.


I'm pretty sure I had > 1024x768 graphics and have used nothing but Netscape/Seamonkey (because inertia) since the mid '90s.


Sadly, a lot of laptops come with only 1366x768 displays, 20 years later. What a lack of progress.


Laptops sit in a different corner ( as in corner solutions ).


>make sure our computer was set to 800x600

Thaaaaaaaat's nonsense. I had relatively high-res CRTs (1600x1200) in the late 90s and early 2000s.

My father and I were able to get by with Netscape Navigator and Firefox for quite awhile as well.


Here's a message from 2003. (scroll up)

https://groups.google.com/forum/#!msg/uk.comp.os.linux/5c40N...

> http://www.argos.co.uk

> "Sorry, the Argos Internet site cannot currently be viewed using Netscape 6 or other browsers with the same rendering engine.

> In the meantime, please use a different web browser or call 0870 600 2020 to order items from the Argos catalogue."

> Sorry, I think I'll shop elsewhere until you get it fixed...

Argos was sniffing the useragent. I think people tried changing the useragent, and it worked fine.

This kind of thing wasn't rare, even in 2003.


>This kind of thing wasn't rare, even in 2003.

And it is not rare now either. Nothing has changed in that regard. Things have just gotten massively slower, use insane amounts of CPU, and are less functional.


Even the first NeXT computer in 1990 was 1120×832...albeit greyscale...still...800x600 died in the mid-90s - especially for professionals.


Firefox was released in November 2004.


The initial release of Phoenix was in 2002. Many long-time Netscape/Mozilla users (myself included) switched very early on.


I remember even relatively computer un-savvy folk in '98 having 1024x768 monitors.


I know that. The date range was not exclusive. The point was that high resolution CRTs date quite a bit back even for consumer use. We had no problems browsing the web.


That was 1997, not 2007.


IE 6 was released with Windows XP in 2001 and IE 7 didn't come out until the end of 2006.


Opera was awesome even in 2000.


2007 I was working for a shop that only officially supported IE6. It was a health insurance company too. Your premiums at work!


This website itself is not far behind, weighing in at 937kb and 57 requests: http://www.webpagetest.org/result/160422_KJ_18KN/1/details/


network latency still fails to catch up to human awareness' time resolution.

client-side logic, done right is much improved over a server-side solution.


Which network and which latency? My local network runs audio way below the Haas limit - I can record over the network without incurring any latency penalty.


Comparing local-network performance with "random" cross-internet traffic IMHO isn't very useful, because there is a wide range for internet latency.

My wired desktop gets DNS responses from 8.8.8.8 nearly as if it were in my network, in way under 10 ms, ping responses in 2 ms or so. Accessing websites hosted in e.g. Korea takes >100 ms.

Add a congested wireless connection somewhere (WLAN or mobile network) and you can add another few hundred ms. And neither cross-continent nor congested wireless latency is going to go away.


Perhaps I should have been more explicit - comparing local audio traffic to local web traffic, there's a heck of a lot of difference. That would be the stacks.


[flagged]


English much?

I said audio. I provided this as a counterexample to the stated thesis of your post. There exist things that can be done over a network such that latency is not an issue. I am obviously not pulling data over a cross-continental link.

FWIW, the protocols I write at work can do a full data pull - a couple thousand elements and growing - in under a half second end to end. I don't know of any HTML/Web based protocols that can even get close to that over localhost.

So yeah - we know the Web is an utter pig. My point is that it probably doesn't have to be.


Reading comprehension much?

The article was specifically about web page payload size. My comment was comparing UX of dynamic client-side logic vs full round trips.

You must replied to the wrong comment, I would hope.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: