logoalt Hacker News

bombcaryesterday at 11:37 PM2 repliesview on HN

You can somewhat "fix" this by using your slow link to connect to a VPS somewhere that then connects to the Internet, either via links or a similar text-mode browser, or other bandwidth-saving gateway.


Replies

ssl-3today at 2:55 AM

That's a lot of stuff that I don't want to deal with or maintain. It's simply beyond the tolerance of my gumption, so it's not going to happen. :)

What can happen, instead: I can dream.

In this dream, the process of loading a web page identifies the viewing platform well-enough and the server delivers content that is shaped for it, so it can be downloaded quickly and displayed simply by the end-user device. It's not one-size-fits-all at all, or even one-size-fits-most: It's a pile of of simplistic HTML and maybe some minimal javascript and CSS that is meant for whatever the user is using right now.

In this way, the same layout jiggering, varnicating, and transfabulation is done as it is done today, but the work of doing so principally happens on the server instead of the client.

Also in this dream, I can hear people saying "But that's a can of worms!", and they're right. It's a damned mess -- but it's a mess either way. This just moves the mess from the client to the server.

I can also hear shouts of "But there will be hundreds or even thousands of layout paths!" And all I can think is: If there's a thousand unique device types hitting a given dynamic page, and that scales poorly with the server side doing the work, then that's a problem for the systems guys to direct instead of the web guys.

Which is fine: The web guys hacking away however they want is how we got into this mess of 20 megabyte Javascript downloads just-to-view-a-web-page to begin with. They've quite broadly proven that they're shit at this kind of work, and in my ideal world they'd be relieved of that duty.

(And yeah, to be sure: After I wake from this dream I'm still going to go outside and yell at the clouds, just as I do every day.)

show 1 reply
geerlingguytoday at 1:05 AM

In the article and my pi-isp project, I use MacProxy Classic to strip heavy stuff from web pages through a local proxy service running on the Pi. This helps a lot, but if a page has 20 MB of resources, it can only do so much (without completely disabling JS and images).

show 1 reply