logoalt Hacker News

snickereryesterday at 10:25 PM2 repliesview on HN

Allowing scripting on websites (in the mid-90s) was a completely wrong decision. And an outrage. Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust. That’s completely unacceptable; it’s fundamentally flawed. Of course, you disable scripts on websites. But there are sites that are so broken that they no longer work properly, since the developers are apparently so confused that they assume people only view their pages with JavaScript enabled.

It would have been so much better if we had simply decided back in the ’90s that executable programs and HTML don’t belong together. The world would be so much better today.


Replies

coinyesterday at 10:31 PM

Stepping back, it's pretty ridiculous that I need to download executable code, often bloated, solely to view read-only content. Just render the thing on the backend and send it to the client.

show 1 reply
SchemaLoadyesterday at 10:40 PM

There is obviously huge demand for scripting on websites. There is no one authority on what gets allowed on the web, if the existing orgs didn't implement it, someone else would have and users would have moved over when they saw they could access new more capable, interactive pages.

The 49MB webpage just shows what our priorities are. It shows the target audience has fast internet that can load this without issues. On my average home connection in Australia, I can download a 49MB page in 0.3 seconds. We spend time optimising for what matters to the end user.