Interesting.
For Hurricane Helene specifically, my team at Newspack actually worked with Blue Ridge Public Radio and a number of other news organizations in the affected area to set up text versions of their websites for low bandwidth readers[1] and get info to 10s of thousands of people[2].
In fact, it was so successful (maybe not at reaching you specifically though), that we got a grant to roll out a general purpose plain text web solution for breaking news situations to news organizations across the country![3] So I think there may have been a mismatch in that you didn't know about all of the plain text versions of news sites available in your area during the disaster -- that's something we'll have to keep in mind.
[2] https://awards.journalists.org/entries/hell-or-high-water-bp...
The header image in the article is a 2400x1600 PNG that is 500KB large, apparently due to subtle dithering making it hard to compress. Converting it to a visually-identical .avif (quality 90, 12-bit color depth) takes it all the way down to 15KB.
Plain HTML and forms for interactivity can be very effective.
In fact, for a long time web forums were largely entirely usable without JS.
See the degradation of GitHub for a great example. You used to be able to interact with most of it without any JS at all; browsing the code repositories, reading and replying to issues, etc. Now it barely shows anything without JS. Of course, I suspect in that case it's deliberate so they can trick you into running a few more tracking and analytics scripts...
Other Helene stuff I took note of:
- AT&T was completely down for us but Verizon and its MVNOs were up
- I had a Verizon MVNO secondary e-sim that came free with a home internet plan, unused until the hurricane hit
- It worked pretty well!
- The day the Verizon disaster internet trucks showed up at the police station in our town my Verizon MVNO internet went down
Non-internet learnings:
- Fill up your vehicle’s fuel or battery before any big storm, we spent a lot of time siphoning and otherwise consolidating fuel to get ourselves and neighbors out of town, particularly because we didn’t know how far we’d have to go to find a gas station with electricity
Vaguely related anecdotes:
- I got caught in the mountains for a few days due to landslides in Nepal. The only available information was relayed by phone between locals. People had no idea of what was going on and their vacation ended on the day the road reopened. It caused a pile up of cars where the road had slid off a few days prior. In some parts, rocks still fell from the cliffs above. We flagged a passing car and asked them to keep us updated on WhatsApp instead. We could have all stayed put if we had that information before.
- During covid I maintained a page with simplified local restrictions and a changelog of new restrictions. The alternative was to follow press conferences and re-read the entire regulation the next day, or keep checking the newspapers. Mine was just a bullet list at a permanent location.
- During the invasion of Ukraine, refugees have set up the most impressive ad hoc information network I have ever seen. It was operational in 24 hours and kept improving for weeks. People sorted out routes, transport, border issues, accommodation, translation and supplies over Telegram, Notion and Google Docs.
Information propagation is critical during emergencies, and people are really bad at it. Setting up a simple website and two-way communication channels makes a huge difference.
I think many of us in the web space that are old enough were shaped by the "mass outage" that happened on 9/11 when pretty much all news websites went down while we were looking for information. Slashdot was fighting valiantly to say up and was one of the few sites one could find information on the events (if you were in a spot without access to a cable TV, you were very much in the dark). The web and the infrastructure are substantially different than they were 25 years ago, but I still get a bit of "what if" in the back of my head (not that I work on anything of that level of significance).
A related article I read yesterday lamented that 1GB of RAM isn't enough to run a graphical browser anymore (1). Sure JavaScript runs fast now, but at the cost of the code size of the average website being unnecessarily large. This is because speedy js and speedy network connectivity allows for more code, more network requests. Another example of Wirth's law.
This was the case when I got a Rapberry Pi 4 with 1GB of RAM about late 2019. You could run one tab of Chrome, but any more and it would be killed.
(1) https://log.schemescape.com/posts/hardware/farewell-to-a-net...
I live just south of Asheville in NC, and we were completely isolated after the storm for a few days. The only reason we were able to get out after a few days was that a fire truck had been abandoned at the bottom of our driveway as trees fell around it, so they came back as soon as they could to get that resource back. People just the other side of those trees were unable to get out for about a week.
Our best source of information, even after we started to get a bit of service, was an inReach. I messaged a friend far from the region and asked them really specific questions like, "Tell me when I can get from our house to I-26 and then south to South Carolina."
Yes, absolutely, emergency information sites should be as light as possible. And satellite devices are incredibly useful when everything local goes down.
There is a movement supporting small websites. The links below may inspire those interested in text or small websites.
https://web.archive.org/web/20231208000921/https://10kbclub....
I find that the sites designed around being small are usually nice to read since the effort is put in the content not the layout.
Additionally a lot of great sites can be found through something like https://wiby.me/ or different protocols like gopher or gemini.
Nice to see other fellow Western NC folks commenting here, I'm in Asheville. I did not know about all of these text only version of major news sites. I'm going to bookmark them.
What saved us from a news deficit after Helene was that we had 2 portable AM/FM radios. Both of the radios took batteries and one of them you could even charge via a hand crank. I highly recommend having a portable AM/FM radio of some kind. Blue Ridge Public Radio (our local NPR) was amazing during this time. Their offices are located right in downtown, which never lost power, so they were able to keep operating immediately after the storm.
I also feel this pain of bloated sites taking forever to load when I'm traveling. I'm on an old T-Mobile plan that I've had since around 2001 that comes with free international roaming in 215+ countries. The only problem is that it's a bit throttled. I know that I could just buy a prepaid SIM, or now I can use an eSIM vendor like Saily, but I'm too cheap and/or the service is just good enough that I'm willing to wait the few extra seconds. Using Firefox for Android with uBlock Origin helps some, but not enough (also I just switched to iPhone last month). I've definitely been on websites that take forever to load because there's just so much in the initial payload, sometimes painfully slow. I don't think enough developers are testing their sites using the throttling options in the dev tools.
As a mild lifelong disaster "junky" (grew up in remote areas, dealt with cyclones, floods, droughts, fires, monsoons, coup d'etats, unannounced atomic tests etc during decades of global field work) this description:
As a web developer, I am thinking again about my experience with the mobile web on the day after the storm, and the following week.
I remember trying in vain to find out info about the storm damage and road closures—watching loaders spin and spin on blank pages until they timed out trying to load.
reminds me why we (locally) still rely on AM radio day in day out and will continue to do so for the forseeable future.One way to get to this is to start with almost-'94 HTML:
<!doctype html>
<html>
<head>
<title>Some Topic</title>
</head>
<body>
<h1>Some Topic</h1>
<p>Information goes here.</p>
<p>Information goes here.</p>
<p>Information goes here.</p>
</body>
</html>
Then add a little non-'94 CSS styling.If you decide to add an off-the-shelf wad of CSS, like Pico.css, consider hosting it alongside your HTML (rather than turning it into another third-party dependency and CDN cross-site surveillance tracker). Minified, and single-request.
It's stuff like this that makes me want to scurry right back into the SRE/sysadmin space. Currently back in webdev after a few years out. I just feel like I'm being a pain in the arse to comment "could just write the HTML...?".
This very article loaded 2.49MB (uncompressed) over 30 requests. It's served using Next.js (it does actually load fine with JS disabled).
Ironically this is a great opportunity for the author to have made a stronger point. It could have gone beyond the abstract desire of "going back to basics" to perhaps demo reworking itself to be served using plain HTML and CSS without any big server-side frameworks.
It's been shared here many times, but Terence Eden has a great anecdote about how the UK's GDS standards - lightweight, simple html - meant the site was usable even on a crappy PSP https://shkspr.mobi/blog/2021/01/the-unreasonable-effectiven...
One thing that I would also suggest folks who are resonating with this piece consider...
Local copies of important information on your mobile device. Generally your laptops are not going to see much use. Mobile apps tend to fake local data and store lots of things to the cloud. We tend to ignore such things like backups and local copies nowadays. Most of the time we can get away without any worry here but consider keeping a copy of things like medications and their non commercial names for situations like this as well.
Yeah, you don't get one. The purpose of modern web development is to make sure the developer is intellectually satisfied in over-engineering something that should be relatively simple, justifying their pet projects. If any useful work gets done, that's a side-effect and most of times an accident. (Just try to use StateFarm's website as a great example).
I can relate to this post. During the fires in Southern California last year it was confusing and frightening to know that you're surrounded by fire but you can't get any news or information due to degraded cell networks. We had no power and were just trying to load pages to figure out whats going on and if we should evacuate. There were either no emergency alerts, or emergency alerts for irrelevant things.
Another endlessly frustrating aspect is unfortunately Facebook. For better or worse, it's become a hub of emergency information using local facebook groups. In an emergency you want a feed of chronological information and facebook insists on engaging you by showing 'relevant' posts out of order.
It's hard to understand the privilege bubble you're in unless you actively try to live like your users. My read of the current trend [1] is that building for your marginal users isn't prioritized culturally in orgs or within engineering. Unseating those ways of working in my experience has been immensely challenging, even when everyone can agree on methodologies to put users first in theory [2].
[1] https://infrequently.org/2025/11/performance-inequality-gap-... [2] https://crukorg.github.io/engineering-guidebook/docs/fronten...
I want a plain text website every day, period. I'd even like to have a text site summarizing video from youtube.
Author should look into WinLink. There is WinLink Wednesdays where WinLink is practiced. A lot of reports come out of WinLink and it’s all text. Of course need to be an Amateur Radio Operator
Many years ago, Google had a service I would use pre-Smartphone days to search when I was away from the PC and needed info for like a restaurant’s number. You would text 46645 and it would send you search results. It was useful during hurricanes.
This should be required reading for anyone building government or emergency sites. During any crisis, people have spotty connections, dying phones, and zero patience for loading spinners. A plain HTML page with bullet points would save lives over a fancy React app that needs 5MB to render. The irony is we have better tools than ever to build fast sites, yet the average webpage keeps getting heavier. Somewhere we forgot that the web worked fine before JavaScript frameworks.
That bulleted newsletter list being the most useful thing says everything.
Reading this on airline wifi right now makes me realise just how unusable some stuff becomes with choppy internet. E.g. I can’t change settings on the LinkedIn app because the request to load the settings page fails :/.
Check out Newswaffle on Gemini:// protocol.
The web could in theory support text-first content, but it won't. The Gemini protocol, though not perfect, was built to avoid extensibility that inevitably leads us away from text. I long for the day more bloggers make their content available on Gemspace, the way we see RSS as an option on some blogs.
The web will continue to stray from text-first content because it is too easy to add things that are not text.
Some things you don't know people need until you're directly affected. For me, it was an injury related light sensitivity that made me realise dark mode isn't just a frivolous addition for looks
What you really want is a (mostly) JavaScript-free website. Run NoScript and cut out all the data broker bloat, allowing just a limited number of critical scripts to run. Adding LocalCDN will further reduce wasted transfers of code that you do allow. Then you can decide if you want to show images. The web will be much faster on a fast or slow link.
W3C did this
I'm not a web developer, but a product manager, and recently I've created my own website using Astro. I do support the point, that its better to have a relatively simple static website - it's really fast and lightweight! An average Wordpress website would weight 1 MB+ in best case, in worst case - 4 MB+. I don't know how people think it's a good idea to have a compressed webpage to be 4 MB!! Let's KISS
I remember complaining about this around five years ago [^1], and it looks like not much has changed since, save for the amount of people complaining about websites being full of garbage that serves no purpose or static resources being bigger than the should be.
nice work. dealt with same issue during helene. would be interesting to do things like convert to morse code or convert to modem-over-walkie low baud rate.
I built this repo as a Helene response repo, trying to use an llm to help get resources over text message. https://github.com/realityinspector/supply_drop_ai
wonder if you could get to news over sms, use an llm to compress to minimum viable text?
> As a web developer, I am thinking again about my experience with the mobile web on the day after the storm
In some villages, where plenty of stone is available, people used it for everything - roof slabs, pillars, walls, flooring, water storage bowls etc. Also, villages which had plenty of wood around, they used it for everything.
As techies, we say there is an app for everything, or there is a web-technology for everything. When you have a hammer in hand, everything looks like a nail.
I have a tab on sublimetext that has been opened since the pandemic and I think it's safe to share my idea since I'm not gonna do it.
*4KB webpage files*
So a website where each page does not exceed 4KB. This includes whatever styling and navigation needed. Surprisingly you can share a lot of information even with such constraints. Bare bone html is surprisingly compact and the browser already does a whole lot of the heavy lifting.
Why 4KB? Because that used to be the default page size in x86 hardware. So you can grab the whole thing in one chunk.
This whole comment is not 1KB.
There's a good talk from Jeremy Keith about building resilient websites:
1. Identify core functionality. 2. Make that functionality available using the simplest technology. 3. Enhance!
For https://cartes.app, I'm planning to code an extremely light version of the map tiles. Based on initial network speed, decide to load this style first, that just lets the user see the streets and and important places. Then load the rest.
This initial map style would be the equivalent of a "text-only" website.
Blocked by Vercel's turbopack bundle analyzer's bugs though, because before optimizing the tiles, I need to optimize the JS that loads the tiles.
I haven't figured a way to load a Maplibre map server side, so the JS must be ready before the map starts to get loaded.
Talking about text-first sites: https://wordgag.com brings me a lot of joy everyday. They also update their funny quotes collection regularly.
Oh for the time when chairs were still for sitting and PDFs were still for printing.
Restaurant websites mentioned — the majority of restaurant web sites I’ve encountered were much more annoying and difficult to read than a PDF, even on a small phone screen. Or should I say, especially on a small phone screen. Some would make a 32 inch monitor feel cramped.
This used to actually work, at least on some sites. The text would load first, then it would reformat as the fonts and CSS assets are loaded. Ugly and frustrating, which is probably why, now you don't get content until the eye candy is all ready.
But the progressive, text first loading, would be readable from the get go, even if further downloads stalled.
Prepend 'pure.md/' in front of any url.
I'm sure there are more proxies around.
Perhaps this is something FEMA could try to encourage and lightly enforce. No new technologies needed, just a mandatory "old web" option for key .gov, state and news sites.
I built a connection to a web-powered LLM over SMS/iMessage for literally this purpose. While traveling I’d have really bad or sparse service but still needed to find my way around.
Actually, there are readily available tools that can do this. Many websites have implemented accessibility features for the blind, allowing users to read all text information on the screen, as well as alt text for accompanying images. This feature might be hidden, and many people are unaware of it.
CMSs should normalize having a text only version of your website.
I think with a little effort they could make it pretty frictionless for their users who it turn would be happy to provide it.
> 66 requests
> 5.1 MB transferred
Ironic.
I'm on 600mbps fiber with low latency. Some times, I can't be arsed to load the websites linked on HN and simply head straight into the comments. For example, when it's a link to Twitter, I get an endless onslaught of in-site popups. Cookie banners, "sign in with google", "sign in to X", "X is better on the app" and so on and so forth. Meh. I'll sometimes just stick to HN, especially when I'm on my phone on the sofa or something.
Give me a minimal / plain text website every day, it's not just the link speed.
> I was struck by how something as simple as text content could have such a big impact.
Truly a sign of our times
I now use a text only CLI utility to read the BBC news. It is (for me) a greatly improved experience.[1]
check out reticulum and nomadnet, they meet these needs perfectly!
Several news sites offer text only versions.
https://lite.cnn.com/
https://text.npr.org/
https://wttr.in/
More listed at https://greycoder.com/a-list-of-text-only-new-sites
It’d be great if there was some standard that allowed these to be easily found, and supported on the local news sites.