Building a sub 512KB website is trivial.
Just don't use external trackers, ads, fonts, videos.
Building a sub 512KB website that satisfies all departments of a company of non-trivial size; that is hard.
I get the appeal of the 512KB Club. Most modern websites are bloat, slow and a privacy nightmare. I even get the nerdy thrill of fitting an entire website into a single IP packet, but honestly, this obsession with raw file size is kinda boring. It just encourages people micromanage whitespace, skip images or cut features like accessibility or responsive layouts.
A truly "suckless" website isn't about size. It's one that uses non-intrusive JS, embraces progressive enhancement, prioritizes accessibility, respects visitor's privacy and looks clean and functional on any device or output medium. If it ends up small: great! But that shouldn't be the point.
> The 512KB limit isn't just minimalism - it forces architectural discipline.
True. I skimmed the biggest sites in that list, and they still are extremely fast. It's not just that size limit that makes the difference, but rather knowing that there is one and therefore forcing oneself to reason and use the right tools without cramming unneeded features.
It would be worth adding some information on the page about the best tools to help the creation of small yet functionally complete and pleasant to look at static sites. A few years ago I'd have said Hugo (https://gohugo.io/), but didn't check for a while and there could be better ones. Also ultra cheap hosting options comparable to Neocities (.org) but located in the EU.
Seems like we can join the club! https://www.firefly-lang.org/ is 218 kB uncompressed.
I use an Intel Atom netbook from 2010 as my test system. It has 1 GB of RAM and an in-order x86 processor. CPU Benchmark gives it 120 Mop/s integer and 42 MiB/s for AES. (For comparison, my usual laptop, which is also nearly obsolete with an i5-8350u gives 22,000 Mop/s and 2000 MiB/s respectively.)
The netbook can load Firefox in just a few seconds. And Hacker News loads almost instantly as on a modern machine. (Hit enter and the page is rendered before you can blink.)
The same machine can also play back 720p H.264 video smoothly.
And yet, if I go to Youtube or just about any other modern site, it takes literally a minute to load and render, none of the UI elements are responsive, and the site is unusable for playing videos. Why? I'm not asking for anything the hardware isn't capable of doing.
If my own work isn't snappy on the Atom I consider it a bug. There are a lot of people using smartphones and tablets with processors in the same class.
Assuming most of these sites use Javascript, perhaps the size of memory use should also be considered
I use a text-only HTML viewer, no Javascript interpreter. This is either a 2M or 1.3M static binary
The size of the web page does not slow it down much, and I have never managed to crash it in over 15 years of use, unlike a popular browser
I routinely load catenated HTML files much larger than those found on the web. For example, on a severly underpowered computer, loading a 16M stored HTML file into the text-only client's cache takes about 8 seconds
I can lazily write custom commandline HTML filters that are much faster than Python or Javascript to extract and transform any web page into SQL or CSV. These filter are each ~40K static binary
As an experment I sloppily crammed 63 different web page styles into a single filter. The result was a 1.6M static binary
I use this filter every day for command line search
I'm a hobbyist, an "end user", not a developer
I'm nostalgic for an old World Wide Web (which never really existed, thanks to GeoCities and such), and wish that we could form a sect of "Puritans," break away from the High Church, and sail away to some top-level domain of our own where we'll consider any outbound links heretical.
It calls out the NYT at the beginning, but am I supposed to be impressed that a bunch of mostly obscure minimalist blogs are a few megabytes smaller than the biggest online news site (by subscribers) in the world?
What are we doing here? And to brag about this while including image media in the size is just onanistic.
I added one site that is mostly a static recipe site that my family uses. It includes VanJS to pick random recipes and you can save them as you go along to pick what you are planning for dinner that night. It also has a filter to find the recipes by name and a filter to filter by type. Mainly for personal use but shows what you can do with not a whole lot of code.[1]
I also added a question for my soccer app. Cloudflare doesn't know how to work with service worker-driven applications :-) This one puts the back end in the service worker and uses HTMZ-BE to make it feel like you are using an app. So, basically, a front end MPA with nice interactivity. Super light weight for what it does and easy to use.[2]
> Your total UNCOMPRESSED web resources must not exceed 512KB.
I would be interested to know how they define web resources. HN would only fit this description if we don't count every possible REST resource you could request, but instead just the images (3 svgs), CSS (news.css), and JS (hn.js).
The second you count any possible response to `https://news.ycombinator.com/item?...` in the total, we've blown the cap on 512kb... and that's where the actual useful content lays.
Feels like regular ol' REST-and-forms webapps aren't quite the target of this list though, so who knows.
As an engineering challenge, I love it.
Other than that, I would've understood this notion better in the 90's when we were all on dialups. Maybe my perception is skewed growing up and seeing in real-time a picture loading on a website?
Now, even with outdated hardware on an ok connection, even larger sites like WAPO (3MB) loads what I feel like instantly (within 5-10 seconds). If it loaded in 2 seconds or 1 second, I really don't know how that would impact my life in any way.
As long as a site isn't sluggish while you browse around.
It's a fun way to push for a lighter web but without a way to distinguish the complexity of the sites on the list it's really not all that useful. It's kind of addressed in the FAQ "The whole point of the 512KB Club is to showcase what can be done with 512KB of space. Anyone can come along and make a <10KB site containing 3 lines of CSS and a handful of links to other pages" but without a way for the user to distinguish the site complexity at a glance I'm not sure I understand the point. Regardless of the FAQ the first few sites I clicked on while yes were quite light in size but also had nothing more than some text and background colors on their sites. Also any search site is going to be at the near top of the list e.g. https://steamosaic.com/
Complexity would be a subjective metric but without it I'm not sure what you take from this other than a fun little experiment, which is maybe all it's meant to be.
> Your total UNCOMPRESSED web resources must not exceed 512KB
JavaScript gets all the hate for size, but images easily surpass even your most bloated frameworks.
Which is why the websites on this list largely don't use media.
---
The problem with JavaScript is the network size (well, not as much); it's the execution time.
> Why does any site need to be that huge? It’s crazy.
It's advertising and data tracking.. Every. Single. Time.
PiHole/Adblocker have become essential for traversing the cesspool that is the modern internet.
Most thorough discussion here from a few years ago: <https://news.ycombinator.com/item?id=30125633>
Seems like lichess dropped off
>why do large sites You answered your question, they wouldn't be large if they were small, many users means many features which to some appears like bloat. The reality is that of the millions of users there are hundreds of thousands of frontends and extentions and other little programs that together use every little aspect of the site, not to mention all the ads and trackers that pay the bill for sites like nyt
> Your total UNCOMPRESSED web resources must not exceed 512KB.
I only see domains listed. Does this refer to the main page only, or the entire site?
I could have my future site [0] into this club, being ~434KB, but that would mean having to remove the blog posts or make them load manually, which I don't see as useful.
I like this website. It's very entertaining to me, and a bit nostalgic too. And those minimalist websites also help us remember the importance of building things to last the effects of time. Most of them are good candidates to stay online for the next 15 or 20 Internet years to come (almost like eternity in human terms).
Would help if there was a short description of what the websites are about, instead of just a list of random URLs.
While a fun idea, arbitrary limits like this just aren’t necessary. Yes it’s all well and good in the name of reducing trackers, etc but what if I want to have an image heavy site? Why does that get perceived as a bad thing?
Phantasy Star for Master System is 512KB.
A recent retranslation romhack exists[0] and it's pretty good.
Is there a way of enforcing memory limits on websites from browser (user) side?
These clubs have little effect if there's no incentives in demand
So, “Let’s build carbon-titanium-foldable bicycles instead of bloated modern cars, and still get from A to B?”
How many mainstream online platform users care about the difference in KB in their experience, anyway?
The sites in the list are hobbyist clubs with a technical point of view, which wouldn’t make sense for a mass media outlet with millions of daily traffic, and real interdepartmental complexity and compliance issues to deal with.
the very first website has either 404 links or pages with over a megabyte of total payload. idea is good but I don't buy "fast" website that only serve text with the css from 70s
many of the listed sites are way over 512KB. e.g. golang.cafe
I hope the club does a routine check with headless browsers or something.
Not sure how a site can fit in that club?
For e.g., if someone uses Google Analytics, that alone comes to 430kb (which most people do)
The 512KB limit isn't just minimalism - it forces architectural discipline.
I built a Trello alternative (frustrated with limitations: wanted rows and decent performance). Came in at ~55KB gzipped by following patterns, some of which I'm open sourcing as genX (genx.software - releasing this month):
- Server renders complete HTML (not JSON that needs client-side parsing) - JavaScript progressively enhances (doesn't recreate what's in the DOM) - Shared data structures (one index for all items, not one per item) - Use native browser features (DOM is already a data structure - article coming)
Most sites ship megabytes because modern tooling treats size as a rounding error. The 512KB constraint makes you think about what's expensive and get creative. Got rewarded with a perfect Lighthouse score in dev - striving to maintain it through release.
Would love feedback from this community when it's out.