logoalt Hacker News

rebane2001today at 2:48 AM2 repliesview on HN

The loading issue is just a hug of death, the site's currently getting multiple visitors per second, and that requires more than a gigabit of bandwidth to handle.

I sort of need to pull all the data at the initialization because I need to map out how every post affects every other - the links between posts are what take up majority of the storage, not the text inside the posts. It's also kind of the only way to preserve privacy.


Replies

jedbergtoday at 8:06 AM

I think I'm missing something, but does every user get the same 40MB? If so, can you just dump the file on a CDN?

goodmythicaltoday at 2:55 AM

I feel very strongly that you should be able to serve hundreds or thousands of requests at gbps speeds.

Why are you serving so much data personally instead of just reformatting theirs?

Even if you're serving it locally...I mean a regular 100mbit line should easily support tens or hundreds of text users...

What am I missing?

show 1 reply