logoalt Hacker News

rybosworldyesterday at 10:10 PM21 repliesview on HN

> The basic math is that launching a million tons per year of satellites generating 100 kW of compute power per ton would add 100 gigawatts of AI compute capacity annually, with no ongoing operational or maintenance needs. Ultimately, there is a path to launching 1 TW/year from Earth.

> My estimate is that within 2 to 3 years, the lowest cost way to generate AI compute will be in space.

This is so obviously false. For one thing, in what fantasy world would the ongoing operational and maintenance needs be 0?


Replies

wongarsuyesterday at 10:21 PM

You operate them like Microsoft's submerged data center project: you don't do maintenance, whatever fails fails. You start with enough redundancy in critical components like power and networking and accept that compute resources will slowly decrease as nodes fail

No operational needs is obviously ... simplified. You still need to manage downlink capacity, station keeping, collision avoidance, etc. But for a large constellation the per-satellite cost of that would be pretty small.

show 12 replies
afavouryesterday at 10:18 PM

As soon as a statement contains a timeframe estimate by Musk you know to disregard it entirely.

show 1 reply
CGMthrowawayyesterday at 10:39 PM

There's clearly rhetorical hyperbole happening there. But assuming that thermal rejection is good in space, & launch costs continue falling, as earth-based data centers become power/grid-constrained, there is a viable path for space power gen.

The craziest part of those statements is "100 kW per ton." IDK what math he is doing there or future assumptions, but today we can't even sniff at 10 kW per ton. iROSA [1] on the ISS is about 0.150 kW per ton.

[1]https://en.wikipedia.org/wiki/Roll_Out_Solar_Array

edit: iROSA = 33 kW per ton, thanks friends

show 7 replies
b00ty4breakfastyesterday at 10:41 PM

This is par for the course for an Elon-associated endeavor but it's been leaking out into the broader tech sector; make ludicrous claims and promises and somehow investors just throw money at you. FSD has been around the corner for over a decade, martian colonization will be here by the end of the decade for the past 20 years and General SuperAI will be here in a few years for the past few years.

padjoyesterday at 10:19 PM

It's always 2-3 years with this guy

show 2 replies
hristovyesterday at 10:35 PM

Currently, just a cursory google search shows $1500-3000 per kilogram to put something into low earth orbit. Lets take the low bound because of efficiencies of scale. So $1500.

A million tons will cost $1500x1000x1000000= 1,500,000,000,000. That is one and a half TRILLION dollars per year. That is only the lift costs, it does not take into account the cost of manufacturing the actual space data centers. Who is going to pay this?

show 3 replies
jopsenyesterday at 10:18 PM

> in what fantasy world would the ongoing operational and maintenance needs be 0?

Well, if you can't get there, you can't do maintenance, so there is zero maintenance :)

show 1 reply
uplifteryesterday at 11:05 PM

The ISS’s solar arrays each weigh a metric ton and generate 35 KW a piece[0], and that’s just for the power collection.

They’d need incredible leaps in efficiency for an orbiting ton collecting and performing 100 KW of compute.

[0] https://en.wikipedia.org/wiki/Electrical_system_of_the_Inter...

paxysyesterday at 10:30 PM

The famous Musk timeline. "By next year, 2 year tops".

show 1 reply
fookeryesterday at 10:54 PM

> in what fantasy world

It is already more expensive to performance maintenance on SOCs than it is to replace them. Remember, these machines are not for serving a database, there are practically no storage needs (and storage is the component that fails most often.)

Given that, the main challenge is cooling, I assume that will be figured out before yeeting 100 billion $ of computers into space. Plenty of smart people work at these companies.

show 1 reply
consumer451yesterday at 10:42 PM

Here is my main question: Musk is on record as being concerned about runaway "evil AI." I used to write that off as sci-fi thinking. For one thing, just unplug it.

So, let's accept that Musk's concern of evil runaway AI is a real problem. In that case, is there anything more concerning than a distributed solar powered orbital platform for AI inference?

Elon Musk appears to be his own nemesis.

show 3 replies
5ersiyesterday at 11:06 PM

Launching alone consumes about 75-150kWh per tonne of energy for fuels only (as per ChatGPT).

Planned lifespan of Starlink satellites is 5years.

tzstoday at 12:23 AM

A million tons a year would be over 18 Starship launches per day.

slgyesterday at 10:44 PM

>This is so obviously false.

One of the biggest but most pointless questions I have about our current moment in history is whether the people in power actually believe the stuff they say or are lying. Ultimately I don't think the answer really matters, their actions are their actions, but there is just so much that is said by people like Musk that strains credulity to the point that it indicates either they're total idiots or they think the rest of us are total idiots and I'm genuinely curious which of those is more true.

show 1 reply
haritha-jyesterday at 10:59 PM

Any estimate by Elon musk, you need to add or substract a zero to/from the end. Here, I'll fix it for you.

> The basic math is that launching a 100,000 tons per year of satellites generating 10 kW of compute power per ton would add 1 gigawatt of AI compute capacity annually, with no ongoing operational or maintenance needs. Ultimately, there is a path to launching 0.01 TW/year from Earth. > My estimate is that within 20 to 30 years, the lowest cost way to generate AI compute will be in space.

wat10000yesterday at 10:32 PM

Never mind operational and maintenance costs. In what fantasy world is it cheaper to put a computer in orbit than in a building on the ground? I don't care how reusable and maintenance-free Starship gets, there's no way even absurdly cheap launches are cheaper than a building.

The whole thing makes no sense. What's the advantage of putting AI compute in space? What's even one advantage? There are none. Cooling is harder. Power is harder. Radiation is worse. Maintenance is impossible.

The only reason you'd ever put anything in orbit, aside from rare cases where you need zero-gee, is because you need it to be high up for some reason. Maybe you need it to be above the atmosphere (telescopes), or maybe you need a wide view of the earth (communications satellites), but it's all about the position, and you put up with a lot of downsides for it.

I feel like either I'm taking crazy pills, or all these people talking about AI in space are taking crazy pills. And I don't think it's me.

show 1 reply
MagicMoonlightyesterday at 11:00 PM

But more importantly, there is no heat dissipation in space. There’s no atmosphere to cool you, no water you can put heat into. Just an empty void. You can radiate a little, but the sun alone is enough to cook you, without you having a rack of GPUs inside your satellite.

It’s completely delusional to think you could operate a data centre in a void with nowhere to put the heat.

show 1 reply
gostsamoyesterday at 10:30 PM

You lost me on million tons.

vessenesyesterday at 10:31 PM

[flagged]

show 10 replies
moomoo11yesterday at 10:17 PM

They'll just be decommissioned and burn up in space. nVidia will make space-grade GPUs on a 2-3 year cycle.

show 2 replies
tokyobreakfastyesterday at 10:26 PM

> For one thing, in what fantasy world would the ongoing operational and maintenance needs be 0?

Do you not understand how satellites work? They don't send repair people into space.

This has been a solved problem for decades before the AI gold rush assumed they have some new otherworldly knowledge to teach the rest of the world.

show 4 replies