logoalt Hacker News

cycomaniclast Wednesday at 9:50 PM4 repliesview on HN

People have done these sort of "optical computing" based demonstrations for decades, despite David Miller showing that fundamentally digital computing with optical photons will be immensely power hungry (I say digital here, because there are some applications where analog computing can make sense, but it almost never relies memory for bits).

Specifically this paper is based on simulations, and I've only skimmed the paper, but the power efficiency numbers sound great because they say 40 GHz read/write speeds, but these consume comparatively large powers even if not reading or writing (the lasers have to be running constantly). I also think they did not include the contributions of the modulation and the required drivers (typically you need quite large voltages)? Somebody already pointed out that the size of these is massive, and that's again fundamental.

As someone working in the broad field, I really wish people would stop these type of publications. While these numbers might sound impressive at a first glance, they really are completely unrealistic. There are lots of legitimate applications of optics and photonics, we don't need to resort to this sort of stuff.


Replies

embedding-shapelast Wednesday at 11:11 PM

> showing that fundamentally digital computing with optical photons will be immensely power hungry

> they really are completely unrealistic

Unrealistic only because they're power hungry? That sounds like a temporary problem, kind of like when we come up with a bunch of ML approaches we couldn't actually do in the 80s/90s because of the hardware resources required, but today work fine.

Maybe even if the solution aren't useful today, they could be useful in the future? Or maybe with these results, there are more people being inspired to create solutions specifically about the power usage?

"we don't need to resort to this sort of stuff" makes it sound like this is all so beneath you and not deserving of attention, but why are you then paying attention to it?

show 4 replies
gsf_emergency_6last Thursday at 2:45 AM

I only upvoted to send a msg to the moderators not to upweight uni/company press releases :) sadly the energy of VC-culture goes into refining wolf-crying despite all the talk of due dilligence,"thinking for yourself" and "understanding value"

The core section from paper (linked below) is pp8-9.

2mW for 100s of picosecs is huge.

(Also GIANT voltages,if only to illustrate how coarse their simulations are):

As shown in Figure 6(a), even with up to 1 V of noise on each Q and QB node (resulting in a 2 V differential between Q and QB), the pSRAM bitcell successfully regenerates the previously stored data. It is important to note that higher noise voltages increase the time required to restore the original state, but the bitcell continues to function correctly due to its regenerative behavior.

fookerlast Thursday at 10:20 AM

40GHZ memory/compute for 10-100x power sounds like a great idea to me.

We are going tohave energy abundant at some point.