Part 1

The limit is based on a computer operating at the Landauer Limit, at the temperature of the cosmic microwave background, powered by a Dyson sphere operating at the efficiency of a Carnot engine. [EDIT: this proposed limit is too low, as the Landauer Limit can be broken, it is now just a lower bound.]

 

Relevant equations

Carnot efficiency               ηI=1-(Tc/Th)

Landauer limit                   E=KbTLn(2)

Bit rate                                  R=PηI /E

 

Relevant values

Boltzmann constant [Kb] (J K-1)                                                    1.38E-23

Power output of the sun [P] (W)                                                3.83E+26

Temperature of the surface of the sun [Th] (K)                      5.78E+03

Temperature of cosmic microwave background [Tc] (K)     2.73

 

Calculations

Carnot efficiency              ηI=1-(Tc/Th)

                                                ηI=1-(2.73/5.78E+03)

                                                ηI=1.00

 

Landauer limit                   E=KbTLn(2)

                                                E=1.38E-23*2.73*0.693

                                                E= 2.61E-23 Joules per bit

 

Bit rate                                 R=PηI /E

                                                R=3.83E+26*1.00/2.61E-23

                                                R=1.47E+49 bits per second

 

Notes

Numbers are shown rounded to 3 significant figures, full values were used in calculations.

 

 

Part 2

The theoretical computational limit of the solar system is 22 orders of magnitude above the estimated computational ability of all alive humans. This is based on estimates of the number of synapses in the human brain, the update rate of those synapses, and the number of humans alive. This estimate is only an approximation and should be used with caution.

The purpose of this post was to show the limit of computation, and therefore intelligence, is far above all humans combined.

 

Relevant equations

Bit rate of all humans                     Rhumans=NsynRsynNhumans

Comparative rate                              Rc=Rmax/Rhumans

 

Relevant values

Number of synapses in the human brain [Nsyn]                     2.50E+14

Synaptic update rate [Rsyn] (Hz)                                                   500

Number of humans alive [Nhumans]                                             8.07E+09

Theoretical computational limit [Rmax] (bit s-1)                      1.47E+49 

 

Calculation

Bit rate of all humans                     Rhumans=NsynRsynNhumans

                                                                Rhumans=2.50E+14*500*8.07E+09

                                                                Rhumans= 1.01E+27

Comparative rate                              Rc=Rmax/Rhumans

                                                                Rc=1.47E+49/1.01E+27

                                                                Rc=1E22

 

Notes

Numbers are shown rounded to 3 significant figures, full values were used in calculations, final result rounded to one significant figure due to low confidence in synaptic update rate.

Synaptic update rate estimated based on a 2 millisecond refractory time of a neuron.

12

0
0

Reactions

0
0
Comments7
Sorted by Click to highlight new comments since:

If you're trying to maximize computational efficiency, instead of building a Dyson sphere, shouldn't you drop the sun into a black hole and harvest the Hawking radiation?

Hi Robi,

For reference, Anders Sandberg discussed that on The 80,000 Hours Podcast (emphasis mine):

[Anders:] The best method [to extract energy after the era of starts], in my opinion, is to use black holes. I’m very fond of black hole power. And I am assuming that maybe in a few trillion years I’m going to be dealing with protesters saying, “No black holes in our neighbourhood,” and “Don’t build that power plant, Anders.” But they’re actually lovely. Black holes have accretion disks when they suck in matter. Or rather, it’s not that they suck in matter — that’s kind of a picture we get from science fiction — they’re just an object with gravity like anything else. But what happens when you put a lot of junk around a black hole? They form a disk, and the friction between parts of the disk heats up the matter. That means it radiates away energy and gets more tightly bound and slowly spirals in. There is also some angular momentum leaking out at the sides where some dust gets thrown off.

The effect of this is that the potential energy of that junk — and it can be anything: burnt-out stars, old cars, old space probes, planets you don’t care for, et cetera — gets ground down, and the potential energy gets released as radiation. So now you can build a Dyson sphere, a very big one, around this whole system, and get all of that energy.

How much total mass energy can you get? It turns out it’s almost up to 40% for a rapidly spinning black hole. The exact limit depends on where the inner edge of the accretion disk is, because eventually you get close enough that you essentially fall straight in without releasing any more energy, and that gets trapped inside the black hole. Now, converting 40% of the mass energy of old cars and space probes into energy is kind of astonishing: that is way more effective than fusion. So actually, the stars might not be the biggest energy source around. We might actually be able to make the galaxies shine much more if we dump things into black holes and gather that energy.

William, I am guessing you would like Anders' episodes! You can find them searching for "Anders Sandberg" here.

Yea, I found him to be a fascinating person when I talked to him at EAGx Warsaw.

I'm initially sceptical of getting 40% of the mass-energy out of, well, anything. Perhaps I would benefit from reading more on black holes. 

However I would in principle agree with the idea that if black holes are feasible power outputers, this would increase the theoretical maximum computation rate. 

Hi William, interesting post :) Some reactions: 

  • I think the LW audience may be interested in this as well, so consider cross-posting there
  • While I don't think this will change your (pretty unobjectionable) bottomline conclusion that "the limit of computation is far above all humans combined", I'd be curious to know if you've considered other approaches to estimating the theoretical computational limit of the solar system, and how those BOTECs would compare to your current approach (in the spirit of how Ajeya Cotra considered 4 different anchors in her report for estimating training computation requirements for a "transformative" model, and arrived at a pretty wide range)
  • Same question for estimating the computational ability of all humans alive. In particular you may want to check out Open Philanthropy's 2020 report How Much Computational Power Does It Take to Match the Human Brain? (see chart below from the report for how widely the numbers can range)
  • Come to think of it, if the idea is to show that the "limit of computation is far above all humans combined", you may be interested in a computation efficiency-oriented perspective (e.g. normalizing by power consumption), in which case Robert Freitas' sentience quotient may interest you (scroll down to the last section). Seth Lloyd's ultimate laptop calculations may interest you as well 
  • Considering instead the perspective of "ultimate brains", you may be interested in Anders Sandberg's The physics of information processing superobjects: daily life among the Jupiter brains, in particular his calculations w.r.t. the physics and engineering constraints guiding the design specs for Dyson brains, Jupiter brains and "neutronium" brains

Happy reading :) 

FLOPsBudgets5.png

Hi Mo, thanks for the feedback.

  1. Good thought, I've cross-posted it to my account there.
  2. This post was spurred by a conversation I had about the upper limit of AI intelligence and the fact that it was likely very far above all humans combined. This is meant as, like you said, a pretty unobjectionable support for my then assumed conclusion. The conversion was heavily influenced by Cotra's Bioanchors report.
  3. I was estimating the brains computation ability very roughly. I guessed that there would be more detailed estimations already done, but would take time to read through and understand their premises. I'll read through the document when I have some time.
  4. These two look interesting to read.
  5. Anders Sandberg is an interesting person. I speculated someone had done calculations similar to mine, I'm not surprised that he is one of such people.

Yeah, your initial guess was right that more detailed estimations had already been done. I figured the reason you posted your rough BOTEC was to invite others to refer you to those more detailed estimates (saves time on your end via crowdsourcing too), since I've done the same as well in the past, hence the shares :) Happy reading

An efficient idea, good thinking.

Curated and popular this week
Relevant opportunities