ElliotJDavies

1911 karmaJoined Mar 2020Copenhagen, Denmark

Posts
5

Sorted by New

Comments
218

To a large extent I don't buy this. Academics and Journalists could interview an arbitrary EA forum user on a particular area if they wanted to get up to speed quickly. The fact they seem not to do this, in addition to not giving a right to reply, makes me think they're not truth-seeking. 

Just to note: I have a COI in commenting on this subject. 

I strong downvoted your comment, as it reads to me as making bold claims whilst providing little supporting evidence. References to "lots of people in this area" could be considered to be a use case of the bandwagon fallacy. 

As you write: 

The result will be a singularity, understood as a fundamental discontinuity in human history beyond which our fate depends largely on how we interact with artificial agents

The discontinuity is a result of humans no longer being the smartest agents in the world, and no longer being in control of our own fate. After this point, we've entered an event horizon where the output is almost entirely unforeseeable. 

If you have accelerating growth that isn't sustained for very long, you get something like population growth from 1800-2000

If, after surpassing humans, intelligence "grows" exponentially for another 200 years, do you not think we've passed an event horizon? I certainly do!

If not, using the metric of single agent intelligence (i.e. not the sum of intelligence in a group of agents), at what point during an exponential growth curve that intersects human level intelligence,  would you defining as crossing the event horizon? 

I feel this claim is disconnected with the definition of the singularity given in the paper: 

The singularity hypothesis begins with the supposition that artificial agents will gain the ability to improve their own intelligence. From there, it is claimed that the intelligence of artificial agents will grow at a rapidly accelerating rate, producing an intelligence explosion in which artificial agents quickly become orders of magnitude more intelligent than their human creators. The result will be a singularity, understood as a fundamental discontinuity in human history beyond which our fate depends largely on how we interact with artificial agents

Further in the paper you write: 

The singularity hypothesis posits a sustained period of accelerating growth in the general intelligence of artificial agents.

[Emphasis mine]. I can't see any reference for either the original definition and later addition of "sustained". 

Intelligence Explosion: For a sustained period

[...]

Extraordinary claims require extraordinary evidence: Proposing that exponential or hyperbolic growth will occur for a prolonged period [Emphasis mine]

 

  • I'm not sure why "prolonged period" or "sustained" was used here?
  • I am also not sure what is meant by prolonged period? 5 years? 100 years? 
    • For the answer to the above, why do you believe would this be required? 

Just to help nail down the crux here, I don't see why more than a few days of an intelligence explosion is required for a singularity event.

 

Circuits’ energy requirements have massively increased—increasing costs and overheating.[6]


I'm not sure I understand this claim, and I can't see that it's supported by the cited paper. 

Is the claim that energy costs have increased faster than computation? This would be cruxy, but it would also be incorrect. 

The joy in righteousness

 

This is a new one to me! Interesting!

To identify one crux with the idea of using morality to motivate behaviour (e.g. "abolitionism"), is the assumption it needs to be completely grassroots. The argument often becomes: did slavery end because everyone found it to be morally bad, or because economic factors ect. changed the country fundamentally.

It becomes much more plausible that morality played an important role, when you modify the claim: Slavery ended because a group of important people realised it was morally wrong, and displayed moral leadership in changing laws. 

While I don't think that was inappropriate, it seems fair to give Owen at least some lead time to prepare a statement of his perspective on the matter. 

I think your right about this, and have changed my mind. 

Load more