80,000 Hours did an interview with Professor Tetlock, one of the world's top experts on how to have accurate beliefs about the future. We asked him about a bunch of questions of interest to the community here:

  • Should people who want to be right just adopt the views of experts rather than apply their own judgement?
  • Why are Berkeley undergrads worse forecasters than dart-throwing chimps?
  • How can listeners contribute to his latest cutting-edge research?
  • What do we know about our accuracy at predicting low-probability high-impact disasters?
  • Does his research provide an intellectual basis for populist political movements?
  • Was the Iraq War caused by bad politics, or bad intelligence methods?
  • Can experience help people avoid overconfidence and underconfidence?
  • When does an AI easily beat human judgement?
  • Could more accurate forecasting methods make the world more dangerous?
  • What are the odds we’ll go to war with China?
  • Should we let prediction tournaments run most of the government?

Here's a preview:

Robert Wiblin: There’s a very active debate in the effective altruism community at the moment about how much people should adopt the inside view versus the outside view, and how much they should just defer to mass opinion on important questions, or just defer to the average view of a bunch of experts. Do you have any views on that? There’s some people promoting a very radical view, basically, that you should almost ignore your own inside view and only look at the reported views of other people, or give your own inside view no more weight than anyone else’s. Do you think that’s a good approach to having more accurate beliefs?

Philip Tetlock: I’ve never been able to impose that kind of monastic discipline on myself. The division between the inside and the outside view is blurry on close inspection. I mean, if you start off your date with a base rate probability of divorce for the couple being 35%, then you … Information comes in about quarrels or about this or about that, you’re going to move your probabilities up or down. That’s kind of inside view information, and that’s proper belief updating.

Getting the mechanics of belief updating are very tricky and there’s a problem of both cognitive conservatism, under adjusting your beliefs and response to new evidence, and also the problem of excess volatility, over adjusting and spiking around too much. Both of which can obviously degrade accuracy..."

Continue reading...

6

0
0

Reactions

0
0
Comments
No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities