MP

Michael_PJ

2321 karmaJoined Sep 2014

Comments
194

On the object level, the original question was:

> Are financial statements going to be released? In particular, how much was spent on estate agent fees, maintaince and bills? And the value of the events hosted. Is the reason for the change that EA has less money or that there was an error in the initial reasoning for buying it?

Even given the context I think this is asking too much. I would support a question like "I would love to know what the reasoning was: in particular, was the project financially unsustainable or were there other reasons?". 

Asking whether the financial statements for the project are going to be published is asking for way more  information than is necessary or useful. From the follow up question, it sounds like Dean is interested in doing a financial audit of the project and maybe thinks they were paying too much on maintenance or something? (Apologies if I'm putting words in your mouth, Dean, but I'm trying to talk about how it comes across to me as a reader) This doesn't seem like a reasonable avenue for Forum commenters to be encouraged to push down.

On the meta level, while there are not that many items at this level, the calls for random transparency are constant. Look at almost any post made by an EA org and you will see people asking for everything from full financial statements, to written documentation of hiring processes, to extensive reports on all kinds of internal operations.

To put it another way, by all means ask for transparency when it matters... but stick to when it matters, please!

I think there is a bit of tendency to assume that it is appropriate to ask for arbitrary amounts of transparency from EA orgs.  I don't think this is a good norm: transparency has costs, often significant, and constantly asking for all kinds of information (often with a tone that suggests that it ought to be presented) is I think often harmful.

I wonder if we would benefit from something like a system that hides the karma (and treats it as zero for visibility purposes) of posts that have less than <some quantity> of engagement. That way posts would get a "grace period" before getting hidden.

Again, I don’t think my picture here is a stretch from the normal English sense of the word “wholesomely”.

The more I read of these essays the less I agree with this. On my subjective authority as a native English speaker, your usage seems pretty far from the normal sense to me. I think what you're gesturing at is a reasonable concept but I think it's quite confusing to call it "wholesome". 

As some evidence, I kept finding myself having to reinterpret sentences to use your meaning rather than what I would consider the more normal meaning. For example, "What is wholesome depends on the whole system." This is IMO kind of nonsensical in normal English.

I don't think we would have been able to use the additional information we would have gained from delaying the industrial revolution but I think if we could have the answer might be "yes". It's easy to see in hindsight that it went well overall, but that doesn't mean that the correct ex ante attitude shouldn't have been caution!

100% agree. I think it is almost always better to be honest, even if that makes you look weird. If you are worried about optics, "oh yeah, we say this to get people in but we don't really believe it" looks pretty bad.

I would qualify this statement by saying that it would be nice for OP to have more reasoning transparency, but it is not the most important thing and can be expensive to produce. So it would be quite reasonable for additional marginal transparency to not be the most valuable use of their staff time.

  • Some EAs knew about his relationship with Caroline, which would undermine the public story about FTX<->Alameda relations, but didn't disclose this.
  • Some EAs knew that Sam and FTX weren't behaving frugally, which would undermine his public image, but also didn't disclose.

FWIW, these examples feel hindsight-bias-y to me. They have the flavour of "we now know this information was significant, so of course at the time people should have known this and done something about it". If I put myself in the shoes of the "some EAs" in these examples, it's not clear to me that I would have acted differently and it's not clear what norm would suggest different action.

Suppose you are a random EA. Maybe you run an EA org. You have met Sam a few times, he seems fine. You hear that he is dating Caroline. You go "oh, that's disappointing, probably bad for the organization, but I guess we'll have to see what happens" and get on with your life.

It seems to me that you're suggesting this was negligent, but I'm not sure what norm we would like to enforce here. Always publish (on the forum?) negative information about people you are at all associated with, even if it seems like it might not matter? 

The case doesn't seem much stronger to me even if you're, say, on the FTX Foundation board. You hear something that sounds potentially bad, maybe you investigate a little, but it seems that you want a norm that there should be some kind of big public disclosure, and I'm not sure that really is something we could endorse in general.

To reuse your example, if you were the only person the perpetrator of the heist could con into lending their car to act as a getaway vehicle, then that would make P(Heist happens | Your actions) quite a bit higher than P(Heist happens | You acting differently), but you would still be primarily a mark or (minor) victim of the crime

Yes, this is a good point. I notice that I don't in fact feel very moved by arguments that P(FTX exists | EA exists) is higher, I think for this reason. So perhaps I shouldn't have brought that argument up, since I don't think it's the crux (although I do think it's true, it's just over-determining the conclusion).

Only ~10k/10B people are in EA, while they represent ~1/10 of history's worst frauds, giving a risk ratio of about 10^5:1, or 10^7:1, if you focus on an early cohort of EAs.

This seems wildly off to me - I think the strength of the conclusion here should make you doubt the reasoning!

I think that the scale of the fraud seems like a random variable uncorrelated with our behaviour as a community. It seems to me like the relevant outcome is "producing someone able and willing to run a company-level fraud"; given that, whether or not it's a big one or a small one seems like it just adds (an enormous amount of) noise. 

How many people-able-and-willing-to-run-a-company-level-fraud does the world produce? I'm not sure, but I would say it has to be at least a dozen per year in finance alone, and more in crypto. So far EA has got 1. Is that above the base rate? Hard to say, especially if you control for the community's demographics (socioeconomic class, education, etc.).

Load more