Comment author: casebash 28 July 2018 02:11:20PM 0 points [-]

How anonymous would you want this to be? Like would the mods still know who posted it?

Comment author: SiebeRozendal 07 August 2018 09:09:41AM 0 points [-]

I think the mods might know, as long as we won't have too many mods. I have no strong opinion either way.

Comment author: RandomEA 28 July 2018 06:42:18PM 3 points [-]

(I'm not the OP.) How about nobody knows who submits it but the comment only appears if the mods approve it? And to deter rule-violating comments, maybe only people with a certain level of karma should be allowed to submit anonymous comments and they should lose a certain amount of karma if the comment is rejected (though we'd have to figure out how to hide that loss from the moderators)?

Comment author: SiebeRozendal 07 August 2018 09:08:52AM 0 points [-]

I would first opt for the low-effort approach: allow anonymous commenting and only have mods step in when it's reported. If this doesn't work, then you could have all anonymous comments moderated.

I think having a certain level of karma (not high) is a good addition.

Comment author: Gregory_Lewis 23 July 2018 03:45:00PM 3 points [-]

Relatedly, some comments could be marked as "only readable by the author", because it's a remark about sensitive information. For example, feedback on someone's writing style or a warning about information hazards when the warning itself is also an information hazard. A risk of this feature is that it will be overused, which reduces how much information is spread to all the readers.

Forgive me if I'm being slow, but wouldn't private messages (already in the LW2 codebase) accomplish this?

Comment author: SiebeRozendal 07 August 2018 09:04:54AM 0 points [-]

Yes you're right and I had not thought of it. I still think private commenting has a slight benefit because it lowers the barrier (I frequently comment on posts, but wouldn't send someone a private message). However, I don't think the benefit is big enough to put effort into.

Comment author: SiebeRozendal 23 July 2018 12:51:03PM *  4 points [-]

Speculative feature request: anonymous commenting and private commenting

Sometimes people might want to comment anonymously because they want to say something that could hurt their reputation or relationships, or affect the response to the criticism in an undesirable way. For example,. OpenPhil staff criticising a CEA or 80K post would have awkward dynamics because OpenPhil funds these organizations partly. Having an option to comment anonymously (but let the default be with names) will allow more free speech.

Relatedly, some comments could be marked as "only readable by the author", because it's a remark about sensitive information. For example, feedback on someone's writing style or a warning about information hazards when the warning itself is also an information hazard. A risk of this feature is that it will be overused, which reduces how much information is spread to all the readers.

Meta: not sure if this thread is the best for these feature requests, but I don't know where else :)

Comment author: MichaelDickens  (EA Profile) 23 July 2018 04:57:43AM 7 points [-]

I'm concerned with the plans to make voting/karma more significant; I would prefer to make them less significant than the status quo rather than more. Voting allows everyone's biases to influence discussion in bad ways. For example, people's votes tend to favor:

  1. things they agree with over things they disagree with, which makes it harder to voice dissenting opinions
  2. entertaining content over important but less-entertaining content
  3. agreeable content without much substance over niche or disagreeable content with lots of substance
  4. posts that raise easy questions and give strong answers over posts that raise hard questions and give weak answers

Sorting the front page by votes, and giving high-karma users more voting power, only does more to incentivize bad habits. I think the current voting system is more suited to something like reddit which is meant for entertainment, so it's reasonable for the most popular posts to appear first. If the idea is to have "all of EA’s top researchers posting and commenting regularly", I don't think votes should be such a strong driver of the UX.

About a year ago I essentially stopped making top-level posts on the EA Forum because the voting system bothers me too much, and the proposed change sounds even worse. Maybe I'm an outlier, but I'd prefer a system that more closely resembled a traditional forum without voting where all posts have equal status. That's probably not optimal and it has its own problems (the most obvious being that low-quality content doesn't get filtered out), but I'd prefer it to the current or proposed system.

Comment author: SiebeRozendal 23 July 2018 12:42:07PM 1 point [-]

I just commented to SamDeere's comment above about having multiple types of votes. One indicating agreement and one indicating "helpfulness". Then you can sort by both, but the forum is sorted by default by "helpfulness". Do you think this would fix some of your issues with a voting system?

Comment author: SamDeere 21 July 2018 01:32:16AM 5 points [-]

Thanks for the comments on this Marcus (+ Kyle and others elsewhere).

I certainly appreciate the concern, but I think it's worth noting that any feedback effects are likely to be minor.

As Larks notes elsewhere, the scoring is quasi-logarithmic — to gain one extra point of voting power (i.e. to have your vote be able to count against that of a single extra brand-new user) is exponentially harder each time.

Assuming that it's twice as hard to get from one 'level' to the next (meaning that each 'level' has half the number of users than the preceding one), the average 'voting power' across the whole of the forum is only 2 votes. Even if you make the assumption that people at the top of the distribution are proportionally more active on the forum (i.e. a person with 500,000 karma is 16 times as active as a new user), the average voting power is still only ≈3 votes.

Given a random distribution of viewpoints, this means that it would take the forum's current highest-karma users (≈5,000 karma) 30-50 times as much engagement in the forum to get from their current position to the maximum level. Given that those current karma levels have been accrued over a period of several years, this would entail an extreme step-change in the way people use the forum.

(Obviously this toy model makes some simplifying assumptions, but these shouldn't change the underlying point, which is that logarithmic growth is slooooooow, and that the difference between a logarithmically-weighted system and the counterfactual 1-point system is minor.)

This means that the extra voting power is a fairly light thumb on the scale. It means that community members who have earned a reputation for consistently providing thoughtful, interesting content can have a slightly greater chance of influencing the ordering of top posts. But the effect is going to be swamped if only a few newer users disagree with that perspective.

The emphasis on can in the preceding sentence is because people shouldn't be using strong upvotes as their default voting mechanism — the normal-upvote variance will be even lower. However, if we thought this system was truly open to abuse, a very simple way we could mitigate this is to limit the number of strong upvotes you can make in a given period of time.

There's an intersection here with the community norms we uphold. The EA Forum isn't supposed to be a place where you unreflectively pursue your viewpoint, or about 'winning' a debate; it's a place to learn, coordinate, exchange ideas, and change your mind about things. To that end, we should be clear that upvotes aren't meant to signal simple agreement with a viewpoint. I'd expect people to upvote things they disagree with but which are thoughtful and interesting etc. I don't think for a second that there won't be some bias towards just upvoting people who agree with you, but I'm hoping that as a community we can ensure that other things will be more influential, like thoughtfulness, usefulness, reasonableness etc.

Finally, I'd also say that the karma system is just one part of the way that posts are made visible. If a particular minority view is underrepresented, but someone writes a thoughtful post in favour of that view, then the moderation team can always promote it to the front page. Whether this seems good to you obviously depends on your faith in the moderation team, but again, given that our community is built on notions like viewpoint diversity and epistemic humility, then the mods should be upholding these norms too.

Comment author: SiebeRozendal 23 July 2018 12:40:15PM 4 points [-]

Speculative feature request: side votes

The problem now is that some people use upvotes to indicate agreement, while others use it to indicate helpfulness (and many, I suspect, use it interchangeably). Having two types of votes clearly separates these two signals. A vote to the right would mean agree, a vote to the left would be disagree. It doesn't necessarily need to be a sidevote, anothey symbol might be better, but it's the idea of two types of votes that counts.

Downside: people are unfamiliar with it, and may be complex to implement. Further complicates the dynamics of upvotes that other people have mentioned in this comment section. However, I think it's fairly straightforward and people will easily pick up on it. Because it won't be confused with other systems (I don't know other fora with multiple types of votes), people will easily read the mouse-over text to find out what the votes mean.

Comment author: SiebeRozendal 23 July 2018 12:31:04PM *  1 point [-]

Feature request: show reading times

It would be useful to show approximate reading times for posts, because readers can decide whether to commit to a long article or not. This saves valuable time of EA's, and improves the engagement with the post.

Comment author: saulius  (EA Profile) 13 June 2018 08:03:27PM *  1 point [-]

I don't know, we simply didn't talk about that at all. My guess is that 4 days is not too long. EA globals sometimes last 3 days, if you include the social on Friday. I believe that a recent group organisers' retreat lasted an entire week. An AI camp lasted 10 days. These latter two events are not quite the same, but I guess you could ask Remmelt Ellen whether they felt too long, I believe he was present in both of them. Hmm, the fact that your event is during winter could matter a bit though, because going outside is usually a refreshing change of atmoshpere during such things.

By the way, this was not a retreat, we did it in an office in London and people slept elsewhere.

Comment author: SiebeRozendal 17 June 2018 01:20:20PM 0 points [-]

Alright thanks! :) Remmelt is also organizing this retreat, so we have that info!

Comment author: SiebeRozendal 13 June 2018 01:28:44PM 0 points [-]

What did you think about the length of the retreat? Would people have liked to stay longer? We're planning to organize a retreat from Thursday to Sunday in The Netherlands in between Christmas and New Year's.

Comment author: SiebeRozendal 07 June 2018 10:10:53AM 0 points [-]

I really admire that you did a study about this, but I think that this study shows much less than you claim to. First of all, you studied support for effective giving (EG), which is different from effective altruism as a whole. I would suspect at least the following three factors to really be different between EG and EA:

  • Support for cause impartiality, both moral impartiality (measuring each being according to their innate characteristics like sentience or intelligence, rather than personal closeness) and means impartiality (being indifferent between different means to an end, e.g. donating money or choosing a career with direct impact
  • Dedication. I believe that making career changes or pledging at least 10% of your income to donate is quite a high bar and much fewer people would be inclined to that.
  • Involvement in the community. As you wrote the community is quite idiosyncratic. Openness to (some of) its ideas does not imply people will like the movement.

Of course, not all of this implies that the study is worthless, that getting people to donate their 1 or 2% more effectively is useless, or that we shouldn't try to make the movement more diverse and welcoming (if this can be done without compromising core values such as epistemic rigor). I think there is a debate to be held how to differentiate effective giving from EA as a whole, so that we can decide whether or not to promote effective giving seperately and if so, how.

View more: Next