I was on Twitter’s Trust and Safety Board. The ending was a bad idea.

  • Eirliani Abdul Rahman is a former member of Twitter’s Trust and Safety Council.
  • Abdul Rahman and two other members resigned earlier this month, and Twitter dissolved the board a few days later.
  • “I don’t know what the future of Twitter looks like, but I don’t think it bodes very well if Musk continues to operate it the way he’s doing it right now,” he said.

This essay is based on a conversation with Eirliani Abdul Rahman and has been edited for length and clarity. Abdul Rahman is the co-founder of Youth, Adult Survivors & Kin In Need (YAKIN) and a former member of Twitter’s Trust and Safety Council. Earlier this month, he resigned from the board, which Twitter later dissolved via email less than an hour before the group’s scheduled meeting.

I loved Twitter as a platform because I felt it was so unique and even today I don’t think there is anything like it. You can tweet anyone. You can talk to anyone. It doesn’t matter how many followers you have. And it is the most democratic platform in that way.

I felt that we at the council were not listening. There has been no contact with us since the Musk acquisition. I was quite surprised by the breakup given the timing and nature in which it happened. There should have been a discussion between the new management and the council members before such a decision was made. The way it was done left a lot to be desired.

I like the idea of ​​a public town square that Elon Musk mentioned, but content containment is too complicated. And I think we can see now the friction between what Musk thinks he can do and the actual reality of content retention.

I am of the opinion that the platform is no longer secure. The numbers, from groups like the Center to Combat Digital Hate and the Anti-Defamation League, showing an increase in hate speech are truly cause for concern.

It’s not the Twitter I wrote on. You can see that it is very different now and represents a different way of looking at things. How will content be moderated? Will it be completely automated? What about trusted partners on the ground who can help with the nuances and contextualization necessary for this work?

Content moderation is complicated — it can’t fire the people who work on it and just automate it. The platform needs to be able to fix things quickly and not just react, but be proactive and then reach out to trusted partners. This is all a big part of content moderation, it’s not as simple as a static process.

I really admire Musk because of what he’s done with SpaceX and Tesla, but I think the way he’s infusing some of these operating principles into Twitter doesn’t quite work. I think it’s troubling that he’s making major decisions by taking polls and tweeting, “Should I do this?”

As for Twitter Blue, I understand the need to monetize, but Musk’s approach means that people who can afford to pay $8 a month for verification will have a better Twitter experience than those who can’t, and then that’s not it’s really a democratic platform anymore. And you shouldn’t be able to buy credibility in the first place.

Looking back, I am very proud of the work we did on the council. I feel like there’s an exodus from Twitter happening now and I’m sad about it. There is no equivalent to Twitter for its democratic value and what you can do with it.

We’ll see what happens with the new changes. I don’t know what the future of Twitter looks like, but I don’t think it bodes very well if Musk keeps running it the way he’s doing it right now.

Leave a Reply

Your email address will not be published. Required fields are marked *