But Signal’s rapid growth has also been a cause for concern. In the months leading up to and following the 2020 US presidential election, Signal employees raised questions about the development and addition of new features that they fear will lead the platform to be used in dangerous and even harmful ways. But those warnings have largely gone unheeded, they told me, as the company has pursued a goal to hit 100 million active users and generate enough donations to secure Signal’s long-term future.
Employees worry that, should Signal fail to build policies and enforcement mechanisms to identify and remove bad actors, the fallout could bring more negative attention to encryption technologies from regulators at a time when their existence is threatened around the world.
“The world needs products like Signal — but they also need Signal to be thoughtful,” said Gregg Bernstein, a former user researcher who left the organization this month over his concerns. “It’s not only that Signal doesn’t have these policies in place. But they’ve been resistant to even considering what a policy might look like.”
If your platform has more than a few hundred users, it is unethical and immoral to take a hands-off approach to content moderation.
Signal seems to be making all the wrong decisions here, but it’s not surprise. Like every other tech platform, they need exponential user growth to satisfy investors (yes, I know Signal is funded by a non-profit foundation, but they have a $50 million dollar loan to deal with). Sadly for them, exponential user growth, proper content moderations and community management, and low overhead are not compatible.