Trust & Safety Lessons for National Security Innovation

Trust & Safety has long had important linkages to national security. Nation states, terrorist groups, and criminal organizations use the internet in innumerable ways - and Trust & Safety teams work to disrupt their activities. To do so, those teams have sometimes looked to the traditional national security for tips and processes. But the national security community learning from Trust & Safety? That’s less common. Nonetheless, there are key, general lessons - especially about how to build organizations that creatively adapt integrated human-technical systems in adversarial environments. 

This article for the CTC Sentinel, a publication of the Combating Terrorism Center at West Point, explains some of those lessons from Facebook’s work against the Islamic State - with an emphasis on creating a culture of innovation and adversarial thinking within a larger bureaucracy.

The central argument is that innovation emerges best when dedicated teams are unshackled from bureaucracy and given responsibility for a critical and well-defined mission. At a high-level, it’s a simple recipe but it is difficult to execute in practice because it is risky. Experimentation entails risk of failure, which potentially imperils the mission. As difficult as it is to accept such risk for companies managing Trust & Safety risks, it is even more difficult for national security organizations concerned about kinetic violence. Nonetheless, there are some factors that contribute to positive outcomes:

  • People. Trust & Safety capitalizes on the experience of people with a range of skills and experiences - and the government can learn from this. You have tech do-gooders, a diverse range of people with cultural, linguistic and lived expertise with the harms that manifest online, and folks that enjoy chasing bad guys. Those differences in perspective are not always easy to square, but they are necessary. Tech companies chase talent across the globe; the government too often requires it to come to them. 
  • Organization. Innovative organizations that depend on non-innovative organizations for key resources will be far less effective. This seems obvious, but lots of “innovation centers” depend intrinsically on other bureaucracies for personnel and key resources (such as engineering talent). Truly  innovative organizations should be cross-functional. Their outputs may not be as scalable, but that’s a tradeoff for another day. 
  • Legitimacy. An innovative organization needs legitimacy, and much of that is inevitably bound up in its leader. This is particularly true because innovative groups often eschew process-heavy decision-making. That increases some types of risk and potentially exposes leaders to the consequences of failure. But productive failure is a hallmark of technical innovation and government entities need to embrace a culture of risk and iteration to innovate quickly.
  • Tools. Good tools can facilitate innovation. Core tools should be dynamic and easily adaptable to different problems and circumstances. Innovators should expect obsolescence from their solutions - and so their core platform tools should be designed for extensibility, configuration, and modularity. This is how we built Cinder and the principles are relevant for innovative organizations writ large. 
  • Collaboration. Geopolitical security and Trust & Safety require collaboration across institutions. It is far easier said than done. One core lesson I took from building the Global Internet Forum to Counter Terrorism (GIFCT) is that many tools built for one form of collaboration are applicable to others. But also that coalition partners have very different intrinsic capabilities. Establishing shared capabilities requires ensuring each member of a coalition can access the shared resource. 

Trust & Safety suggests future geopolitical conflict in myriad ways beyond the scope of this post or the article it extracts. But it is worth noting that basic observation is likely to be somewhat uncomfortable for both Trust & Safety professionals and traditional national security professionals. Many of the former come from communities skeptical of state power; many of the latter wrestle with what evolving conflict means for the United States’ traditional geopolitical advantages and a warrior ethos that is both timeless and in a moment of very real change.

Book a meeting

Read More

Brian Fishman Keynote to Terrorism and Social Media Conference (June 17, 2024)

This is the text of a keynote presentation given at the Terrorism and Social Media (TASM) at Swansea University on June 17, 2024. It has been lightly edited for clarity, contains fewer images than those presented during the live presentation, and includes several hyperlink references not applicable for a live audience.

Countering Terrorism on Digital Platforms

Digital communications platforms are part of the geopolitical battlespace - and that reality will test platforms in new ways. No company will balance all the considerations perfectly – that cannot and should not be our expectation – but those companies that prepared for these moments will manage better than those that did not. 

We found North Korean engineers in our application pile. Here’s what our ex-CIA co founders did about it.

Cinder is part of a growing list of US-based tech companies that encounter engineering applicants who are actually suspected North Korean nationals. Here's what we're doing about it.