Covert influence operations, including disinformation operations, to influencepublic opinion and sow division.Using false U.S. personas, adversaries could covertly create and operate social media pages and other forums designed to attract U.S. audiences and spread disinformation or divisive messages. This could happen in isolation or in combination with other operations, and could be intended to foster specific narratives that advance foreign political objectives, or could be intended simply to turn citizens against each other. These messages need not relate directly to political campaigns. They could seek to depress voter turnout among particular groups, encourage third-party voting, or convince the public of widespread voter fraud to undermine confidence in election results. These messages could target discrete U.S. populations based on their politicaland demographic characteristics. They may mobilize Americans to sign online petitionsand join issue-related rallies and protests, or even to incite violence. For example, advertisements from at least 2015 to 2017 linked to a Russian organization called the Internet Research Agency focused on divisive issues, including illegal immigration and gun rights, among others, and targeted those messages to groups most likely to react.
Thank you for the invitation to speak today, and for the important work you are doing: in organizing this conference devoted to the challenges of misinformation, and, by attending, bringing your experience and expertise to bear on the problem.It’s a privilege to help kick off this first day of MisinfoCon, focused on state-sponsored misinformation. To do that, I am going to give you an overview of how the Department of Justice views the problem, where it fits in the context of related national security threats, and how we are addressing it.
As you probably know, the Justice Department recently obtained an indictment of 13 Russian individuals and three entities, including the Internet Research Agency (or IRA), for federal crimes in connection with an effort to interfere in the 2016 Presidential election. The defendants allegedly conducted what they called “information warfare against the United States,” with the stated goal of “spread[ing] distrust towards the candidates and the political system in general.”
According to the indictment, the IRA was a structured organization headed by a management group and arranged in departments. It had a “translator project,” designed to focus on the U.S. population, with more than 80 employees assigned by July 2016. They posed as politically and socially active Americans, advocating for and against particular political candidates. They established social media pages and groups to communicate with unwitting Americans. They also purchased political advertisements on social media.
One of the so-called trolls who worked for the IRA recently spoke to the Washington Post about his work in a different department, attempting to influence a domestic, Russian audience. He described it as “a place where you have to write that white is black and black is white.” Hundreds of people “were all writing absolute untruths.”
But as the indictment alleges it, what made the defendants’ conduct illegal in the United States was not the substance of their message, the “accuracy” of their opinions: it was their conspiracy to defraud by, among other ways, lying about who the messenger was. They were not Americans expressing their own viewpoints; they were Russians on the payroll of a foreign company.
Now, the problem of covert foreign influence is not new. In 1938, a congressional committee found that the Nazi government had established an extensive, underground propaganda apparatus inside the United States using American firms and citizens. The response was to recommend a law that would (in the committee’s words) throw these activities under the “spotlight of pitiless publicity.” The result is the Foreign Agents Registration Act (FARA), a disclosure statute that, notably, does not prohibit speech. Rather, FARA requires agents of foreign principals who engage in political activities within the United States to file periodic public disclosures with the Department.
The Act’s purpose is to ensure that the American public and our lawmakers know the source of information provided at the behest of a foreign principal, enhancing the public’s and the government’s ability to evaluate such information.
Transparency, not prohibition, has been the government’s response to misinformation. In the 1980s, the government established an interagency committee, the “Active Measures Working Group,” to counter Soviet disinformation. It did so by exposing forgeries and other propaganda, such as fake stories that the Pentagon developed the AIDS virus as part of a biological weapons research program.
Today, we confront misinformation as only one component of a broader, malign foreign influence effort. As this framework from the Department’s recent Cyber-Digital Task Force report shows, those efforts can also include cyber operations that target election infrastructure or political parties’ networks; covert efforts to assist (or harm) candidates; and overt efforts to influence the American public (for example, through state-run media organizations).
Our responses to those efforts must likewise be multifaceted, from providing indicators and warnings that can help network owners protect themselves from hackers, to criminal investigations and prosecutions, and other measures, like sanctions and expulsions that raise the costs on the states that sponsor such malign activities.
This graphic, also from the Task Force report, depicts the Department’s strategy to counter each phase of a covert influence campaign cycle, from the identification of targets to the production and amplification of content. The middle rows (in red) depict our adversaries’ activities in stages, while the bottom rows (in blue) suggest the means by which private actors and the government can disrupt and deter the activity.
One aspect of this strategy worth highlighting is that the content of a foreign influence campaign may be true or false. Whether the message is accurate or not may not be the point: doxing a candidate or a corporation for political reasons might not involve misinformation, but it may nonetheless violate our laws, threaten our values and way of life, compromise privacy and, sometimes, retaliate against and chill free speech.
Covert foreign influence efforts can take many forms, but recently we have seen increased efforts to influence Americans through social media. To counter these efforts, a key component of our approach is sharing information with social media and other Internet service providers, which we do through the FBI’s Foreign Influence Task Force. It is those providers who bear the primary responsibility for securing their own products and platforms. By sharing information with them, especially about who certain users and account holders actually are, we can assist their own, voluntary initiatives to track foreign influence activity and to enforce their own terms of service.
As the Task Force report also recognizes, there may be circumstances when it is appropriate for the government itself to expose and attribute foreign influence operations as a means of rendering them less effective. But there are often compelling, countervailing considerations, however.
As a general rule, the Department does not confirm, deny, or comment on pending investigations, both to protect the investigation itself as well as the rights of any accused.
We are also constrained to protect the classified sources and methods that may inform our judgment of what foreign governments are doing.
And, most important of all, we must never act to confer any advantage or disadvantage on any political or social group, individual, or organization, and we must strive to avoid even the appearance of partiality. That could constrain the timing and nature of any disclosure we might make.
All of this is to say, and as the Department’s Policy on the Disclosure of Foreign Influence Operations recognizes, we might not be the best messenger to counter a particular piece of misinformation.
That’s why this conference is so important: what we call the private sector (but which includes a lot of people in public spaces, just like you) has a critical role – larger than the federal government’s – in countering covert foreign influence efforts, particularly misinformation, and ensuring that our democracy rests on the active engagement of an informed public.
The former Russian troll I mentioned at the beginning of my remarks, who worked for the IRA, said his work was “pointless” for Russian audiences, that it would not impact them. But in America, that kind of trickery might have an impact, he said, because we “live in a society in which it’s accepted to answer for your words.” My challenge to us during this conference, if I may make one, is that we find ways to ensure we all continue to answer for our words, so that the trust we enjoy as an aspect of our free, democratic society can thrive.