"Digital Authoritarianism: A Growing Threat” at the Carnegie Endowment for International Peace

"Digital Authoritarianism: A Growing Threat” at the Carnegie Endowment for International Peace

 

Carnegie Endowment for International Peace Event

Director of National Intelligence Avril Haines

April 24, 2023

 

On April 24, Director of National Intelligence Avril Haines delivered remarks at the Carnegie Endowment for International Peace on “Digital Authoritarianism: A Growing Threat.” Following the remarks, Carnegie Senior Vice President for Policy Research and Director of the Europe Program Dan Baer moderated a fireside chat with the DNI. The full event is available to view here, and the transcript is below.

 

 

MR. BAER: Good afternoon everyone, and welcome to the Carnegie Endowment for National Peace. My name is Dan Baer, and I'm the Senior Vice President for Policy and Research here. It's a pleasure to have you all here. Today we welcome you to a particularly special event. The Director of National Intelligence is not one of the cabinet officials who most often speaks publicly, and so we're very excited to have Avril Haines with us today.

 

For those of you who have not met Avril before, she has had a distinguished career in the U.S. Government in several administrations. She served she joined President Biden's cabinet as the seventh confirmed Director of National Intelligence. She served as the Principal Deputy National Security Advisor for President Biden, as the Deputy Director of the CIA, as a lawyer in the State Department, and many other things in her career, including independent bookstore owner, and Senior Fellow at Applied Physics Lab.

 

So, she's had a storied career, and as somebody who got the pleasure of working with her, I can tell you that she is as kind as she is smart, as gracious as she is committed to principles, and it is a real joy to work with her. And Avril you have the stage.

 

 

DIRECTOR HAINES: Thank you so much, Dan, for that ridiculously kind introduction — it is such a joy to get to see you and to know that you are part of leading this wonderful institution.

 

I also want to thank President Tino Cuellar for inviting me to speak today and to extend my gratitude to the wonderful experts who reside at the Carnegie Endowment for International Peace and whose work has — for years — been enriching our analysis and thinking in the Intelligence Community on the many challenging issues that we face as a country.

 

Speaking as a civil servant in government who has had the chance to see first-hand how important the intellectual exchange is that occurs between government and entities like Carnegie Endowment, I cannot tell you how grateful I am for your work.

 

Many of you challenge our thinking and work to ensure that we are focused on what matters and not missing the broader strategic picture as we manage urgent crises. And many of you have been or will come through government, having had a chance to think things through from the outside, without the standard constraints to which we are subject, to make us better.

 

The Intelligence Community, in particular, benefits from such collaboration because we are trying to understand and reflect on the world around us and in doing so, we look to bring rigor, expertise, evidence to bear on our analysis, as all of you do. We recognize, however, that the classified nature and pressure of our work can make us susceptible to insular thinking and cognitive biases. And, as such, our interactions with those outside the community who can test our hypotheses and better inform our work are crucially important.

 

So this leads perfectly into our topic for today — digital repression — an issue for which it is critical to have a wide range of perspectives both in and out of government not only for purposes of understanding the landscape but also to address the problem.

 

And people like Dan and Steve, who literally wrote the book on digital repression, all of you and so many others at Carnegie and other organizations, such as Freedom House, and even in the private sector, have contributed to our thinking on these questions — each of you with different expertise, experiences, information, and perspectives that are fundamental to understanding the landscape.

 

And you all have helped us to focus in on an aspect of a problem that does not, at least in my view, get enough attention. Specifically, the degree to which new technologies, institutional, legal, and organizational approaches to digital repression being deployed by authoritarian governments and exported to other countries are advancing authoritarianism and undermining democratic governance globally.

 

And I realize that for some it may seem strange to have a leader in the Intelligence Community discuss this topic but I actually think it is crucial that we lend our voice, our analysis, and perspective to this issue — making clear the distinction between the work of an intelligence service in a democracy versus an authoritarian state.

 

And while intelligence services in authoritarian countries are often used as tools of the state to enhance digital repression under the direction of their rulers, in democracies they are subject to democratically passed laws, have internal and external safeguards in place, and ultimately are held accountable to oversight mechanisms that work to ensure we remain legal, ethical, and focused on providing the best intelligence to help decision-makers in our government make better national security and foreign policy decisions while being protective of people’s civil liberties and privacy.

 

And, in my view, the Intelligence Community is a critical ally in the fight against authoritarianism and should contribute to the promotion of norms that help to protect against the primary tools of digital authoritarianism, which are censorship, misinformation and disinformation, mass surveillance, and invasive spyware used to suppress public debate.

 

We have a unique perspective, as well as an understanding of state actors and even security services, that can enhance our capacity to reveal structural norms or approaches that might support the resilience of open information environments in democratic societies.

 

But let me back up, and explain why we are so focused on this issue, which I see as a critical threat to our national security. As the President often says, the struggle to bolster democratic governance at home and abroad is the defining challenge of our time, as it remains the best way to realize lasting peace, prosperity, and human dignity.

 

And the President, the Congress, democratic leaders around the world are pursuing efforts to strengthen democratic resilience through such means as funding to support the rule of law, human rights, good governance, civil society, pluralistic political parties, independent media and free and fair elections.

 

Such efforts are also — rightly in my view — beginning to include democratic assistance that promotes not only political freedoms but also efforts to counter the indignity of corruption, inequality, and a lack of economic opportunity, as Administrator Power and Secretary Yellin recently noted, are also fundamental to strengthening democracy.

 

Perhaps one of the most challenging aspects of the problem, however, is the contest over information, which is defined by the increasing use of digital technology to promote authoritarianism.

 

Today’s digital technologies have profoundly shaped access to information. Initially, such technologies were instrumental in facilitating civil society and freedom of the press in many places and yet, they sparked a backlash from authoritarian regimes, first to contain the risks posed by freer flows of information and to harness these same technologies in pursuit of broader objectives, including to stifle freedom of expression and to suppress political discourse.

 

And today, we assess that foreign governments are increasingly using digital information and communication technologies to monitor and suppress political debate domestically, as well as in their expat and diaspora communities abroad.

 

And as these technologies, capabilities, policies, and mechanisms are exported and implemented in various countries or territories, they make it that much harder to bolster democratic governance and easier for authoritarians to prevail.

 

Moreover, the use of these technologies and methods to monitor and limit dissent are on a trajectory to become even more pervasive, targeted, and complex in the next few years, further constraining freedoms globally. For example, generative Artificial Intelligence will only increase the sophistication that such regimes can use to deploy such tools, making them that much more difficult to counter.

 

So what I want to do is talk about the Chinese, Russian, and Iranian models, describe certain structural aspects of these models and methods that are being exported, and talk about how we might approach creating greater resilience in societies, so as to mitigate the impact of these models and methods with the ultimate objective of really promoting greater resilience in democratic governance.

 

And, to that end, it probably doesn’t come as a surprise to any of you that authoritarian regimes are the biggest drivers in advancing their control mechanisms. Authoritarian leaders often fear how open debate of political or social topics could jeopardize their hold on power.

 

These leaders’ worldviews are often clouded by paranoia and an overarching concern with regime preservation, internal control, and stability.

 

And the history of this trend dates to when some authoritarian regimes grew concerned about the implications of the internet early in its usage, and then the Arab Spring uprisings served as a turning point, when authoritarian governments came to recognize that their publics’ digital connectivity posed an existential threat to their grip on power.

 

Autocrats’ beliefs that Western governments, particularly the United States, have been using the Internet’s influence to undermine their regimes’ stability, alongside the increasing number of public protests around the world, have exacerbated these fears.

 

And the People’s Republic of China or “PRC” is the global leader in digital repression. In fact, for the 8th year in a row, Freedom House identified China as the country providing the least Internet freedom.

 

Compared to Russia, Beijing is better at censoring digital information and surveilling the population, in part because they prioritized digital controls before Moscow did.

 

And the PRC seeks to preempt challenges to its rule by demonstrating its responsiveness, eliminating dissent, and remolding society to achieve China’s “national rejuvenation.” In many ways, the PRC’s extraordinary use of digital repression tools are presented as features, rather than bugs.

 

Beijing uses digital repression techniques to control the flow of information, to downplay and disrupt its citizens’ access to information highlighting domestic shortcomings, and try and reinforce the PRC’s legitimacy as well as its all-encompassing reach. And China’s willingness to share its know-how and export its technology far beyond its borders is a key enabler of transnational repression, even as it makes it easier for other governments to engage in digital repression within their own borders.

 

Furthermore, China’s smart cities use surveillance technology to combine the provision of basic public goods, such as traffic safety, with the projection of authoritarian control.

 

The PRC is furthermore the world’s leading perpetrator of transnational repression, often through digital means.

 

News reports of Chinese transnational repression have become all too common, much of it under the auspices of the PRC’s Operation Fox Hunt — a purported Chinese global anti-corruption effort.

 

And in fact, last week, the US Justice Department charged PRC police officers, for wielding thousands of fake social media accounts to spread Chinese propaganda and harass dissidents living in the United States.

 

And next is Russia, which is also a leading perpetrator of digital repression but it takes a different approach than China, relying heavily on complex legal and institutional structures that promote its control over information and, as you undoubtedly have noticed, Russia is particularly active in spreading disinformation.

 

Notably, since the start of Putin’s unjust and illegal invasion of Ukraine, Moscow has employed a full spectrum of malign influence activities to defend its actions, seed doubt with respect to news about what is happening on the battlefield that undercuts Russia’s narrative, and amplify false, misleading, or unsubstantiated narratives to undercut Kyiv and the West.

 

The Russian people are subjected to the sixth least free Internet environment in the world, and Internet freedom reached an all-time low in Russia last year in conjunction with Russia’s invasion of Ukraine, as the Kremlin blocked social media sites and more than 5,000 websites in Russia and introduced a law prescribing up to 15 years in prison for anyone who spread “false information” about the conflict.

 

Moscow’s efforts to force foreign IT giants to abide by its content moderation requirements increased in 2022 as Google, Meta, other firms received fines worth hundreds of millions of dollars for not filtering “banned content.”

 

And this increased targeting of non-Russian platforms, has resulted in increased self-censorship, and in some cases, a full departure from the Russian media space.

 

Efforts to remove non-Russian media have been accompanied by the expansion of Kremlin-aligned media corporations, including the acquisition by social media giant VKontakte--which runs Russia’s most popular social network and is effectively state-controlled--of major Internet and technology firm Yandex, which is, among other things, a news aggregator, giving the Kremlin greater control over the content that Russian citizens encounter online.

 

And the Russian government paired this action with policies that pushed out independent media and social media platforms for fear of prosecution or employee safety concerns. In January 2022, Moscow began requiring foreign IT and media companies with more than 500,000 daily Russian users to maintain staff in local offices and expected the companies to restrict information that violated Russian laws.

 

The Russian government then encouraged Russian companies to create indigenous platforms to replace foreign social media companies allowing for increasing control over content.

 

Russia also undermined internet freedom in Ukraine; the Russian military in 2022 subjected Ukrainian cities and oblasts to 22 internet shutdowns through a combination of cyberattacks, targeted air strikes, and deliberate dismantling of telecommunications infrastructure.

 

Last September, public researchers discovered a Russia-based influence operation in Ukraine that managed more than 60 websites impersonating news organizations and had accounts on major U.S. social media platforms.

 

Finally, Iran for several years has been imposing increasingly sophisticated internet shutdowns to stop protests. This was particularly evident in the monthslong protests across the country following the death last September of 22-year-old Mahsa Amini in police custody, and brought Iran to an unprecedented 18 shutdowns in 2022.

 

And Tehran has also shown a willingness and capability to expend its repressive activity outside its borders, as evidence by the U.S. Department of Justice’s recent charges against Eastern European criminal organization members hired by the Iranian Government to surveil and murder a human rights activist who has criticized the country’s treatment of women.

 

And I’ve gone through briefly the approaches taken in these three countries to achieve digital repression both in and outside of their borders to give you a sense of the different models, but, as I noted at the outset, authoritarian regimes are not the only governments conducting digital repression — and this is particularly concerning, as it is the key battleground for the competition between democracies and authoritarians.

 

We are seeing more and more instances of other countries engaging in digital repression and their adoption of these approaches is in turn contributing to further democratic erosion.

 

For example, we see other countries increasingly using the tactic of internet shutdowns. In fact, last year, governments and other actors shut down the internet at least 187 times in 35 countries, which was a new record.

 

Shutdowns were imposed during protests, active conflicts, school exams, elections, periods of political instability or high-profile events, such as religious holidays or visits by government officials, with in many cases the goal of imposing and silencing voices.

 

We also saw a record number of governments block websites with nonviolent political, social, or religious content, undermining users’ rights to free expression and access to information.

 

And the use of commercial spyware is also on the rise, which in the past year journalists have estimated is a $12 billion dollar business.

 

While some states use such spyware tools and lawful intercept programs for legitimate purposes such as to target criminals or terrorists, governments are increasingly using spyware, along with legislative efforts that provide a basis for doing so, to target political opponents and critics.

 

And, furthermore, a growing number of internet users around the globe only have access to an online space that mirrors the views of their government and its interests — but this phenomenon is not restricted to China, Russia, and Iran. Authorities in 47 of the 70 countries covered by a recent research study limited users’ access to information sources located outside of their borders.

 

We also see, even where China is not intentionally exporting its approach for purposes of extending its digital repression efforts that they are making it easier for others to engage in such activity and ultimately may use their access to further their efforts at transnational repression.

 

For example, the Internet of Things is projected to reach 64 billion devices by 2025 and possibly trillions by 2040 — all potentially monitored by various governments. This growth is connected to smart city initiatives, which are using emerging technologies to improve the ability of city leaders to leverage public resources to boost the overall quality of life and while at the same time creating vast amounts of data.

 

China also has a comparative advantage in the global export of facial recognition AI. Autocracies and weak democracies are more likely to acquire this technology from China than from other countries, and they are more likely to import this technology from China when they are experiencing periods of political unrest.

 

In short, these technologies offer new possibilities for tracking and intimidating dissenters, monitoring political opponents, and preempting challenges to government power that are hard to resist without the legal, institutional, and cultural structures that we rely on in our own country to help us avoid such abuses.

 

Moreover, Chinese companies, which are subject to Chinese laws that provide the government with access to their information, are leading providers of technology in a number of countries where Freedom House has highlighted that democracy is backsliding.

 

And, often, such companies are able to offer lower cost solutions that these governments are ill equipped to regulate or operate. And access to the data that is being collected and will increasingly be collected, as these technologies continue to emerge and spread, will enable authoritarian leaders to more effectively monitor populations and potentially manipulate, control, and exploit information based on the insights gained from the data.

 

As I noted at the outset, the President, the Congress, and Democratic leaders around the world are together mounting an effort to promote democratic governance and counter the risks posed by the technologies I have discussed today but significant work remains and what is clear is that, to be successful, a whole of government effort that includes not just the traditional public and private sector actors, but also the Carnegies of the world, is really needed.

 

The tools will have to be varied, to address the range of structural issues I have outlined. This is where I think our partnership can be most useful.

 

President Biden, for example, recently signed an Executive Order prohibiting US government use of commercial spyware that poses risks to national security and has supported Secretary Raimondo’s efforts in this arena, as she has aggressively pursued the use of export controls to hold companies accountable that develop, traffic, or use technologies to conduct malicious activities that threaten the cybersecurity of members of civil society, dissidents, government officials, and organizations.

 

And together, we need to build on what has been done to improve the resilience of countries to resist digital repression in all its forms.

 

We, in the Intelligence Community, alongside the private sector and research institutions, can help to create greater awareness regarding the types of technologies, as well as the institutional, legal, and organizational approaches being used to engage in digital repression.

 

And in doing so, we hope to highlight areas where normative frameworks might be developed by experts and policymakers in and outside of government that preserve to the greatest extent the promise of such technologies to support freer flows of information, more timely and cheaper communication, as well as smart technologies that improve the delivery of services and even protect the environment or promote our health, while nevertheless guarding against their use for digital repression.

 

We hope to prompt thinking on technical standards and design approaches that promote not only cybersecurity and appropriate law enforcement and intelligence activities, but also democratic governance, freedom of expression and political discourse.

 

Even data management approaches that recognize the dangers associated with the extraordinary collection of information that happens on a daily basis in our world today. And model laws and organizational structures that make it harder to engage in digital repression.

 

And we need to move with urgency. During the coming years, we can expect that governments will grow more sophisticated in their use of existing repressive technologies and will learn quickly how to exploit new and more intrusive technologies, particularly automated surveillance and identity resolution techniques.

 

The multifaceted challenge of adversaries suppressing information environments cannot be solved by government alone. Digital repression and foreign malign influence are whole-of-society challenges, and we can no longer operate on parallel but distinct tracts. I think now is the time for partnership.

 

If you believe as I do that this is an urgent challenge, we must work together to protect the integrity of our own democracy, and democratic societies around the world.

 

Thank you for your time, I look forward to your questions and for the opportunity to talk to Dan.

 

 

MR. BAER: Thank you very much for those illuminating kick off remarks, and now you get the hard part. And for those of you who are in the room, or watching online, online you should be able to submit questions on your online platform, and if you're in the room there should be QR codes that were around the room that will allow you to submit questions, and then they will magically show up on my iPad, so when we get to that part, I'll be able to turn to them.

 

So, I wanted to start where you started, which is the unusual nature of the speech that you gave, because we're not actually used to having people from the Intel Community, or certainly not the leader of the Intel Community sounding the alarm in a way. And I guess I wonder why you're giving that speech, rather than say Secretary Blinken, or somebody who is more usually at the front of a public address about a foreign policy, or a security policy issue?

 

DIRECTOR HAINES: Yeah. Thank you so much. And look, it's great to actually be here with all of you. I will say I mean to be fair, Secretary Blinken does give those comments, you know, as does the President and everybody else. I think it's important for our voice to be heard on this too, to make it clear that it's not just the, you know, foreign secretary in a sense that cares about this issue, but it's also the security services that care about this issue.

 

And I do care deeply about this issue. I first of all, really believe it is a national security threat, and if you looked at our Annual Threat Assessment, we for the first time, have a whole section on digital trends of authoritarianism. So that is something that we're highlighting.

 

But it's also, it's a place where I think we uniquely can bring analysis, insight into essentially how it's happening and what the challenges are, so I believe that that's something we should be doing more generally.

 

But I also want to make clear that there’s a real distinction between intelligence services in a democratic society and one in authoritarian. And there are, you know, structures around that. And that doesn't mean that, in other words, we should be having the conversation where people have concerns that it is creeping into our own system.

 

And that's something that needs to be on the table as well at the same time that we're talking about how does this happen, and how can we all move forward in a way that's actually going to counter what we're seeing and that we have a concern with. If that makes sense, yeah.

 

MR. BAER: If you had to, I mean that makes sense for why you're speaking out. If you had to kind of analyze the timing of the warning, you described actually two kind of broad trends. One is about geopolitics, and the frame of democracy versus authoritarianism, but really really this is a challenge of rising authoritarian powers one could say, and the moment that we're living in. That's one trend.

 

The other is the progress of technology, and the evolution of kind of the next wave of digital technology, the kind of post internet, or building on the internet set of technologies, whether those are AI, or smart cities, internet of things.

 

The timing of your kind of alarm cry, is that driven more by geopolitics, or more by as somebody sitting over the intel community seeing around corners and saying gosh, there's something that's about to hit us like a wave, and we need to we're not there yet, we need to be ready for this, or be more thoughtful about how we're encountering it.

 

DIRECTOR HAINES: Yeah. So, I'd say it's a combination, not surprisingly. But here's what I think is really intensifying in this moment that makes it so critical. If you look at a combination of the pandemic, right, which basically both put an emphasis on tracking data of individuals who had contact with, et cetera, like all of us engaging more through digital means, right? It sort of pushed society into that space where there was increasing amounts of data that was available publicly about all of us in many respects, with the extraordinary trend of digital technologies, of emerging technologies, right?

 

Where, you know, we talk about the trajectory for the internet of things, which is obviously, you know, enormous as I've indicated in my remarks. But that combined with a whole series of other types of technologies like spyware and so on that are becoming commercially available, that are cheap, that are easy to get, so that you can really engage in digital authoritarianism at scale, right?

 

And then you look at how it's not just being engaged in its scale by sort of the classic authoritarian countries that you're used to looking at it, but what you're recognizing is that there's a kind of an export of these structures. And really, you know, from my perspective in addition to obviously them wanting to sound the alarm, put forward what the threat is, and to try to analyze it, I think part of what I'm hoping to do is to set up for a better conversation about how we can create greater resilience for it.

 

Because I think part of the challenge is that you sort of say okay, well you know, there may be some places where a China or Russia is looking to export this technology, so that they can engage in transnational repression, or so that they can help a leader that they want to have, you know, remain in power, to sort of control their own information environment.

 

And that is certainly, you know, one way in which it might occur. But it's also the case that you now, as a leader in a country, can basically purchase a lot of these tools, can see the model that's being used in, you know in a China or a Russia. Decide how it is that you want to apply it in different spaces.

 

And, I think unless we start to understand the structural changes that are happening here, and how that scaling works, and how it actually can be so pernicious, then we're not going to be able to create the norms and the frameworks, the technology standards, the export controls, all these tools can be brought to bear to actually help us to promote greater resilience in our societies.

 

MR. BAER: You mentioned that President Biden recently signed an executive order barring the use of spyware. Is that putting is that trying to hold back the tide in some way? I mean is the value of something like that more symbolic in that it does signal the need for kind of a normative development, instead of kind of institutional approaches to this, than it is on its face for the specific bar?

 

And then also, a second part of that question, is also that action won a lot of praise from the human rights community, from the civil liberties communities, but it did it have blow back from partners who make that kind of technology, or who want to use that technology maybe for legitimate purposes?

 

DIRECTOR HAINES: So, just to answer the last question first. Like I've had no counterpart, for example, say to me, you know, this is a problem, or this is an issue. You know, the EO, or to disagree with it in any way. And I think it is more than symbolic I would say, but I do think that part of what it does is, as you say, set the standard.

 

And you know, if you think of, if we think that norms is a, you know, a space that we want to move into in this area right, that we want to create some norms that give us kind of a framework through which to look at what's acceptable, and what's not. In a way the predicate is agreeing on what the norm should be, right?

 

Like on what is the space that is acceptable to occupy for legitimate purposes, and what isn't acceptable? And part of what the President is doing in a scenario like that is actually sort of kind of mapping out that space, right? Instead of saying here is something that is unacceptable, and we want others to join us, and in the context of the democracy summit others joined.

 

And you know, and I think it's starting to promote that more generally, then creates a standard that we say okay, we'll that's not okay, and then there is a kind of a, you know, all of the things that can happen afterwards through policy means to begin to backstop that in a way, and make it more challenging for folks to engage in it.

 

MR. BAER: As you think about, I mean that's one specific set of technologies.

 

DIRECTOR HAINES: Yeah.

 

MR. BAER: You mentioned, I'd be remiss given the moment that we're living through, if I didn't talk about AI, and you mentioned generative AI in your remarks. You know, given the breathlessness of the commentary that any of us have read in the last few months about how AI is going to change everything everywhere all at once, how do you, sitting on top of the Intel Community, choose what to focus on? And have you identified anything that you think is, or any set of things, bullet points, that you think are the real risks that are emergent from AI, or is it too soon to tell?

 

DIRECTOR HAINES: Yeah. I mean, you know, the first thing is you're right. Like there's no way I can sit here and say I understand all of the implications of generative AI for the Community, let alone the world. What I do think is true is that with generative AI and the other emerging technologies that we're seeing, it is making it easier to be surprised by significant developments.

 

And in a way, that's a big part of our work, is indications and warning, right? And we're sort of trying to make sure that we can provide policy makers with some sense of what's likely to happen that is meaningful and important for them to be focused on in order to address major national security and foreign policy issues.

 

And this makes it more challenging. The one thing I can say in this space is it is going to make it easier, right, for basically authoritarian governments, others who want to engage in digital repression, to do the job. And I'll give you just one example. I list in my remarks a whole series of different tools that are part of digital repression that we identify, right.

 

And misinformation and disinformation is obviously one of them. And there's just no question that with generative AI you can be far more sophisticated in your production of misinformation and disinformation. That's already obvious, right? That is going to make it harder to counter.

 

And it's critical, obviously, for us to be able to do that as effectively and on a timely basis. Well maybe, generative AI will also give us some tools to counter it. But I think it is, you know, just increasingly complex an environment for us to keep on managing the degree to which things are developing, and we're trying to counter them.

 

And so, having kind of broader frameworks that we're able to stick this into, and you know, is sort of a consensus on what is acceptable and what's not, will be increasingly important, I think.

 

MR. BAER: Are you worried about, I mean most of your remarks were kind of global in nature and looking at things that are happening around the world, but obviously we know that Russia, in particular, and China have both engaged in what we would what we used to call information ops in the United States.

 

And are you worried that the United States, especially we're going to have an election next year, in the run up to those elections, are we prepared enough for the new capacities that are provided by AI for misinformation, disinformation?

 

DIRECTOR HAINES: Yeah. I mean this is an area, obviously, we spend an enormous amount of time on is trying to protect from election influence and interference. And I do think that it's going to be possible, obviously, for foreign actors to engage in more sophisticated kind of misinformation and disinformation campaigns.

 

But we are working very hard to continue to try to be on the edge of it, so that we're able to warn the American public as appropriate, and we do this through really the law enforcement leg, FBI, and then also through DHS, Department of Homeland Security, that's really on the front lines of this.

 

MR. BAER: Russia and China and Iran figured in your remarks, but out of the examples you gave with respect to Iran, we're more focused on Iran's use of digital repression domestically. Russia and China both have more obvious examples of where they've used it beyond their borders.

 

And I was struck by the distinction, what I heard of the distinction, maybe you'll correct me, that it seems like while both use both types of digital repression, that Russia is much more focused on disinformation, misinformation, control of information, whereas the Chinese model is much more built into technology and surveillance. And how do those two kind of different strategies change the way that we have to think about how to respond?

 

DIRECTOR HAINES: Yeah. So, it is true that we see that China is more sophisticated from a technology perspective and censoring and so on. And as I mentioned, partially because they just started earlier. They prioritized it at an earlier point.

 

Russia, even culturally, it tends to be quite legalistic in their addressing issues. And they've created this incredible organizational structure, their digital ministry, they have, you know, you have to basically get a license, and put forward bandwidth requirements, and all sorts of things that give them the capacity to manage what's happening.

 

And you know, and they've put a series of laws. This isn't to say that China doesn't have laws on these issues that are also intended to facilitate the work that they do. They do. And also, as you point out, Russia has technology as well, and is deploying it to effect.

 

They do tend to engage more in the disinformation space than China, but that doesn't mean again that there isn't overlap and works. What I would say is useful about thinking about this in different ways is as follows. Like each of them have a slightly different model for how it is they approach it. Even technically, they have slightly different models. And I think part of what I'm hoping is useful is that as we try to lift this up, so that people understand what the model is, then frankly, you know, folks at places like the Carnegie Endowment, and you know, and research institutions in our policy community and so on, can begin to think through how do we counter that particular model.

 

How do we deal with this particular type of issue, and how can we create norms, for example, that make it harder for other societies to essentially adapt it wholesale, right? And in part, it could be a technical issue, right? It could be that you're developing a standard for how, you know, and I mentioned data management, right?

 

Like how do you store data? And what are the restrictions around data? And what are the sort of technical ways in which you might do it, you know, homomorphic encryption, or other types of things that might help to protect the privacy of data and make it harder to bring things together in a way that would allow you to use tools that could engage in repression.

 

But part of it can also be, you know, in ways that we have privacy and civil liberties officers in each of our services, right, that we have, you know, a variety of institutional structures that help to promote certain aspects of our systems that we think are important, or values or principles. Those are also things that matter.

 

And I've seen over the years, I think you know, I sort of feel like I went into law thinking this is many, many years ago, obviously, decades, but in any event, believing that law was the way to change society. That you know, I sort of watched the civil rights movement and other things, and saw how critical it was to society, and to our capacity to change.

 

And I feel as if the older I've gotten the more I recognize the cultural in soft law, and institutional issues are actually absolutely critical, if not more important at times, to actually producing change. And I think those are important too, and it's the structure that's being used to engage in digital repression, not just the digital technologies that are important to this picture.

 

MR. BAER: One of the I wonder whether you've gotten any pushback. You've mentioned several times the export of these technologies, and I would say those technologies, these specific technologies aren't the only technologies that are being exported, and obviously the export of technology is essential to economic development in key parts of the world.

 

We had a conversation in line with the spring meetings of the IMF and World Bank here about digital public infrastructure, and the way that digital public infrastructure has expanded banking to many millions, hundreds of millions of people in India, and has the promise to do so elsewhere in the world.

 

I wonder whether you've run into this challenge of how you get the benefits of the export of technology, while also being mindful of the risks of, you know, large scale consumption of Chinese technology that may have back doors, or that kind of thing? DIRECTOR HAINES: Yeah.

 

MR. BAER: How are those conversations with partners who are I guess in good faith, trying to square the circle, and both harvest the benefits of technology for the population, and also be mindful of the risks?

 

DIRECTOR HAINES: Yeah. Absolutely. Because I do think this is one of the challenges. You know, as I look to the next ten years, it strikes me that, you know, we are going to have smart cities, right? We are going to be collecting more data.

 

It's going to be easier to access it in many respects, right, and the question is how do you manage it in a way that provides you with some sense of security or capacity to, you know, not see it manipulated for elicit and unacceptable purposes. And so, you know, thinking through how do you design export control regimes, or other forms of tools that allow you to manage this in a way that give us the benefit, but without, you know, some of the harms is obviously the 64-million-dollar question.

 

And I am happy to say that in the Intelligence Community, we are not responsible for solving that one. But I will say, I mean it is, you both have licensing regimes, right, that say we are fine with you getting this type of technologies, so long as you comply with the following requirements, right?

 

And you just have to report on a regular basis that you're doing so, or you know, we have a monitoring capacity, or we have ways to sort of check on issues. So that is sort of one form of addressing that kind of an issue, right? The other is to say that you have to build interior design, right?

 

We have a lot of building into our design for cybersecurity that we're trying to promote. We should also be doing it for democratic resilience in a sense, right? There's ways to do this to try to promote whatever the principle is in the sense that you're looking for. It's not to say that it isn't complicated or there isn't transaction costs as a consequence, all those things, but that has to be factored in.

 

MR. BAER: I'm going to ask a couple more questions, and then turn to audience questions, so if you have a question please feel free to submit it now. I would be remiss if I didn't give you some harder questions before letting you go. But one of them is about TikTok, because that's another issue that is often in the news these days.

 

And I have read the polling data that suggests that banning TikTok is hugely popular actually, with American voters. I suspect, that has more to do with parents being concerned that their kid's brains are being turned to mush, than it does with national security concerns.

 

But I wonder if from a national security lens, whether the Intel Community has made any recommendations, either to the oversight committees in Congress, or to the White House about whether or not some sort of ban, and I understand there are Constitutional questions, but some sort of ban on TikTok makes sense from a national security standpoint? And there are specific national security concerns with it?

 

DIRECTOR HAINES: So, we do threat assessments, which obviously we provide to the Congress, and to the Executive Branch on these kinds of issues. And typically, when we're looking at platforms like TikTok, or other things, what we're trying to look at it is basically their capacity to collect data, and then who has access to that data.

 

And what might that data be misused for essentially in the context of any particular, you know, scenario, national security or otherwise. And that is certainly something on which we've done, and of course, you know, the President has directed a ban workspaces for the U.S. Government with respect to TikTok, which we're implementing, and that is yeah, pretty much.

 

MR. BAER: Where you are?

 

DIRECTOR HAINES: Yeah.

 

MR. BAER: Okay. Last question, which is a two parter, which is first of all did you think before giving this speech today, or before deciding to publicly weigh in, that there might be some downside risks to the U.S. Director of National Intelligence getting up on a stage and talking about how the world should be concerned about digital tools, and how they might be used?

 

Is there a risk that that gets thrown back at you, or at the U.S. Government? And the second piece of that, is the second piece of kind of self reflection is obviously, the Biden administration got a lot of well-deserved praise in the run up to the Ukraine War, about the way that it systematically declassified intelligence, and shared it with partners and allies, and indeed with the general public in order to make people aware of what was about to happen.

 

More recently we've seen news about unauthorized release of classified intelligence, and I wonder how that has spurred reflection on your part, and your colleague's part about the balances between classification, compartmentalization, and making sure that people have the tools that they need to do their work.

 

DIRECTOR HAINES: Boy that's a lot in two questions. Okay. I'll deal with the last one first, and then the first one. So, it's very challenging to talk about the unauthorized disclosure because there's an ongoing criminal process right now, legal proceeding.

 

But what I can say is I think it is first of all just deeply depressing whenever one of these things occur for the folks in the Intelligence Community. I mean many of us work our butts off so to speak. You know, just trying to protect our information in an appropriate way. And so, seeing this stuff, you know, which may or may not be right there, in any form or any suggestion of this kind of a leak is just very frustrating.

 

So I just tell you that, yeah, how challenging it is for many of us in dealing with this. I think we will always in any scenario, in any incident that occurs, learn lessons once we understand what happened, and ensure that we try to do a better job protecting our information moving forward.

 

It is also the case that in these scenarios what I think we all try to do is learn the right lessons, and then not over torque as a consequence of, you know, an incident. And what I mean by that is to try to promote better practices, while at the same time not undermining our capacity to do appropriate sharing, and you know, and engage in our mission.

 

And I think, you know, during the course of the Ukraine conflict, as you know, we went through a very careful process for ourselves to try to ensure that we could disclose as much information as we thought we could while still preserving essentially our sources and methods.

 

And I think that was an appropriate thing to do. There is always risks that comes with that, but I think there's also the benefit it can have for national security and foreign policy, and that's, of course, what our ultimate mission is. So, in trying to work this through, I think we'll just try to continue to approach it in a measured way.

 

It sort of comes back to the first question that you asked though. I mean I think, you know, I obviously believe in the works that we do. I think an Intelligence Community is incredibly important to the security of the country, and to ultimately countering authoritarian aggression as we saw on the Russia invasion of Ukraine, right?

 

And of course, an Intelligence Community engages in, you know, spying right? But we have to do it in a way that is consistent with a very robust legal framework, and we have to be accountable to that, and we have to be in my view, as transparent as possible when we make mistakes, and ultimately be held accountable for those mistakes, so that we can continue to work in a way that promotes national security, but at the same time, promotes the other equally important values that we hold as a country in our work.

 

So it is, you know, I think there's always a risk of embarrassment when you go out in public, and I am particular am crappy at it honestly, but you know, I think it's an important place for our voice to be heard, because I think it's part of trying to again draw the distinction between when you do this the right way, versus when you do this the wrong way. And I want us to be helping to lead the charge in a sense, encountering this.

 

MR. BAER: My colleagues should know that my screen is blank right now, so if it needs to be fixed, please fix it. In the meanwhile, I'm going to continue to ask questions that I have for you. I wanted to walk about the democracy versus authoritarianism frame, which is one that Joe Biden led with at the beginning of the administration.

 

There's been some conversation I would say, in the last year. I have detected more conversations suggesting that the frame is not useful, or that there's too much of the world for whom it doesn't resonate. And that actually find that a way of U.S. demanding that people choose sides.

 

And I wonder whether you think the frame holds up. Obviously, you used it today, and you can use it one can use it as an analytic frame without using it as a kind of public diplomacy frame. And I wonder whether you think whether you would continue to recommend it as a public frame, or whether it's really only an analytic frame from the utility standpoint?

 

DIRECTOR HAINES: I mean it's more of a policy question, so in some ways it's hard for me to answer. But what I would say is that I do think that we've seen the narrative, for example, from China shift a bit in the context of even in the last few years where a few years ago it was more of we're presenting another system that is more effective in contrast to your system in a way.

 

In other words, our system is functional. Our system delivers results. Yours is a bit of a mess, and kind of that being the sort of intellectual narrative in a way that was being presented, and a bit of that has shifted towards we have an alternative path. Still with some of the same features, in other words we can deliver, but not framing it as authoritarian versus democracy.

 

And to your point, and perhaps that is because they see that as being, you know more compelling, and yet at the same time when you look at their, you know, sort of best friends, BFF, document with Russia, it's remarkable how much democracy and sort of, you know, their promotion of an international system that sounds very much like what we're promoting, but isn't, you know, is sort of part of the structure that they're pushing forward.

 

So I don't pretend to know what's the right answer from a policy perspective, but I think these are questions analytically that are important for us to understand, yeah.

 

MR. BAER: Great. We now actually have audience questions. The first one I am going to pass along is how concerned are you about the U.S. private sector, particularly the tech sector, that they may not be fully cognizant of some of the risks that their own tools could be manipulated, or misused to nefarious ends?

 

And do you have worries that they aren't aware of the trends that you are identifying?

 

DIRECTOR HAINES: So, we have been trying in our sort of normal way to engage with the private sector to better understand what they're seeing, and honestly in many scenarios they see things before we do because they have the first sort of basis of information, you know.

 

And so, I am I guess the answer is in part they sometimes see things before we do, and we need to learn from them. But in part, I think it is true that they don't necessarily put it into the broader context, which is something we can do, and help them to actually discern from the information what's happening in certain spaces that may be useful.

 

And I think again, you know, as I tried to outline in the remarks, I really think this is an area where you actually need so many different parts of society to engage together in order to put the whole picture together, so that you can actually understand what's happening.

 

And I think that's, you know, an important conversation for us to continue to be having, basically.

 

MR. BAER: There's a question here about the challenges that face democracies with respect to kind of the broad set of issues that you've identified, and that we've seen an increasing trend, one would say, in countries that we've referred to as democracies, taking steps that would chill free speech, that would enhance surveillance without sufficient safeguards.

 

There was obviously at the Summit for Democracy, a declaration on the future of the internet, which was kind of a step towards identifying a normative framework. But how concerned are you about kind of backsliding in democratic partners with respect to these techniques in particular?

 

DIRECTOR HAINES: I am. I mean, you know, as I focused in I think one of the key pieces of the puzzle that doesn't get as much attention is this space, right? In other words, there's a fair amount of attention on what China and Russia are doing in their countries to manage information, right?

 

And it's useful to unpack the model, but part of what I'm really trying to help people focus in on is how that model can be used in various forms, or in various variations, essentially in other spaces. And what I think is sort of a so I am worried about it, and I think part of what we should be doing is thinking about how do we actually make it harder for a society to begin to go down that road? Are there ways in which we can make the structure a little bit more resilient, and sort of automatically highlight when it's happening in ways that allow us to then focus a light on it, and take action as a consequence?

 

MR. BAER: You mentioned the kind of the wakeup call in a way that the Arab Spring was for some authoritarian regimes about the power of the connection that people have through the internet. And since then, and even before then we saw this kind of cat and mouse game between repressing regimes, and creative, innovative citizens, who are figuring out ways to kind of hack through the restrictions.

 

If you had to make an assessment, who do you think is winning right now in the cat and mouse game around the world? And also, are there any recent examples of citizens figuring out how to get around some of these restrictions that inspire you?

 

DIRECTOR HAINES: Yea. So I definitely would not tell you the latter.

 

MR. BAER: I have to ask.

 

DIRECTOR HAINES: So if we can but I mean I do think, you know, I'll just speak the obvious really for all of you who are so sophisticated in these issues. But it is absolutely true that what we saw, I think initially, in terms of the promise of these technologies was kind of turned around by authoritarian governments in order to focus in on, you know, suppressing political dissent.

 

And as a general matter I mean I think yeah, the pendulum has swung more towards the authoritarian space. But I think we're hopefully making some movements to actually bring it back again, so we'll see how that proceeds.

 

MR. BAER: We've talked a lot about the kind of the need for normative innovation, and you know, putting your kind of three or four jobs ago hat back on, and like looking at the possibilities for institutions and frameworks that could be the basis for a kind of international set of standards or agreements around these kinds of tools.

 

Do you see an opportunity? I understand this is not your you're not making policy currently, but with your old hat on, do you see an opportunity for some particular fora, or do you see a model for the kind of regulation, or standard setting that you think is needed in this space that maybe applied to some other space?

 

DIRECTOR HAINES: It seems like a terrible idea for me to try to put on my prior hat. I will just say I mean I think there are a lot of different fora. And you know, it can go from, you know, some of the more obvious U.N. related fora and associated, but to things like the International Telecommunications Union, or other fora in the world.

 

But it is always a kind of a case-by-case analysis frankly and the folks who are in it are the ones who are going to be best positioned to know whether or not it's the right place to do that kind of work.

 

MR. BAER: Another question here from the audience about the particular challenge of close strategic partners of the United States, like Turkey or India, where there may be growing concerns about the use of digital coercion, or the capacity to use digital coercion, and how the United States can think about the trade offs in those relationships, and whether there is actually an opportunity for us to push our concerns about the long term implications of digital repression?

 

DIRECTOR HAINES: Yeah. These are all great questions, but they're all policy questions, you know. So for, I mean for us, what we try to do is just lift up in the Intelligence Community what we're seeing, and then help the policy makers try to figure out how it is that they can address these issues.

 

And you know, as you know, this is an issue that our policy makers care a lot about, and so I think you know, in trying to manage this they're going to want to take up these questions with countries that are across the spectrum, essentially, between authoritarian and democracies.

 

MR. BAER: Do you see value in exchange programs as a way of helping people in different countries understand the nature of these threats?

 

DIRECTOR HAINES: Oh, that's really interesting. I mean I'm a huge fan of exchange programs, generally, so I would absolutely see value in that. But it's I don't know. I mean I suppose on technologies perhaps organizationally, so that they can see how it is that different entities work in ways that allow for, yeah, the kind of cultural and legal norms that we have. I suppose that's possible, yeah.

 

MR. BAER: So, as a closing question I'm going to do another two parter. The first part is what would you most like the American public to understand about the Intelligence Community that you think is least well understood today? And the second is about more appropriately, about think tanks, which is you know, one of the things that is you look around Washington, D.C.

 

There's a ton of think tanks writing about foreign policy and security policy. And there's very little policy work that is done on the IC. And there's some obvious reasons for this given obviously much of the work of the IC is not knowable or known by the general public.

 

But there are parts of the IC's work that are knowable or known, and certainly principles, et cetera. And I wonder if there's an area of, I understand you're not a policy maker, but you do work for an organization around which policy can be made, and I wonder if there's any area of policy with respect to the IC per se that you think is right for policy research.

 

DIRECTOR HAINES: Yeah. Okay. So on that last one, one of the things that I try to do is once every month bring in folks from different NGOs, or think tanks, et cetera, on a particular issue. And ask them, you know, what are the things that they're concerned about that the Intelligence Community might get into, or what are the kind of critiques that they have of the Intelligence Community in these different spaces.

 

And, you know, it has ranged from emerging technologies to human rights issues, to you know, just a whole series of different things. And it's fascinating because I think it's first of all an opportunity for us to hear directly from them, and we bring in our whole senior leadership team to do these things.

 

But it reveals, from my perspective, two things. One is there are lots of places where folks have identified umm I'll give you an emerging technology example. Like where, you know, emerging technology somebody said look, one of the things we're really worried about is that in this area people will very much, as you were describing, concerns about technology more generally in the context of digital repression.

 

People will be so worried about the national security uses and misuses of this technology, and we will not have the opportunity to gain the advantages of the technology. And they sort of gave specific examples of what they were concerned about, right? MR. BAER: Vaccines even.

 

DIRECTOR HAINES: Perfect. Right. There's lots of places where this is an issue, right? And I think, you know, where there is that kind of concern working with the Intelligence Community to help us understand what we can do to try to avoid that, or to try to, you know, develop structures and things like that that help can be useful.

 

It's not saying that it's a perfect, you know, bullet, or that there will be an easy answer to this, but I think it is at the very least useful to be informed, and to try to think it through in a way that's productive from the very initial outset, and the advantage of having academia and research institutions, or NGOs, or others talk to us early is that they often see these things before government does.

 

And so, they kind of have an opportunity to bring these issues up in a way that's helpful. But the flip side of it I think is that we also get from institutions, experts, who are at Carnegie and other places, who talk to us about policy work that they're engaged in. And even though we're not engaged in policy work, one of the things that we can do is if a policymaker is interested in an issue, for example, human trafficking, or something, you know, some particular issue.

 

And you, as an expert, are considering different drivers of human trafficking, or you're looking at different implications, and costs of human trafficking, or other things like that. If you're able to work with us to explain what those drivers are, what the issues are that you're trying to basically create metrics around, we can create methodologies that allow us to actually track it.

 

And that is something that tends to be useful for policymakers who are then trying to promote an argument for, you know, here are the things that you should be looking for. This is what indicators you should be concerned about, right? If we're able to track those indicators people can act on it, right? Similarly, here's the cost of whatever it is that you are concerned about if we're able to do that then we can push back against that.

 

So, that's I think there's a lot of space for this, and it's part of why I think this question about whether or not the structure and scaling essentially of digital repression, if we're able to lift that up effectively with frankly the brains that exist, you know, in different parts of you know, the sort of intellectual ecosystem that Carnegie and others exist in, they can help us to think about how it is that we can monitor and track this, in ways that are going to be more useful to policymakers to actually enact tools to create that kind of greater resilience.

 

You had a second question in there, and I sort of lost it.

 

MR. BAER: Which is what you would want the people in the general public to know about the IC?

 

DIRECTOR HAINES: Yeah. Okay. So here is the thing I think about. You know, in many respects when I talk to folks in recruiting, or in talking about, you know, the Intelligence Community, and they're kind of caricatures of the Intelligence Community and so on. So, it is often, especially with very young people I find like through a lens, like a dark lens.

 

Like we're sort of almost like you're in the conflict, and intelligence is there, and the sorts of visions that they have of what the Intelligence Community does often, and I think from Hollywood and other things, right, is you know, somehow to do with killing people or hurting things, or you know, just that kind of thing.

 

It is also the case, and I hope that the Ukraine conflict demonstrated this in many respects. Our mission is one to actually promote peace. That is in fact what we are basically designed to do, it's to try and provide indications and warnings so that the United States doesn't have to go to war, so that we're able to use other tools to manage crises, so that we're able to exist in a more effective and prosperous and peaceful way.

 

And again, hopefully consistent with our values, and promoting those values, and I think that's the thing that I would hopefully leave people with about the Intelligence Community.

 

MR. BAER: Thank you. I guess I'm going to take a moment to answer my own question in a different way, as a way of saying goodbye to everyone, which is that one of the things that I've always respected about you when you were leading Deputy's Committee Meetings, and that I think is generally true about people working at senior levels of the U.S. Government in my experience is the discipline that people have that attaches to their roles.

 

When we're in a national -- Deputy's Committee Meeting, the lawyers don't try generally speaking, are pretty good about making sure that they're not trying to make policy, that they're being lawyers. The intel people will often say this is just the intel. I can't give you a policy recommendation.

 

And I think most people not only in the United States, but also around the world would be surprised to see how formal those divisions and roles are, even behind closed doors.

 

And how much people respect that institutional kind of set of roles as a way of making sure that are decisions are not that our intelligence is not instrumentalized as to serve a policy agenda, or that the law is not instrumentalized as to serve a policy agenda, but rather that policy makers are forced to take responsibility for making policy with the insights and advise from an intel or legal standpoint, but that other professionals provide.

 

And I saw you oversee a process like that, and I know that you are now a part of it, and we saw you squirm today when I tried to push you out of your role, which is a good sign of discipline that you continue to adhere to in your current role. Thank you very much for spending so much time with us this afternoon. Thank you for sharing your views, and we hope you'll come back to Carnegie soon and often.

 

DIRECTOR HAINES: Thank you. Thank you so much.

 

 

###