Image courtesy of Gerhard Altmann/Pixabay

Creating a model democratic alternative to the surveillance state

By Ishan Sharma, September 8, 2022

Image courtesy of Gerhard Altmann/Pixabay

Like everything else these days, surveillance has become globalized. In 2019, the New York Times reported that there was a steady spread of surveillance technologies designed to detect Muslim Uyghur minorities across several regions in China (Mozur 2019). In India, law enforcement recently used new wiretapping tools to plant evidence on victims’ computers and then use it to arrest them (Greenberg 2022). In South Africa, private surveillance technology is fueling the beginnings of a “digital apartheid” that could limit black people’s physical movements in white enclaves (Hao and Swart 2022). In Uganda, a cybersurveillance unit dismantled years’ worth of the organizing efforts of tens of thousands of its citizens within days (Parkinson, Bariyo, and Chin 2019). In the United States, immigration officials surveil migrants through a vast dragnet of child welfare, housing, and employment records, as well as data from utilities, phone, and social media, without a warrant or congressional oversight (Georgetown Law Center on Privacy & Technology 2022).

Rather than the product of a grand strategy to remake the world for autocracy, the globalization of surveillance abuse illustrates a larger problem: Democracies have not succeeded in crafting an alternative that reinforces the fundamental democratic principles of equality, freedom, transparency, and accountability (Fisher 2021). Consequently, only 17 percent of the world views American democracy as worth emulating (Wike et al. 2021). The fact that the United States and its allies have driven much of the democratic world’s decline underscores the need for alignment on a shared vision that repairs, builds, and attracts (Klein 2022; Khatiri 2020).

To achieve this new vision, there needs to be more concrete, applied, and solutions-oriented discussions about tackling the erosion of democracy on both the domestic and foreign policy fronts. Widely accepted, common principles about the role of ethics in technology should be applied to usher in a broad alignment among democracies as to how surveillance technology should be used.. Below are several recommendations for policy makers in the United States and aligned countries that offer an initial roadmap to build on these principles—and lead by example in pursuit of a credible alternative to the surveillance state.

Lead by principled example

Even the best processes—such as collecting track-and-trace data to respond to the COVID-19 pandemic—can be misused at a later date or by another regime. (Singapore is a recent example: Data that was ostensibly collected to track and trace the spread of COVID-19 was later used for the purposes of law enforcement [Sato 2021]). In general, tools acquired or developed for one discrete capability are often expanded for other intelligence uses at a later date, according to studies of the surveillance industry conducted over the past 20 years (Desombre, Gjesvik, and Willers 2021). This tendency poses a pressing challenge for democracies: Any responsible system must endure changes in power.

To combat the dangers of mission creep, governments need to establish “concrete benchmarks, regulations, and bodies” that empower them to enforce oversight, as noted in a recent working paper published by the National Endowment for Democracy (Feldstein, Ferreyra, and Krivokapić 2022). With that in mind, four principles must be observed for governments to acquire and deploy surveillance technologies in democracies: transparency, proportionality, enforceability, and intersectionality.

Transparency. Citizens should never have to guess who, where, or what is under surveillance. Municipalities need to answer these questions through public reports, posted signs, or other forms of outreach. Judges must be trained as gatekeepers over the warranted use of modern surveillance technology. Governments must unseal warrant requests over time to heighten public knowledge about what was authorized and what was ultimately performed. Local city councils should ensure that the use of surveillance is guided by public input on clearly defined policies on how data will be stored, retained, and secured, and what independent auditing and performance-review procedures are in place.

Proportionality. Law enforcement should use digital surveillance in cases commensurate with the crime committed—such as tracking child traffickers instead of jaywalkers. Initial use-cases for a given technology should explicitly codify this proportionality. Rather than rely on an algorithm as the sole reason to arrest someone, biometric detection should only form an investigative lead that requires further corroboration (Hill 2020). In general, data-gathering should operate under clear policies of secure and limited retention—which may be the only bulwark against the ever-increasing flow and capture of data. Public-private contracts should guarantee that surveillance data will not be resold or repurposed—especially to agencies outside the initially agreed contract.

Enforceability. Citizens should have a clear path to address concerns about how public surveillance data is gathered, breached, or misused—and be able to employ the legal system and the aid of civil society groups to uphold these standards. The recent American Civil Liberties Union (ACLU) settlement against the facial recognition company Clearview AI—which made headlines after it scraped 3 billion photos from the Internet—is a crucial example. Under the settlement agreement, Clearview AI must grant residents the option to block their facial data from its systems (ACLU v. Clearview AI 2022).

Intersectionality. An intersectional approach recognizes that surveillance affects everyone but not equally; some bear the costs of surveillance more than others. Those who have lived in communities with a long history of being under constant oversight—be it in the form of wiretaps, informants, security checkpoints, outright spying, or biased facial recognition (Devich-Cyril 2020)—are uniquely qualified to future-proof a democratic response to surveillance technology. These communities must be invited as key stakeholders in any conversation about the oversight of surveillance technology. The public engagement efforts of the Santa Clara, California’s new Privacy Office offer a prime example (Sharma 2021).

Recommendations: from principles to action

Moving from prescriptions to practices has never been easy, an endeavor compounded here by the skyrocketing demand for surveillance technologies (Greitens 2021a). This growth in demand shows no signs of stopping, and at the same time, digital surveillance is steadily becoming cheaper, more accessible, and more intrusive—with the result that the widespread adoption of surveillance technology can be expected to continue its explosive upward trajectory.

Representatives from the surveillance technology industry have told me that unless governments intervene, companies will continue in this downward spiral, feeding the market’s demand for ever-more invasive technologies—a race to invasiveness—including the recent development of potential mind-reading algorithms (Sharma 2021a; IPVM Team; Murgia 2021; Huang 2022).

The high-tech surveillance state is not restricted to China: Interview with Maya Wang of Human Rights Watch

To combat these developments and reinforce basic, core democratic principles, the surveillance industry’s incentives need to be reoriented. These changes in incentives—or market interventions, if you prefer—will likely come from three different areas: governance, technology, and partnerships.

Governance: Develop a sustainable model of governance that checks power regardless of political party.

One possible solution that has gained traction in the last two years is found at the local level: independent, citizen-run oversight structures that can intervene before any law enforcement agency can procure a technology, and that remain in place throughout the technology’s deployment and evaluation (Marlow 2021; Degroff and Cahn 2021). These citizen-driven bodies have worked in their 15 US pilot cities, because they make decisions after engaging the community—and incorporate real-world data about implementation (Fidler 2020).

For example, the City of Oakland has a board of volunteers known as the Privacy Advisory Commission, that finds answers to questions like how the government plans to use the technology, store (or delete) the collected data, and measure the risks before the technology is purchased. After purchase, the commission conducts annual reviews to assess the extent to which these technologies meet the community’s crime and equity needs (Sharma 2021a).

Such a solution could be scaled up. In the United States, for example, municipalities could be required to have clearly defined rules in place, or “surveillance ordinances,” regarding the use of surveillance equipment before they receive any federal funding for the purchase of state and local police surveillance equipment (Sharma 2021b). With billions of federal dollars at stake to support the purchase of surveillance technology, one US police chief explained that “surveillance comes at the cost of the taxpayer” (Sharma 2021a).

In tandem, the federal government could provide guidance. The FBI, for example, could create baseline contracts that serve as rough templates for the standards to be followed in the purchase and use of surveillance technology—including criteria such as banning non-disclosure agreements, requiring provisions for limited retention, and no re-use or third-party sharing of data, as well as requiring that companies integrate effective ethics reviews into their design and sales processes. Other agencies at the federal and municipal levels could choose to follow these contracts or use them merely as starting points for establishing their own, more protective and specific standards in their contracts.

Abroad, legal systems will differ. But the US State Department’s Bureau of Democracy, Labor, and Human Rights—and the recently established Bureau of Cyberspace and Digital Policy—could coordinate with other countries in exploring where and how the US surveillance ordinance model may be transferable, and vice-versa. In Nigeria, for example, a new model of community-led policing has emerged, which could offer a model for the world of surveillance decisions informed by local input (Campbell 2020; Erezi 2021; Ighobor 2021). Kenya’s High Court has proposed carrying out impact assessments on surveillance technology to better measure their effects, mirroring a crucial component of surveillance ordinances (Bulelani 2022). At the very least, surveillance standards that highlight the key elements of a surveillance ordinance model—community-based oversight and transparency—should inform ongoing conversations at the US-EU Technology and Trade Council, and the Summit for Democracy’s Export Controls, and Human Rights Initiative members.

That said, no model is perfect. For example, one study found that the American public’s perceptions of drones depend on demographics and political affiliation—with participants more likely to approve of law enforcement using drones in primarily black neighborhoods than white neighborhoods (Anania et al. 2019). If what is happening to Muslim Uyghurs in Xinjiang is to be prevented, democracies must found an alternative model that is based on principles of intersectionality, lest such oversight bodies become rubber stamps for systematic abuse.

In addition, the recently created National AI Advisory Committee’s Subcommittee on Artificial Intelligence and Law Enforcement could establish a federal certification mechanism to stem the spread of misuse and non-secure, biased technologies. Decisions could be based on evidence: submitted answers to an objective questionnaire, evaluation of existing, less-invasive alternatives, software and hardware audits, and intended-use cases, among other considerations.

Such a mechanism would be difficult to get right; most important, it would need to avoid becoming yet another free pass for harmful surveillance. As one potential anti-rubber stamp measure, recertification could be required every three years, with a renewal process based on real-world data including records of the technology’s use, rates of false positives and negatives, reports of misuse and relevant civil society complaints, entities in receipt of technology, and more.

Technology: Provide financial incentives for companies to develop ethical surveillance practices.

Researchers such as Dahlia Peterson and Samantha Hoffman have advanced the idea of an interagency partnership between the National Science Foundation and the Defense Advanced Research Projects Agency that would fund “privacy-preserving computer vision research … [with the] goal of automatically anonymizing all faces in a crowd and deanonymizing only those necessary to pursue investigative leads” (Peterson and Hoffman 2022). Governance recommendations like this will be key to translating research from labs to the marketplace.

At the same time, however, there is also a general need for detailed data about how surveillance technologies are abused. Currently, surveillance technology companies bypass oversight by claiming they don’t know—or can’t know—how their product is used. In the words of the chief executive officer of the Israeli-based surveillance technology company known as the NSO Group:

We are selling our products to governments. We have no way to monitor what those governments do … we cannot be blamed on the misuse that the government did.

Admittedly, the surveillance industry is composed of a dense, multi-layered supply chain. But what if all surveillance technologies came with an internal software feature that alerted companies of wrongful or un-intended use? It could reshape the entire marketplace, just as cars are slowly starting to be made safer by driver safety packages (Elliot 2020). Such an idea is within reach. In fact, a version of this internal software for surveillance technology was floated by Microsoft in 2019 as part of its so-called “smart” approach to export controls, while Google and Mozilla also demonstrated it in locking their services with regards to Kazakhstan (O’Neal and Clark 2020). While approaches will differ based on the type of surveillance technology, one possible way of promoting their development might be to launch a $30 million prize to create proof-of-concept for software controls on 5 to 6 common surveillance technologies (Sharma 2021b). The result could serve endless oversight purposes, from enforcing Know-Your-Customer laws to receiving licenses for exporting technology to certain regions.

Introduction: The brave new world of the high-tech surveillance state

Partnerships: Use existing coalitions of countries, industries, and civil societies as leverage, to realize a shared democratic vision.     

Why would any government purchase surveillance technology with an oversight package? They wouldn’t—unless it became a global standard. This underscores the need for coordinated, multi-stakeholder partnerships that take the long view when it comes to reinforcing democracy. The United States and its allies should leverage their diplomacy efforts to guide global surveillance-tech markets with international consensus.

Most immediately, democracies should establish multilateral controls over the export of surveillance technology, with the aim of stopping Western-based companies from profiting from digital repression (Kim 2021). Fortunately, last year, the European Union declared that it would require an export license for any technologies—not just those of blacklisted companies—with potential for what it called “internal repression” or other serious human rights violations (European Commission 2021). Currently, the Bureau of Industry and Security, the US export controls arm, only has the authority to consider end-uses for weapons of mass destruction, not human rights. Congress would need to expand the Bureau’s authority to consider human rights as an end-use worth controlling. If that happens, the beginnings of a practical, rights-based, and multilateral export control could be within reach—a vital alternative to slow-moving existing agreements (Lewis 2015). By partnering with the European Union over end-user controls, the United States could exert pressure on allied countries such as Canada, Japan, Switzerland, and Israel to enact similar controls over their exports.

Next, the United States must change its one-size-fits-all messaging regarding China-based technologies; its current generic appeals to “geostrategic rivalry, democracy and human rights, and data security” are not meeting the mark (Greitens 2020b). Instead, such conversations should change to account for subnational actors—the provincial governors, mayors and local officials driving most of the surveillance demand—in a country-specific way that meets US interests. This will not be easy. In certain Latin American countries, for example, the deployment of surveillance technology may relate to other competing US priorities, such as decreasing crime and drug-related activities to lower migration pressures.

As a result, the International Trade Administration’s Digital Attaché Program; the State Department’s Bureau of Democracy, Human Rights, and Labor; the Bureau of Cyberspace and Digital Policy; and the Special Envoy for Critical and Emerging Technologies should evaluate competing priorities and design a specific path forward (Morgus, Woolbright, Sherman 2018). Digital Trade Officers already operate in key markets within Brazil, Mexico, India, Indonesia, South Africa, Japan, the Association of Southeast Asian Nations, and the EU, presenting an opportunity to meet demand and orient market norms towards more ethical use.

To be clear, simply promoting Western-based surveillance technology would not solve any problems; substantive, rights-based coordination must wrap around any surveillance technology sales pitches. But a coordinated partnership would reach more democratically favorable outcomes. Digital Trade Officers, for example, could meet directly with relevant subnational actors who can implement local and regional norms of use. Meanwhile, the State Department could work with Latin America and Southeast Asia leaders to crystallize what the phrase “public security” truly means as a justification for using surveillance powered by artificial intelligence (Peterson and Hoffman 2022).

Another, perhaps even bolder, approach would be to expand the so-called “T3” strategic tech alliance between the United States, India, and Israel beyond next-generation wireless telecommunications and into the domain of surveillance technology (Chikermane 2020). Israel is home to several surveillance industry leaders, while India is the world’s largest democracy—and a rapidly digitizing one, at that. Unfortunately, neither of these countries has a particularly great record with respect to surveillance technology’s impact on democracy. Codifying some of these principles, however—or even reaching a near-term agreement to begin coordinating over the digital authoritarian problem—would be a tremendous win for institutionalizing global responsibility in surveillance exports and deployments.

Governments should also subsidize civil society organizations to participate in local and international settings. Indeed, these organizations have often been the backbone to preserving democratic values in an increasingly changing world. In my research, I’ve found that groups like the Surveillance Technology Oversight Project and the American Civil Liberties Union play fundamental roles in helping the US judicial system catch up to the progress of surveillance. In Argentina, civil society organizations were the only means by which the public could even retain a “glimpse of vendors’ relationships with the local public sector” (Feldstein, Ferreyra, and Krivokapić 2022). Such groups deserve a seat at the global standards-setting table, at home and abroad.

Finally, democratic governments must consider how best to promote the right companies to achieve this global vision. One way would be to subsidize small and medium-sized technology companies to engage in international standards-setting efforts (Thompson and Montgomery 2021). Currently, the pricetag for participation at the United Nations International Telecommunications Union is $300,000 per year—for one engineer (Whalen 2020). Unless the company is a Big Tech firm or highly subsidized by the Chinese state, it quickly becomes impossible to participate in discussions that could determine how surveillance technology is developed and used around the world.

Ultimately, preserving the democratic rule of law and human rights in the digital era will require the United States to look inward and call upon its allies to do the same.


The author would like to thank his colleagues Dan Correa, Divyansh Kaushik, and Karinna Gerhardt for their endless support in bringing this piece to life.


American Civil Liberties Union (ACLU). 2022. “ACLU v Clearview AI.” American Civil Liberties Union. May 11.

Anania, E. Rice, S. Pierce, M. Winter, S.R., Capps, J. Walters, N.W., Milner, M.N. “Public support for police drone missions depends on political affiliation and neighborhood demographics.” Technology in Society 57 (2019): 95-103.

Brewster, T. 2021. “‘If You’re Not A Criminal, Don’t Be Afraid’—NSO CEO On ‘Insane’ Hacking Allegations Facing $1 Billion Spyware Business.” Forbes. July 22.

Bulelani, J. 2022. “Africa: regulate surveillance technologies and personal data. CCTV cameras and spyware are proliferating in the continent without checks and balances. Governments must legislate locally to prevent civil-rights abuses.” Nature 607 (1): 445-449.

Campbell, J. 2020. “Nigeria Launches Community Policing Initiative.” Council on Foreign Relations. September 15.

Chikermane, G. 2020. “From T3, the India-US-Israel tech alliance can become T11.” Observer Research Foundation. September 10.

Degroff, S. and Cahn, A.F. 2021. “New CCOPS on the Beat: An Early Assessment of Community Control of Police Surveillance Laws.” February 10.

Desombre, W., Gjesvik, L. and Willers, J.O. 2021. “Surveillance Technology at the Fair: Proliferation of Cyber Capabilities in International Arms Markets.” Atlantic Council Scowcroft Center for Strategy and Security. November.

Devich-Cyril, M. 2020. “Defund Facial Recognition.” The Atlantic. July 5.

The Economist. 2022. “ ‘You can expect, in a lot of countries, a doubling of unrest over the next year’—inflation and uprising.” June 29.

Elliot, C. 2020. “Are Vehicle Safety Features Actually Reducing Car Accidents?” Forbes. September 2.

Erezi, D. 2021. “New report claims Nigeria is spying on its citizens’ data and communications.” The Guardian. December 8.

European Commission. 2021. “Strengthened EU export control rules kick in.” September 9.

Feldstein, S. 2021. “Digital Technology’s Evolving Role in Politics, Protest and Repression.” United States Institute of Peace. July 21.

Feldstein, S., Ferreyra, E., and Krivokapić, D. 2022. “Working Paper: The Global Struggle Over AI Surveillance.” National Endowment for Democracy. June.

Fidler, M. 2020. “Fourteen Places Have Passed Local Surveillance Laws. Here’s How They’re Doing.” Lawfare. September 3.

Fisher, M. 2021. “U.S. Allies Drive Much of World’s Democratic Decline, Data Shows,” The New York Times. November 16.

Georgetown Law Center on Privacy & Technology. 2022. “American Dragnet: Data-Driven Deportation in the 21st Century.” May 10.

Greenberg, A. 2022. “Police Linked to Hacking Campaign to Frame Indian Activists.” Wired Magazine. June 16.

Greitens, S.C. 2020a. “Dealing with demand for China’s global surveillance exports.” The Brookings Institution. April.

Greitens, S.C. 2020b “Working Paper for the Penn Project on the Future of U.S.-China Relations China’s Surveillance State at Home & Abroad: Challenges for U.S. Policy.” University of Pennsylvania.

Hao, K. and Swart, H. 2022. “South Africa’s private surveillance machine is fueling a digital apartheid.” MIT Technology Review. April 19.

Hill, K. 2020. “Wrongfully Accused by an Algorithm.” The New York Times. August 3.

Huang, J. 2022. “China Boasts of ‘Mind-reading’ Artificial Intelligence that Supports ‘AI-tocracy.’ ” Voice of America. July 9.

Igohobor, K. 2021. “Effective community policing in Nigerian town could be a model for other places.” United Nations. September.

IPVM Team. 2020. “Huawei / Megvii Uyghur Alarms.” IPVM. December 8.

Khatiri, S. 2020. “Make Democracy Attractive Again.” The Bulwark. September 8.

Kim, H. 2021. “Global Export Controls of Cyber Surveillance Technology and the Disrupted Triangular Dialogue.” International and Comparative Law Quarterly, 70(2), 379-415. doi:10.1017/S0020589321000105.

Klein, E. 2022. “What America Needs Is a Liberalism That Builds.” The New York Times. May 29.

Lewis, A. 2015. “The Effectiveness of the Wassenaar Arrangement as the Non-Proliferation Regime for Conventional Weapons.” Stanford University’s Center for International Security and Cooperation Freeman Spogli Institute for International Studies. May.

“Meet TIP – Technology, Innovation and Partnerships: A new directorate at the U.S. National Science Foundation.” National Science Foundation. N.D.

Murgia, M. 2021. “Emotion recognition: can AI detect human feelings from a face?” Financial Times. May 12.

O’Neal, S. O. and Clark, J. 2020. “Microsoft and Open AI Comment on Advance Notice of Proposed Rulemaking (ANPRM) for the Identification and Review of Controls for Certain Foundational Technologies.” Uploaded to Bureau of Industry and Security BIS-2020-0029. November 9. https://

Parkinson, J., Bariyo, N, and Chin, J. 2019. “Huawei Technicians Helped African Governments Spy on Political Opponents.” The Wall Street Journal. August 15.

Peterson, D, and Hoffman, S. 2022. “Geopolitical implications of AI and digital surveillance adoption.” The Brookings Institution. June.  

Thompson, N. and Montgomery, M. 2022. “Strengthening U.S. Engagement in International Standards Bodies.” Day One Project. June 15.

Sato, M. 2021. “Singapore’s police now have access to contact tracing data.” MIT Technology Review. January 5.

Sharma, I. 2021a. “A More Responsible Digital Surveillance Future.” Federation of American Scientists. February 19.

Sharma, I. 2021b. “A Strategy to Blend Domestic and Foreign Policy on Responsible Digital Surveillance Reform.” February 19.

Sharma, I. 2021c. “Surveillance Technology & the Global Decline in Democracy.” Federation of American Scientists. December 22.

Sharma, I.. 2020. “China's Neocolonialism in the Political Economy of A.I. Surveillance.” Cornell International Affairs Review. Spring.

Whalen, Jeanne. 2021. “U.S. bans investment in Chinese surveillance company SenseTime, saying it supports repression of Uyghur minority population.” The Washington Post.

Wike, R., Silver, et al. 2021. “What People Around the World Like – and Dislike – About American Society and Politics.” Pew Research. November 1.

As the Russian invasion of Ukraine shows, nuclear threats are real, present, and dangerous

The Bulletin elevates expert voices above the noise. But as an independent, nonprofit media organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of
Inline Feedbacks
View all comments