Abstract
The advent of quantum computing will compromise current asymmetric cryptography. Awaiting this moment, global superpowers are routinely collecting and storing encrypted data, so as to later decrypt it once sufficiently strong quantum computers are in place. We argue that this situation gives rise to a new mode of global surveillance that we refer to as a quantum panopticon. Unlike traditional forms of panoptic surveillance, the quantum panopticon introduces a temporal axis, whereby data subjects’ future pasts can be monitored from an unknown “superposition” in the quantum future. It also introduces a new level of uncertainty, in that the future watchman’s very existence becomes a function of data subjects’ efforts to protect themselves from being monitored in the present. Encryption may work as a momentary protection, but increases the likelihood of long-term preservation for future decryption, because encrypted data is stored longer than plaintext data. To illustrate the political and ethical aspects of these features, we draw on cryptographic as well as theoretical surveillance literature and call for urgent consideration of the wider implications of quantum computing for the global surveillance landscape.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
The Mycenaean civilization (1750–1050 BC) belongs to the world’s most well-studied bronze age societies. This is largely due to the survival of its written language, referred to by archaeologists as Linear B (Blegen, 1953). The Mycenaeans themselves never intended for this to be the case, as inscriptions were normally made on soft clay tablets which were destroyed and replaced on a regular basis. Yet, during the civilization’s collapse in the 12th century BC, several palaces were burnt to the ground. In the ensuing heat, the tablets were hardened into bricks, thus solidifying the momentary inscriptions for posterity (Wace, 1953). Some 3000 years later, archaeologists dug up the tablets, and though they initially struggled to decipher Linear B, its eventual decoding revealed a detailed picture of the everyday life of Mycenaean citizens (Beattie, 1956).
The fate of the Mycenaeans is an apt illustration of the argument proposed in this article. It shows how unexpected events or technical developments may turn the longevity of information storage on its head—preserving the very information that was meant to be hidden or destroyed. For the Mycenaeans, that information included their language and everyday records. In our case, it concerns the data we leave behind every day using the internet. For, just like the Mycenean clay tablets, much of the information we produce today is not intended for posterity, especially the parts that we consider private (Taylor, 2023)—and yet, it may be precisely these data that survive long-term. The reason for this lies in the current regime of global governmental surveillance, and its anticipation of the looming advent of quantum computers.
Contemporary government surveillance programs collect and store online communication data, sometimes in secrecy and in direct violation of their own constitutions (Ackerman & Roberts, 2013; Satter, 2020). By a variety of means, including deals with global tech platforms and tapping of backbone cable intersections, the superpowers are now able to collect more or less any online communication on a global scale (Bradford Franklin et al., 2024; Gray & Henderson, 2017, pp. 440–441; Snowden, 2019, pp. 6–7). The standard method for resisting this surveillance is through encryption, which is a technical paradigm that has served the internet well for decades, protecting everything from bank transactions to military intelligence, but also major messaging platforms like WhatsApp, Signal and Facebook Messenger (Katz & Lindell, 2021).
There are a number of ways in which encrypted data may be compromised, such as improper implementation of cryptographic protocols or data controllers giving up the cryptographic keys, to name two.Footnote 1 However, our focus in this article is placed on a particularly grave, and—so we claim—theoretically interesting threat, namely the advent of so-called cryptographically relevant quantum computers (CRQC). The threat from CRQCs is that they are expected to break current cryptographic protocols and expose our most sensitive information on a global scale (Ford, 2023). The quantum race between the global superpowers will thus lead, not only to more complex computer systems, but to a disruption of the technical foundations of the current data protection paradigm (Lague, 2023).
Fortunately, new quantum-proof encryption protocols are underway, which will alleviate the immediate effects of CRQCs on encrypted communication (Lindsay, 2022). Yet, this might give little remedy to those whose data has been encrypted with weaker protocols during the past decades, not least since encrypted communication data is—according to U.S. federal law—stored longer than plaintext data (ProPublica, 2013; Rogers, 2014). Awaiting the arrival of CRQCs (a moment commonly referred to as “Q-day”), governments have been collecting data in bulk for later decryption, following the principle of “harvest now, decrypt later” (HNDL) (Ott et al., 2019). The generation populating the first few decades of the internet may thus face a fate similar to that of the Mycenaeans—their most sensitive (encrypted) data is likely to survive the longest, potentially to be deciphered by future quantum computers for the world to see.
In the present article, we conceptualize this situation as a new mode of global surveillance that we refer to as a quantum panopticon. We highlight two prime features that distinguish this concept from previous panoptic surveillance: The first is the introduction of a new type of temporal axis, whereby data subjects are not merely monitored by a contemporary hidden watchman, but also from a future (super) position.Footnote 2 The second is what we refer to as an uncertainty of existence. Unlike Foucault’s panopticon metaphor, which is predicated only on the uncertainty of the watchman’s gaze, the quantum panopticon introduces an uncertainty of the (future) watchman’s very existence. Current data subjects cannot know whether there will in fact be a future watchman monitoring their data (Foucault, 1995, pp. 208–215).
Ironically, their very efforts to protect themselves in the present (i.e., through encryption) are what may bring the future watchman into existence. This is due to the fact that encrypted data is more likely to be preserved long-term, whereas plaintext data is typically cleared more frequently in data retention centers (Erwin, 2015; Greenberg, 2013; ProPublica, 2013). Thus, like the Mycenean clay tablets, today’s encrypted communication will likely be information that survives the longest in government hands.
This means that, unknowingly, data subjects face a forced crossroad between the risk of being monitored now by current superpowers, or having their encrypted data preserved for posterity, at the risk of being (forever) monitored from the future. In this article, we refer to this irony as an instance of what political theorists refer to as absolute recoil—an act of resistance against a specific power structure that backfires into a reinforcement of said structure (Žižek, 2014, p. 66). Such a mechanism is well studied within other contexts, but, as we argue in the second half of this article, it holds considerable value as an analytical tool for quantum surveillance.
The goal of this article is to explore this novel surveillance infrastructure in detail, so as to highlight its ethical and political implications. Our argument is structured as follows. In the following section, we provide a brief primer on the technical and regulatory landscape. In Sect. 3, we discuss the panopticon metaphor in the context of digital surveillance studies, defending its relevance for theorizing specifically modern government surveillance. We differentiate between the theoretical landscapes surrounding private and government surveillance, arguing that the panopticon metaphor applies only to government surveillance, and we operationalize it to match our analysis. In Sect. 4, we provide a detailed explanation of the quantum panopticon based on its two distinguishing features: a new kind of temporal axis and the uncertainty of existence. To further demonstrate the theoretical contribution of our framework, Sect. 5 draws on the concept of turnkey tyranny (Gertheiss et al., 2017, pp. 231–32), and illustrates the political implications of the concept when viewed through the lens of the quantum panopticon. We conclude by calling for technical and political data protection measures that reflect the intergenerational nature of data privacy, rather than its mere momentary value (Öhman, 2024).
2 Technical Primer
Before we detail the features of the quantum panopticon, a technical primer is in order. The technical background consists of three sections, the first presents the global surveillance infrastructure, the second outlines the basics of current cryptographic protocols, and the third gives an account of the technological race between superpowers in developing cryptographically relevant quantum computers.
2.1 The Global Surveillance Infrastructure
In 2013, the NSA analyst Edward Snowden released thousands of highly classified documents to journalists at The Guardian (Greenwald, 2014, pp. 21–25). These revealed how the United States had installed listening points at thousands of backbone internet cables around the world in a global intelligence agreement known as “14 eyes” (Gray & Henderson, 2017, pp. 30–35; MacAskill & Ball, 2013). This type of data tapping is referred to as upstream collection, and the data is stored in gigantic server centers spread across the involved countries, including the NSA data center in Utah, the Dagger Complex in Germany and the GCHQ headquarters in the United Kingdom (Becker et al., 2014; Carroll, 2013; MacAskill & Ball, 2013; Norton-Taylor & Bowcott, 2016).
The documents also disclosed the existence of the PRISM program, which gives the NSA direct access to raw data from the largest internet platforms within their jurisdiction (Gray & Henderson, 2017, pp. 30–35, 108–109; Lyon, 2014), and the Xkeyscore program, which is a front-end search engine capable of extracting information on anyone in the world, provided that there are known identifiers in form of unique digital fingerprints, which every internet user leaves behind (Gray & Henderson, 2017, pp. 474–475). Through bilateral agreements, the NSA lets their international partners use Xkeyscore, and the allies provide the NSA with their own databases (SVT, 2013).
Since the Snowden disclosures, congress has continued in bipartisan agreement within the boundaries of the status quo, by repeatedly renewing the FISA Amendment that upholds both PRISM and Xkeyscore (Thorp et al., 2024). Furthermore, recent public reports from the Privacy and Civil Liberties Oversight Board (PCLOB) show that the NSA is still conducting an aggressive upstream collection around the world, and the board has specifically criticized the NSA for continuously using batch queries in their searches (Bradford Franklin et al., 2023, p. 201).
The PCLOB reports also show that the PRISM infrastructure is still in place, even though it is nowadays mostly referred to as downstream collection (Bradford Franklin et al., 2023). In addition, the PCLOB recently released a previously top-secret report on Xkeyscore (Klein et al., 2020). The report, which has gone largely under the radar of mainstream media, states that the most far-reaching program in the NSA toolbox is still operational, and the press-release includes references to activities as recent as 2022 (Bradford Franklin et al., 2024). In short, the United States is actively running comprehensive and global surveillance programs, and they save the data in long-term data retention centers.
2.2 The Current Cryptographic Paradigm
The standard way of protecting one’s data from government surveillance on the internet is through encryption. Most encrypted transmissions online rely (in one way or another) on so-called asymmetric cryptography—a term describing algorithms that use two different keys; a public key for encrypting and a private one for decrypting (Menezes et al., 2001, pp. 25–32). This allows users to receive encrypted information securely even if the sender is located far away, like a door that can be locked by anyone who has the public key, but unlocked only by the person who has the private key (Menezes et al., 2001, pp. 25–32). It is exceedingly difficult to calculate the private key just by looking at the public key, even though the keys are mathematically connected (Menezes et al., 2001, pp. 25–32). This method relies on so-called trapdoor functions—mathematical functions that are designed to be easy to verify in one direction but difficult to efficiently invert.Footnote 3 In a more qualitative comparison with a similar logic, it is harder to compose Beethoven’s 5th symphony than to verify its key signature by listening to it (Katz & Lindell, 2021).
The modern usage of (non quantum-proof) asymmetric algorithms is mainly to transfer session keys and verifiable signatures on the internet, to establish other forms of more secure symmetric encryption, such as AES-256 (Diffie & Hellman, 1976, pp. 644–648). The AES protocol with a 256 bit key is one of the most secure and scrutinized cryptographic protocols ever created, hence why it is used so widely across platforms and operating systems. However, since AES-256 is a symmetric protocol, it only uses a single secret key for both encryption and decryption, which hinders the protocol from sharing the private key for remote encryption (Katz & Lindell, 2021). In encrypting communications sent over distances, the symmetric protocols have to rely on vulnerable asymmetric protocols for keeping the symmetric key encrypted during transfer (Menezes et al., 2001).
Such vulnerable asymmetric protocols include RSA and ECC (Menezes et al., 2001), which protects most of the TLS communications being transmitted in browsers. For the purpose of this article, the central cryptographic point to keep in mind is that the security of most encrypted communications is mathematically dependent on the security of asymmetric protocols—which, unfortunately, are vulnerable to the looming power of quantum computers.Footnote 4 This cryptographic paradigm is the very foundation that keeps electronic information safe on the internet, and most people use it automatically every day (e.g., in online banking or standard messaging apps). Encryption is thus not merely something that clandestine actors use to hide from the law—it is the fundamental infrastructure that safeguards data privacy (Katz & Lindell, 2021).
2.3 The Quantum Race
Since the public and private keys in asymmetric cryptography are mathematically connected, it is theoretically possible to use the public key to unravel the private key. Succeeding with such an attack does not necessarily mean that the adversary manages to find an efficient inverse function; it could simply mean that they have enough computing power to try enough possibilities until they can verify the correct key (Katz & Lindell, 2021). Historically, even the strongest supercomputers have been unable to systematically succeed with such decryption attacks. However, as of today, we are standing on the doorstep of the quantum era, with new computers powerful enough to systematically decrypt current asymmetric protocols (Ford, 2023; Gidney & Ekerå, 2021).
While conventional computers use bits to perform calculations, a quantum computer uses qubits. Specifically, the latter uses the mechanisms of quantum physics in a controlled quantum system, by entangling the qubits and placing them in superpositions, thus allowing for calculations at incredible speed (Ford, 2023; Gidney & Ekerå, 2021). The unprecedented computing power of a single strong quantum computer can easily outperform the strength of even the strongest contemporary supercomputers in certain domains. However, a logical or simulated quantum computer is not necessarily suited for performing all kinds of mathematical operations familiar to conventional computers. Though, the probabilistic nature of quantum computers include performance advantages in several mathematical areas, including factoring prime numbers and solving the elliptic curve discrete logarithm problem—both of which are directly applicable to decrypting data encrypted with the RSA and ECC protocols (Shor, 1994; Gidney & Ekerå, 2021).
While no superpower has formally admitted to collecting foreign data in mass for later decryption (the HNDL principle), the leading quantum powers all accuse one another of doing it (Lague, 2023). As Krelina (2023) points out, government and private intelligence reports are already showing clear signs that several countries are saving data with the aim of decrypting it with quantum computers. Additionally, a recent report from Reuters details how the Pentagon is already cooperating with private cybersecurity firms specialized in quantum decryption and quantum systems. One such business representative is Tilo Kunz who, after his briefing at the Pentagon, explained to the Reuters reporter that “Everything that gets sent over public networks is at risk” (Lague, 2023).
Regarding the data retention length, leaked documents from the United States show that the NSA can store encrypted data for as long as it takes to decrypt it (Greenberg, 2013; Rogers, 2014). Furthermore, the bill regulating the data retention at the NSA—Authorization Act for Fiscal Year 2015—explicitly state that the agency can keep plaintext data regarding United States citizens for five years, and encrypted data regarding anyone in the world indefinitely (Erwin, 2015; ProPublica, 2013; Rogers, 2014). This means that there are no judicial guardrails ensuring that collected encrypted data is ever cleared from the government servers. Not knowing how long our encrypted secrets are stored—for a short while or forever—creates a strange kind of uncertainty, an issue that we will return to in Sect. 4.2.
The fact that encrypted data is being stored for future decryption does not necessarily mean that plaintext data is guaranteed to be cleared shortly after collection, since some governments store plaintext data for prolonged periods as well. However, it is a fact that encrypted communication data collected by the United States is stored longer than plaintext data, as established by the Authorization Act for Fiscal Year 2015, which is still in effect today (ProPublica, 2013; Rogers, 2014).
The advancement in quantum computing, combined with governments’ ambition to disclose encrypted secrets has given rise to a new field of cryptography, known as post-quantum cryptography (PQC). These novel algorithms are designed specifically to withstand the immense computing power from CRQCs. Two notable algorithms are CRYSTALS-Dilithium and FALCON, whose protocols rely on lattice problems, which utilize the trapdoor function of finding the shortest vector in a multidimensional coordinate system.Footnote 5 These algorithms have shown promising results (NIST, 2022; Pham et al., 2023), meaning that the future arrival of CRQCs will not compromise the safety of the internet.
However, the implications still remain uncertain. Depending on how soon Q-day arrives, Lindsay (2020b) has suggested possible outcomes regarding the amount of data that could be compromised. The worst scenario from a privacy standpoint is a rapid emergence of CRQCs, such that vulnerable transmissions can be intercepted in real-time and later decrypted by the computers. Yet, according to current estimates, quantum-proof protocols will be implemented on a large scale before quantum computers can start to decrypt data in bulk (Lindsay, 2022).
The development of PQC protocols is a major achievement for the protection of data privacy. However, as we shall argue in Sect. 4.1, such algorithms are not sufficient to safeguard historic private data in a broader sense. With previous and contemporary data already stored by governments in retention centers, there is a colossal pool of private data that is not protected by the quantum-proof algorithms. As such, a new type of temporal asymmetry arises, where quantum-proof encryption can protect the future, but not the past (Lague, 2023).
3 Theorizing Online Surveillance
There are a myriad of theories for understanding surveillance in modern society. The most recognized among these is the metaphor of the panopticon, a hypothetical prison originally described by Jeremy Bentham in the 18th century (Bentham & Božovič, 2010). The theory of the panopticon has been widely used in academia to understand surveillance from a metaphor of a prison (Gandy, 2021). In its design, the panopticon is a circular building with cells facing inwards around the walls, and with a central watchtower in the middle. While the cells are lit up at all times, the watchtower is veiled in darkness, such that a potential watchman can see the inmates at any time, without them knowing whether they are being watched. The mere possibility of such surveillance is meant to inspire obedience, as inmates are conditioned to act as if they are watched at all times (Bentham & Božovič, 2010).
For Foucault (1995), the panoptic surveillance of modernity referred mainly to physical forms of monitoring where institutions such as schools, hospitals, workplaces and prisons assumed the function of factories, with the aim of normalizing citizens into operational gears in the capitalist machinery. This process, Foucault argued, was materially driven by the capitalist need for a disciplined workforce. In modernity, the punishments against bodies that characterized feudal means of discipline were thus largely exchanged for new kinds of punishments against the soul in the form of internalized self-regulation, i.e., the mechanism by which citizens themselves adopt the gaze of the (supposed) watchman (Foucault, 1995, pp. 208–215).Footnote 6
Foucault’s theory of panoptic surveillance has remained the foundation of much subsequent surveillance theory. But the advancement of new information technologies in the 80’s and 90’s radically expanded its scope. As indicated by Clarke’s (1988) prolific term dataveillance, the focus shifted from physical forms of surveillance (such as schools and factories) to more explicitly data-driven modes of monitoring through records and automatic data processing. Clarke’s dataveillance theory is one of the earliest academic descriptions of how digital data compresses time into space, allowing for a digital temporal telepresence backwards in time, something that this article builds upon and extends in Sect. 4.1. This new data driven surveillance model in the 90’s and early 2000s introduced new power dimensions and mechanisms of domination by both government and private actors, which have only grown in scope and sophistication in the last decades (Zuboff, 2020).
In the wake of the internet revolution, Haggerty and Ericson (2000) described how collected data separates the information from the actual person and creates a virtual “data double”. The notion of digital data doubles is especially applicable in contemporary society, where there are vast social, legal and medical records stored for each citizen—which can easily be merged together to a coherent personal database by anyone who has access to the information.
As described by Gandy (2021), the vast sorting of personal information in the digital era can be used for social control, and he refers to this as panoptic sorting. By categorizing data on the level of individuals, as in the case of e.g. the Cambridge Analytica scandal (Cadwalladr & Graham-Harrison, 2018; Hinds et al., 2020), which affected the presidential election in the United States, actors who hold centralized access to information (akin to the watchman’s position in the panoptic guard tower) may illicitly influence the public without their knowledge (Gandy, 2021, pp. 15–20).
Despite its central position in the literature, several theorists have questioned the aptness of the panopticon metaphor for the online landscape. Already in 1993, Lyon (1993) rejected the idea of an electronic panopticon with a central watchman who monitors digital data. Bauman (2000, p. 85–86) as well as Haggerty and Ericson (2000) similarly claimed that digital surveillance is more decentralized and subtle than the direct and hierarchical control of a (panoptic) prison. As Bauman and Lyon (2013) argue in a joint discussion, panoptic surveillance is nowadays mainly present at institutions such as actual prisons and psychiatric wards, which still rely on hierarchical mechanisms with direct control. In their perspective, the postmodern society has a dominant mode of societal control that is more liquid: “The architecture of electronic technologies through which power is asserted in today’s mutable and mobile organizations makes the architecture of walls and windows largely redundant” (p. 10). A similar position can be found with Elmer (2003), who maintains that the modern surveillance infrastructure is not as centralized as in a panoptic prison. Elmer also notes that today’s surveillance is not entirely coercive, such that there are several digital services that we freely sign up for, knowing very well that they will collect our data (pp. 236–237).
Another type of criticism is offered by Dupont (2008), who sees the online surveillance landscape as more horizontal, where users monitor each other, and thus blur the line between who is watching and being watched. Brivot and Gendron (2011) argue along the same lines, noting that the panopticon has lost its theoretical sharpness due to the decentralized nature of today’s surveillance systems: “[…] unlike the panopticon which leaves little doubt as to the existence of a relatively clear surveillance project, there is no unified surveillance master plan underlying the proliferation of technologies, there is no central watching figure either” (Brivot & Gendron, 2011, p. 140). Modern surveillance, that is, does not rely on individual and enclosed spaces that can be centrally monitored—on the internet, there are no coercive enclosures representing the cells of the panoptic prison.
These criticisms may hold true for the monitoring of consumers on social media, which is not hierarchical and disciplinary in the way originally imagined by Foucault. However, they fail to account for the deep interconnectedness between surveillance capitalism and the unquestionably hierarchical surveillance programs led by governments. As we have seen, contemporary surveillance programs do have something that resembles cells and a central watchtower. While the internet is a messy place, there are efficient systems for categorizing data, such as Xkeyscore, which can identify any individual online using only parameters and keywords (Gray & Henderson, 2017, pp. 474–475). Such a tool can take any given person as input and find their vast records of private data in compiled and personal folders only accessible for the watchman, here represented by government agents.
All in all, Bauman, Lyon and their fellow critics of the panopticon metaphor are quite correct in questioning its applicability in a world of liquid surveillance—given that they describe only the private collection of data. This distinction between government and private surveillance, while understanding the interconnectedness between the two, is paramount for theoretical clarity in surveillance studies. The governmental surveillance with its upstream collection, on the other hand, remains comprehensive, far-reaching and coercive. The only available means of protection towards it is encryption, to “hack the panopticon” as Dupont (2008) puts it, by making the watchman incapable of seeing what is happening within the encrypted cell. Yet, as we will show in the following section, the reality of such a hacking endeavor may actually be inverted. By encrypting the data, a digital inmate is led to believe that they are hacking the panopticon, and this may well be true for the present moment, but in the quantum panopticon, this very action also opens a new cell, in which they may be trapped forever.
4 Introducing the Quantum Panopticon
We posit that the advent of quantum computing brings a new mode of surveillance into being. We refer to it as a quantum panopticon. The traditional panopticon metaphor is limited to a spatial area within the prison, and an uncertainty about whether the watchman is looking. As illustrated in Table 1, the quantum panopticon introduces two crucial new features.
The first is a new kind of temporal axis, whereby data subjects are not only monitored periodically in the present but, because of the HNDL principle, risk having also their pasts monitored by a future watchman equipped with a CRQC that grants him a gaze that pierces through not only time, but also into a new part of the panoptic cell that was not previously available for temporal analysis. As such, the quantum panopticon expands the traditional metaphor across a new dimension, where there was once only a horizontal spatial axis (the prison), there is now also a new type of vertical temporal axis.
It should be noted, again, that surveillance of plaintext data already involves temporal analysis backwards. Indeed, the last decades has seen a societal debate on data privacy and the predicament of how persistent data retention can be at online platforms (Council of the European Union, 2016; Mayer-Schönberger, 2009).Footnote 7 However, as elaborated on further in the next section, we argue that quantum panopticon introduces a new kind of temporal dimension that modifies the already existing temporal axis both in terms of data types (scope), and how temporally extended it is (reach). The scope is expanded because the encrypted data was previously unavailable for analysis, and the reach is increased because encrypted data is generally stored longer than plaintext data (Erwin, 2015; Greenberg, 2013; ProPublica, 2013; Rogers, 2014). Meanwhile, the new temporal dimension of decrypted data unlocked by emerging quantum computers is merely one type of temporal surveillance in modern society. There are others, but they are outside of the scope of this article to analyze.
The second feature is a new form of uncertainty. Unlike the original panopticon metaphor, which includes only a level of uncertainty regarding the watchman’s gaze, the quantum panopticon features an uncertainty of his existence. Inmates cannot know whether the future watchman will actually materialize, for it is entirely possible that the quantum panopticon will not materialize, which would remove the future watchman from the equation. Additionally, the spread of the quantum panopticon does not have to be determined or symmetrically distributed. In fact, the spread of any kind of political or technological system is usually asymmetrically distributed around the world. Some countries in the future might suffer under harsh surveillance, intense quantum decryption efforts and nonexistent privacy protection laws; such a scenario would make the advent of the quantum panopticon more likely. On the contrary, strong data protection laws and continuous clearing of private data are factors that could effectively hinder the emergence of a future watchman.
Similar to Schrödinger’s famous cat, the watchman can both exist and not exist, depending on whether data subjects take caution to protect themselves in the present (e.g., by encryption). As such, current data subjects are facing a difficult unconscious dilemma between being monitored now by the current superpowers, or having their encrypted data preserved for posterity, at the risk of being potentially forever monitored by a future watchman. This does not necessarily imply that data subjects are making these decisions consciously in relation to the theory of the quantum panopticon—quite the contrary. People can make decisions that affect their future even if they themselves are unaware of the complex mechanisms involved with their choice, or, interestingly, even if they are unaware that they are making a choice. What interests us here is merely the rational theoretical conditions of their choice.
These two new features of the quantum panopticon do not replace, but are additions to, the traditional features of the panopticon metaphor. They instantiate an extra level on-top of what already existed. Both of these two new quantum features—the novel temporal axis and the uncertainty of existence—will be elaborated on in the two following sections.
4.1 The Novel Temporal Axis
The novel temporal axis refers to the fact that government surveillance programs, by the principle of HNDL, are routinely preserving encrypted data for posterity so as to decipher the content once CRQCs are in place. As such, the data subjects of today may not merely have their present actions monitored, but also their pasts and futures—by a currently unknown watchman looking back from the future (see next Sect. 4.2).
An illustrative analogy for the original type of temporal axis is the problem of search warrants, as elaborated by Öhman (2020). In today’s smart home environments, everything is digitized. A warrant which may be valid only for a few days effectively encompasses the entire period for which the technologies have gathered data, such that “a second of spatial access may grant unlimited temporal access” (2020, p. 1073). This poses a serious problem for current norms of information privacy in forensics. As Öhman (2020, p. 1073) elaborates:
Investigators are obviously not able to travel back in time and access a home from some years ago, and hence, the temporal limits of a [search] warrant have traditionally been equal to the amount of time investigators spend there. However, if we interpret the emergence of the IoT [internet of things] as an erosion of temporal friction, time travel is no longer necessary since the events that have taken place in the home are present [to the investigator] ...
In other words, information compresses time into space. A similar mechanism arises in the context of the quantum panopticon. The inmates of the quantum panopticon may not be monitored right now, but this does not stop the future watchman from retroactively surveilling their behavior (given that their data is preserved). While the internet is a vast and chaotic space, data subjects have little chance of hiding in the sea of information. Due to the amazing powers of tools like Xkeystone (Gray & Henderson, 2017, pp. 474–475), the future watchman can easily sort past data and link it to specific individuals of interest, which essentially creates a complete, three-dimensional prison that extends backwards in time.
Thus, the new kind of temporal axis backwards in time is not a new mode of surveillance per se—all data records allow backwards gazing. However, this particular brand of retrospective surveillance stands out compared to contemporary plaintext analysis, partly because it concerns data that was once encrypted with the intention to keep it secret. It expands the dataset that is available for surveillance and analysis, but also adds a new kind of information which likely contains more secrets and sensitive data. The principle of HNDL, combined with the prospect of CRQCs, opens up previously safeguarded vaults of private data, whose owners were, during the encryption process, once promised perpetual privacy. This is a fundamental technological shift, and it exposes the decrypted data for a wide range of potential analyses. Thus, while previous temporal analysis of stored data has been limited to plaintext data and metadata, the quantum revolution opens up a new dimension of formerly protected secrets. In summary, the emergence of cryptographically relevant quantum computers introduces a new mode of online surveillance that increases the data size, discloses new sensitive information and expands the previous temporal axis.
To contextualize this extension of the original temporal axis, consider past expansions of the panopticon metaphor. In the original outline for the panopticon prison, Bentham described how the mere possibility of being under the watchman’s gaze would alter the inmates’ behavior permanently (Bentham & Božovič, 2010). Foucault broadened the metaphorical scope of this mechanism to the totality of modern society, or at least to key institutions like schools, workplaces, prisons and psychiatric wards (Foucault, 1995). Finally, surveillance scholars like Clarke (1988) and Gandy (2021) expanded the scope of Foucault’s theory—from the physical world of institutions, to the world of data, introducing the conventional temporal axis. It is in view of this incremental expansion that the new temporal axis should be interpreted—as the fourth step in the series following from prison to society, from society to data, and, finally, from data to the expanded temporal axis.
However, unlike previous (spatial) expansions, the conventional temporal axis constitutes an entirely new vector of surveillance. Whereas the move from prison to society is horizontal, the time axis may be seen as a vertical expansion. As such, it grants the watchman a telepresence gaze, which in the quantum panopticon, turns almost omniscient, as it can both unveil encrypted secrets and transcend momentary time barriers, categorically matching the etymological origins of the panopticon—the Greek word pan-optes (πανόπτης) meaning all-seeing.
The resulting mode of surveillance is neither subtle nor liquid, as imagined by Bauman and Lyon (2013), but rather top-down and rigid. Contrary to Brivot and Gendron’s criticisms (2011, p. 140), it also features both metaphorical cells and a central watchtower. The cells are the personal data archives, which can be identified and compiled with tools such as Xkeyscore, and are impossible to escape from (Elmer, 2003, pp. 236–37). The watchtower is the (future) government(s) who control the archives of encrypted data from the digital past.
Lindsay is correct to say that the new quantum-proof algorithms are being developed faster than the quantum threat is approaching (Lindsay, 2022). However, with encrypted data being stored for extended periods, the old and vulnerable algorithms, which are still used today, will hold the key not only to our contemporary secrets but also to our past. The first generation of internet users might thus live in a technical interregnum between two distinct privacy paradigms. Before the popular breakthrough of the internet, it was easier to maintain control over your private information, since the means of communication were slower and more local. Meanwhile, the future of the internet will likely be protected by more robust privacy frameworks and quantum-proof cryptographic algorithms. As such, the early years of the internet may be just an unstable and chaotic exception, where lawmakers struggled to figure out the new digital landscape while engineers developed the technology faster than they could protect it. The point is that those living in this precarious interregnum may be forever exposed by the future watchman.
In Bentham and Foucault’s panoptic prisons, the inmates are thought to change their behavior as the guard may be watching at any given point. The same seems to be the case for the dataveillance landscape. As Stoycheff et al., (2019, pp. 8–9) have suggested, the mere presence of internet surveillance changes people’s online behavior, both in terms of online criminal activities but crucially also decreasing political activity. Does such self-regulatory mechanisms apply also to the novel temporal axis of the quantum panopticon? On the one hand, very few people are aware of the latest developments in cryptography, surveillance and quantum computing. Moreover, since the future watchman does not yet exist, his existence is hardly visible to the inmates in the sense imagined by Bentham and Foucault. On the other hand, this creates a peculiar paradox—the self-regulatory effect is negligible as long as you do not try to resist, but it increases the more you try to learn about your predicament. As we develop in the following section, the self-regulation imposed by the quantum panopticon works like a noose, which tightens the more you resist.
These properties of the novel temporal axis that we propose illustrate why current data privacy solutions are both technically and politically insufficient. When conceptualizing surveillance through the standard panopticon metaphor, we see only the spatial prison at individual moments—a set of cells with a guard watching the inmates in real-time. The surveillance here is limited to the present, so that if the guard is absent at any point, they cannot reach back in time to retroactively watch the cell. This traditional metaphor is, of course, still applicable in some parts of society. However, it prevents us from seeing the future watchman enabled by the coming advent of CRQCs. By applying the temporal axis, we acknowledge that the described surveillance and social control can go beyond mere spatial and momentary dimensions. This temporal addition is not a quantitative modification to the panopticon, since it does not make the spatial prison cell larger; instead, it is a qualitative adjustment that allows the future watchman to see how the cell has changed over time. Our contribution to the already existing temporal axis expands its scope and reach, since the quantum decryption process reveals more data, a new type of data as well as older data, given that encrypted data is generally stored longer than plaintext data (Rogers, 2014).
Finally, it may be worth asking about the identity of this future watchman. Is it a government employee, a historian, a genealogist, an artificial agent, or simply a result of our imagination? These are questions that we, just like the ancient Mycenaeans, cannot answer with certainty, since our knowledge about our own place in history is limited to the context we live in. There are both known unknowns as well as unknown unknowns. But as we show in the next section, the answer may, in the most ironic way, depend on our course of action.
4.2 Uncertainty of Existence
The second dimension of the quantum panopticon is the uncertainty of the future watchman’s existence. In addition to the uncertainty about the gaze of the current watchman, the inmates of the quantum panopticon also face uncertainty regarding the existence of a future watchman, whose gaze operates on the novel temporal axis described above.
The development of CRQCs for the purpose of government surveillance is not a historically determined scenario. A considerable number of unknown variables are involved. For example, since decryption requires computing power, not all available encrypted data will be decrypted—some secrets will (probably) live on forever. However, even if governments focused mainly on military secrets, it would be impossible to pinpoint only those, since the very idea of intelligence operations is to use encryption to hide the content. With the content of the encrypted data being impossible to determine, and intelligence secrets trying to blend in among other types of data, it is plausible that governments would decrypt data in bulk and then automatically search within the new plaintext database.
One possible counterargument against the plausibility of such a future watchman is that the vastness of big data is also its weakness—detailed or personal data may drown in the enormity of the digital sea. However, modern computer systems are already more than capable of pinpointing specific data clusters through identifiers. Such features are available in e.g., Xkeyscore, which, as we have seen, can group personal data based on a number of digital identifiers, many of which are not disguised by using a VPN (Klein et al.).
With encrypted data surviving, and potentially being decrypted in the future, the persisting information will not only contain more secrets, and therefore be valuable, it will also be the very data that people have taken active measures to protect. However, this connection works both ways—an unprotected transmission will probably expose you more momentarily while encrypted transmissions expose your data more in the long run. Taking this logic into the historical perspective, the plaintext data is more likely to be cleared and lost forever (ProPublica, 2013).
Irrespective of whether such a situation will materialize, we argue that the very uncertainty itself introduces an interesting trade-off among current data subjects. For illustration, consider an analogy from quantum physics. Subatomic particles may exist in a “superposition,” meaning that they exist simultaneously in multiple states until measured. A particle can exist and not exist in a certain state at the same time, and only when observed does the wave function “collapse” into one of them (Von Neumann, 1996).
Our future watchman is also situated in such a superposition. He both exists and does not exist, at the same time—depending on whether the inmates take action to protect themselves from the current gaze in the present. They can hide from the current watchman by encrypting their data, but this very act increases the plausibility of their data becoming available to the future watchman due to governments’ efforts to store encrypted data awaiting Q-day. Meanwhile, they can increase the chance of hiding from the future watchman, but only by exposing themselves to the present watchmen. In other words, it is the inmates’ very actions to protect themselves that brings the future watchman into existence. They are unknowingly trapped in a trade-off between being surveilled in the present or risk being eternally surveilled by a future (unknown) watchman.
This kind of causal relationship, where the very act of trying to achieve a specific goal unintentionally leads to the opposite outcome—“a withdrawal that creates what it withdraws from” (Žižek, 2014, p. 66)—is commonly referred to by political theorists as an absolute recoil.Footnote 8 The political theory literature is full of such unexpected causal outcomes, and they challenge our intuitive understanding of cause and effect in a complex and multifaceted world. One such example is capitalism’s ability to neutralize radical resistance by reforming to a fundamentally unchanged order, as identified by Mouffe and Laclau (2001). Another example is how consumer culture manages to uphold capitalism’s hegemony by commodifying resistance through countercultural commodities and pseudo-radical consumer gratifications that pacifies the political subject (Marcuse, 1991). The latter is also detailed by Žižek (2009), who argues that opposition to market forces can be converted and commodified into charity and socially conscious business models, where the practice of resistance and consumption become intertwined.
We argue that a similar mechanism applies when data subjects try to resist the global surveillance infrastructure. By encrypting their communications, data subjects act merely as isolated individuals, concerned only about their own privacy as a private good. It is somewhat an act of damage control, which aims at treating the isolated symptom of the order but not its cause. As such, most individualistic solutions to the threat of government surveillance are likely doomed; they generate an absolute recoil that may protect data subjects in the present—only to risk having their entire lives monitored from the future.
In sum, encrypting information will make it less visible today but risk increasing the duration the data is stored, as well as making the data more visible tomorrow. In this rather Mycenaean manner, there is an unexpected mechanism that instigates the survival of the private data by burning it into solid digital remains, to borrow a term from Lingel (2013). Therefore, the people trying to hack the panopticon using encryption, as outlined by Dupont (2008), might instead find themselves being hacked by the future watchman. Following the logic of an absolute recoil, the very act of trying to hide from the watchman is the call that summons him.
5 Concluding Discussion
The chief goal of this article has been to introduce the concept of the quantum panopticon, and to explore its features—the expanded temporal axis and the uncertainty of the future watchman’s existence—so as to highlight their ethical and political implications. In doing so, we hope to have opened a dialogue between various literatures that normally overlap only marginally, and we claim that the synergies are many. By combining the cryptographic literature with the social science perspective of surveillance, the study has added a new and qualitative perspective to Lindsay’s (2020b) scenario analysis through the introduction of the quantum panopticon. Much like Lindsay, this article has underscored the political nature of the technical development, such that quantum computers will generally not be used randomly by private actors, but rather cement the current global order between superpowers (Lindsay, 2020a).
Moreover, by considering the nexus between quantum computing, cryptography and sociological surveillance theories, our analysis has provided counterarguments against the scholars who critique the prevalence of panoptic surveillance in the current technological terrain. It is correct that the landscape of surveillance capitalism is more decentralized and liquid than how Foucault imagined the panopticon, as argued by Bauman and Lyon (2013) and Brivot and Gendron (2011) among others—at least the private side of it. But the idea that private and government surveillance can be fully separated is false, as governments routinely rely on private actors to collect their data. While the private collection of data is alarming in its own right, its process is being continuously regulated by lawmakers and at the same time being examined thoroughly by journalists and researchers. The government collection, however, is shrouded in a thick veil of secrecy, sometimes operating in a legal gray area, and though this data collection is more comprehensive, far too little has been said in the data preservation and online privacy literature about what will happen to this data—our data—in the years to come.
The features of the quantum panopticon also underscore that data privacy is more than a mere momentary good. While military and business secrets may have a shorter expiration date in regards to its relevance and value (Lindsay, 2020b), personal data can very well maintain its value both for the individual for whom the data concerns, and for anyone interested in buying the data. This is especially true since the rise of data broking and the need for gigantic datasets for training large language models (LLMs), like GPT, Gemini and Claude. With LLMs becoming increasingly saturated on the available training data from the internet, since there is little authentic (i.e., non machine assisted) plaintext data left to feed the largest models, developers have already turned to feeding them with synthetic data (Chen et al., 2024; Wang et al., 2024).
This highlights the monetary value and enormous technological potential embodied in the previously encrypted data, which, contrasted to synthetic data, is actually created organically by real people interacting with the world around them. Decryption of such datasets would unlock a large pool of unique, organic and unused information. From a data privacy perspective, this raises the question of privacy of dead people and the afterlife of data (Lingel, 2013; Öhman, 2024), especially since it is not a trivial task to remove specific neural weights from a large language model after the training process. This exemplifies the contemporary dictum that whoever owns the data in a digital world has tremendous social, technological and political power.
This point is also important because it demonstrates the utility of the quantum panopticon as a theoretical construct—a means of seeing further by introducing new dimensions of power. For illustration, consider the aforementioned concept of turnkey tyranny, a term popularized by Snowden in 2013. Turnkey tyranny refers to a potential future scenario where an autocratic regime maliciously uses a surveillance system established by a previously democratic government (Gertheiss et al., 2017).
This scenario, while in no way predetermined, captures the danger wherein democratic governments build up sophisticated surveillance states to maintain order, still within the boundaries of the rule of law and legal certainty, but then inadvertently lose this powerful machinery to a new undemocratic leader. This process could occur through democratic elections, without any violent overthrow of democracy—indeed, alarming signals of such democratic backsliding are increasingly frequent, not least in the United States, which holds some of the most powerful surveillance systems—but the end result would still be the same as a full abolishment of democracy (Edgar, 2017, pp. 10–11). As the repressive machinery is already in place when the new leader is elected, all they have to do is step into the office and turn the key (Earl & Beyer, 2014, p. 220).
In the literature, the prospect of a turnkey tyranny has been understood mainly as a problem for the future, such that we can hypothesize about it from our current standpoint within a safe democratic context (Gertheiss et al., 2017). However, through the lens of the quantum panopticon, we clearly see why it is also a present problem. This is, partly, because the mere prospect of a future turnkey tyranny may (in rare cases) alter our behavior today. A theoretical rational agent who is cautious with their privacy may, for instance, refrain from encrypting their data out of fear that the totality of their life may become visible to a watchman lurking in an unforeseeable distant (or close) future. As discussed earlier, the more informed we are about the prospect of surveillance, the bigger the incentive to take protective action—which, as we have seen, may seriously backfire. As such, the rise of a turnkey tyranny reverberates both backwards and forwards in time.
More importantly, though, as inmates of the quantum panopticon, it is reasonable to say that we are potentially already living within the gaze of the future turnkey tyranny’s watchmen. We are the data points upon which they are looking back, with our physical self being reflected in the form of a data double (Haggerty & Ericson, 2000). Therefore, the current generation not only fights for the most secret information from their own time, but also for the people living in the past. The fight for our future intrinsically becomes a fight for our past.
Our analysis of the quantum panopticon, that is, allows us to see that turnkey tyranny is not merely a distant process with a looming starting point in the future which, once implemented, will only be forward facing. By taking what we have called the new type of temporal axis into account, we now see that our most personal pasts may one day become (tele-)present in the future. Quantum computing risks to erode the temporal friction (Öhman, 2020) that upholds the distinction between the present, and a future turnkey tyranny. If those generations are in political trouble, due to authoritarian politics without consideration to privacy, then so are past generations. This underscores the importance of democratic systems that can withstand turbulence and respect both current and past generations.
However, it is entirely possible that Q-day will arrive so far in the future that all those who belong to the current generation are dead. Such a scenario may alleviate some of the graver privacy concerns, but may still hold great political significance. As shown by Öhman (2024), the data of the dead is inextricably entangled with that of the living. Moreover, while the individuals whose data may be decrypted by the turnkey tyrant could be dead, the organizations they represent may certainly outlive them. As such, a turning of the future key remains an important political event, no matter how distant.
In turn, this illuminates the notion that the remedy to the repression of the quantum panopticon is not to act as separate individual inmates, and try to escape the momentary prison by just encrypting the data. As outlined in Sect. 4.2, the individualistic solution of encrypting one’s own data risks leading to an absolute recoil, in the form of a privacy backlash, since encrypted data (vulnerable against quantum computers) is stored longer than plaintext data (Rogers, 2014). This recoil mechanism inadvertently strengthens the surveillance power structure that the data subject is trying to escape.
In dissecting the mechanics of this absolute recoil, we propose that the remedy is not necessarily technological at all, but rather political. In fact, every generation has a window of opportunity to put forward comprehensive privacy and data protection laws for clearing and storing data—including that which is encrypted—to avoid having their information being misused in the future. This is a political struggle that is not easily fought, partly because of the powerful commercial and government interests that willingly collect and profit from the data collection (Zuboff, 2020). But, by highlighting that data privacy is an intergenerational project, rather than a mere momentary good, the stakes of this struggle become more evident and so do our incentives to pursue it (Öhman, 2024).
As the remarkable properties of quantum physics is bringing about a new kind of computer, it simultaneously brings about a new world. In its wake, many of the concepts we use to theorize data privacy and surveillance will need amendment. This article has focused on one such concept—the panopticon metaphor—and revised it to better account for surveillance and privacy in the quantum era. As we have shown, the contemporary situation is full of contradictions. Not only is our present privacy at stake, but also our past and our future, which become inextricably entangled in the quantum panopticon.
Moreover, through a new type of quantum uncertainty, it is now our very means of protection in the present that risks exposing us in the future. As a subtle echo from the Mycenaeans, the very actions intended to make you less visible on the internet might instigate the virtual flame that burns your private data into hardened file vaults for future generations to view—an absolute recoil in the quantum panopticon.
Notes
The leakage of unencrypted data is tangential to the privacy threat of open-source intelligence (OSINT), which has revolutionized journalism, hacking and intelligence gathering in the last decade. With new OSINT tools for reverse email lookup, data mapping and database search tools, the passwords and account information of billions of internet users is currently exposed openly on the internet. Furthermore, the issue of cryptographic keys falling into the hands of unauthorized actors is covered under the PRISM surveillance program, elaborated further on in Sect. 2.1 (Bradford Franklin et al., 2023; Gray & Henderson, 2017).
As we elaborate in Sect. 4.2., we here use the term “superposition” in a specific metaphoric sense, not to be literally interpreted.
Trapdoor functions used in cryptography should not be confused with one-way functions, which refers to the obfuscating hash-functions such as SHA-512 and MD5, which are used to store passwords securely on servers.
While modern handshake protocols often include methods for forward secrecy (which ensures that a unique session key is generated for each user session rather than being derived from long-term asymmetric keys), this measure is specifically designed to minimize the damage if the server is compromised, and is not necessarily efficient against an overpowered quantum system that can decrypt the asymmetric keys in real-time. For a discussion on forward secrecy, see Katz & Lindell, 2021, p. 493.
It should be noted that there are more potential ways to cryptographically combat the emerging computing power, for example through utilizing quantum random numbers in already existing protocols or to use the still experimental quantum encryption which relies on superpositions and polarization filters.
Foucault sometimes referred to this mechanism as governmentality and panopticism.
A peculiar case of data privacy law arises when encrypted data of EU citizens is stored in data retention centers, since these citizens have the right under the GDPR to have their personal data removed upon request (Council of the European Union, 2016). However, a citizen cannot prove that an actor has stored their personal data if the data is encrypted. This creates a democratic deficit, in which EU citizens are unable to exercise their data privacy rights in relation to agencies and companies that store encrypted data without permission. Interestingly, there is no binding case law from the Court of Justice of the European Union addressing this specific data privacy problem.
Translators of Hegel refer to this concept in different ways, for example as “absolute recoil”, “absolute negation” and “counter-thrust”, but Hegel himself mainly used the German term “absoluter gegenstoß” (Hegel & Miller, 1969, p. 412).
References
Ackerman, S., & Roberts, D. (2013). ‘NSA phone surveillance program likely unconstitutional, federal judge rules’. The Guardian, December 16, 2013, from https://www.theguardian.com/world/2013/dec/16/nsa-phone-surveillance-likely-unconstitutional-judge
Bauman, Z. (2000). Liquid modernity. Polity Press.
Bauman, Z., & Lyon, D. (2013). Liquid surveillance: A conversation. Polity Press.
Beattie, A. J. (1956). Mr. Ventris’ decipherment of the Minoan linear B script. The Journal of Hellenic Studies, 76, 1–17. https://doi.org/10.2307/629549
Becker, S., Gude, H., Horchert, J., Müller-Maguhn, A., Poitras, L., Reißmann, O., Rosenbach, M., Schindler, J., Schmid, F., Sontheimer, M., Stark, H. (2014). Inside snowden’s Germany file. Der Spiegel, 18 June 2014, from https://www.spiegel.de/international/germany/new-snowden-revelations-on-nsa-spying-in-germany-a-975441.html
Bentham, J., & Božovič, M. (2010). The Panopticon writings. Radical thinkers. Verso.
Blegen, C. W. (1953). The palace of Nestor excavations at Pylos, 1952. American Journal of Archaeology, 57(2), 59–64. https://doi.org/10.2307/501233
Bradford Franklin, B., Felten, S. E., LeBlanc, T., & Williams B. (2024). Privacy and civil liberties oversight board released unclassified 2020 xkeyscore documents. Privacy and Civil Liberties Oversight Board
Bradford Franklin, S., Felten, E., LeBlanc, T., Williams, B., & DiZinno, R. (2023) Report on the surveillance program operated pursuant to section 702 of the foreign intelligence surveillance act. Privacy and Civil Liberties Oversight Board, https://documents.pclob.gov/prod/Documents/OversightReport/054417e4-9d20-427a-9850-862a6f29ac42/2023%20PCLOB%20702%20Report%20(002).pdf
Brivot, M., & Gendron, Y. (2011). Beyond Panopticism: On the ramifications of surveillance in a contemporary professional setting. Accounting, Organizations and Society, 36(3), 135–155. https://doi.org/10.1016/j.aos.2011.03.003
Cadwalladr, C., & Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge analytica in major data breach. The Guardian, 17, 22.
Carroll, R. (2013). Welcome to Utah, the NSA’s desert home for eavesdropping on America. The Guardian, https://www.theguardian.com/world/2013/jun/14/nsa-utah-data-facility
Chen, H., Waheed, A., Li, X., Wang, Y., Wang, J., Raj, B., & Abdin, M. I. (2024). On the diversity of synthetic data and its impact on training large language models. arXiv. https://doi.org/10.48550/ARXIV.2410.15226
Clarke, R. (1988). Information technology and dataveillance. Communications of the ACM, 31(5), 498–512. https://doi.org/10.1145/42411.42413
Council of the European Union. (2016). ‘Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 (General Data Protection Regulation)’. Official Journal of the European Union, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679.
Diffie, W., & Hellman, M. (1976). New directions in cryptography. IEEE Transactions on Information Theory, 22(6), 644–654. https://doi.org/10.1109/TIT.1976.1055638
Dupont, B. (2008). Hacking the Panopticon: Distributed online surveillance and resistance. Sociology of crime law and deviance (pp. 257–278). Emerald (MCB UP). https://doi.org/10.1016/S1521-6136(07)00212-6
Earl, J., & Beyer, J. L. (2014). The dynamics of backlash online: Anonymous and the battle for WikiLeaks. In L. M. Woehrle (Ed.), Research in social movements, conflicts and change (pp. 207–233). Emerald Group Publishing Limited. https://doi.org/10.1108/S0163-786X20140000037007
Edgar, T. H. (2017). Beyond Snowden: Privacy, mass surveillance, and the struggle to reform the NSA. Brookings Institution Press.
Elmer, G. (2003). A diagram of panoptic surveillance. New Media & Society, 5(2), 231–247. https://doi.org/10.1177/1461444803005002005
Erwin, M. (2015). The latest rules on how long NSA can keep Americans’ encrypted data look too familiar. Just Security, https://www.justsecurity.org/19308/congress-latest-rules-long-spies-hold-encrypted-data-familiar/
Ford, P. (2023). The quantum cybersecurity threat may arrive sooner than you think. Computer, 56(2), 134–136. https://doi.org/10.1109/MC.2022.3227657
Foucault, M. (1995). Discipline and punish: The birth of the prison (2nd ed.). Vintage Books.
Gandy, O. H. (2021). The panoptic sort: A political economy of personal information. Oxford University Press.
Gertheiss, S., Herr, S., Wolf, K. D., & Wunderlich, C. (Eds.). (2017). Resistance and change in world politics: International dissidence (1st ed.). Springer. https://doi.org/10.1007/978-3-319-50445-2
Gidney, C., & Ekerå, M. (2021). How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits. Quantum, 5, 433. https://doi.org/10.22331/q-2021-04-15-433
Gray, D., & Henderson, S. E. (Eds.). (2017). The Cambridge handbook of surveillance law (1st ed.). Cambridge University Press.
Greenberg, A. (2013). Leaked NSA doc says it can collect and keep your encrypted data as long as it takes to crack it. Forbes, https://www.forbes.com/sites/andygreenberg/2013/06/20/leaked-nsa-doc-says-it-can-collect-and-keep-your-encrypted-data-as-long-as-it-takes-to-crack-it/?sh=228defcdb07d
Greenwald, G. (2014). No place to hide: Edward Snowden, the NSA, and the U.S. Surveillance State (1st ed.). Metropolitan Books Henry Holt.
Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. The British Journal of Sociology, 51(4), 605–622. https://doi.org/10.1080/00071310020015280
Hegel, G. W. F., & Miller, A. V. (1969). Hegel’s science of logic, Muirhead library of philosophy. Allen & Unwin.
Hinds, J., Williams, E. J., & Joinson, A. N. (2020). “It wouldn’t happen to me”: Privacy concerns and perspectives following the Cambridge analytica scandal. International Journal of Human-Computer Studies, 143, 102498. https://doi.org/10.1016/j.ijhcs.2020.102498
Katz, J., & Lindell, Y. (2021). Introduction to modern cryptography (3rd ed.). CRC Press.
Klein, A., Felten, E., Nitze, J., LeBlanc, T., & Bamzai, A. (2020). Report on certain NSA uses of XKEYSCORE for counterterrorism purposes. Privacy and Civil Liberties Oversight Board
Krelina, M. (2023). The prospect of quantum technologies in space for defence and security. Space Policy, 65, 101563. https://doi.org/10.1016/j.spacepol.2023.101563
Laclau, E., & Mouffe, C. (2001). Hegemony and socialist strategy: Towards a radical democratic politics (2nd ed.). Verso.
Lague, D. (2023). U.S. and China race to shield secrets from quantum computers. Reuters, https://www.reuters.com/investigates/special-report/us-china-tech-quantum/
Lindsay, J. R. (2020a). Demystifying the quantum threat: Infrastructure, institutions, and intelligence advantage. Security Studies, 29(2), 335–361. https://doi.org/10.1080/09636412.2020.1722853
Lindsay, J. R. (2020b). Surviving the quantum cryptocalypse. Strategic Studies Quarterly, 14(2), 49–73.
Lindsay, J. R. (2022). These are not the droids you’re looking for: Offense, defense, and the social context of quantum cryptology. In J. R. Lindsay (Ed.), Quantum international relations (pp. 153–171). Oxford University Press.
Lingel, J. (2013). The digital remains: Social media and practices of online grief. The Information Society, 29(3), 190–195. https://doi.org/10.1080/01972243.2013.777311
Lyon, D. (1993). An electronic Panopticon? A sociological critique of surveillance theory. The Sociological Review, 41(4), 653–678. https://doi.org/10.1111/j.1467-954X.1993.tb00896.x
Lyon, D. (2014). Surveillance, Snowden, and big data: capacities, consequences, critique. Big Data & Society, 1(2), 205395171454186. https://doi.org/10.1177/2053951714541861
MacAskill, E., & Ball, J. (2013). Portrait of the NSA: No detail too small in quest for total surveillance. The Guardian, https://www.theguardian.com/world/2013/nov/02/nsa-portrait-total-surveillance
Marcuse, H. (1991). One-dimensional man: Studies in the ideology of advanced industrial society. Beacon Press.
Mayer-Schönberger, V. (2009). Delete: The virtue of forgetting in the digital age. Princeton University Press.
Menezes, A. J., van Oorschot, P. C., & Vanstone, S. A. (2001). Handbook of applied cryptography. CRC Press.
NIST. (2022). NIST announces first four quantum-resistant cryptographic algorithms. National Institute of Standards and Technology, https://www.nist.gov/news-events/news/2022/07/nist-announces-first-four-quantum-resistant-cryptographic-algorithms
Norton-Taylor, R., & Bowcott, O. (2016) UK spy agencies have collected bulk personal data since 1990s, files show. The Guardian, https://www.theguardian.com/world/2016/apr/21/uk-spy-agencies-collected-bulk-personal-data-since-1990s
Öhman, C. (2020). A theory of temporal telepresence: Reconsidering the digital time collapse. Time & Society, 29(4), 1061–1081. https://doi.org/10.1177/0961463X20940471
Öhman, C. (2024). The afterlife of data: What happens to your information when you die and why you should care. The University of Chicago Press.
Ott, D., & Peikert, C., and other workshop participants. (2019). Identifying research challenges in post quantum cryptography migration and cryptographic agility, https://doi.org/10.48550/ARXIV.1909.07353
Pham, T. X., Duong-Ngoc, P., & Lee, H. (2023). An efficient unified polynomial arithmetic unit for CRYSTALS-Dilithium. IEEE Transactions on Circuits and Systems i: Regular Papers, 70(12), 4854–4864. https://doi.org/10.1109/TCSI.2023.3316393
ProPublica. (2013). What you need to know about the NSA’s surveillance programs. ProPublica, https://www.propublica.org/article/nsa-data-collection-faq
Rogers, M. (2014). Authorization act for fiscal year 2015. Washington, D.C.: 113th Congress of the United States, https://www.congress.gov/bill/113th-congress/house-bill/4681/text
Satter, R. (2020) U.S. court: Mass surveillance program exposed by Snowden Was Illegal. Reuters, https://www.reuters.com/article/idUSKBN25T3CJ/
Shor, P. W. (1994) Algorithms for quantum computation: discrete logarithms and factoring. In Proceedings 35th Annual Symposium on Foundations of Computer Science, 124–134. Santa Fe: IEEE Comput. Soc. Press, https://doi.org/10.1109/SFCS.1994.365700
Snowden, E. J. (2019). Permanent record. Macmillan.
Stoycheff, E., Liu, J., Xu, K., & Wibowo, K. (2019). Privacy and the Panopticon: Online mass surveillance’s deterrence and chilling effects. New Media & Society, 21(3), 602–619. https://doi.org/10.1177/1461444818801317
SVT. (2013). Read the Snowden documents from the NSA’. Sveriges Television, https://www.svt.se/nyheter/granskning/ug/read-the-snowden-documents-from-the-nsa
Taylor, P. (2023) Volume of data/information created, captured, copied, and consumed worldwide. Statista, https://www.statista.com/statistics/871513/worldwide-data-created/
Thorp, F., Kapur, S., & Nobles, R. (2024). Senate passes bill renewing key FISA surveillance power moments after it expires. NBC News, https://www.nbcnews.com/politics/congress/senate-renews-fisa-section-702-spying-privacy-rcna148394
Von Neumann, J. (1996). Mathematical foundations of quantum mechanics. Princeton landmarks in mathematics and physics. Princeton University Press.
Wace, A. J. B. (1953). The discovery of inscribed clay tablets at mycenae. Antiquity, 27(106), 84–86. https://doi.org/10.1017/S0003598X00024613
Wang, K., Zhu, J., Ren, M., Liu, Z., Li, S., Zhang, Z., Zhang, C., Wu, X., Zhan, Q., Liu, Q., & Wang, Y. (2024). A survey on data synthesis and augmentation for large language models. arXiv. https://doi.org/10.48550/ARXIV.2410.12896
Žižek, S. (2009). First as tragedy, then as Farce. Verso.
Žižek, S. (2014). Absolute recoil: Towards a new foundation of dialectical materialism. Verso.
Zuboff, S. (2020). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Acknowledgements
This research was generously funded by Wallenberg AI, Autonomous Systems and Software Program—Humanity and Society. Special thanks to David Watson and Yunyun Zhu for proofreading and providing feedback. We would also like to thank the anonymous reviewers for their insightful comments. The article is loosely based on Erik Olsson’s master’s thesis, 2024.
Funding
Open access funding provided by Uppsala University.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Olsson, E., Öhman, C. The Quantum Panopticon: A Theory of Surveillance for the Quantum Era. Minds & Machines 35, 17 (2025). https://doi.org/10.1007/s11023-025-09723-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11023-025-09723-2