The Convergence Puzzle

Note: Jackson Pollock’s masterpiece Convergence has provided the creative foundation for the overarching theme and title of this volume. Known for his eclectic painting style, Pollock is seen as a trailblazer for invention and free expression, admonishing us that “the modern painter cannot express his age, the airplane, the atom bomb, the radio, in the old forms of the Renaissance or any other past culture. Each age finds its own technique.”

Coming across this painting by chance when thinking of a way to conceptualize cyberspace, it stood out to me for its portrayal of complexity – yet underpinned by a harmony of sorts. This seemed a fitting frame for this topic, with the dynamics of complexity evident in Convergence speaking to the complexity inherent in cyberspace – a realm of converging and diverging forces and interests: technological, social, political, economic, institutional, cultural, ideational/ideological and strategic. As such, they co-exist, compete and act upon each other, forming a complex ecosystem of dynamic, interlinked threat and opportunity vectors.

Once I discovered the inscription provided by the Albright Knox Gallery, where the painting has its home, the parallel became even stronger: “for Pollock, the process of dripping, pouring, and splattering provided him with a combination of chance and control.”

The dialectic between chance and control are also at play in the realm of cyberspace – how we manage them is the puzzle we are asked to solve in our own age. And, taking inspiration from Pollock once more, it requires finding our own technique.

Jackson Pollock, Convergence 1952, oil on canvas
Jackson Pollock, Convergence 1952, oil on canvas

Collection Albright-Knox Art Gallery, Buffalo, New York (Gift of Seymour H. Knox, Jr., 1956) Reproduced here as part of the authorized use for educational purposes (scholarly publication) © Pollock-Krasner Foundation / Artists Rights Society (ARS), New York

“Civil liberty functions today in a changing technological context.”

Ithiel de Sola Pool,
Technologies of Freedom (1984)

Writing in 1983, well before what we now refer to as ‘cyberspace’ was conceived as such, MIT political scientist Ithiel de Sola Pool mapped out the coming technological landscape as one where “most published information will be disseminated electronically”, with networked computers functioning as “the printing presses of the twenty-first century”. This way, he forecast a convergence of once separate modes of communication – and the dangers inherent in such ‘electronic hegemony’ as he anticipated an erosion of civil liberties and freedom through heavy-handed government regulation.

While Pool’s vantage point is bounded to some extent by its time and place – in particular traditionally libertarian concerns – his framing of the challenges of the coming information age is still a useful entry point to understand how the accelerating, disruptive nature of technology and hyper-connectivity is giving rise to a new set of socio-political, economic and especially strategic challenges.

In the past infringements on citizens’ freedoms through government overreach at the hands of surveillance agencies such as GCHQ or NSA were the main fear. Today the potential for control originates from a wider array of sources: fears of rival or adversarial actors that control large parts of the technology and communications infrastructure now run alongside concerns about excessive state power – both domestically and globally. Initially, a diffusion of technology and easier accessibility had given rise to hopeful expectations of the democratizing

effects of increasing connectivity, empowering individuals vis-a-vis State power. Yet, the increased reach of tech-savvy dictatorships, revisionist powers and violent extremist groups highlighted the dangers of our age, soon giving way to fears of new forms of oppression and violence: technology becoming a handy enabler of greater surveillance, control, and coercion – in particular giving asymmetric and revisionist actors a potential advantage over established democracies.

Geopolitically, China’s One Belt One Road Initiative, especially the concept of a Digital Silk Road is being recognized not only as an instrument for greater connectivity but as a deliberate strategy to exercise control. Likewise, beginning with Russia’s cyber-enabled interference in the 2016 American presidential election, Chinese and Russian attempts at influencing Western politics, media organizations, and certain segments of the population illustrate the prevalence of manipulating public opinion – increasingly being considered a key national security threat amongst liberal democracies.

The geostrategic threat to liberal socio-political systems in the digital age is evidenced in the 5G debates playing out in Western democracies. Europe is a current prime example: pursuing a course of ‘strategic autonomy’, it seeks independence from the US-China superpower rivalry which it perceives to be behind the American efforts to push other nations to exclude Huawei. For this end, the stance on Huawei becoming visible in Germany and other European countries at the time of writing appears to be one of attempted ‘neutral’ positioning, manifested in a reluctance to endanger economic partnerships with China. Along those lines, the great power rivalry between the United States and China is often described as a ‘new Cold War’, in terms of a cyber or AI ‘arms race’. Even though the accuracy and usefulness of such historical analogies are contested, it could be argued that their frequent use points to a recognition of the fundamental nature of these challenges: as digital technologies provide adversaries with unprecedented opportunities to undermine Western democratic, social, and market institution, these are not only security issues, but more fundamentally, debates about order and global governance.

These new governance challenges for States are also illustrated by the Islamic State’s strategic use of communication technologies: leveraging the opportunities afforded by social media platforms, it managed to augment its reach and incite terrorist acts against the West in a more dispersed manner. Likewise, the Christchurch attack has served as a much-needed reminder that terrorists harnessing technology is not just the purview of jihadists. It points to a bigger problem-set of how cyberspace is enabling extremists of all persuasions to more easily disseminate their narratives, recruit und inspire/ instruct terrorist acts.

These development result in a new set of challenges that come with regulating the online environment, such as the complexities of responsible encryption, how to deal with AI-enabled deep fakes and the manipulation of public opinion through the use of computational propaganda (so-called political bots). As these quandaries bear out through governments’ relationships with tech companies, they highlight the blurred boundaries that currently exist in terms of regulation and responsibility.

Here, new points of friction emerge as tech and media companies are asked by governments to monitor the content on their platforms to impede the dissemination of extremist content or misinformation. This move has been perceived as problematic, suggesting that government intelligence gathering is being outsourced to tech companies whose business model is inherently programmed for metrics-driven growth.

In this context, Facebook’s regulation of activities across its platforms along a yardstick of ‘truth versus falsehood’ raises questions about how objective the very act of determining what is ‘true’ can be. Even with a revamped algorithm and fact-checking measures designed to fight the spread of fake news, critics argue that it easily enables and in fact incentivizes cognitive biases, especially in a contested and polarized information environment, it is important to factor in cognitive biases as well as political and economic when assessing metrics.

At the same time, the thesis has been put forward that the only way for big tech corporations to continue dominating the market is by allowing a certain extent of government regulation, resulting in what some analysts imagine as a sort of ‘power sharing agreement’.

What these and similar arguments reveal are the blurred lines between the power of corporations, machines and the state, which have led to questions of where power and the ability to control truly lie and what we, as citizens can do about it.

As highlighted by the debate about the validity of Cold War analogies, it has become almost a cliché these days to argue that power politics take on a new form. Yet, the argument is useful one to examine in this context. Power politics are seen as having moved away from their traditionally narrow containment lines of State sovereignty and increasingly playing out on an expanded playground that is characterized by decentralized, shifting system of networks. This idea was for example expressed by Anne-Marie Slaughter in her The Chessboard and the Web: Strategies of Connection in a Networked World, where she argued that “states still exist and exercise power, but side by side with corporate, civic, and criminal actors enmeshed in a web of networks.” Such developments point to an increasingly symbiotic relationship between States’ digital powers/ measures and corporate data collection, giving rise to debates about who is in control in an era where data apparently reigns supreme – what some have called a ‘dictatorship of data’ or, more specifically in regards to governments wanting to protect and control their information-related companies and infrastructure, ‘data mercantilism’. Similarly, Shoshana Zuboff, warning of the effects of what she calls surveillance

capitalism, has coined the term “Instrumentarianism”, a new power constellation of the digital revolution. This “new frontier of power” is said to result from the ability to commodify human experience into ‘behavioral data’, by means of analysing and measuring online human activity – with the end goal of manipulating and monetizing it.

Appreciating these complexities accentuates what lies center of this shift: the tricky issue of in whose hands the responsibility of ensuring the balance between privacy, free speech, ‘establishing truth’ and national security ultimately ends up – and if the result is a world we want to live in, a world that still reflects its founding values.

Seeking to avoid technological determinism, answering the question of ‘who is in control’ needs to go beyond focusing on the power of corporations or how authoritarian regimes appropriate new technological advances for their own ends: it should also entail an inquiry into the fundamental societal and political dynamics and structures that enable such abuses – with an eye on our own societies and technology‘s potential to weaken democracies if left ungoverned and driven by market principles. This is based on the recognition that in a hyper-connected and highly networked world, technology enables individuals, civil society, non-state actors and institutions to impact on social and political agendas more than ever before. As noted in a recent report, “across social media, people participate in the creation and spread of information, misinformation, and disinformation. Society is not shielded from geopolitics here. Rather society is, wittingly or unwittingly, a participant.” Consequently, human action is at the core of the information age still – enabled by technology but not determined by it.

This means more technology alone can also not be the answer to help us overcome the challenges resulting from this shift. Metrics are still driven by human biases. And, in looking for a solution, common descriptors such as ‘fake news contagion’ are often not helpful when they remain ill-defined; equally, recourse to a ‘post-truth’ era gives the impression little can be done to contain the spread of falsehoods or even establish, through critical inquiry, what is true and false.

Assessing the security landscape is therefore not just a matter of simple fact-checking and metrics: how we scale risks and security threats is ultimately a function of how we perceive the world and think it should be ordered. The evolution of any system in society demonstrates this: be they military, information, political, control, economic and cultural, systems are driven not by strategic thinking alone but also firmly rooted in beliefs systems and values. This makes the above questions not only deeply political and strategic ones but inadvertently also about ethics. What has been termed by some as a new paradigm of ‘society-centric warfare’ is useful for conceptualizing this: a conflict’s centers of gravity as well as the end goals of operational and technical forces are ultimately rooted in society, making factors such as identity, perceptions, emotions and motivations or beliefs paramount.

Accordingly, recognizing that society and individuals have an unprecedent role in an evolving global system of knowledge, power and authority is one thing. What’s more important is to acknowledge that the matter is situated in an ideational/ ethical sphere rather than being a mere outgrowth of instrumental rationality. Applying the conceptual lens of French sociologist Jacques Ellul can be instructive for

understanding the discursive construction of today’s information age. Ellul’s analysis of the forces driving liberal technological societies reveals that democracy itself, meant as the prime vehicle for the free exchange of opinion and ideas, can became an empty myth when allowed to be driven by technocratic and commercial imperatives.

This is not always recognized– and when it is, the overly normative, even ideological character of the debate often obscures the real complexity of the interconnected dynamics between security matters and values or ethics. One illustration for this is how the decline/erosion of Western dominance has become a frequent talking point – evident for instance in the Munich Security Conference’s engagement with the concept of “Westlessness”. While seeking to diagnose the challenges of our time, this is a problematic lens on several levels: the principal issue being its reification of ‘the West’ as the original and exclusive home of progressive values, especially when presented in triumphalist tones.

Nevertheless, it highlights an important element of the global political landscape: transcending immediate security concerns, debates have been elevated to a more existential level where the future of our order is framed and questioned in ideational terms. As global power shifts have given rise to competing models of governance/political order and the way data is governed impacts on our collective and individual freedom, these questions do need to be asked and pursued. But especially in an era of democratic disenchantment, it may be more useful to address them with more humility and less self-assuredness as we reflect on how its key tenets so they can carry us into the future. The Director of Military Sciences of the Royal United Sciences Institute for instance also noted “a belief in Western conceptual or intellectual superiority remains deeply entrenched in the Western orthodoxy; such hubris has distinct dangers.”

Recourses to the shared foundation of Western values, reiterating their superiority are therefore not enough to tackle the complex problems of our time. For one, maintaining the openness and trust that should be the social fabric of our society and protecting it from compromise is not an outside problem. Consequently, lamenting “Westlessness” and issuing moralistic calls for restoring Western dominance do little to alleviate the problem. What these dilemmas and complex problem-sets can alert us to, however, is the importance of how we conceptualize and address such prickly challenges of sovereignty, governmental/institutional overreach, transparency and accountability for ourselves.

To return to the starting premise, complexity is inherent in not only the technical and logical layers that make up cyberspace, but also in how ‘cyber’ is embedded in the socio-political, cultural and geostrategic structures. Hence, recognizing this complexity as emanating from the interconnectedness of dynamically driven elements within this human-centric space or system, means that responsive policy can only be made by grappling with social and ethical complexity, rather than wanting to reduce it. In an essay titled “When Truth Becomes a Commodity”, Daniel Rogers highlights a process that pinpoints the core of this challenge

“As long as we can click on the truths we want, as long as truth is imagined as a desire satisfied in a politically and commercially saturated market, we will have a superabundance of facts that people hold as true.

Everyone will get what he wants, and the public — and its trust in truth — will fall apart …

… finding our way back to the notion of truth as the result of a public process of search and debate and deliberation will not be easy…above all, it will require a renewed commitment to truth’s complexity and the processes by which one searches for it.”30

What makes this complexity emanating from the converge of forces a puzzle rather than just a tangle of non-linear causes and effects is therefore the end goal: working out and managing the relationship between these forces in a way that aligns with the bigger picture; ultimately, it is about making them converge in a manner that strengthens rather than undermines the foundations of liberal orders – both domestically and at the multilateral level. This is tricky.

Consequently, the convergence puzzle seeks to serve as a reminder of where the centre of gravity should lie in debates on cybersecurity: in a commitment to the core of the liberal project as its best defence mechanism. The challenge is finding out what this means practically, step by step and for each problem that presents itself.

  1. As quoted in the Albright-Knox Collection, k19567-convergence
  2. Ibid
  3. Ithiel de Sola Pool, Technologies of Freedom (Harvard: Harvard University Press, 1984).
  4. Robert Huckfield, “The 2016 Ithiel de Sola Pool Lecture: Interdependence, Communication, and Aggregation: Transforming Voters into Electorates”, P.S: Politics and Science, Vol.50, No.1, January 2017, pp. 3-11.
  5. Lucas Kello, The Virtual Weapon and International Order (Oxford: Oxford University Press 201, pp. 162-4).
  6. See for example;; Cathy Downes, “Strategic Blind–Spots on Cyber Threats, Vectors and Campaigns.” The Cyber Defense Review 3, no. 1 (2018): 79-104;
  7. Nadège Rolland, China’s Eurasian Century? Political and Strategic Implications of the Belt and Road Initiative, National Bureau of Asian Research, May 2017,; or more recently:;
  8. Aaron Friedberg, The Authoritarian Challenge: China, Russia and the Threat to the International Liberal Order (Tokyo: Sasakawa Peace Foundation, 2017),; Christopher Walker and Jessica Ludwig, “From Soft Power to Sharp Power: Rising Authoritarian Influence in a Democratic World” in Sharp Power: Rising Authoritarian Influence (Washington, DC: National Endowment for Democracy, 2017),
  9. Ulrike Franke and Tara Varma, “Independence Play: Europe’s Pursuit of Strategic Autonomy”, European Council on Foreign Relations, July 2019:
  12. See for example Samantha Bradshaw and Philip N Howard, “The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation”, Computational Propaganda Research Project, University of Oxford,; or;
  13. David Lyon, The Culture of Surveillance: Watching as a Way of Life (London: Polity Press, 2018);
  14. This is especially the case when the search is one for direct causations between what hap- pens on social media platforms and individual actions – which cannot be easily measured - rather than inquiries into the wider strategic consequences and shifts in the information ecology, see for example; Alicia Wanless et al., “How Do You Define a Problem Like Influence?, Journal of Information Warfare (2019) 18.3: 1-14; or
  17. Anne-Marie Slaughter, The Chessboard and the Web: Strategies of Connection in a Networked World (New Haven: Yale University Press, 2017).
  19. Eric Rosenbach and Katherin Mansted, “Geopolitics of Information”, Belfer Center for Science and International Affairs, Harvard Kennedy School, May 28 2019;
  20. Shoshana Zuboff, The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019).
  21. Thomas Rid and Marc Hecker (2009), War 2.0: Irregular Warfare in the Information Age. Westport: Praeger Security International; Ofer Fridman, Vitaly Kabernik and James C. Pearce (eds.) (2018), Hybrid Conflicts and Information Warfare: New Labels, Old Politics. Boulder and London: Lynne Rienner.
  23. Cecilia Andrews & Edward Lewis (2006), “Simulating Complexity-Based Ethics for Crucial Decision-Making in Counter Terrorism” in H Nemati (ed.) Information Security and Ethics: Concepts, Methodologies, Tools and Applications, p. 3250. Hershey, PA: Information Science Reference (an Imprint of IGI Global)
  24. See Luciano Floridi, The Fourth Revolution: How the Infosphere is Reshaping Human Reality (Oxford: Oxford University Press, 2014)
  25. Emile Simpson, War from the Ground Up: Twenty-First-Century Combat as Politics (New York: Oxford University Press, 2013); Ariel E. Levite and Jonathan (Yoni) Shimshoni, “The Strategic Challenge of Society-centric Warfare”, Survival, 60:6 (2018), 91-118 DOI: 10.1080/00396338.2018.1542806 ;
    Rupert Smith, The Utility of Force: The Art of War in the Modern World (New York: Alfred A. Knopf, 2005); Scales, R. (2006, July). Clausewitz and World War IV. Armed Forces Journal, 16–24, 48.
  26. See for example Karim H Karim, “Cyber-Utopia and the Myth of Paradise: Using Jacques Ellul’s Work on Propaganda to Analyze Information Society Rhetoric”, Information, Communication and Society, 4:1 (2001), pp. 113-134
  28. US Secretary of State Pompeo’s pompous ‘The West is Winning’ speech being a case in point
  29. Peter Roberts, “Designing Conceptual Failure in Warfare”, The RUSI Journal, 162:1(2017), 14-23, DOI: 10.1080/03071847.2017.130151

Katja Theodorakis

Head of Counterterrorism Program


Katja Theodorakis is the Head of ASPI’s Counterterrorism program.

Before joining ASPI, she was Senior Program Coordinator for Research and Analysis (Foreign/Security Policy) at the Konrad-Adenauer-Foundation’s Australia Office where she coordinated a portfolio spanning the fields of terrorism/extremism, European defence/security matters, cybersecurity and the wider strategic relations in the Indo-Pacific. This entailed engagement and strategic outreach to the national security community, with the aim of enhancing cooperation between Australia and Germany/Europe.
Katja publishes and presents at seminars, conferences and masterclasses (for representatives from DoD, security agencies, DFAT) and has appeared on national TV and radio for commentary. She has engaged in academic teaching, such as most recently as a seasonal lecturer for a course on ‘Terrorism and Propaganda in Cyberspace’ at the Australian Graduate School of Policing and Security at Charles Sturt University. At the Australian National University, Katja has taught and tutored on Middle East politics, Islam and International Terrorism at the postgraduate and undergraduate level.

She is also a member of the Future Operations Research Group and co-founder of the associated Women In Future Operations platform at the UNSW at ADFA. Here, her focus is on insurgent/extremist propaganda and their strategic use in information operations. A key premise underpinning Katja’s research is the need to look beyond technocratic approaches to propaganda campaigns in order to capture the full breadth of extremism, including cognitive and affective factors.

Katja has lived in the Middle East, where she was engaged in educational projects and NGO work in Syria during the early years of Bashar al-Assad’s presidency. She holds a First-Class Honours degree in International Development from the Australian National University, was recipient of several scholarships including an Australian Government Research Training PhD Scholarship, and has been awarded the 2016 ANU Media and Outreach Award as Emerging Media Talent for her commendable media output and performance as a young researcher.