From Digital Methods to Digital Ontologies: Bruno Latour and Richard Rogers at CSISP

It was a completely full house last week (7 March) in the Ian Gulland Lecture Theatre for Richard Rogers’ and Bruno Latour’s joint presentation as part of The New in Social Research – with students and lecturers lining the steps and craning their necks from the upper deck.

Both speakers were gracious co-hosts: Rogers referring to himself as “the appetiser for the main course”, while Latour framed his talk as “a footnote” to Rogers’. But the two lectures, which addressed “digital methods” and “digital ontology” respectively, were more closely entwined: Rogers’ cutting edge mapping and internet research techniques provided “an occasion” for Latour to vindicate the theories of Gabriel Tarde, while Latour’s Tardian ontology provided validation and grounding for Roger’s methodologies.

Rogers, who runs the Digital Methods Initiative in Amsterdam, began with brief history of internet research. In the late 90s, the hyperbolic pronouncements of the “cyberspace” era gave way to more sociological studies which situated the online in the offline world (following Miller and Slater 2000). But Rogers argued that we have now entered a new phase in which internet activity need not be studied as something categorically separate from “the real”. The online, rather than the real, is now “the baseline”.

He gave the example of Google Flu Trends, which uses keyword searches of “flu like symptoms” to locate the spread of the flu (and other diseases) geographically – much faster than traditional epidemiological techniques. He contrasted these new ways of using Google as a research tool – such as mapping regional differences in language (“soda” versus “pop”) or Thanksgiving eating habits (“sweet potato pie” versus “yams”) – with the traditional equivalents: where sociologists physically drove around the country in “word wagons” collecting data on local language practices. Rogers posed the question what characteristics does internet data contain (such as time stamps) which distinguish it from from these inherited forms of data ike surveys.

He also addressed the ubiquity of Google and how it has transformed the search engine into a mass media resource. But ironically, even from the early days, it was always thought of as a research tool – as stated in Google’s patent application. Rogers explained how to turn off various settings to convert google into your own research engine.

Many of the DMI’s recent works were also on display. Rogers used the Issue Dramaturg tool to reveal a rare incident of a website (a 9/11 conspiracy blog) temporarily disappearing from the Google listings. He examined differences in the language versions of articles in Wikipedia which spoke volumes about how countries represent historical conflicts differently (different death tolls or more provocative photos). We also saw how the health of national internets (as defined by IP address or WhoisGuard) could tell us about what was happening on the ground in conflict zones. But Rogers also stressed caution in applying these approaches uncritically: discussing throughout his talk the messiness and limitations of internet data sets, as well as the politics behind them.

While Rogers explored what was exciting and new about internet research, Latour stressed continuity by relating this new kind of data back to the work of Durkheim’s contemporary Gabriel Tarde. While Durkheim assumed that there was an aggregate called “society”, arising “ex abrupto” out of aggregated individuals, Tarde saw “scale”, in this sense, as a sociological invention. Latour described scale in Tarde’s vocabulary as a function of connectedness between “monads” which are defined by their relations to other monads through a kind of reciprocal possession. Thus for Tarde, the whole is always smaller than its parts – in the sense that a grouping of people is not a container for individuals nor a baseline of shared characteristics, but a few select features within each member.

Some authors have already speculated about the potential application of Latour’s reading of Tarde for the internet (Kulenberg and Palmås 2009) but here and in a forthcoming paper (Latour et al 2012) Latour made this connection explicit. He explained how the internet finally allows us to do sociology in a Tardian way, offering online profiles or CVs as an approximation of a monad in the way they represent internalizations of relations. With the advent of digital methods we now have the ability to see both this individual data and aggregates simultaneously on the same screen with a few clicks. So what has changed is not the basic format of data but the speed at which it can be accessed and zoomed in and out. It was the slowness of surveys and stats that kept collectives and individuals at arms length in the past. Latour also related these techniques to past studies of scientific journals and “paradigms” where this level of individual data has been available for some time.

The momentum continued in the then muggy theatre with a lively question and answer session. Rogers was asked how he triangulated his findings, to which he replied that his team was still nervous about the data but were currently working on triangulating it with other online data. Moderator Noortje Marres similarly wondered if there wasn’t a danger in “going native” – that the methods themselves could format the data – leaving blind spots for the researcher.

An audience member at the end wondered if perhaps this Durkheimian brand of macro-sociology Latour was railing against wasn’t something of a straw man, and that most practitioners of anthropology or social science, at least those in the room, never suffered from these theoretic deficiencies in the first place. Latour agreed that this was not directed at the audience but that there were practitioners (economists?) who very much rely on collectives as objects. As usual, Latour defused tensions with his self-deprecating humour and precise comic timing. When asked by one audience member if there were any special properties of “social” monads which made them distinct from say, bacteria, he pretended to think for a second before tentatively approaching the microphone to bark “non!” to a big laugh.

The general atmosphere seemed to be positive and receptive but not without some healthy skepticism towards the newness of Roger’s techniques or the practicalities of Latour’s “flat ontology”. But as we learned, neither the methods nor the ontology are categorically “new”, merely amplifications of old techniques and a reignition of old concerns. The point where Tarde’s brand of Sociology finally becomes realised through new technology on the internet is still a latent potential, but it’s a tantalizing one.

– David Moats

Photos by Jorge Castillo

References:
Kullenberg, C. and Palmås, K. (2009) “Tarde’s contagiontology: From ant hills to panspectric surveillance technologies” Eurozine http://www.eurozine.com/pdf/2009-03-09-kullenberg-en.pdf

Latour, B. et al 2012 “The Whole is Always Smaller Than It’s Parts A Digital Test of Gabriel Tarde’s Monads” British Journal of Sociology (forthcoming)http://www.bruno-latour.fr/sites/default/files/123-WHOLE-PART-FINAL.pdf

Miller, D. and Slater, D. (2000) The internet: an ethnographic approach. Berg, Oxford & New York

The New in Social Research: Ruppert recording

We are pleased to put online the next in our ‘The New in Social Research’ series, a recording of Evelyn Ruppert’s lecture titled ‘Doing the Transparent State: Methods and their Subjectifying Effects/Affects’ (Feb 28th).

'Who's lobbying?' data interface

Building on themes explored in the previous talk by Fuller and Harwood, Ruppert looked at the effects (and affects) of the UK government’s data ‘Transparency Agenda’, insisting on the generative capacities of this device. This includes the release of detailed data, via publically accessible, comparatively easy-to-use online platforms (e.g. government produced data apps), ranging from details of MPs expenses to itemised lists of departmental spending. This data, in turn, can be – and increasingly is – downloaded, manipulated and mediated by organisations and institutions, whether by journalists looking to produce eye catching visualisations , or companies hoping to unearth market value hidden in the relations between and amongst different data sets.

Where does my money go

'Where does my money go?' data visualisation

A key argument was that Transparency and Open Data arrangements anticipate the moral failure of government: they enrol people as vigilant subjects monitoring such potential failures. This mode of government/public engagement, Ruppert argues, calls forth certain kinds of witnessing public, the production of what she termed (uncertain, hypervigilant) ‘data subjects’. This mode of witnessing implies a reorientation of both the responsibilities of political subjects and the medium for political action. Increasingly, responsibility for detecting moral, political failure is relocated away from the business of politics itself and onto a public charged with monitoring, sifting, detecting, calling attention to potential government failings lurking in the depths of the data, but also rendering subjects complicit in this ultimately passive mode of governance by transparency.

To hear more, including Ruppert’s reflections on whether or not this mode of witnessing can be considered ‘new’ (a key question given the aims of this series), download the recording below.

Recording (to be downloaded; these are not designed to stream)

1. Evelyn Ruppert – Doing the transparent state