- Rob Kitchin
From around 2008 onwards big data systems and technologies have steadily risen up the hype cycle, aided by a strong boosterist discourses which contend that such data are in the process of revolutionizing how business is conducted and governance enacted. Big data, characterized by enormous volume, high velocity, wide variety, fine resolution, and strong relationality -- and produced by the plethora of digital devices and information and communication technologies that now populate daily life -- provide a data deluge from which valuable information can be extracted. Such information can be used to generate value through gains in efficiencies, productivity, competiveness, and effective marketing and sales.
Nearly all of the popular literature on big data focuses on the generation of such value for business and government and their associated risks, such as dataveillance, privacy infringement, profiling, social sorting, and anticipatory governance. What is much less discussed is the potential effects of big data on the academy and how knowledge is produced. Even within the academic literature, scholars are only just starting to examine the potential effects of big data on how research is undertaken and knowledge created. The slowness to engage with big data is caused in part by established scholars continuing to plough their usual intellectual furrows; to practice their established ways of seeing, researching, and knowing the world. They are used to an endless succession of pronouncements about new developments that will change how they think about things and they know that such novelties rarely change their viewpoints and practices. Nevertheless, there is now a groundswell of contention that big data is set to challenge and radically transform established epistemologies across the sciences, social sciences and humanities. Underpinning this argument is the observation that ‘[r]evolutions in science have often been preceded by revolutions in measurement” (Sinan Aral, cited in Cukier 2010).
For anyone who has engaged with big data, it seems obvious that a revolution in measurement is underway. Traditionally, given the costs and difficulties of generating, processing, ana-lysing and storing them, data have been produced in tightly controlled ways using sampling techniques that limit their scope, temporality and size. Academic knowledge has thus developed using approaches and methods designed to produce insights from relatively small numbers of observations or struggles to handle and analyze larger datasets. Big data, however, flows as a wide, deep torrent of timely, varied, resolute and relational data. In many cases, the issue now is not data scarcity, but rather data overload. Consequently, the development of the data deluge has been accompanied by the creation of new analytical methods suited to trying to extract insights from massive datasets using machine learning techniques, wherein the power of computational algorithms are used to process and analyze data.
For some, these new forms of data and analytics will inevitably challenge dominant paradigms across the academy, ushering in new epistemologies in all disciplines. This is the contention that I explore in my paper ‘Big data, new epistemologies, and new paradigms’ published in Big Data and Society. In particular, I explore three developments: the notion that big data gives rise to the end of theory, enabling a new form empiricism in which data can speak for themselves; the creation of data-driven rather than knowledge-driven science; and the formation of the digital humanities and computational social sciences that propose radically different ways to make sense of culture, history, economy and society. Each of these developments proposes new ways to make sense of the world.
The paper sets out and critically reviews the main arguments being forwarded. It concludes that big data seems set to transform how research is conducted in the sciences, but it will not lead to the end of theory, but rather usher in an era of data-driven science. However, much conceptual thinking is required to work through philosophical framing of such an approach. The situation in the humanities and social sciences is somewhat more complex given the diversity of their philosophical underpinnings, with big data and new analytics being unlikely to lead to the establishment of new disciplinary paradigms. Instead, I suggest, big data will enhance the suite of data available for analysis and enable new approaches and techniques, but will not fully replace traditional small data studies. This will be partly due to the prevalence and resilience of post-positivist thinking, but also because it is unlikely that suitable big data will be produced that can be utilised to answer particular questions. In other words, the effect of big data on the practices of knowledge production will be differentially felt across the academy, but there is no doubt that its effects will be felt even if one continues to plough their own intellectual furrow.
Rob's full article 'Big Data, New Epistemologies and Paradigm Shifts' is available here
About the author
Rob Kitchin is a professor and ERC Advanced Investigator in
the National Institute of Regional and Spatial Analysis at the National
University of Ireland Maynooth, for which he was director between 2002 and
2013. He has published widely across the social sciences, including 21 books
and over 130 articles and book chapters. He is editor of the international
journals, Progress in Human Geography and Dialogues in Human Geography, and for
eleven years was the editor of Social and Cultural Geography.