This two-day event addressed both the theory and practice of digital hermeneutics. Day 1 had the format of a ‘traditional’ conference with sessions dedicated to the four key aspects of the concept of digital hermeneutics: source criticism, tool criticism, algorithmic criticism and interface criticism. Day 2 consisted of workshops where participants discussed the challenges of introducing digital history training in the history curriculum and demonstrated some best practices in an interactive setting.
Day 1, Conference
The conference opened with a 'Welcome and Introduction' by C²DH director Andreas Fickers who pointed out today’s challenges of the humanities, i.e. fast technological development versus slow appropriation of these technologies, and the need for an update of classical hermeneutics for the digital age. In this context, a set of tensions, of a methodological (quantitative/qualitative, distant/close reading) and epistemological (statistical/ historical evidence, exemplary/representative) nature, should be dealt with. Hermeneutics, and in particular digital hermeneutics, therefore becomes an art of in-betweenness, requiring interdisciplinarity, multimodal literacy, new conceptions of curricula (train the trainers), new skills (in algorithmic, digital source, tools, interface, simulation criticism) and hands-on reflection or thinkering.
Session 1 focused on methodological issues in digital hermeneutics. In his presentation, Save your sources! … but can you?, Pascal Föhr (University of Basel) talked about source safeguarding, as part of the historical critical method, which requires new infrastructures and skills. Among these infrastructure elements, Föhr mentioned trusted institutions and trusted digital repositories, retrievable objects and a trusted saving procedure. On the other hand, the new skills should imply abilities in various areas such as: information managing (searching, evaluating, selecting, processing data), media (use of all types of analogue and digital media), computer science (programming to understand algorithmic transformation), data assessment (using appropriate means, knowledge of data storage and search to be able to criticise and interpret datasets), law and archiving (copyright, data protection and privacy).
Tim van der Heijden (C²DH) targeted methodological aspects from the perspective of Doing Digital History and Hermeneutics in an Interdisciplinary Setting. The first part of the presentation centered around the “Digital History & Hermeneutics” Doctoral Training Unit (DTU) at the C²DH, which gathers 13 doctoral researchers of different disciplinary backgrounds. Van der Heijden highlighted the DTU’s aims of critically reflecting on the impact of digital techniques on historical research as well as hands-on experimentation. The second part highlighted facets of the DTU as a trading zone (involving locality, interdisciplinarity, the creation of a common ground and finding an inter-language) as reflected by a series of interviews with the team members.
The session was concluded by Marijn Koolen (Royal Netherlands Academy of Arts and Sciences - Humanities Cluster) with A hands-on approach to digital tools criticism for (self-)reflection that addressed the question of how to incorporate methods for digital tools criticism in practice? The presentation elaborated on the results of two workshops in Digital Tools Criticism held during the DH Benelux conferences of 2017 and 2018. Among the findings were the importance of reflection as an integrative practice, related to reflective methods (tool evaluation, evaluate bias not just error rates) as well as reflective tools (including documentation, functionalities for analysing data quality, and features for visualizing missing or erroneous data values).
Session 2 was dedicated to use cases. Andreas Müller (University of Halle), in his talk Digital - Is that all there is? A report from digital source criticism of Zedler-Lexikon.de, presented the story of Zedler’s lexicon missing pages as an example of combining digital metadata and knowledge of content (e.g. about how universal lexicons were moving from brief dictionaries to fully grown encyclopaedia in the XVIIIth century) to solve the problem. The conclusion derived from the experiment was to stay skeptical of what you see or don’t see, and, if needed, to consult the analogue source material in order to check the reliability of the digital representation.
Sofia Papastamkou (University of Lille) and Frédéric Clavert (C²DH) brought in the topic of Twitter data as primary sources for historians: a critical approach. Lessons from two projects: the 2015 Greek referendum and the Centenary of the Great War on Twitter. Two types of historian’s work that can be supported by social media were pointed out, i.e. collective memory and current time events or documenting the now. The two projects used Twitter as a non-institutional, decentralised, wild, born-digital source for historical study. The workflows (collecting, preparing, analysing, visualising data) leading to interpretation were considered as implying a three-faceted Twitter hermeneutics: a hermeneutics of the APIs (choosing an API already represents a first step to interpretation), a hermeneutics of the hashtags (hashtags are not conversation, collective massive data are not exhaustive data), a hermeneutics of networks (different degrees of meaning can be associated with tweeting, retweeting, comments and likes).
The last use case in the session was presented by Cristina Vertan (University of Hamburg) with a contribution on Combining digital and hermeneutic approaches for investigating source quotations in early modern historical texts. Her talk illustrated how vagueness may be approached within the framework of HerCoRe (Hermeneutic and Computer based Analysis of Reliability, Consistency and Vagueness in Historical Texts). The study focused on reliability and consistency between original and translations, and the encoding of vagueness in two works from the beginning of the XVIIIth century by the Moldavian prince and scholar Dimitrie Cantemir, History of the Growth and Decay of the Ottoman Empire and Descriptio Moldavie.
Session 3 dealt with models. Francesco Beretta (CNRS - Université de Lyon) discussed the Factoïd-based model versus CIDOC CRM? Extracting structured data from historical sources. Several questions were pointed out, such as: How to go from factoids to states of affairs (world)? Do you model reality or ways of speech? How to store different opinions? Can historical criticism be modelled in data? In this context, the need for workflows producing fair data (findable, accessible, interoperable, reusable) was emphasised, as well as the CIDOC conceptual reference model and the Data for History Consortium aiming at building up an international community and a multi-project distributed information architecture for history.
The next presentation, Implementing Transparency, by Marten Düring and Estelle Bunout (C²DH), highlighted the importance of transparency in research and tools building, within the Impresso project framework. It was argued that beauty does not mean perfection and imperfection is not necessarily useless. Different problems may occur while dealing with a corpus: OCR errors, metadata quality/availability, text segmentation issues, imperfect annotation of named entities, etc. Integrating transparency into the workflow may give people a sense of imperfection in the data and create opportunities for more informed decisions and for a critical assessment of methods and decisions. Implementing transparency within an interdisciplinary tools building framework, therefore involves aspects such as shared understanding of the problems, consultation of domain experts to reduce complexity, discussions and decisions with partners, and proper documentation of the underlying processes.
Another approach to digital hermeneutics was proposed by Joris van Zundert (Huygens Institute for the History of the Netherlands) in his talk Why We Should Think About a Domain Specific Computer Language (DSL) for Scholarship. The presentation was articulated around two main problems in digital hermeneutics. The first consists of the denial, in current practices, of an important part of interpretation, i.e. the digital code. The second refers to the complexity of coding, programming languages being usually engineer-oriented. The proposed solution to these problems was a Domain Specific Computer Language (DSL), truly polyglot, providing support beyond Boolean logic, constructed as a shared task by an interdisciplinary community, and allowing scholars to enter the sites of interpretation created by computer languages.
The official launch of the Ranke.2 teaching platform for Digital Source Criticism by Stefania Scagliola and Andreas Fickers (C²DH) provided a brief history of the project idea and of the project itself, as well as the ways in which the collaboration between various members of the team involved shaped the project.
In his Keynote talk, The Historical Imagination in a Digital World, Edward Ayers (University of Richmond) drew attention to the new landscape of memory shaped today by the flood of digital representations of the past. In this context, a lay hermeneutics would imply the interpretation of every piece of evidence created today, a new kind of historical practice and of scholarship that is open source. The digital environment of the Web seems to fit all these requirements of scholarship beautifully, history meaning not only books, but history done in all types of media, as illustrated by two projects, the Valley of Shadows and the Digital Scholarship Lab.
Day 2, Workshops
Workshop 1, Sustainability: problematizing fragility and ephemerality in digital research, by Valerie Schafer and Andreas Fickers (C²DH), explored the tension between the instability and ephemerality of digital technologies, and the stability and reproducibility that is required in digital research. Aiming to raise awareness about the challenges of sustainability and durability in digital research, the workshop consisted of three parts. The first was dedicated to the identification of the phases in the research process where this type of challenges inevitably occurs, from the creation of corpora to the maintenance and preservation of research results and outcomes. The second part addressed the issue from the double perspective of existing solutions, e.g. digital research infrastructures, and of obstacles and limits, e.g. author rights. The third part focused on creating practical guidelines that could be useful both for scholars and students who deal with digital methods, tools and data.
The second Keynote talk, by Julia Noordegraaf (University of Amsterdam), addressed the topic of Digital Hermeneutics in Media Historiography: Researching Paul Verhoeven in the CLARIAH Media Suite. The CLARIAH Media Suite is a distributed research infrastructure facilitating access to Dutch audio-visual and contextual collections via media search and analysis tools. Noordegraaf pointed out the underlying principles of the infrastructure, i.e. the separation of tools from data, and transparency, issues that are of increasing importance in discussions of digital source criticism and hermeneutics. The Media Suite allows access to raw lists of metadata and offers the opportunity to the researcher to build the relevant facets of the dataset query. It also allows comparison across media, e.g. TV, newspapers, book and film, as illustrated by the project Remediating Culture featuring the Dutch filmmaker Paul Verhoeven. Finally, it was argued that an environment such as this helps to deconstruct our familiarity with tools, such as search engines, and thereby enhances our capacity for digital source criticism.
Workshop 2, Strategies for Using Digital Sources in the Classroom, was presented by John Randolph (University of Illinois at Urbana-Champaign) and Gerben Zaagsma (C²DH). In the first part of the workshop, a project at the University of Illinois was presented entitled The Classroom and the Future of the Historical Record, 2018 – 2020. The aim of the project is to reflect on how digital technology has transformed the way history is taught, i.e. how materials (not only text but also audio, video) and the producers of sources have changed. For instance, can a short piece of film footage showing an artist creating masks for men disfigured by the war be presented to students as an object of historical investigation? What is needed for interpretation - context, identifiers, rights of use, description of the editorial principles? Other examples of objects for this type of enquiry were cited, e.g. the McNitt Family Papers and the Letters from Illinois. The proposed curriculum includes training students in documentary editing and editorial peer review for the publication of open educational resources, thus combining education, publication and social elements. During the second part of the workshop participants worked in groups to discuss possible teaching strategies based on various digital sources. To encourage reflection, the samples, e.g. a cartoon from 1856 and a pamphlet from early 1900s, were accompanied by a set of questions to be addressed by the groups of participants in the workshop, such as: consider the understanding and evaluation of these objects as primary sources, their role in acquiring digital hermeneutics skills and concepts, and the shifts of an institutional/infrastructural nature that would be necessary to adopt this type of teaching.
Workshop 3, Digital Skills - Defining the Bare Minimum, by Ilja Nieuwland (Huygens Institute for the History of the Netherlands) concluded the Conference. The introduction focused on two questions: What digital skills should a student have to be considered a competent historian? and Can we formulate universal requirements for handling digital sources? The lack of consensus on the digital skills that are needed for historians and the need for defining a curriculum regarding this type of skills were also emphasised. Two kinds of experience were showcased in this context. The first, on the archives of the Internet, presented by Valerie Schafer (C²DH), addressed questions such as How to use a tool?, How to adapt to so many tools?, Should we start by a research question?, and the need for skills as flexibility and adaptation. The second, by Sofia Papastamkou (University of Lille), referred to teaching a DH course to master students with classical no DH orientation, and to basic notions included in the lessons, e.g. an introduction and brief history of DH, data definition and fair principles, methods, tools and demonstrations - as network analysis for people working with correspondence. The hands-on part of the workshop was dedicated to work in groups and short group presentations in order to reflect on four categories of skills: (1) searching/finding (beyond Wikipedia); (2) use and criticism of sources and tools (selection, annotation, analysis); (4) structuring and presenting an argument (outlining, writing, presentation); (4) analogue hermeneutic skills that may be disregarded by the first three categories. As a follow-up, the results of the workshop will be communicated and a second workshop is intended for spring 2019.