Project outline
Brief project outline tbd.
Summary
Introduction
Until some years ago, articles about the definition of Digital Humanities, and specifically the question of whether it was a discipline, a community of practice, a set of methods, a metaphorical big tent or archipelago, were almost a rite of passage for proponents of Digital Humanities (#REFS). In the meantime, the question appears not to have been settled, but at least to have become somewhat less urgent. The main reason for this is probably that Digital Humanities has now accumulated so many indicators of institutionalization as a discipline (or, certainly, as an recognizable and enduring, if transdisciplinary, field) that it has basically become evident that Digital Humanities has become, in fact, a discipline. If it walks like a duck and talks like a duck, it's very likely to be a duck.
And yet, nothing in the history of this process of institutionalization of Digital Humanities as a field is self-evident or even well-documented, despite several important investigations of the early history of Digital Humanities (#REFS: Terras and Nyhan, more.). Frameworks of disciplinary institutionalization from the field of Higher Education Research exist, but apart from the rare theoretical publication, they operate mostly in the mode of case studies. Still, they can help us define a perspective on institutionalization of disciplines, a framework of things to look out for in terms of indicators of institutionalization, and a perspective on the motivations for institutionalization (from branding, authority to sustainability of structures and funding sources).
Indicators of institutionalization
One way to organize the indicators of institutionalization is to consider them to fall into five groups: research, teaching, output, people and recognition (Handbook).
Research | Output | Teaching | People | Recognition |
---|---|---|---|---|
Academic institutions (networks, initiatives, centers, professorships, institutes, departments) | Events (meetings, workshops, conferences, conference series) | Training (courses, workshops, training schools) | Job titles | Awards |
Scholarly associations (regional, national, international) | Publication venues (conference proceedings, journals, book series) | Degrees (Bachelor, Master, Ph.D. levels) | Roles in scholarly associations | -- |
Funding programmes | Meta-publications (surveys, reviews, history of DH, definition of DH) | Educational resources (textbooks, handbooks, OER) | Prominent roles in non-DH institutions | -- |
-- | -- | DH taught in secondary education | -- |
Objectives
- enable the writing of the institutional history of DH - be a hub that points to more detailed information about DH: associations, journals, conferences, centers, study programms, people who have participated.
Survey of the field
- IDHC: conferences - DHCR: study programmes - Wikidata: 2000 DHers - Corpora: CLS INFRA, Lehmann - Bibliographies (Stylometry) - CenterNet map and data
Data sources
- See above
Methods: Linked Open Data / Wikibase
- Wikibase cloud - Data model - Identifiers - Linking out - References - Mapping - Queries / SPARQL endpoint
The data model distinguishes major classes, i.e. those that are of primary interest to Mapping DH from contextual item classes, i.e. those that are required to properly describe the major classes. Major classes include unit (with subclasses such as center, network, initiative or department), institution (with subclasses such as university and academy of science) as well as scholarly programme, association, event and person. Contextual item classes include Field of work as well as City, Country and World region. There are currently nine major classes, six contextual classes as well as more than 100 properties.
The data model is described in several ways: As a structured list of classes and subclasses providing a first overview; in the form of scope notes and prose descriptions of the data model, on the page of each major class; and in the form of Entity Schemas written in ShEx (Shape Expression language) that (currently) describe minimal requirements for major classes. In this manner, it is ensured that humans can understand the data model and that its correct application can be automatically validated.
State of play
- 80% of the data model is there - 20% of the data is there - First analyses become possible
Sample analyses
- Publication models of journals: 75% Diamond OA, more if only active journals are taken into account. -
Challenges
- Geographical / geopolitical biases: address through partnerships and case studies (see e.g. Hong Kong, South Korea, Taiwan: relatively well-represented) - Sustainability: Work with "domain ambassadors" who are trained in the data model and are responsible for updating very specific subdomains of the data; most changes going forward will be new board members, new journal editors, and of course additional events, or changes in the study programmes (most difficult to detect!) - Limitations inherent in a separate Wikibase instance
Next steps
- More data - More precise data model - Bring domain ambassadors on board: the presentation at DH should precisely enable that!