Triangulating Evaluative Data Sources: What’s the point?
There is always a need to evaluate the impact of change. Since the first Covid19 lockdown in March 2020 evidence gathering has focused on the impact of institutional decision making and understanding the experiences of staff and students. Unfortunately, there has been little time to construct a participatory approach to evaluation and competing priorities have limited stakeholder engagement in data collection and analysis.
As David Parson (2017) noted – evaluation should be proportionate to needs, circumstances and resources, and designed with regard to both technical choices and political context – so the evaluation of our teaching and learning response during the Covid19 pandemic pragmatically utilised data triangulation as a guiding principle in our reporting.
Data Curation
A period of data curation ensued, which is no easy task in a large, complex organisation. Data is collected in a university for various reasons: to monitor student retention, to react to concerns about student wellbeing, to ensure quality and standards, and so on. Whilst the data controller is the institution, the data handlers are diverse, using different systems, processes, and reporting mechanisms.
In many cases, the data is not collected for evaluative purposes. The data may be time bound or updated daily. It is sometimes sensitive with restricted access. A balance of data types is welcomed, but is not designed in, and the dominance of quantitative indicators often prevails.
New Data
Alongside the curation of existing data is the identification of new data, sometimes designed quickly, and often attending to multiple viewpoints and priorities. Most aspects of institutional data collection are covered within the Student Privacy Notice, other sources will gain formal research ethics approval. If the data can be accessed and has been sourced during a similar period of time (in this case March – Sept 2020) then it can be triangulated to confirm, refute or further explain the emerging findings.
Data triangulation, as a strategy for evaluating the impact of Covid19, has been messy and often frustrating. However, looking at what data already existed (and there’s lots, in places you would never imagine until you start digging) did limit the need (for a period of time) to generate a mass of new data. Ethically, this is a strategy which should limit over researching, which is especially important in these challenging times.
Outputs and Reporting
A series of evaluative reports, utilising a trajectory approach to evaluation, have been published (July 2020, Sept 2020) which provide an overview of the transition to online delivery of teaching and learning during and beyond the Covid19 pandemic. They explored student and staff experiences, indicators of impact on student groups and subject areas and an exploration of the use and impact of our No Detriment Policy. They have been communicated across the institution and to senior leaders to enable evidence informed decision making.
If you are a Hallam staff member, you can view these reports in our pilot Evaluation Repository on our SharePoint site: Click here for access to our SharePoint site.
For a full list of the data sources used within these evaluations please click on this link: Data Sources. This overview of data sources has been aligned with the Austen’s Typology of Institutional Research and Evaluation (2018, 2020).
References
Austen, L. The amplification of student voices via institutional research and evaluation, in Lowe , T. & El Hakim, Y. (2020) A Handbook for Student Engagement in Higher Education: Theory into Practice, Routledge, available at https://www.routledge.com/A-Handbook-for-Student-Engagement-in-Higher-Education-Theory-into-Practice/Lowe-El-Hakim/p/book/9780367085490
Parsons, David (2017) Demystifying Evaluation, Bristol: Policy Press