Exploring the Context, Assumptions and Outcomes of Institutional Interventions via a Data Dashboard
A recent review of access, retention, attainment, and progression literature written by a team from Sheffield Hallam (Austen et. al. 2021) discussed the evidenced relationships between interventions and student outcomes. Knowing and understanding the demographics of a cohort and tracking outcomes over time were concerns, as was the description of how and why change occurs.
At Sheffield Hallam, we recommend that all interventions which aim to impact of student outcomes are accompanied by a Theory of Change, either before (preferably) or during (hopefully) implementation. An exploration of the context, assumptions and outcomes for these interventions is crucial for designing effective interventions.
This evidence was used to commission, design and test an ‘Interventions Dashboard’. Using student IDs for intervention cohorts and institutional data, a dashboard of demographics and outcomes can be generated. Demographic data includes trends in frequency of participation over time and key protected characteristics. The outcomes data compares the intervention cohort with the general population at Hallam and displays withdrawals (retention) and reasons for leaving, good honours, degree classifications, average module marks and if possible, graduate outcomes. The accessibility of this dashboard would enable an exploration of context, assumptions, and outcomes for each intervention.
About the data
Our test dashboard used records of student engagement as a Course Representative. Hallam Students’ Union notes that student representatives work with students and university staff to identify and address issues affecting the educational experience. Their main role is to represent the views of their fellow students and work with staff to contribute to positive changes on their course.
This test dashboard provided an overview of the characteristics of students who became a student representative at some point during their academic studies. In this test version, the data was not able to identify a time period for engagement as a Course Rep; students in this data set could have been a Course Rep during any year of study.
Using the data for evaluation
In evaluative terms, being a student representative involves a series of ACTIVITIES which the students engage with and participate in. Withdrawals, module marks, and good honours are examples of LONG TERM OUTCOMES. Alongside the analysis of this data, it is important to explore what changes in student knowledge, skills and behaviours the ACTIVITIES are hoping to achieve, what INPUTS ensure the ACTIVITIES take place, and what SHORT TERM OUTCOMES and MEDIUM TERM OUTCOMES can be evidenced before any longer term impact claims are made.
How to use this data
This data has many uses and many caveats which limit the claims that can be made, but this is a useful step forward in the development of intervention and evaluation design. Rebecca Hodgson, Assistant Dean Teaching & Learning, College of Social Science & Arts, explains:
“Working with Liz Austen and James Berry on our ideas for an ‘interventions dashboard’ was incredibly exciting. Over the years I have seen many examples of excellent ‘interventions’ in terms of creative ideas to support student progress on a particular course, and I have always been struck by the challenge of accessing relevant data to evaluate such projects in a robust way. Often, the only evaluation evidence would be comments from staff or students. Whilst such qualitative data is of course a vital part of evaluation, data on student progress and outcomes is often ‘the missing link’ that can provide more evidence for longer term strategic planning and policy decisions. Of course, at this stage I need to add the necessary caveat about being cautious about causality – something outlined in detail below. Nevertheless, this new tool adds an extremely helpful ingredient into our ability to develop and maintain evidence informed practice in higher education.“
This data could be used to:
- Test assumptions and prior evidence about the demographic characteristics of Course Reps.
- Identify strengths and weaknesses in the diversity of representation roles at Hallam.
- Explore indicative OUTCOMES of participation and engagement in student representation.
- Design and plan an evaluation and further data gathering on the assumed associations between ACTIVITY and OUTCOMES to provide robust evidence of impact.
This data should not be used to:
- Conclude that there is a causal association between the ACTIVITIES (being a Course Rep) and OUTCOMES (such as withdrawal, attainment).
- Make claims which overlook the limitations of the data. For example, the size of the sample.
- Overlook the need for triangulated evidence, especially a qualitative understanding of how change might occur.
- Make generalisations beyond the Hallam context.
- The reporting and sharing of this data should be mindful of commercial sensitivities and apply ethical principles to ensure there are no opportunities for individual students to be identifiable (combining values where data is less than 10 students and reporting as <10).
- Ethical approval and individual consent from students to use this anonymised data, as stated, is not necessary.
Next steps
Our data visualisation specialists are working to refine the dashboard and explore the use of statistical tests and the associated claims that could be made. They are also exploring the identification of a comparison group instead of a comparison to the wider population.
Interventions which receive funding to evaluate impact through the SETL Evaluation Bursary Scheme will be able to use a bespoke data dashboard for their intervention cohorts. There is also work being undertaken to test outcomes of students that receive an intervention based on learning analytics data. The Students Union would also like to explore the context and outcomes of student who participate in voluntary and extra-curricular activity.
If you are currently implementing an intervention at Sheffield Hallam which aims to impact on retention, attainment, or progression, please contact Liz Austen: l.austen@shu.ac.uk for an opportunity to engage with the intervention data dashboard. For more information on evaluating impact using a Theory of Change, please visit https://blog.shu.ac.uk/steer/evaluative-mindset/.