Hello again!
This is a somewhat overdue follow up of my blog posted back in 2022 (Our strategy for student feedback in MEE).
A year after the full rollout of a departmental feedback system, I will be discussing how it was implemented and what the impacts have been for staff and students...
...was it worth doing?
Perhaps this will be food for thought for student voice systems in your own department, or maybe you can give me some comments/pointers from your own experiences!
A quick recap
Back in 2022 we identified gaps in MEE's (Multidisciplinary Engineering Education) feedback net. At the time we didn't have a system in place that was able to capture the important day-to-day information about student experience.
The end of semester module surveys (called 'TellUs' at Sheffield, and the NSS, don't provide a tight enough net - a student would have to remember a comment for weeks/months before they could submit it through the survey. MEE therefore introduced a digital 'open-door' style feedback letter box allowing students to give anonymous feedback about their classes, on the fly, anytime.
We also made a place to collect this feedback 'data', along with all the other student feedback we could get our hands on (...nothing fancy, just a spreadsheet). The previous blog explains all that in more detail along with all the components of the feedback system, and the aspirations before it was introduced.
What's happened since
After a trial phase, the feedback system was introduced department-wide (in MEE/ The Diamond) in 2023.
Since its initial trial incarnation, the Diamond (MEE) Feedback Letterbox had a bit of a branding update.
When a student goes into a lab class, they are then able to get to the feedback letterbox in just one click (or a scan of the QR code) to provide a comment about their experience.
So over the last year or so, students have been submitting comments through the feedback letterbox. In addition we sifted through TellUs (and NSS) data, Staff-Student Committees, and collected it all into the central feedback database.
- Had lots of small wins (low hanging fruit)
- Shared good news stories with staff
- Identified bigger themes/trends to fix long term
As a result of all this, we have made lots of headway in improving things across the department. Arguable, we have 'plugged gaps' in our feedback net with some success.
...but can we quantify this impact in any way?
Annual Review
It is hard to measure the impact of introducing the Feedback Strategy to MEE. However, as a simple metric, many comments have been addressed & improvements made. So there is demonstrable progress in the right direction. We scooped up over a years worth of data, and here are some key headlines:
At the end of 2023, MEE's feedback database had 114 rows of data. 33 of these were comments submitted via the feedback letterbox. In general 1 database row equates to 1 comment, but for TellUs comments 1 row in the database represents a collection of comments related to a module. Data was also gathered from NSS, Staff-Student Committees, and Focus Groups.
This feedback data collectively formed 91 actions - of which:
- 88% Are completed
- 9% Need chasing up
- 3% Formed larger/longer term projects (in progress)
Reflecting on this a bit more:
Considering actions as 'small' or 'large' was a simple, sensible way of managing lots of data. It allowed us as a department to quickly fix the easy problems ‘low hanging fruit’, while identifying recurring themes so that we can take a more strategic view of bigger changes in the long term.
Creating extra admin effort (managing of the system) was an early concern. It turned out to be fairly low when spread over a year, and also because the volume of comments submitted to the feedback letterbox was not particularly high.
- Number of submissions (engagement) through the feedback letterbox will increase as feedback culture develops in the department/faculty (student awareness).
- However, engagement will decrease as we fix everything and there a fewer issues to report.
Closing-the-loop remains a challenge! It's impractical to close the loop on hundreds of individual comments (without bombarding students with info they won't read). After discussing this challenge with the faculty student experience committee, we decided that MEE would provide short "you said, we did" segments into departmental newsletters > a short summary mentioning key bits of student feedback and the correcting actions we took.
Has the feedback strategy helped?
Yes!
Reviewing the data showed that introduction of the feedback strategy (including the feedback letterbox) has directly resulted in a sizeable volume of improvements to education quality and student experience across the department. We are now more effective at listening to student voice and implementing changes as a result.
Staff engagement with the process including receiving feedback and deciding/taking 'Actions' has been great. With this, and because of 'feedback' coming up more often in team meetings, MEE's culture/business-as-usual is now incrementally more focussed on student voice.
Could it be improved?
Yes!
Annual reviews will keep happening to make sure that the system is useful, effective, and evolving with the times!
In the future we want to explore using automations to improve the efficiency of the system, and to explore more ways of closing the feedback loop. There are a few ideas down on paper for this - such as displaying "you said, we did" bulletins on our display screens around the building.
The main future aim is to share more about this work in MEE (hence this blog) and thereby gain insight from others (you?) doing work on student feedback/voice/experience.
Thanks for reading!
Please leave a comment or get in touch if you want to discuss!
.