Smiley Faces - What can Instant Student Satisfaction Ratings Tell Us?

Practical teaching is very different to lectures, seminars and problem classes. Students are expected to work with specialist equipment, sometimes independently and sometimes in groups, and students interact with a range of different staff members including teaching technicians and graduate teaching assistants (GTAs).

Yet most methods for student evaluations of teaching ask for feedback on whole modules or even programmes, of which lab sessions constitute just one part. By conflating laboratory experiences with lectures and other activities, the feedback received from students may not be directly applicable to our practical teaching. We also employ a wide range of practical activity styles, from introductory equipment tutorials to open-ended design challenges, and we would like to gain insight on each activity individually.

To capture more granular data specifically on practical activities, MEE developed a system for students to press a "smiley face" button as they leave the laboratory. They can rate their experience of the session from extremely happy to extremely unhappy, although this is happiness rating is not in response to any particular question or stimulus. 

Every button press is precisely time-stamped and linked to a room and activity, allowing fine-grained data capture of student sentiments for every session that we run.


The vast amount of data generated by the system stands alone; it is completely anonymous and cannot be linked to any other metrics on the students. This gives us a lot of freedom to analyse it without any ethical concerns. However, this comes at the price of no contextual data, which makes it nearly impossible to ascertain reasons that a particular button press was made. We can tell how happy a student was, but we don't know why!

To try and find some reasons behind the happiness scores, we analysed a large dataset covering 6 weeks of lab sessions to try and correlate other anonymous environmental factors with the student feedback information. This data included class size, staff/student ratio, time of day and week and even external weather! The correlation process attempted to find trends in the data set with any underlying factors, as shown here for day and time of week:

This 5 minute video explains the full system and summarises some of our key findings:


As shown in the video, very little environmental data correlates with the trends observed in the student satisfaction measurements. Even though we have a vast amount of data, and it confirms that MEE is doing a great job of making students happy in the labs, it is not of great use for evaluation and development. 

The tool is a great way of gathering large datasets, and provides reassurance on aggregate student satisfaction, but for meaningful analysis and conclusions the data must be taken in context alongside other questionnaires, focus groups and other freeform responses.

This work was presented at the IEEE FIE Conference 2020, and the full paper can be found here. 

Comments