Rationale of using Slido
As a specialist technician in teaching and learning I chose Slido as the interactive technology intervention for this study due to its potential to significantly enhance student engagement in the classroom. This decision was further informed by the findings of a previous study conducted by Ningsih (2023). This research explored the impact of Slido on English for Specific Purposes (ESP) classrooms, employing a mixed-methods approach to investigate both quantitative and qualitative data. The results of this study were encouraging, demonstrating positive effects on students’ perceptions of engagement, motivation, and overall satisfaction with the learning experience.
Building on Ningsih’s research (2023), which found that Slido significantly enhanced student engagement in ESP classrooms, this study aims to further explore its potential. Ningsih’s respondents reported finding Slido engaging (41% strongly agreed), enjoyable (42% strongly agreed), and effective in promoting active participation (35% strongly agreed) in class discussions. The majority preferred quizzes, word clouds, and image polls, indicating a preference for interactive and engaging features. The less popular multiple-choice feature highlights the importance of diverse feature offerings to cater to various student preferences.
I am hopeful and optimistic that integrating Slido into the classroom can revolutionize classroom interaction by fostering a dynamic interplay between student and tutor learning. Research by Muthmainnah (2019) suggests that Slido can significantly boost student participation, encouraging questions, comments, and active engagement with learning materials. Aslan et al. (2019) further emphasized how real-time interaction tools like Slido can enhance student engagement and create a more positive learning environment.
Technology has the power to transform language learning. It can enhance learning outcomes, boost student engagement, provide immediate feedback, and facilitate personalized and autonomous learning (Chiu, 2021; Memon et al., 2022). By incorporating technology-based activities, we can provide authentic language practice opportunities, making learning more meaningful (Miller, 2018). Interactive technologies, such as quizzes and games, can improve knowledge retention and recall (Wassalwa & Iffah, 2022; Yang and Chen, 2021). Immediate feedback empowers students to identify areas for improvement and focus on them.
I envision a classroom where technology, particularly tools like Slido, would foster a more engaging and interactive learning environment. By encouraging active participation and collaboration, technology can promote student-centred learning and cultivate creativity and innovation (Fonseca and García-Peñalvo, 2019; Onyema et al., 2019).
Data Collection This research employed an action research cycle approach (McNiff & Whitehead, 2009). Following step 2 of the cycle (Figure 1), interactive elements were incorporated into the research process.
Figure 1. Action research cycle (McNiff and Whitehead,2009).

On October 24th, 2024, I took advantage of Slido and signed up for a Slido account on website (https://www.slido.com/) to explore its features. To promote student participation, I incorporated three specific Slido features into my presentation (Figure 2):
Q&A (Questions and Answers)
This facilitates real-time interaction by allowing students to ask questions directly within the presentation.
Open Text Input
This provides a platform for students to share more detailed thoughts or ideas.
Emoji Rating
This feature allows students to use emojis to express their understanding or reaction to a specific topic.
To enhance student engagement during my PowerPoint presentation, I integrate these various interactive features using the Slido add-on. My primary goal is to evaluate whether student participation would increase when they interact with the presentation content through the Slido .
Figure 2. Overview of Slido interactive features

Measurements: Slido
To evaluate the effectiveness of the Slido intervention in the classroom, a questionnaire was designed to gather student feedback on their Slido experience (Table 1). I kept the questionnaire brief and short so students could complete it. I uploaded the questionnaire to Qualtrics, which is a software platform for survey distribution.
The questionnaire’s design was inspired by Ningsih’s (2023) study on students’ perceptions of Slido use on mobile devices, as well as further insights from marketing researchers’ discussions on Slido (Videos of Slido researchers talking, 2023; Soe, 2024) and open-ended questions from previous research (Dawadi et al., 2021; Strijker et al., 2020).
Minimizing Response Bias Through Questionnaire Design
The design of the questionnaire prioritized minimizing response bias. To achieve this, a focus was placed on clear and neutral wording. Complex questions were avoided, along with topics students might hold strong, uninformed opinions on, a phenomenon known as “non-attitudes” (Converse, 1970; Smith, 1984). The study by Converse and Presser (2011) highlights a key challenge in question design. That is, even straightforward concepts can become difficult to answer if the question is overly complex. In the context of Slido questions, I avoid topics that might elicit strong opinions or lack knowledge among students. This could lead to inaccurate or biased responses, hindering the effectiveness of the questionnaire.
Slido Questionnaire
The instrument I constructed employed a mixed-methods approach, collecting both quantitative and qualitative data from participants. To facilitate data collection, I constructed a questionnaire using the Qualtrics platform. To streamline the survey process, I implemented skip logic functionality. This feature redirects participants who did not consent to participate to the end of the survey, ensuring that only those who agreed to participate proceeded with the questionnaire. This step is crucial to uphold ethical principles and maintain the integrity of the research. I distributed directly to students the link to the questionnaire in the classroom on 30th of October 2024. To ensure immediate completion and minimize dropout rates, participants were prompted to finish the questionnaire upon accessing it. An information sheet (Figure 3) and consent form (Figure 4) were attached to the questionnaire, serving as a comprehensive cover page for the survey.
The questionnaire (Table 1) included six Likert scale questions to assess the following aspects:
- Overall satisfaction with Slido for communicating problems
- Slido’s impact on content delivery, pace, and learning support
- Interest in exploring Slido’s features such as Q&A , rating and emojis
- Overall satisfaction with Slido as a learning tool and suggestions for improving the platform
The questionnaire employed a combination of rating scales. Six-point Likert scale, ranging from “Extremely dissatisfied” to “Extremely satisfied,” were used to assess various aspects. Additionally, three-point Likert scales, including options like “Fair,” “Good,” and “Excellent,” were incorporated. To gather more detailed insights, two open-ended questions were included following suggestions from my supervisor. These questions prompted students to provide specific examples of how Slido had positively influenced their learning experience and to offer suggestions for improving the Slido platform.
In fact , the decision to include open-ended questions was made after a discussion with my supervisor during a workshop on October 25, 2024. During this workshop, I shared my intention to utilize the Slido application in the classroom, and my supervisor recommended evaluating its impact through a questionnaire and using the open-ended questions.
I administered the questionnaire through a QR code generated via Qualtrics in classroom. The QR code link was embedded in PowerPoint slides when I presented it. According to Converse and Presser (2011), open-ended questions can yield rich, detailed responses. However, they are susceptible to ambiguity and tacit assumptions (Converse and Presser,2011). Respondents may inadvertently overlook certain options if they are not explicitly presented. In contrast, closed-ended questions provide a standardized framework enhancing data analysis and validity. To leverage the strengths of both question types, I designed the Qualtrics questionnaire for slido, incorporating both formats. The questionnaire begins with closed-ended questions to establish a baseline and then concludes with two open-ended questions to allow for deeper exploration and nuanced responses.
Table 1. Questionnaire to assess Slido
Questionnaire for using Slido application.docx
Or in Excel format questionnaire.xlsx

Procedure: Testing the Slido intervention in classroom
I piloted the Slido intervention in a Year 2 Psychology class in late October 2024. Prior to the intervention, I prepared the questionnaire on Qualtrics by October 20th and informed the unit leader about my Action Research Project and the planned Slido intervention. I obtained unit’s leader verbal consent to conduct the intervention in class. On October 24th, I further debriefed the unit leader, explaining the specific features of the Slido intervention, including rating polls, questions, open text, and a post-intervention questionnaire. I also detailed the process of students connecting via QR code to participate. The unit leader approved the intervention, and it was conducted at the end of October.
Upon entering the classroom, I informed students about the purpose of my Action Research Project and explained Slido was a method to assess its impact on student learning and interaction. I distributed information sheet (Figure 3) and consent form ( Figure 4) and informed that I had uploaded these forms online, emphasizing the anonymity of their responses and the right to withdraw at any time (BERA, 2024).
Figure 3.information sheet.docx
Figure 4. Consent Form.docx
I instructed students to scan the QR code (Figure 5 and 6) on their mobile devices to access the Slido platform and submit their responses. I emphasized the confidentiality and anonymity of their data, and reassured them that participation was entirely voluntary,and it was OK if they were withdrawn, given my dual role as both a specialist technician and tutor. By creating a safe space where students could choose not to participate, I aimed to mitigate any power dynamics and prioritize their freedom of choice.
Figure 5. Slido’s QR code on powerpoint

Figure 6. Slido Joining link via mobile.

What resonates with me is that the digital age has ushered in unprecedented opportunities for data collection and analysis. However, it also presents unique ethical challenges, particularly when relying on mobile devices to gather and process data. As van Doorn (2013) and Kara (2015) highlight, digital interactions can be traced by third parties, potentially compromising participant confidentiality. To mitigate these risks, I employed measures to ensure student anonymity and data privacy. By utilizing anonymized responses in Qualtrics for slido questionnaire, I did not collect IP addresses, location data, or contact details, safeguarding student privacy and research integrity. I checked the box not to include IP address. It is important to note that not all students actively engaged with the Slido features. Of the total participants, 22 actively interacted with the Slido Q&A and polls.
When I tested the slido in classroom I displayed the following questions onto powerpoint to students and asked them to respond to :
- ‘I feel confident using coding and coding stripes’: I debriefed students that this was a rating question (Figure 7).
- ‘Explain any problems you face classifying cases’ :I explained to students to write as if they were writing in a text (Figure 7).
- ‘I am confident in my ability to use Maxtrix coding task’: I explained to students this was a emojis-based question (Figure 7)
During the classroom students were able to view and read on screen the responses of their classmates.
Figure 7. Questions Students were asked via Slido.

Figure 8 . Word Cloud and Q&A responses in classroom.

Following student responses, I presented the slides (Figure 7 and 8 ) to visualize the results using word cloud (Figure 8). For the statement “Feeling confident in using coding,” the slido responses were distributed as follows: 50% agreed, 10% strongly agreed, 20% neither agreed nor disagreed, and 20% disagreed (Figure 7).
References
Aslan, S., Alyuz, N., Tanriover, C., Mete, S. E., Okur, E., D’Mello, S. K., & Arslan Esme, A. (2019) ‘Investigating the Impact of a Real-time, Multimodal Student Engagement Analytics Technology in Authentic Classrooms’, Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 1–12. Available at: https://doi.org/10.1145/3290605.3300534.
Chiu, T. K. F. (2021) ‘Digital support for student engagement in blended learning based on self-determination theory’,Computers in Human Behavior, 124, 106909. https://www.sciencedirect.com/science/article/pii/S0747563221002326 (Assessed 3 September 2024).
Converse, J.M. and Presser, S. (2011) ‘The Tools at Hand’, in Converse , J.M. and Presser, S.(eds.) Survey Questions: Handcrafting the standardized questionnaire. Publishing Company: SAGE Publications, Inc. City: Thousand Oaks ,pp. 48-75. DOI: https://dx.doi.org/10.4135/ 9781412986045
Converse, P. E. (1970). Attitude change. New York: Wiley.
Dawadi, S., Shrestha, S. and Giri, R. A. (2021) ‘Mixed-Methods Research: A Discussion on its Types, Challenges, and Criticisms’, Journal of Practical Studies in Education, 2(2), 25–36. Available at: https://doi.org/10.46809/jpse.v2i2.20 (Accessed 22 October, 2024).
Fonseca, D. and García-Peñalvo, F. J. (2019) ‘Interactive and collaborative technological ecosystems for improving academic motivation and engagement’, Universal Access in the Information Society, 18(3), 423–430.Available at: https://doi.org/10.1007/s10209-019-00669-8 (Accessed 4 September, 2024).
Kara, H. (2015) Creative Research Methods in the Social Sciences : A Practical Guide, Policy Press, Bristol. Available from: ProQuest Ebook Central. [Accessed: 4 November 2024] pp. 35-53.
Memon, M. Q., Lu, Y., Memon, A. R., Memon, A., Munshi, P., & Shah, S. F. A. (2022) Does the Impact of Technology Sustain Students’ Satisfaction, Academic and Functional Performance: An Analysis via Interactive and Self-Regulated Learning? Sustainability, 14(12), 7226. https://doi.org/10.3390/su14127226 (Accessed 3 September, 2024).
Miller, G. J. (2018) ‘Technologies in the Classroom: Advancing English Language Acquisition’. Kappa Delta Pi Record, 54(4), 176–181. https://doi.org/10.1080/00228958.2018.1515546(Accessed 3 September, 2024).
Muthmainnah, N. (2019) ‘An effort to improve students’ activeness at structure class using Slido appliction’ , Journal of English Educators Society, 4(1), 1–7. Available at https://doi.org/10.21070/ jees.v4i1.1868.
Ningsih, F. (2023) ‘Uncovering Students’ Perceptions of Slido: An Innovative Engagement with Real-Time Interactive Technology in ESP’, Issues in Applied Linguistics & Language Teaching, Vol. 05 (1), 7-15.DOI: 10.37253/iallteach.v5i1.7773 (Accessed: 3 September 2024).
Onyema, E. M., Deborah, E. C., Alsayed, A. O., Noorulhasan, Q., & Sanober, S. (2019) ‘Online discussion forum as a tool for interactive learning and communication’, International Journal of Recent Technology and Engineering, 8(4), 4852–4859. https://www.ijrte.org/wp- content/uploads/papers/v8i4/D8062118419.pdf (Accessed 3 September, 2024).
Smith, P. B. (1984) On the measurement of attitudes. Amsterdam: Elsevier Science Publishers B.V.
Slido (2023). Videos of Slido Researchers Talking. Slido/ Elevate: Secrets to Building Trust – Keynote Available at : https://youtu.be/RvluGIi_-8w?si=nM9ryC5YSxkBMjmE (Accessed: 3 September 2024).
Soe, E.Y (2024). 5 Must-Have Slido Features to Boost Engagement in Your Next Meetings
. Blog. Available at : https://blog.slido.com/slido-features-engagement/
Strijker, D., Bosworth, G. and Bouter, G. (2020) ‘Research methods in rural studies: Qualitative, quantitative and mixed methods’, Journal of Rural Studies, 78, 262–270. Available at : https://doi.org/10.1016/j.jrurstud.2020.06.007
Wassalwa, A., & Iffah, U. (2022) ‘The Effectiveness of Game Quizizz Using Learning Media On Students’ Outcomes’, Journal of English Ibrahimy,1(2), 10–17. Available at: https://doi.org/10.35316/joey 2022.v1i2.10-17 (Accessed 3 September 2024).
Yang, K.-H. and Chen, H.H. (2021) ‘What increases learning retention: employing the prediction-observation- explanation learning strategy in digital game-based learning’, Interactive Learning Environments, 1–16.Available at: https://doi.org/10.1080/10494820.2021.1944219(Accessed 4 September, 2024).