Want to Know What Students Think About Classroom Tech? Ask Them

Alemayehu Bahta
Age of Awareness
Published in
5 min readJan 11, 2021

--

In my experience, students rarely have a voice in which tools are selected and used in their classrooms. Gathering input from all relevant district stakeholders is a complicated feat, so it’s no surprise that students often don’t make the cut when staff are deciding who to include on steering committees. One of the primary challenges to collecting student input is that the people who buy and implement products rarely have direct access to students. Therefore, technology buyers and implementers have to work through school contacts and other mediums to rally enough buy-in before they can talk with students.

My district wanted to make sure student voice was being considered when we were in the process of renewing a software contract, so we ran four focus groups of about 15 students each. This approach was by no means perfect nor should it be considered the gold standard, but it was what was available and could be easily replicated by others.

Here are four tips I’d suggest for anyone looking to get student input on their district’s technology tools.

One: Find Students Where They Are

Due to remote learning, our team had to think of the best way to find students that could provide input. Some districts have student leadership boards and other panels of students that are tasked with helping district staff on providing student input. Our district’s student panel was unavailable since they were working on providing feedback on remote learning. Instead, we worked with our school-level implementation champions to identify a new group of students.

One of our school lead’s was running an extra-curricular group that was already planning to use the tool. We supplemented the extra-curricular programming by offering an optional student focus group after students had used the product.

I learned three lessons about forming the optimal student focus group after working with these students. First, make sure the student group is diverse. This will allow you to see if there are differences in how various student groups react to the product and how you can overcome any gaps. Second, if the tool is used across grade levels make sure you talk to students across those grades. Third, make sure your students have the right level of exposure to the product. If you are going through a renewal, like we were, then recent usage will be important. However, if you in the RFP then students should have enough exposure to understand a product’s basic functionality.

Two: Set the Stage for the Focus Group and Ask the Right Questions

Prior to the focus group, we created a discussion guide to help facilitate our discussion as well as creating artifacts for students to react to. The discussion guide is meant to structure the flow of how you want to focus group to go. Additionally, it helps you formulate questions that lead to uncovering the real problems you want to solve for. A quick google search on focus group moderation can provide useful guidance on how to create a guide as well as other best practices. Since my team couldn’t actually change the product’s functionality or UI, we wanted to ask questions about how well the tool was doing in reaching our district learning objectives and goals.

Each focus group started with basic ground rules as well as what we were hoping to learn through the research. Additionally, we shared what a focus group was and the instructions for how they typically flow. These guidelines are an important step since most students may have never heard of a focus group and may be under the impression that they can only tell us positive things. As with any focus group or interview, there is always an unstated power dynamic in the room that may sway opinions, so we emphasized the need for truth and honesty without judgment. To increase conversation among the group we recruited the near-peer mentors that were working with students in the extracurricular program since they had existing relationships with the students.

Three: Properly Digest the Feedback

After the sessions, I transcribed the data and began my analysis by coding the data into themes.

My analysis resulted in three themes

1. Students displayed confusion about how to actually complete activities

2. The tool affirmed the importance of self-discovery and future planning

3. There was a greater need for UI enhancements in the tool (outside of our scope of work)

Before taking any actions on the identified problems we wanted to supplement the qualitative analysis with quantitative data to help quantify the scale of the problems. I designed a survey instrument based on our focus group findings that was distributed to all students and their families. This survey provided quantitative data on how wide-scale the problems were as well as the ability to disaggregate findings by grade, school or race. This step took much more time but when building the case for additional resources or changes we wanted to ensure we had more student representation from across the district.

Four: Act on the Findings

After properly identifying problems we needed to find out what our implementation team could actually do. The challenge here was that we did not have the ability to write new code for the product so we had to be creative in the tools and services we could provide to enhance the student experience. The lowest hanging fruit was addressing issues related to difficulty in navigating the tool. What we learned was that though students had learned how to explore the tool, they often only learned it once, so they are always relearning basic navigation. Moreover, since our support resources lived outside the tool, students often ignored them or simply weren’t aware of their existence. Therefore, we built in-app support using other SaaS tools that guided students through task completion without them leaving the product.

Lastly, we found one area of positive engagement from both our qualitative and quantitative data, so we decided to double-down in that area. Since most students shared that the product helped with self-discovery and future planning (which aligns with our departmental goals), we used the feedback as the basis for the future development of a series of lessons that use the product for post-secondary life planning. After the lessons are created, they will be piloted with a single school to understand their impact and then scaled to other schools.

Regardless of how much time it takes to go through the process of gathering student input, it is a necessary step in the selection of classroom technology. As the ultimate end users of many tools, students should have the opportunity to voice their opinions on tools and their efficacy.

--

--

Alemayehu Bahta
Age of Awareness

Education Nerd | Tech for Social Impact | Future of Work | Coffee Enthusiast