A Gesture-Based System Puts AI in the Classroom
This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.
For as long as classrooms have existed, students have dutifully crossed their “t’s” and dotted their “i’s” on paper using pens, pencils, or paint. But a group of researchers in Taiwan envision a very different approach to learning via their AI edge computing program.
The platform allows students to draw their handwriting and artwork in mid-air with their fingers, while motion tracking technology projects their writing onto a computer screen at the front of the classroom. The approach, described in a study published on 10 April in the IEEE Canadian Journal of Electrical and Computer Engineering, could help teachers more efficiently manage and teach large groups of students—for example, by seeing real-time results of multiple students’ work on a large screen simultaneously.
Liang-Bi Chen is an associate professor and chair of the Department of Computer Science and Information Engineering at National Penghu University of Science and Technology in Taiwan. He notes there are several different challenges that can hinder teaching efficiency in classrooms, especially as individual teachers become responsible for larger numbers of students.
“These [challenges] include limited interaction between teachers and students, and students struggling to understand lesson content in real-time,” Chen says. As well, he notes that classroom supplies can be expensive, and sharing of classroom supplies can be unhygienic.
AI Tools for Education
The new system proposed by Chen and his colleagues could help address all of these issues. Each student is provided with a device that has a screen and webcam, the latter of which is able to track in detail 21 different joint points of the hand via the gesture tracking library of Mediapipe. An AI model identifies specific hand gestures and recognize when users want to switch modes (for example, switching from writing to selecting a color). The students’ individual work appears on the screen in front of them, but can also be transmitted to the larger screen at the front of the classroom.
Students can raise a fingerto enter “drawing mode” and clench their fist when they want to select a different color. If the system detects finger movement toward the virtual menu area at the edge of the canvas, it triggers menu operations. As each student makes hand motions in mid-air, their virtual writing or art is projected onto a large screen at the front of the classroom.
“The teacher receives real-time video and drawing results from each student, which are integrated and displayed on a large screen to facilitate synchronized teaching and interaction,” explains Chen, adding that the students’ writing results are automatically uploaded to a cloud platform, allowing their handwriting to be evaluated after class as well.
In their study, Chen and his colleagues explored the system’s accuracy—meaning its capability to consistently and accurately track video data captured by student-end devices—as it’s used by more and more students simultaneously. The results show that the system performs with 100 percent accuracy when between 5 and 10 students are using the program. This accuracy dropped to 96 percent when 30 students were using it simultaneously. But overall, Chen says these results demonstrate “strong real-time performance and scalability, making it well-suited for practical classroom applications.”
He notes that the overall cost of the system is low, at approximately US $6,250 for 30 students, and would circumvent the need to routinely buy classroom supplies. Chen adds that the system, being touch-free, also promotes better hygiene in the classroom.
A limitation of the current system, he says, is that its response time decreases gradually with the number of users. It is also currently limited to virtual writing and drawing features, but his team is working on developing more interactive teaching functions, such as quizzes or group discussions.
The researchers have patented the system and are looking to collaborate with educational institutions or hardware manufacturers to introduce this system into schools in Taiwan. As well, they are looking to build upon the system and its functions, for example by developing automated scoring systems and facial expression recognition to evaluate students’ attentiveness.
“We believe that education is one of the most promising areas for digital transformation. Although this system was initially designed to support handwriting and drawing instruction, its core concept and architecture can be adapted to a broader range of teaching scenarios,” says Chen.
“Our goal is to continually refine and promote this system, ensuring that every child has access to a safe, interactive, and affordable learning environment.”