New! Sign up for our free email newsletter.
Science News
from research organizations

EduSense: Like a FitBit for your teaching skills

Comprehensive classroom sensing system

Date:
November 6, 2019
Source:
Carnegie Mellon University
Summary:
While training and feedback opportunities abound for K-12 educators, the same can't be said for instructors in higher education. Currently, the most effective mechanism for professional development is for an expert to observe a lecture and provide personalized feedback. But a new system offers a comprehensive real-time sensing system that is inexpensive and scalable to create a continuous feedback loop for the instructor.
Share:
FULL STORY

While training and feedback opportunities abound for K-12 educators, the same can't be said for instructors in higher education. Currently, the most effective mechanism for professional development is for an expert to observe a lecture and provide personalized feedback. But a new system developed by Carnegie Mellon University researchers offers a comprehensive real-time sensing system that is inexpensive and scalable to create a continuous feedback loop for the instructor.

The system, called EduSense, analyzes a variety of visual and audio features that correlate with effective instruction. "Today, the teacher acts as the sensor in the classroom, but that's not scalable," said Chris Harrison, assistant professor in CMU's Human-Computer Interaction Institute (HCII). Harrison said classroom sizes have ballooned in recent decades, and it's difficult to lecture and be effective in large or auditorium-style classes.

EduSense is minimally obtrusive. It uses two wall-mounted cameras -- one facing students and one facing the instructor. It senses things such as students' posture to determine their engagement, and how much time instructors pause before calling on a student. "These are codified things that educational practitioners have known as best practices for decades," Harrison said.

A single off-the-shelf camera can view everyone in the classroom and automatically identify information such as where students are looking, how often they're raising their hands and if the instructor moves through the space instead of staying behind a podium. The system uses OpenPose, another CMU project, to extract body position. "With advances in computer vision and machine learning, we can now provide insights that would take days if not months to get with manual observation," said Karan Ahuja, a member of the research team who is pursuing his Ph.D. in the HCII.

Harrison said learning scientists are interested in the instructional data. "Because we can track the body, it's like wearing a suit of accelerometers. We know how much you're turning your head and moving your hands. It's like you're wearing a virtual motion-capture system while you're teaching."

Using high-resolution cameras steaming 4K video for many classes at once is a "computational nightmare," Harrison said. To keep up, resources are elastically assigned to provide the best possible frame rate for real-time data.

The project also has a strong focus on privacy protection, guided by Yuvraj Agarwal, an associate professor in the university's Institute for Software Research. The team didn't want to identify individual students, and EduSense can't. No names or identifying information is used, and since camera data is processed in real time, it is discarded quickly.

Now that the team has demonstrated that they can capture the data, HCII faculty member Amy Ogan said their current challenge is wrapping it up and presenting it in a way that's educationally effective. The team will continue working on instructor-facing apps to see if professors can integrate the feedback into practice. "We have been focused on understanding how, when and where to best present feedback based on this data so that it is meaningful and useful to instructors to help them improve their practice," she said.

This research has been presented at Ubicomp, the International Conference of the Learning Sciences, and will be presented this coming April at the American Educational Research Association annual meeting.


Story Source:

Materials provided by Carnegie Mellon University. Original written by Virginia Alvino Young. Note: Content may be edited for style and length.


Cite This Page:

Carnegie Mellon University. "EduSense: Like a FitBit for your teaching skills." ScienceDaily. ScienceDaily, 6 November 2019. <www.sciencedaily.com/releases/2019/11/191106085416.htm>.
Carnegie Mellon University. (2019, November 6). EduSense: Like a FitBit for your teaching skills. ScienceDaily. Retrieved November 20, 2024 from www.sciencedaily.com/releases/2019/11/191106085416.htm
Carnegie Mellon University. "EduSense: Like a FitBit for your teaching skills." ScienceDaily. www.sciencedaily.com/releases/2019/11/191106085416.htm (accessed November 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES