I love summer vacations. But I hate planning them. So I experimented with a chatbot that helps me plan a family trip. I learned a lot from the experience, and gained new insight into why AI is still a long way from replacing many of the intangibles that humans bring to important jobs like tutoring.
AI is undoubtedly great at making users more efficient and organized — it can quickly evaluate, customize, and present information and options. But it still misses the mark when it comes to capturing the human experience and nuance that makes an impact on everything from motivating a struggling student to planning a memorable family trip.
For example, in my case, the AI bot, as if it knew me, suggested an itinerary and then told me to “have a great vacation.” But it was totally disappointing. My first reaction was, “Damn, AI.” Then I started thinking. Why? Why is AI so bad at giving compliments and fostering connections, but so good at categorizing?
Science vs. Society
Chatbots are computer programs designed to mimic certain human interactions, often using natural language processing (NLP) to decipher user questions and send automated responses in real time. These chatbots aim to simulate human conversations for a better user experience, but they lack key elements such as tone, context, empathy and humor.
Dr. Rosalind Picard, director of the Affective Computing Group at the MIT Media Lab, points out that emotions are an essential element of human-computer interaction, and has been working for many years to improve computers’ ability to recognize human emotions. As a result, many studies have repeatedly shown that humans treat computers as if they were human, not inanimate objects, and yell at them.
However, while computers and AI are cognitively intelligent, they are emotionally insensitive, leading to the failure of products like Clippy, a virtual assistant chatbot developed by Microsoft in 1996. Clippy offered unsolicited and insensitive advice, usually without knowledge of its creators’ intentions, frustrating users rather than helping them.
So what is effective in motivating people? People. They provide the critical social and emotional engagement. Think about it: if you asked anyone to recall what most inspired, challenged, or motivated them at school, it’s highly unlikely the answer would be a computer program.
Similarly, today’s students are unlikely to cite chatbots as their best motivators; teachers who understand them and all the complexities that impact their learning experience, peers who inspire them, and classmates who support and encourage them on their academic journey are far more likely to be praised than AI.
This doesn’t mean that AI can’t be used in education. Learning engineers and researchers are increasingly leveraging AI in smart ways to improve learning experiences and outcomes. The key is knowing when to use AI and when to add that all-important human touch.
Adopting a hybrid approach
There are many things artificial intelligence can do in the tutoring space, most importantly making tutoring more accessible, scalable, and efficient. AI apps can set and track goals, provide reminders, and offer new techniques. Chatbots strive to make learning interactions more personal and convenient by allowing students to access them in the learning environment that best suits their schedule.
AI tutoring platforms are tailored to better identify student needs, fill learning gaps, and customize instruction. But AI tutoring currently fails to empathize, inspire, and encourage students in the critical ways they need to overcome learning pain points.
But there is potential for synergy between teachers’ social-emotional learning skills and AI effectiveness, and it’s already happening: New AI programs are helping identify which students need help, and others are developing better approaches to give hints to teachers — and doing so in ways that keep students from thinking the AI is “shit.”
For example, Carnegie Mellon University’s Personalized Learning Village (PLV) is building a scalable pathway to help recruit, train, evaluate, and assign human and computer-based “mentor tutors.” Their intervention, Personalized Learning Squared (PLUS), is a hybrid human-AI tutoring system that provides each student with the amount of tutoring they need based on their individual needs.
Funded by LEVI (Learning Engineering Virtual Institute), the project builds on decades of learning science research through cutting-edge tutor training and an AI-powered app that gives tutors “superhuman” powers to reach every student quickly and effectively. Through the app, tutors can leverage data from students’ existing math software to access motivational aids and personalize learning in real time, resulting in more students receiving high-quality tutoring at a lower cost.
I’ve been working with LEVI and another grant recipient at the University of Colorado Boulder, who is doing similarly innovative research at the interface between humans and AI. At Boulder, researchers have partnered with Saga Education to develop a hybrid human-agent tutoring (HAT) platform that analyzes instructor-student interactions and provides instructors with detailed analytical feedback to improve performance. The team aims to blend the best of human tutoring and AI.
The platform leverages many of the benefits of human tutors, recommending challenging assignments, facilitating deep discussions, nurturing student-tutor relationships, providing feedback and guidance, and facilitating collaborative learning. Using learning engineering techniques, the project aims to rapidly transform and scale high-quality tutoring, providing tutoring to over 275,000 diverse, low-income students within five years. The goal is to improve the quality of tutoring at scale.
Make connections
The importance of personalized instruction in education cannot be underestimated. The need for more scalable ways to deliver personalized instruction is crucial. Teacher shortages, stagnant or declining test scores, resource disparities in different regions of the country, and funding challenges for schools, especially those serving diverse populations, can all be remedied with personalized instruction.
But removing the social-emotional learning component from tutoring and education is like removing the human element from planning a family vacation. Sure, it lightens the load in some ways, but it doesn’t take into account the personal insights, traits, and desires that make such adventures meaningful and lasting. Not to mention that a computer giving a motivational speech seems fake.
Education is no different. While AI can enhance learning and chatbots can supplement many aspects of education and tutoring, the real success lies in establishing better tutoring platforms that support teachers, not replace them. Such synergy can help educators close learning gaps and give students the academic and emotional confidence that turns learning challenges into lifelong success.