New Year's Day Holiday

Georgia Tech will be closed in observance of the New Year's Day holiday.

MLK Jr. National Holiday

Georgia Tech will be closed in observance of the M.L.K, Jr. National Holiday.

Wearable Health Equity Workshop: Rural Healthcare and Wellbeing

AGENDA

08:30 - 09:00        Registration
09:00 - 10:00        Morning Keynote, Dr. Phillipp Gutruf
10:00 - 10:15        Break
10:15 - 11:30        Technology Panel
11:30 - 01:00        Poster and lunch
01:00 - 02:00        Research Presentations
02:00 - 03:00        Afternoon Keynote, Dr. Kimberlee McKay
03:00 - 03:15        Break

Project 159 with Tim Lieuwen

Join us as Tim Lieuwen, executive vice president of research at Georgia Tech, discusses “Project 159” where Georgia Tech aspires to engage with each of Georgia’s 159 counties. This event is hosted by the Georgia Tech Institute for People and Technology and is sponsored by Tech Square Atlanta and Collaborative Real Estate.

To attend, register here >>

Georgia Tech Team Designing Robot Guide Dog to Assist the Visually Impaired

Georgia Tech researchers test their prototype of a robotic guide dog. Photo by Terence Rushin/College of Computing.

People who are visually impaired and cannot afford or care for service animals might have a practical alternative in a robotic guide dog being developed at Georgia Tech.

Before launching its prototype, a research team within Georgia Tech’s School of Interactive Computing, led by Professor Bruce Walker and Assistant Professor Sehoon Ha, is working to improve its methods and designs based on research within blind and visually impaired (BVI) communities.

“There’s been research on the technical aspects and functionality of robotic guide dogs, but not a lot of emphasis on the aesthetics or form factors,” said Avery Gong, a recent master’s graduate who worked in Walker’s lab. “We wanted to fill this gap.”

With training a guide dog costing up to $50,000, few BVI individuals can afford one, and even fewer can afford to care for and feed it. The dog also has fewer than 10 working years before it needs replacement.

Gong co-authored a paper on the design implications of the robotic guide dog that was presented at the 2025 International Conference on Robotics and Automation (ICRA) in Atlanta in May.

The consensus among the study’s participants indicates they prefer a robotic guide dog that:

  • resembles a real dog and appears approachable
  • has a clear identifier of being a guide dog, such as a vest
  • has built-in GPS and Bluetooth connectivity
  • has control options such as voice command
  • has soft textures without feeling furry
  • has long battery life and self-charging capability

“A lot of people said they didn’t want the dog to look too cute or appealing because it would draw too much attention,” said Aviv Cohav, another lead author of the paper and recent master’s graduate.

“Many people have issues with taking their guide dog to places, whether it’s little kids wanting to play with the dog or people not liking dogs or people being scared of them, and that reflects on the owners themselves. We wanted to look at what would be a good balance between having a functional robot that wouldn’t scare people away or be a distraction.”

The researchers also had to consider the perspectives of sighted individuals and how society at large might view a robotic guide dog.

An example of this is the amount of noise the dog makes while walking. The owner needs to hear the dog is active, but the clanky sound many off-the-shelf robots make could create disturbances in indoor spaces that amplify sounds. To offset the noise, the team developed algorithms that allow the robot to move more quietly.

Walker and his lab have examined similar scenarios that must take public perception into account.

“We like to think of Georgia Tech as going the extra mile,” Walker said. “Let’s not just make a robot, but a robot that’s going to fit into society.

“To have impact, the technologies we produce must be produced with society in mind. This is a holistic design that considers the users and all the people with whom the users interact.”

Taery Kim, a computer science Ph.D. student, began working on the concept of a robotic guide dog when she came to Georgia Tech in 2022. She and Ha, her advisor, have authored papers on building the robot’s navigation and safety components. 

“When I started, I thought it would be as simple as giving the guide dog a command to take me to Starbucks or the grocery store, and it would just take me,” Kim said. “But the user must give waypoint directions — ‘go left here,’ ‘turn right,’ ‘go forward,’ ‘stop.’ Detailed commands must be delivered to the dog.”

While a real dog has naturally enhanced senses of hearing and smell that can’t be replicated, technology can provide interconnected safety features during an emergency. The researchers envision a camera system equipped with a 360-degree field of view, computer vision algorithms that detect obstacles or hazards, and voice recognition that recognizes calls for help. An SOS function could automatically call 911 at the owner’s request or if the owner is unresponsive.

Kim said the robot should also have explainability features to enhance communication with the owner. For example, if the robot suddenly stops or ignores an owner’s commands, it should tell the owner that it’s detecting a hazard in their path.

Manufacturing a robot at scale would initially be expensive, but the researchers believe the cost would eventually be offset because of its longevity. BVI individuals may only need to purchase one during their lifetime.

To introduce a prototype, the multidisciplinary research team recognizes that it needs to enlist experts from other fields to adequately address the various implications and research gaps inherent in the project.

Walker said the teams welcome additional partners who are keen to tackle challenges ranging from design and engineering to battery life to human-robot interaction.

A graphic depicts design considerations for the prototype.
 
News Contact

Nathan Deen, Communications Officer
School of Interactive Computing

nathan.deen@cc.gatech.edu

Liberian Students Awarded Georgia Tech Fellowships in Computer Science

University of Liberia President Dr. Layli Maparyan is pictured with students starting the Georgia Tech Online Master Program in Computer Science this fall 2025.

University of Liberia President Layli Maparyan is pictured with students starting the Georgia Tech Online Master Program in Computer Science this fall 2025.

In a landmark achievement for higher education and international collaboration, 12 faculty and staff from the University of Liberia have been accepted into the Georgia Institute of Technology’s Online Master of Science in Computer Science (OMSCS) program. This marks the first time Georgia Tech has offered full fellowships to students for its acclaimed online graduate program.

The inaugural cohort began their studies in August, setting a precedent for future scholarship opportunities and academic collaboration between Georgia Tech and Liberian institutions. 

The initiative results from a strategic partnership between the University Consortium for Liberia (UCL) and Georgia Tech aimed at expanding access to world-class computer science education for Liberian students. Cynthia Blandford, president and CEO of UCL and former honorary consul for the Republic of Liberia in Atlanta expressed her pride in the milestone.

“The UCL's mission is to help provide brighter futures through education and understanding and this includes student and faculty exchanges, curriculum development, academic scholarships, joint research, and fundraising,” said Blandford.

The announcement follows a 2023 visit to Atlanta by Liberian President Joseph Boakai during which Georgia Tech formally introduced the OMSCS scholarship program for Liberia. Michael Best, executive director of the Institute for People and Technology at Georgia Tech emphasized the program's significance.

“Georgia Tech was delighted to host the president of Liberia,” said Best. “This is the first time the OMSCS degree at Georgia Tech is providing complete fellowships to students. I am so glad Liberia is our partner in this groundbreaking program.”

The OMSCS program, hailed by Forbes as the “greatest degree program ever,” is the first fully accredited online master’s degree in computer science offered by a major U.S. university. It combines academic rigor with the flexibility of online learning, allowing students to earn the same degree as their on-campus peers.

Best added that completing the program will be a personal achievement for the students and a strategic investment in Liberia’s future.

“The graduates of this program will help to ensure that Liberia is a full participant and contributor to our digital age. These students’ advanced training will position them for leadership and impact within Liberia and beyond.”

University of Liberia (UL) President Layli Maparyan is excited about the collaboration with Georgia Tech and UCL. 

“The Georgia Tech OMSCS is equipping UL’s computer science faculty and IT staff with a profound degree of capacity building,” she stated. “This positions UL well for planned curricular developments in AI, cybersecurity, and other key IT areas of study. We are profoundly grateful to Georgia Tech for the timely launch.”

The 12 University of Liberia students accepted in the program are:

  • Harris Barwu
  • Clarence Carlwolo
  • Viola Cheeseman
  • Alieu Farhat
  • Varney Jarteh
  • Fredrick Juah
  • Abubakar Keita
  • Yougie Kessellie
  • Josephus Nyumalin
  • Melvin Soclo
  • Michael Umunna
  • Martin Wallace
 
News Contact

Walter Rich

Accompaniment, Design, and Research

Carl DiSalvo, Professor, School of Interactive Computing, Georgia Tech

Winter Break: Campus Closed

The Georgia Tech campus is closed for winter break.

The Algorithm Will See You Now — But Only If You’re the Perfect Patient

A doctor on a computer working with an AI-powered health device

An illustration representing a doctor working with an AI-powered health device.

In the morning, before you even open your eyes, your wearable device has already checked your vitals. By the time you brush your teeth, it has scanned your sleep patterns, flagged a slight irregularity, and adjusted your health plan. As you take your first sip of coffee, it’s already predicted your risks for the week ahead.

Georgia Tech researchers warn that this version of AI healthcare imagines a patient who is "affluent, able-bodied, tech-savvy, and always available." Those who don’t fit that mold, they argue, risk becoming invisible in the healthcare system.

The Ideal Future

In their study, published in the Proceedings of the ACM Conference on Human Factors in Computing Systems, the researchers analyzed 21 AI-driven health tools, ranging from fertility apps and wearable devices to diagnostic platforms and chatbots. They used sociological theory to understand the vision of the future these tools promote — and the patients they leave out.

“These systems envision care that is seamless, automatic, and always on,” said Catherine Wieczorek, a Ph.D. student in human-centered computing in the School of Interactive Computing and lead author of the study. “But they also flatten the messy realities of illness, disability, and socioeconomic complexity.”

Four Futures, One Narrow Lens

During their analysis, the researchers discovered four recurring narratives in AI-powered healthcare:

  1. Care that never sleeps. Devices track your heart rate, glucose levels, and fertility signals — all in real time. You are always being watched, because that’s framed as “care.”
  2. Efficiency as empathy. AI is faster, more objective, and more accurate. Unlike humans, it doesn’t get tired or biased. This pitch downplays the value of human judgment and connection.
  3. Prevention as perfection. A world where illness is avoided through early detection if you have the right sensors, the right app, and the right lifestyle.
  4. The optimized body. You’re not just healthy, you’re high-performing. The tech isn’t just treating you; it’s upgrading you.

“It’s like healthcare is becoming a productivity tool,” Wieczorek said. “You’re not just a patient anymore. You’re a project.”

Not Just a Tool, But a Teammate

This study also points to a critical transformation in which AI is no longer just a diagnostic tool; it’s a decision-maker. Described by the researchers as “both an agent and a gatekeeper,” AI now plays an active role in how care is delivered.

In some cases, AI systems are even named and personified, like Chloe, an IVF decision-support tool. “Chloe equips clinicians with the power of AI to work better and faster,” its promotional materials state. By framing AI this way — as a collaborator rather than just software — these systems subtly redefine who, or what, gets to be treated.

“When you give AI names, personalities, or decision-making roles, you’re doing more than programming. You’re shifting accountability and agency. That has consequences,” said Shaowen Bardzell, chair of Georgia Tech’s School of Interactive Computing and co-author of the study.

“It blurs the boundaries,” Wieczorek noted. “When AI takes on these roles, it’s reshaping how decisions are made and who holds authority in care.”

Calculated Care

Many AI tools promise early detection, hyper-efficiency, and optimized outcomes. But the study found that these systems risk sidelining patients with chronic illness, disabilities, or complex medical needs — the very people who rely most on healthcare.

“These technologies are selling worldviews,” Wieczorek explained. “They’re quietly defining who healthcare is for, and who it isn’t.”

By prioritizing predictive algorithms and automation, AI can strip away the context and humanity that real-world care requires. 

“Algorithms don’t see nuance. It’s difficult for a model to understand how a patient might be juggling multiple diagnoses or understand what it means to manage illness, while also navigating other important concerns like financial insecurity or caregiving. They are predetermined inputs and outputs,” Wieczorek said. “While these systems claim to streamline care, they are also encoding assumptions about who matters and how care should work. And when those assumptions go unchallenged, the most vulnerable patients are often the ones left out.” 

AI for ALL

The researchers argue that future AI systems must be developed in collaboration with those who don’t fit in the vision of a “perfect patient.” 

“Innovation without ethics risks reinforcing existing inequalities. It’s about better tech and better outcomes for real people,” Bardzell said. “We’re not anti-innovation. But technological progress isn’t just about what we can do. It’s about what we should do — and for whom.”

Wieczorek and Bardzell aren’t trying to stop AI from entering healthcare. They’re asking AI developers to understand who they’re really serving.

 

Funding:
This work was supported by the National Science Foundation (Grant #2418059). 

 

 
News Contact

Michelle Azriel, Sr. Writer-Editor
mazriel3@gatech.edu

Georgia Tech’s Jill Watson Outperforms ChatGPT in Real Classrooms

Georgia Tech’s Jill Watson Outperforms ChatGPT in Real Classrooms

A new version of Georgia Tech’s virtual teaching assistant, Jill Watson, has demonstrated that artificial intelligence can significantly improve the online classroom experience. Developed by the Design Intelligence Laboratory (DILab) and the U.S. National Science Foundation AI Institute for Adult Learning and Online Education (AI-ALOE), the latest version of Jill Watson integrates OpenAI’s ChatGPT and is outperforming OpenAI’s own assistant in real-world educational settings.

Jill Watson not only answers student questions with high accuracy. It also improves teaching presence and correlates with better academic performance. Researchers believe this is the first documented instance of a chatbot enhancing teaching presence in online learning for adult students.

How Jill Watson Shaped Intelligent Teaching Assistants

First introduced in 2016 using IBM’s Watson platform, Jill Watson was the first AI-powered teaching assistant deployed in real classes. It began by responding to student questions on discussion forums like Piazza using course syllabi and a curated knowledge base of past Q&As. Widely covered by major media outlets including The Chronicle of Higher Education, The Wall Street Journal, and The New York Times, the original Jill pioneered new territory in AI-supported learning.

Subsequent iterations addressed early biases in the training data and transitioned to more flexible platforms like Google’s BERT in 2019, allowing Jill to work across learning management systems such as EdStem and Canvas. With the rise of generative AI, the latest version now uses ChatGPT to engage in extended, context-rich dialogue with students using information drawn directly from courseware, textbooks, video transcripts, and more.

Future of Personalized, AI-Powered Learning

Designed around the Community of Inquiry (CoI) framework, Jill Watson aims to enhance “teaching presence,” one of three key factors in effective online learning, alongside cognitive and social presence. Teaching presence includes both the design of course materials and facilitation of instruction. Jill supports this by providing accurate, personalized answers while reinforcing the structure and goals of the course.

The system architecture includes a preprocessed knowledge base, a MongoDB-powered memory for storing conversation history, and a pipeline that classifies questions, retrieves contextually relevant content, and moderates responses. Jill is built to avoid generating harmful content and only responds when sufficient verified course material is available.

Field-Tested in Georgia and Beyond

In Fall 2023, Jill Watson was deployed in Georgia Tech’s Online Master of Science in Computer Science (OMSCS) artificial intelligence course, serving over 600 students, and in an English course at Wiregrass Georgia Technical College (WGTC), part of the Technical College System of Georgia (TCSG).

A controlled A/B experiment in the OMSCS course allowed researchers to compare outcomes between students with and without access to Jill Watson, even though all students could use ChatGPT. The findings are striking:

  • Jill Watson’s accuracy on synthetic test sets ranged from 75% to 97%, depending on the content source. It consistently outperformed OpenAI’s Assistant, which scored around 30%.
  • Students with access to Jill Watson showed stronger perceptions of teaching presence, particularly in course design and organization, as well as higher social presence.
  • Academic performance also improved slightly: students with Jill saw more A grades (66% vs. 62%) and fewer C grades (3% vs. 7%).

A Smarter, Safer Chatbot

While Jill Watson uses ChatGPT for natural language generation, it restricts outputs to validated course material and verifies each response using textual entailment. According to a study by Taneja et al. (2024), Jill not only delivers more accurate answers than OpenAI’s Assistant but also avoids producing confusing or harmful content at significantly lower rates.

Compared to OpenAI’s Assistant, Jill Watson (ChatGPT) not only achieves higher accuracy but also produces confusing or harmful content at significantly lower rates. Jill Watson answers correctly 78.7% of the time, with only 2.7% of its errors categorized as harmful and 54.0% as confusing. In contrast, OpenAI’s Assistant demonstrates a much lower accuracy of 30.7%, with harmful failures occurring 14.4% of the time and confusing failures rising to 69.2%. Additionally, Jill Watson has a lower retrieval failure rate of 43.2%, compared to 68.3% for the OpenAI Assistant.

What’s Next for Jill

The team plans to expand testing across introductory computing courses at Georgia Tech and technical colleges. They also aim to explore Jill Watson’s potential to improve cognitive presence, particularly critical thinking and concept application. Although quantitative results for cognitive presence are still inconclusive, anecdotal feedback from students has been positive. One OMSCS student wrote:

“The Jill Watson upgrade is a leap forward. With persistent prompting I managed to coax it from explicit knowledge to tacit knowledge. Kudos to the team!”

The researchers also expect Jill to reduce instructional workload by handling routine questions and enabling more focus on complex student needs.

Additionally, AI-ALOE is collaborating with the publishing company John Wiley & Sons, Inc., to develop a Jill Watson virtual teaching assistant for one of their courses, with the instructor and university chosen by Wiley. If successful, this initiative could potentially scale to hundreds or even thousands of classes across the country and around the world, transforming the way students interact with course content and receive support.

A Georgia Tech-Led Collaboration

The Jill Watson project is supported by Georgia Tech, the US National Science Foundation’s AI-ALOE Institute (Grants #2112523 and #2247790), and the Bill & Melinda Gates Foundation.

Core team members are Saptrishi Basu, Jihou Chen, Jake Finnegan, Isaac Lo, JunSoo Park, Ahamad Shapiro and Karan Taneja, under the direction of professor Ashok Goel and Sandeep Kakar. The team works under Beyond Question LLC, an AI-based educational technology startup.

 
News Contact

Breon Martin