Working Smarter: Improving Personalized Stem Cell Treatments for Kids
Apr 09, 2025 —

Stem cell therapies are improving recovery and survival rates for pediatric cancer patients. But the treatments can be risky. They can weaken the immune system, making children highly vulnerable to infections. And there are other potential long-term complications, including damage to tissues and organs.
A team of researchers at Georgia Tech has addressed this challenge, creating a new way to predict how these cutting-edge treatments might work in a particular patient. And it could revolutionize treatments for kids with complex immune system challenges.
Read full story here.
By Jerry Grillo
Keiretsu Forum Due Diligence Workshop
REGISTER HERE by Monday, May 26 @ 10:00 a.m.
Designed to quickly bring members up to speed on the forum's detailed due diligence process, a key element in its successful investment strategy.
INNS Executive Director Search Vision Talk: Candidate 3
Three finalists have been chosen for the role of Executive Director of the Institute for Neuroscience, Neurotechnology, and Society (INNS). Each finalist will meet with Georgia Tech faculty, staff, and IRI leadership and give a seminar on their vision for the INNS.
Finalist 2: Michelle LaPlaca
Date: June 9th, 2025
Time: 11a.m. - Noon
Location: Callaway Manufacturing Research Building (GT Manufacturing Institute)
813 Ferst Drive NW, Atlanta, GA 30332, seminar room 114
INNS Executive Director Search Vision Talk: Candidate 2
Three finalists have been chosen for the role of Executive Director of the Institute for Neuroscience, Neurotechnology, and Society (INNS). Each finalist will meet with Georgia Tech faculty, staff, and IRI leadership and give a seminar on their vision for the INNS.
Finalist 2: Chris Rozell
Date: June 3rd, 2025
Time: 11a.m. - Noon
Location: Callaway Manufacturing Research Building (GT Manufacturing Institute)
813 Ferst Drive NW, Atlanta, GA 30332, seminar room 114
INNS Executive Director Search Vision Talk: Candidate 1
Three finalists have been chosen for the role of Executive Director of the Institute for Neuroscience, Neurotechnology, and Society (INNS). Each finalist will meet with Georgia Tech faculty, staff, and IRI leadership and give a seminar on their vision for the INNS.
Finalist 1: Lewis Wheaton
Date: May 28th, 2025
Time: 11a.m. - Noon
Location: Callaway Manufacturing Research Building (GT Manufacturing Institute)
813 Ferst Drive NW, Atlanta, GA 30332, seminar room 114
Painting a Target on Cancer to Make Therapy More Effective
May 19, 2025 —

The combination approach that Lena Gamboa, seated, Gabe Kwong, foreground, and Ali Zamat developed tags the tumors with a synthetic "flag" then uses specially engineered cells from the patient's own immune system to attack the cancer. They found their approach worked against hard-to-treat breast, brain, and colon cancers. it also turned the cancer into an immune system training ground, allowing the body to recognize and fight any tumors that regrow. (Photo: Candler Hobbs)
Biomedical engineers at Georgia Tech created a treatment that could one day unlock a universal strategy for treating some of the hardest-to-treat cancers — like those in the brain, breast, and colon — by teaching the immune system to see what it usually misses.
Their experimental approach worked against those kinds of cancers in lab tests and didn’t damage healthy tissues. Importantly, it also stopped cancer from returning.
While the therapy is still in early stages of development, it builds on well established, safe technologies, giving the treatment a clearer, quicker path to clinical trials and patient care.
Reported in May in the journal Nature Cancer, their technique is a one-two punch that flags tumor cells so they can be recognized and then eliminated by specially enhanced T cells from the patient’s own immune system.
Joshua Stewart
College of Engineering
2025 BIO Vendor Showcase
The Institute for Bioengineering and Bioscience (IBB) and the Bioengineering and Bioscience Unified Graduate Students (BBUGS) are hosting a BIO Vendor Showcase - an opportunity for faculty, staff, and students to explore products and services from over 25 companies showcasing their equipment and research techniques. Attendees can enjoy refreshments and enter raffles for vendor-donated prizes throughout the event.
Exhibitor Registration - $275 per table
AI Chatbots Aren’t Experts on Psych Med Reactions — Yet
May 14, 2025 —

The study was led by computer science Ph.D. student Mohit Chandra (pictured) and Munmun De Choudhury, J.Z. Liang Associate Professor in the School of Interactive Computing.
Asking artificial intelligence for advice can be tempting. Powered by large language models (LLMs), AI chatbots are available 24/7, are often free to use, and draw on troves of data to answer questions. Now, people with mental health conditions are asking AI for advice when experiencing potential side effects of psychiatric medicines — a decidedly higher-risk situation than asking it to summarize a report.
One question puzzling the AI research community is how AI performs when asked about mental health emergencies. Globally, including in the U.S., there is a significant gap in mental health treatment, with many individuals having limited to no access to mental healthcare. It’s no surprise that people have started turning to AI chatbots with urgent health-related questions.
Now, researchers at the Georgia Institute of Technology have developed a new framework to evaluate how well AI chatbots can detect potential adverse drug reactions in chat conversations, and how closely their advice aligns with human experts. The study was led by Munmun De Choudhury, J.Z. Liang Associate Professor in the School of Interactive Computing, and Mohit Chandra, a third-year computer science Ph.D. student. De Choudhury is also a faculty member in the Georgia Tech Institute for People and Technology.
“People use AI chatbots for anything and everything,” said Chandra, the study’s first author. “When people have limited access to healthcare providers, they are increasingly likely to turn to AI agents to make sense of what’s happening to them and what they can do to address their problem. We were curious how these tools would fare, given that mental health scenarios can be very subjective and nuanced.”
De Choudhury, Chandra, and their colleagues introduced their new framework at the 2025 Annual Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics on April 29, 2025.
Putting AI to the Test
Going into their research, De Choudhury and Chandra wanted to answer two main questions: First, can AI chatbots accurately detect whether someone is having side effects or adverse reactions to medication? Second, if they can accurately detect these scenarios, can AI agents then recommend good strategies or action plans to mitigate or reduce harm?
The researchers collaborated with a team of psychiatrists and psychiatry students to establish clinically accurate answers from a human perspective and used those to analyze AI responses.
To build their dataset, they went to the internet’s public square, Reddit, where many have gone for years to ask questions about medication and side effects.
They evaluated nine LLMs, including general purpose models (such as GPT-4o and LLama-3.1), and specialized medical models trained on medical data. Using the evaluation criteria provided by the psychiatrists, they computed how precise the LLMs were in detecting adverse reactions and correctly categorizing the types of adverse reactions caused by psychiatric medications.
Additionally, they prompted LLMs to generate answers to queries posted on Reddit and compared the alignment of LLM answers with those provided by the clinicians over four criteria: (1) emotion and tone expressed, (2) answer readability, (3) proposed harm-reduction strategies, and (4) actionability of the proposed strategies.
The research team found that LLMs stumble when comprehending the nuances of an adverse drug reaction and distinguishing different types of side effects. They also discovered that while LLMs sounded like human psychiatrists in their tones and emotions — such as being helpful and polite — they had difficulty providing true, actionable advice aligned with the experts.
Better Bots, Better Outcomes
The team’s findings could help AI developers build safer, more effective chatbots. Chandra’s ultimate goals are to inform policymakers of the importance of accurate chatbots and help researchers and developers improve LLMs by making their advice more actionable and personalized.
Chandra notes that improving AI for psychiatric and mental health concerns would be particularly life-changing for communities that lack access to mental healthcare.
“When you look at populations with little or no access to mental healthcare, these models are incredible tools for people to use in their daily lives,” Chandra said. “They are always available, they can explain complex things in your native language, and they become a great option to go to for your queries.
“When the AI gives you incorrect information by mistake, it could have serious implications on real life,” Chandra added. “Studies like this are important, because they help reveal the shortcomings of LLMs and identify where we can improve.”
Citation: Lived Experience Not Found: LLMs Struggle to Align with Experts on Addressing Adverse Drug Reactions from Psychiatric Medication Use, (Chandra et al., NAACL 2025).
Funding: National Science Foundation (NSF), American Foundation for Suicide Prevention (AFSP), Microsoft Accelerate Foundation Models Research grant program. The findings, interpretations, and conclusions of this paper are those of the authors and do not represent the official views of NSF, AFSP, or Microsoft.

Munmun De Choudhury, J.Z. Liang Associate Professor in the School of Interactive Computing
Catherine Barzler, Senior Research Writer/Editor
Institute Communications
catherine.barzler@gatech.edu
IBB Seminar
Dae-Hyeong Kim
Professor
School of Chemical and Biological Engineering
Seoul National University
The Kim lab specializes in the research and development of nanomaterial-integrated translational soft devices designed to improve the quality of life for everyone. By pioneering advanced material strategies, their group aims to create innovative systems for biomedical engineering, electronics, optoelectronics, and catalytic applications.
Georgia CTSA Informatics Lunch and Learn
REGISTER HERE
Presented by:
Meredith Lora, MD, Associate Professor, Grady PrEP Program