Scheller Business Insights: How Data Can Transform Museum Experiences

Abhishek Deshmane, assistant professor of operations management

When you walk through a museum, the path you take feels natural — guided by curiosity, aesthetics, and maybe a helpful app. But behind the scenes, that journey is shaped by decisions about layout and design that can make or break your experience. Research by Abhishek Deshmane, assistant professor of operations management, reveals how data-driven models can help cultural institutions — and other experience-based businesses — optimize these layouts to boost engagement.

Read more » 

 

Yellow Jacket Connection Sparks Glaucoma Research Fund at Tech

Hannah Youngblood

An estimated 4 million Americans have glaucoma, a group of eye diseases that can lead to irreversible blindness. Now, Georgia Tech is home to a Glaucoma Research Fund that will support cutting-edge work to understand and advance treatments for the disease.

The new initiative was sparked by ongoing research at Georgia Tech — and a Yellow Jacket connection: when Postdoctoral Research Fellow Hannah Youngblood’s work on exfoliation glaucoma (XFG) was featured by the BrightFocus Foundation, it caught the attention of Jennifer Rucker, an Alabama resident who was diagnosed with XFG several years ago.

Excited that the research could change outcomes for people like her — and proud that it’s happening at her husband Philip Rucker’s, EE 72, alma mater — Jennifer Rucker reached out to Youngblood and her advisor, School of Chemistry and Biochemistry Professor and Kelly Sepcic Pfeil, Ph.D. Chair Raquel Lieberman

“As the wife of a Georgia Tech graduate and an individual with pseudoexfoliation glaucoma, I was inspired to support the scientists whose efforts may help me and others,” Jennifer Rucker says. What followed was a meaningful dialogue and a shared sense of purpose — and the creation of the Georgia Tech Glaucoma Research Fund (Wreck Glaucoma! Fund). 

“It meant so much that Jennifer took the initiative to reach out to learn more about our research,” says Lieberman. “Moments like this remind me how deeply meaningful it is to connect with people in the broader community who are navigating glaucoma. Opportunities for such personal connections are rare, but they inspire and further motivate us to achieve our lab’s mission to improve the lives of individuals suffering from blindness diseases.”

A personal connection

Youngblood’s interest in glaucoma research also stems from a personal connection: her father was diagnosed with glaucoma as a young adult. Now, Youngblood studies the genetic and molecular factors behind XFG in the Lieberman research lab

“XFG is an aggressive form of the disease with no known cure,” Youngblood says. While scientists know that XFG is the result of abnormal accumulation of proteins in the eye, current treatments only address symptoms rather than treating the root cause of the disease.

“We know XFG is driven by protein buildup, but we still don’t know why it happens,” she explains. “My work studying specific genetic variants aims to uncover this.” 

The genetics of glaucoma

In particular, Youngblood is researching the role of LOXL1, a protein that plays a role in soft tissue throughout the body, including the eyes.

“Research has shown that people with variants in the genes responsible for this protein are more likely to have XFG,” she says. “That made me curious to see if the variants might be impacting the structure of the LOXL1 protein itself and how those variants might lead to disease.”

Youngblood is currently testing her theory in the lab. “My hope is that new insight into proteins like LOXL1 will bring us closer to treatments that address XFG at its source,” she says. “The new Georgia Tech Glaucoma Research Fund is a tremendous step forward in making that hope a reality.”

Support the Georgia Tech Glaucoma Research Fund

Please visit the Glaucoma Research Fund support page to give to this specific program. To discuss additional philanthropic opportunities, please contact the College of Sciences Development Team: development@cos.gatech.edu

Your investment ensures that these scholars and researchers have world-class resources, facilities, and mentors to excel in this critical work. Thank you for helping us shape the future.

Raquel Lieberman
 
News Contact

Confronting the Roadblocks in Medical Technology Innovation

A panel of five speakers sits on tall stools at the front of a classroom, participating in a moderated discussion. The moderator on the left holds papers while addressing the group. A large presentation slide behind the panel displays names and academic titles. Audience members are partially visible in the foreground, and tables, chairs, and a water bottle are arranged throughout the room.

Georgia Tech’s Institute for Matter and Systems (IMS) hosted its second Boundaries and Breakthroughs panel on Jan. 27, bringing together leading clinicians, engineers, and data experts to examine why promising medical technologies often fail to translate into clinical practice.

Moderated by IMS Executive Director Eric Vogel, the panel explored how innovation, regulation, economics and clinical realities intersect to shape the future of medical devices. 

The panel featured John Duke, physician and director of the Center for Health Analytics and Informatics at Georgia Tech Research Institute; Matthew Flavin, assistant professor in the School of Electrical and Computer Engineering; HyunJoo Oh, assistant professor in the schools of Industrial Design and Interactive Computing; and Lokesh Guglani, pediatric pulmonologist and clinician-researcher at Children’s Healthcare of Atlanta. 

Vogel opened the event by highlighting the gap between technological novelty and real-world medical adoption. 

“About 75% of medical device start-ups never achieve commercial success or make it to market, and some industry estimates push this higher,” Vogel said. “Even those that reach the market often fail to gain meaningful adoption. This may be because technologists optimize for platforms five or 10 years out and are rewarded by novelty, whereas clinicians demand reliability, interpretability, and outcomes that hold up with real patients, real workflows, and real liability.”

Throughout the discussion, panelists examined the tension between rapid innovation and clinical safety, noting that the level of invasiveness often determines how bold developers can be.

“We must remember that in medicine—and especially when we're dealing with human lives—there's a significant asymmetry of the harm that could be done,” said Guglani. “Even a small change or an oversight at the design level of a medical device can have significant downstream repercussions for patients and create liability for institutions and providers.”

Flavin and Duke added that excessive conservatism, particularly around non-invasive wearable, can also slow potentially life-changing advancements. 

All panelists agreed that breakthrough technology alone is not enough to ensure clinical adoption. Usability, workflow fit, and time efficiency often determine whether clinicians adopt a device. Tools that require lengthy calibration or add to a clinician’s already tight schedule rarely succeed. Even when a technology integrates well, reimbursement barriers can prevent adoption. 

 “A lot of technologies come out, but then if the clinic is using them and is not being reimbursed for the time spent, that creates a bottleneck,” said Guglani.

Economic constraints also shape who benefits from innovation. Children with rare diseases, stroke survivors, and other small or heterogeneous patient groups often struggle to attract investors, even when their needs are urgent.

The panelists also discussed the dual role of regulatory and manufacturing standards. Good Manufacturing Practice (GMP) requirements ensures consistent, safe production, but force teams to lock designs earlier than ideal, adding cost and slowing iteration. These requirements protect patients but also function as an economic filter for many early-stage technologies.

The conversation then turned to data, AI, and the education of future innovators. Despite massive amounts of health data, many clinically important areas remain data‑scarce. Wearable devices, such as smart watches, may help close these gaps, but AI models remain limited by the quality of input data. 

When asked about preparing the next generation of MedTech innovators, panelists emphasized the importance of “interface literacy” or the ability to collaborate across disciplinary boundaries and understand how design decisions cascade into real clinical environments.  

“You really do have to be able to be interdisciplinary,” said Duke. “Now of course what makes things go is not often the knowledge of the domain, but the person’s role or connectivity into the system.”

Vogel closed by emphasizing that successful medical technology development requires “ongoing, honest collaboration” across fields. The Boundaries and Breakthroughs series will continue that mission in February with a panel on the future of the electric grid.

 
News Contact

Amelia Neumeister | Communications Program Manager

The Institute for Matter and Systems

From Fusion to Self-Driving Cars, High Performance Computing and AI are Everywhere in 2026

CSE in 2026

While not as highlight-reel worthy as the Winter Olympics and the World Cup, experts expect high-performance computing (HPC) to have an even bigger impact on daily life in 2026.

Georgia Tech researchers say HPC and artificial intelligence (AI) advances this year are poised to improve how people power their homes, design safer buildings, and travel through cities.

According to Qi Tang, scientists will take progressive steps toward cleaner, sustainable energy through nuclear fusion in 2026. 

“I am very hopeful about the role of advanced computing and AI in making fusion a clean energy source,” said Tang, an assistant professor in the School of Computational Science and Engineering (CSE)

“Fusion systems involve many interconnected processes happening across different scales. Modern simulations, combined with data-driven methods, allow us to bring these pieces together into a unified picture.”

Tang’s research connects HPC and machine learning with fusion energy and plasma physics. This year, Tang is continuing work on large-scale nuclear fusion models.

Only a few experimental fusion reactors exist worldwide compared to more than 400 nuclear fission reactors. Tang’s work supports a broader effort to turn fusion from a promising idea into a practical energy source.

Nuclear fusion occurs in plasma, the fourth state of matter, where gas is heated to millions of degrees. In this extreme state, electrons are stripped from atoms, creating a hot soup of fast-moving ions and free electrons. In plasma, hydrogen atoms overcome their natural electrical repulsion, collide, and fuse together. This releases energy that can power cities and homes.

Computers interpret extreme temperatures, densities, pressures, and plasma particle motion as massive datasets. Tang works to assimilate these data types from computer models and real-world experiments.

To do this, he and other researchers rely on machine learning approaches to analyze data across models and experiments more quickly and to produce more accurate predictions. Over time, this will allow scientists to test and improve fusion reactor designs toward commercial use. 

Beyond energy and nuclear engineering, Umar Khayaz sees broader impacts for HPC in 2026.

“HPC is the need of the day in every field of engineering sciences, physics, biology, and economics,” said Khayaz, a CSE Ph.D. student in the School of Civil and Environmental Engineering

“HPC is important enough to say that we need to employ resources to also solve social problems.”

Khayaz studies dynamic fracture and phase-field modeling. These areas explore how materials break under sudden, rapid loads. 

Like nuclear fusion, Khayaz says dynamic fracture problems are complex and data-intensive. In 2026, he expects to see more computing resources and computational capabilities devoted to understanding these problems and other emerging civil engineering challenges.

CSE Ph.D. student Yiqiao (Ahren) Jin sees a similar relationship between infrastructure and self-driving vehicles. He believes AI will innovate this area in 2026.

At Georgia Tech, Jin develops efficient multimodal AI systems. An autonomous vehicle is a multimodal system that uses camera video, laser sensors, language instructions, and other inputs to navigate city streets under changing scenarios like traffic and weather patterns.

Jin says multimodal research will move beyond performance benchmarks this year. This shift will lead to computer systems that can reason despite uncertainty and explain their decisions. In result, engineers will redefine how they evaluate and deploy autonomous systems in safety-critical settings.

“Many foundational problems in perception, multimodal reasoning, and agent coordination are being actively addressed in 2026. These advances enable a transition from isolated autonomous systems to safer, coordinated autonomous vehicle fleets,” Jin said. 

“As these systems scale, they have the potential to fundamentally improve transportation safety and efficiency.”

 
News Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu

Better Brain-Machine Interfaces Could Allow the Paralyzed to Communicate Again

During a research session, a participant looks at a monitor and imagines saying the text cue displayed on screen. Text below the cue shows the brain-computer interface’s prediction of her imagined words.

During a research session, a participant imagines saying the text cue on the screen. The bottom text is the brain-computer interface’s prediction of the imagined words. (Photo courtesy: Chethan Pandarinath)

Last summer, a team of researchers reported using a brain-computer interface to detect words people with paralysis imagined saying, even without them physically attempting to speak. They also found they could differentiate between the imagined words they wished to express and the person’s private inner thoughts.

It’s a significant step toward helping people with diseases like amyotrophic lateral sclerosis, or ALS, reconnect with language after they’ve lost the ability to talk. And it’s part of a long-running clinical trial on brain-computer interfaces involving biomedical engineers from Georgia Tech and Emory University alongside collaborators at Stanford University, Massachusetts General Hospital, Brown University, and the University of California, Davis. 

Together, they’re exploring how implanted devices can read brain signals and help patients use assistive devices to recover some of their lost abilities.

Speech has become one of the hottest areas for these interfaces as scientists leverage the power of artificial intelligence, according to Chethan Pandarinath, associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory and one of the researchers involved in the trials.

“We can place electrodes in parts of the brain that are related to speech,” he said, “and even if the person has lost the ability to talk, we can pick up the electrical activity as they try to speak and figure out what they’re trying to say.”

Read the full story in Helluva Engineer magazine.

 
News Contact

Joshua Stewart
College of Engineering

Digital Doppelgängers

Illustration of a laptop computer with a digital silhouette of the Georgia Tech campus on the screen along with lightning bolts and water drops.

Extreme weather, congested streets, aging infrastructure — just some of the challenges that communities and their residents face every day. Solving them requires more than traditional planning; it demands tools that can anticipate problems before they happen. 

One of the tools our researchers are turning to is called a digital twin. These virtual models mirror real-world systems in real time to make communities safer, transportation smarter, and campus operations more efficient.

Unlike static simulations, digital twins evolve with live data. They allow decision-makers to respond to changing conditions with speed and precision. Whether it’s predicting how floodwaters will move through a city or minimizing traffic delays for emergency vehicles, digital twins offer a new way to manage complexity. By blending artificial intelligence, sensor networks, and advanced analytics, Georgia Tech engineers are creating solutions that don’t just react — they prepare, adapt, and improve the systems we rely on every day.

Explore the digital twins in Helluva Engineer magazine.

 
News Contact

Jason Maderer
College of Engineering

Georgia Insurance Claims Database Provides Health Care Cost Comparisons

Health care workers in a facility corridor

Georgia residents now have a new way to compare the estimated costs paid for a large variety of health care services in the state, thanks to a resource created by the Georgia All-Payer Claims Database. (iStock photo)

Georgia residents now have a new way to compare the estimated costs paid for a large variety of health care services in the state, thanks to a searchable “shop for care” resource launched as part of the Georgia All-Payer Claims Database (GA APCD).
 

The Georgia APCD Cost Comparison Tool (apcd.georgia.gov/cost-comparison-tool) contains information on more than 200 different medical procedures ranging from cardiac stress tests and childbirth to knee replacement and colonoscopies. The resource provides information on the median cost paid for the procedures statewide, along with information on what individual medical facilities and professional providers have been paid for each type of procedure. 
 

For each procedure, the tool identifies medical facility providers nearest to the consumer, and includes facility ratings collected by the Centers for Medicare & Medicaid Services (CMS). For each facility providing a specific service, the comparison data includes the median cost for the procedure and the range of costs that were paid. Costs can be filtered by payer category, including commercial, Medicare, and Medicaid. While that data is understandably incomplete and includes caveats, developers of the new service say it provides a much-needed resource for Georgians facing a decision on a costly medical procedure.
 

“In health care, there are a lot of factors that can drive cost and it’s not always a straightforward equation, so it’s worth doing the research,” said Dr. Jon Duke, an M.D. and principal research scientist in the Georgia Tech Research Institute’s (GTRI) Health Emerging and Advanced Technologies Division, which administers the APCD for the state of Georgia. “This is really just one part of health care decision-making, and it will help patients be more proactive advocates for themselves when considering potential options for care.”
 

Read more about this project on the GTRI home page

 

 
News Contact

gtrimedia@gtri.gatech.edu

Wearing the Future

a patch of haptic actuators shown on a user's neck

Worn on the neck, and paired with a smartphone, these haptic actuators designed in Matt Flavin's lab can help people with vision loss navigate their environment. (Photo: Chris McKenney)

If you walked through the Smithsonian American History Museum in the mid-2000s, you might have seen the “Smart Shirt,” the very first garment to seamlessly combine textiles and electronics.

Dubbed a “wearable motherboard,” it acted as a hub for sensors that could collect a range of biometric data.

That shirt foretold a future where health and biometric data could be collected unobtrusively through wearable technology. And it was created by engineers at Georgia Tech.

“What we have is all these nice data buses that are the fabric threads. And we can connect any kind of sensors to them,” said Professor Sundaresan Jayaraman, the shirt’s co-creator. “We were able to route information in a fabric for the first time, just like a typical computer motherboard. That’s why we called it the ‘wearable motherboard.’”

Jayaraman and Sungmee Park created the shirt in response to a Defense Advanced Research Projects Agency (DARPA) call for ideas to protect soldiers in battle. They envisioned a comfortable, flexible garment infused with fiber optics to detect gunshot wounds and vital signs. The data would help medics rapidly triage battlefield injuries in the critical minutes when emergency care is the difference between life and death.

Creating a shirt made it easy: no bulky electronics to add to the gear soldiers carried. Just a piece of clothing to wear under their fatigues. Park and Jayaraman developed a way to weave the garment on a loom, making mass production and consistency far easier.

The original sleeveless shirt is tucked into the Smithsonian archives now. But it’s possible to follow the thread of that first smart textile to the work happening in the pair’s School of Materials Science and Engineering (MSE) lab today. 

Read the full story in Helluva Engineer magazine.

 
News Contact

Joshua Stewart
College of Engineering

Hacking the Grid: How Digital Sabotage Turns Infrastructure Into a Weapon

Today’s power grid equipment incorporates internet-connected – and therefore hackable – computers. Joe Raedle/Getty Images

Today’s power grid equipment incorporates internet-connected – and therefore hackable – computers. Joe Raedle/Getty Images

The darkness that swept over the Venezuelan capital in the predawn hours of Jan. 3, 2026, signaled a profound shift in the nature of modern conflict: the convergence of physical and cyber warfare. While U.S. special operations forces carried out the dramatic seizure of Venezuelan President Nicolás Maduro, a far quieter but equally devastating offensive was taking place in the unseen digital networks that help operate Caracas.

The blackout was not the result of bombed transmission towers or severed power lines but rather a precise and invisible manipulation of the industrial control systems that manage the flow of electricity. This synchronization of traditional military action with advanced cyber warfare represents a new chapter in international conflict, one where lines of computer code that manipulate critical infrastructure are among the most potent weapons.

To understand how a nation can turn an adversary’s lights out without firing a shot, you have to look inside the controllers that regulate modern infrastructure. They are the digital brains responsible for opening valves, spinning turbines and routing power.

For decades, controller devices were considered simple and isolated. Grid modernization, however, has transformed them into sophisticated internet-connected computers. As a cybersecurity researcher, I track how advanced cyber forces exploit this modernization by using digital techniques to control the machinery’s physical behavior.

Hijacked Machines

My colleagues and I have demonstrated how malware can compromise a controller to create a split reality. The malware intercepts legitimate commands sent by grid operators and replaces them with malicious instructions designed to destabilize the system.

For example, malware could send commands to rapidly open and close circuit breakers, a technique known as flapping. This action can physically damage massive transformers or generators by causing them to overheat or go out of sync with the grid. These actions can cause fires or explosions that take months to repair.

Simultaneously, the malware calculates what the sensor readings should look like if the grid were operating normally and feeds these fabricated values back to the control room. The operators likely see green lights and stable voltage readings on their screens even as transformers are overloading and breakers are tripping in the physical world. This decoupling of the digital image from physical reality leaves defenders blind, unable to diagnose or respond to the failure until it is too late.

people wearing hardhats in front of electrical equipment the size of a small house

Today’s electrical transformers are accessible to hackers. GAO

Historical examples of this kind of attack include the Stuxnet malware that targeted Iranian nuclear enrichment plants. The malware destroyed centrifuges in 2009 by causing them to spin at dangerous speeds while feeding false “normal” data to operators.

Another example is the Industroyer attack by Russia against Ukraine’s energy sector in 2016. Industroyer malware targeted Ukraine’s power grid, using the grid’s own industrial communication protocols to directly open circuit breakers and cut power to Kyiv.

More recently, the Volt Typhoon attack by China against the United States’ critical infrastructure, exposed in 2023, was a campaign focused on pre-positioning. Unlike traditional sabotage, these hackers infiltrated networks to remain dormant and undetected, gaining the ability to disrupt the United States’ communications and power systems during a future crisis.

To defend against these types of attacks, the U.S. military’s Cyber Command has adopted a “defend forward” strategy, actively hunting for threats in foreign networks before they reach U.S. soil.

Domestically, the Cybersecurity and Infrastructure Security Agency promotes “secure by design” principles, urging manufacturers to eliminate default passwords and utilities to implement “zero trust” architectures that assume networks are already compromised.

Supply Chain Vulnerability

Nowadays, there is a vulnerability lurking within the supply chain of the controllers themselves. A dissection of firmware from major international vendors reveals a significant reliance on third-party software components to support modern features such as encryption and cloud connectivity.

This modernization comes at a cost. Many of these critical devices run on outdated software libraries, some of which are years past their end-of-life support, meaning they’re no longer supported by the manufacturer. This creates a shared fragility across the industry. A vulnerability in a single, ubiquitous library like OpenSSL – an open-source software toolkit used worldwide by nearly every web server and connected device to encrypt communications – can expose controllers from multiple manufacturers to the same method of attack.

Modern controllers have become web-enabled devices that often host their own administrative websites. These embedded web servers present an often overlooked point of entry for adversaries.

Attackers can infect the web application of a controller, allowing the malware to execute within the web browser of any engineer or operator who logs in to manage the plant. This execution enables malicious code to piggyback on legitimate user sessions, bypassing firewalls and issuing commands to the physical machinery without requiring the device’s password to be cracked.

The scale of this vulnerability is vast, and the potential for damage extends far beyond the power grid, including transportation, manufacturing and water treatment systems.

Using automated scanning tools, my colleagues and I have discovered that the number of industrial controllers exposed to the public internet is significantly higher than industry estimates suggest. Thousands of critical devices, from hospital equipment to substation relays, are visible to anyone with the right search criteria. This exposure provides a rich hunting ground for adversaries to conduct reconnaissance and identify vulnerable targets that serve as entry points into deeper, more protected networks.

The success of recent U.S. cyber operations forces a difficult conversation about the vulnerability of the United States. The uncomfortable truth is that the American power grid relies on the same technologies, protocols and supply chains as the systems compromised abroad.

The U.S. power grid is vulnerable to hackers.

Regulatory Misalignment

The domestic risk, however, is compounded by regulatory frameworks that struggle to address the realities of the grid. A comprehensive investigation into the U.S. electric power sector my colleagues and I conducted revealed significant misalignment between compliance with regulations and actual security. Our study found that while regulations establish a baseline, they often foster a checklist mentality. Utilities are burdened with excessive documentation requirements that divert resources away from effective security measures.

This regulatory lag is particularly concerning given the rapid evolution of the technologies that connect customers to the power grid. The widespread adoption of distributed energy resources, such as residential solar inverters, has created a large, decentralized vulnerability that current regulations barely touch.

Analysis supported by the Department of Energy has shown that these devices are often insecure. By compromising a relatively small percentage of these inverters, my colleagues and I found that an attacker could manipulate their power output to cause severe instabilities across the distribution network. Unlike centralized power plants protected by guards and security systems, these devices sit in private homes and businesses.

Accounting for the Physical

Defending American infrastructure requires moving beyond the compliance checklists that currently dominate the industry. Defense strategies now require a level of sophistication that matches the attacks. This implies a fundamental shift toward security measures that take into account how attackers could manipulate physical machinery.

The integration of internet-connected computers into power grids, factories and transportation networks is creating a world where the line between code and physical destruction is irrevocably blurred.

Ensuring the resilience of critical infrastructure requires accepting this new reality and building defenses that verify every component, rather than unquestioningly trusting the software and hardware – or the green lights on a control panel.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 
News Contact
Author:

Saman Zonouz, Associate Professor of Cybersecurity and Privacy and Electrical and Computer Engineering, Georgia Institute of Technology

Media Contact:

Shelley Wunder-Smith
shelley.wunder-smith@research.gatech.edu