Photochemistry and a New Catalyst Could Make Fertilizer More Sustainable

Closeup view of a red tractor spreading fertilizer pellets in a field.

Georgia Tech engineers are working to make fertilizer more sustainable — from production to productive reuse of the runoff after application — and a pair of new studies is offering promising avenues at both ends of the process.

In one paper, researchers have unraveled how nitrogen, water, carbon, and light can interact with a catalyst to produce ammonia at ambient temperature and pressure, a much less energy-intensive approach than current practice. The second paper describes a stable catalyst able to convert waste fertilizer back into nonpolluting nitrogen that could one day be used to make new fertilizer.

Significant work remains on both processes, but the senior author on the papers, Marta Hatzell, said they’re a step toward a more sustainable cycle that still meets the needs of a growing worldwide population.

“We often think it would be nice not to have to use synthetic fertilizers for agriculture, but that’s not realistic in the near term considering how much plant growth is dependent on synthetic fertilizers and how much food the world’s population needs,” said Hatzell, associate professor in the George W. Woodruff School of Mechanical Engineering. “The idea is that maybe one day you could manufacture, capture, and recycle fertilizer on site."

Get the full story on the College of Engineering website.

 
News Contact

Joshua Stewart
College of Engineering

Why Your Scissors Glide (or Don't) When You're Wrapping Presents

Wrapping Presents

Wrapping Presents

In the hustle and bustle of the holidays, a moment of transcendence can happen as you wrap presents: scissors in hand, cutting a piece of wrapping paper from the roll, the blades hit their stride and slide from end to end.

Why is it sometimes the scissors glide, and other times the paper tears a dozen times? Christopher Luettgen says it all has to do with paper quality.

“Good wrapping paper is going to have a prettier surface. It may even have a textured surface, maybe embossed or more three dimensional,” said Luettgen, a professor of the practice with the Renewable Bioproducts Institute and an expert on paper.

High-quality wrapping paper is made from softwood pulp — in particular, the strongest pulp you could make is southern pine softwood.

“The really good paper starts with softwood fiber,” he said. “Softwood kraft in particular — ‘kraft’ being an old German word for ‘strong.’ It’s going to be stiffer and stronger in multiple directions. Then it gets coated so you get a nice clay coating on the surface, which will smooth the surface to get it beautifully printed. When you come across weak paper that wants to tear very easily, it is often made with mechanical fibers.”

So, if you want the glide, you want good paper. When might it be worth skimping on quality?

“If you’ve got a big job, like you want to wrap a TV or a large game or something like that, you don’t want to spend a lot of money on the high-end wrapping papers. It’s going to get torn up pretty fast. That’s when you might go with a cheaper, thinner brand.”

Of course, as Luettgen notes, you can’t tear the paper in the store, but looking for a thicker paper is a good start. The thicker paper will also give your presents a more refined look under the tree.

“Let’s say you’re giving a book to somebody. You want nice tight corners. You want good creasing. You really want to make it showy.”

Why, then, does Santa sometimes not wrap his presents? Luettgen believes it’s all a matter of resources leading up to Christmas Eve.

“If he has enough help at his studio, I would think that he’s going to get all of your presents wrapped. But if he’s rushed, with bad weather for instance, he may have to come down the chimney with the presents unwrapped, but they’ll be under the tree.”

 
News Contact

Kristen Bailey

Institute Communications

Remembering Research Scientist Paul Manuel Aviles Baker

Portrait of Paul Manuel Aviles Baker

Portrait of Paul Manuel Aviles Baker.

Like those of many senior scholars, Paul M.A. Baker’s CV runs more than 30 pages, detailing a career’s worth of research, service, and accomplishments. It’s on page two, however, where you may get the strongest sense of Baker’s intellect. He accumulated an eclectic and impressive collection of degrees, five in all, ranging from zoology to theology and bookending his Ph.D. in public policy.

That kind of dedication to learning was quintessential Baker, as was his commitment to helping lift up those around him, especially junior researchers, said Victoria Razin, a senior research engineer at the Georgia Tech Research Institute. She became a friend and mentee of Baker’s after working with him for a year on voting machine accessibility.

“Paul was an incredibly thoughtful researcher, a kind friend, and an incredible mentor who built up the people around him,” Razin said.

Baker, the senior director for research and strategic innovation at the Center for Advanced Communications Policy, passed away suddenly last week after a brief medical emergency, leaving behind an enormous void for his family, friends, and coworkers, as well as a tremendous legacy.

“Paul was like no one else I have met,” said Regent’s Researcher W. Bradley Fain, CACP’s executive director and Baker’s boss since 2019. “To be able to describe Paul succinctly is impossible.”

From Zoology to Technology Policy

After graduating from college with a degree in zoology, Baker worked as an environmental scientist, in real estate, and as a publisher, in addition to later academic roles at George Mason University and Saint Mary’s College. He joined Georgia Tech in 1999 as a visiting assistant professor, where he taught Research Design for the Policy Sciences, American Government, and more.

Two years later, he joined CACP as associate director for policy research and became director of research four years later. In 2011, he was named associate director of the Center for 21st Century Universities, where he oversaw strategic policy initiatives and managed the Center’s policy-focused sponsored research projects. After three years, he returned full-time to CACP, where he was appointed senior director for research and strategic innovation.

In 2020, he took on a new role when the Center for the Development and Application of Internet of Things Technologies moved from GTRI to CACP. Paul became the organization’s chief operations officer, where he worked to further the Center’s mission to spur technology and policy innovation in the internet of things sphere.

“Paul was a wonderful advisor, helping me work through really complicated issues,” Fain said. “Every conversation was an opportunity for him to share knowledge.”

Kaye Husbands Fealing, dean and Ivan Allen Jr. Chair in the Ivan Allen College of Liberal Arts, said Baker was an accomplished researcher who was deeply committed to expanding technology and workforce accessibility for everyone.

“We worked together a few years ago on a project with my research assistant, Andrew Hanus, and Connie McNeely of George Mason University to broaden participation in STEM employment for people with disabilities, and he took the initiative to lead a workshop on how veterans could gain STEM skills. I will miss his keen insight, his passion for his scholarship, and his generosity.”

Regents’ Researcher Emeritus Helena Mitchell, former executive director of CACP, said Baker was the Center’s most published employee whose contributions at Georgia Tech and around the world will continue to be felt.

“He was an excellent researcher, a great networker, a man of passion, integrity, and knowledge,” she said.

She and Baker were close friends for over 20 years, frequently hanging out together before Baker moved to Canada to be with his husband. She said she will miss their wide-ranging discussions over cosmopolitans. 

“He’s like a brother to me,” she said. 

Promoting Equal Access

In each of his roles, Baker approached his work with enormous curiosity, rigor, and a genuine desire to leave the world a better place, said Nathan Moon, director of research at CACP, who worked with Baker for nearly two decades.

“Paul was committed to doing research that would promote equal access for all people,” Moon said.

It shows in his publishing record, where you’ll find papers such as “Wireless Technologies and Accessibility for People with Disabilities: Findings from a Policy Research Instrument,”; “E-Accessibility and Municipal Wifi: Exploring a Model for Inclusivity and Implementation,” and “Digital Tech for Inclusive Aging: Usability, Design and Policy.”

In the last few years, he worked with Moon to develop a new seminar course, Policy Innovation for Inclusive Technologies, as part of a grant to develop a new postdoctoral training program for scholars interested in disability and accessible technology policy.

They taught the course together in the recently concluded Fall semester.

“In addition to being an excellent researcher, Paul was a wonderful educator,” Moon said. “He loved teaching and had high hopes and expectations for students, just as he did for junior researchers.”

But Baker’s personality and approach to other people especially set him apart, Razin said.

He had a way of connecting with people that made them feel special. For instance, Baker was a Quaker who also practiced Buddhism. But he always took time to send holiday greetings in correct Hebrew to Razin, who is Jewish.

“That was so special,” she said.

Moon said Baker’s legacy will continue to motivate him and other research scientists at CACP and across Georgia Tech who were touched by Baker’s intellect, curiosity, and drive.

“I can say confidently that as both a research scientist and person, Paul left the world a better place than he found it. He was a good friend, and he’ll be missed.”

 
News Contact

Michael Pearson
Ivan Allen College of Liberal Arts

Coskun Lab Pioneering New Field of Research: Single Cell Spatial Metabolomics

spatal metabololomics

Images of time in space: The top panel image shows pseudo-time single cell metabolic trajectories across distinct biogeographical regions. The dark purple represents early metabolic changes, while the bright yellow represents later metabolic activities. The bottom panel is a spatial projection of single cells’ metabolic trajectories (denoted by arrows in the dark zone and light zone regions) in tonsil tissue. Photo provided by Coskun Lab

Ahmet Coskun and his collaborators plan to create a chemical atlas of all the immune cells in the human body, a 3D micromap to help clinicians navigate the complex role of the entire immune system in the presence of different diseases. 

It’s the kind of massive undertaking that would result in vastly improved precision therapies for patients. And it’s the kind of journey that starts with a single cell. Coskun and team are off to a fast start with the introduction of a new integrative technique for profiling human tissue that enables researchers to capture the geography, structure, movement, and function of molecules in a 3D picture. 

The researchers described their new approach, the Single Cell Spatially resolved Metabolic (scSpaMet) framework, in the journal Nature Communications on Dec. 13. The study builds on a technique Coskun’s team developed and described in a 2021 article, “3D Spatially resolved Metabolomic profiling Framework,” published in Science Advances. In that work, the team introduced a technique that measures the activity of metabolites and proteins as part of a comprehensive profile of human tissue samples. 

“Earlier we couldn’t achieve single-cell resolution, but with this new approach, we can,” said Coskun, Bernie Marcus Early Career Professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. “With this new approach, we can get spatial details of proteins and metabolites in single cells– no one else has yet reached this level of high subcellular resolution.”

He added, “We’re pioneering a new field of research with this work, single cell spatial metabolomics.”

A Bigger, Better Molecular Picture

Human tissue is spatially crowded with all kinds of stuff, so investigators need tools that can see clearly into, through, and around that multilayered biological traffic – everything, all at once, in high-definition 3D. With scSpaMet, Coskun’s team can capture single cell details such as the naturally occurring lipids, proteins, as well as metabolites (with their multiple functions, including energy conversion and cell signaling). And other details, like those provided by researchers: Intracellular and surface markers are used to label and track cell activity and behavior. 

The team broadened the scope of this study, extending its investigation beyond human tonsil tissue. 

“We showed the crucial role of immune cells in lung cancer for the study of lung cancer for the study of immunometabolism of T cells and macrophages as they interact with tumors,” Coskun said. “Then we created dynamic immune metabolic changes in tonsils as they go through germinal center reactions to give rise to the antibody-producing cells. Finally, we demonstrated the role of immune cells in the endometrium, a membrane in the uterus that might lead to conditions impacting a woman’s health.”

The wide-angled study required plenty of cross-country collaboration with other institutions, although Coskun’s lab guided the wide-angled study, integrating its expertise in bioimaging, chemistry, tissue biology, and artificial intelligence. 

Cold Spring Harbor Laboratory (New York) provided access to its endometrium tissue bank. Oak Ridge National Laboratory (Tennessee) provided data from its complex metabolic imaging instrumentation, to further demonstrate how single cell spatial metabolomics imaging can generate rich data. 

The University of California-Davis provided kidney biospecimens as both fixed tissue and frozen embedded tissue, in two halves of the same sample, “so we could demonstrate the effect of tissue preparation on the sensitivity of our single cell spatial metabolomics pipeline,” Coskun said.

The team also included Thomas Hu and Mayar Allam, graduate researchers in Coskun’s lab, who guided the research as lead authors, and Walter Henderson, a research scientist who manages the IEN/IMat Materials Characterization Facility at Georgia Tech.

Considering the Whole Person's Biochemistry

The ability to generate single cell spatial metabolic profiling of individual patients can reveal a world of possibility and potential for clinicians who need to fully understand a patient’s biophysical makeup to contrive the best treatment options.

“For example, it can provide mechanisms of how immune responses can be boosted by adding dietary molecules along with immunotherapies,” Coskun said. “It can also help adjust the dose of cell-based treatments, based on the body mass index of individual patients, whether they are obese or not.”

Coskun believes this new arena of single cell metabolomics research his lab is developing will complement the field of single cell genomics, which has led to genomic medicine. His team’s comprehensive exploration and imaging of the geography of normal and unhealthy human tissues – of every single cell – can further explain cellular regulation in ways that were previously overlooked, due to the lack of technology.

He envisions a future in which a patient’s BMI, dietary habits, and exercise commitments, along with their single cell spatial metabolomic atlas of disease progression, will be analyzed all together to find optimum therapies that can work with biologics and metabolic boosting regimens, potentially increasing the survival of cancers, women’s diseases, and metabolic disorders.

“We will have opportunities to talk about spatial single cell metabolomic medicine, to stratify patients and design next-generation combination therapies with an integrated view of genes and chemical activity roadmaps, for more efficient management of cancer and other diseases,” Coskun said.

In creating their scSpaMet framework, the researchers must integrate expensive machines that live in the worlds of nanotechnology and chemistry right now. The system will require clinical-friendly optimizations to be able to run single cell metabolic imaging measurements in healthcare settings. Coskun expects the cost and user-friendliness will be improved in the near future to reach the bedside.

“When researchers achieved single cell sequencing, it was a revolutionary moment in medicine,” Coskun said. “Now, we believe single cell spatial metabolic profiling will push the medical practice into new heights.” 

This research was supported by the Burroughs Wellcome Fund, and the Bernie Marcus Early Career Professorship, as well as the National Science Foundation (Grant ECCS-1542174), (Grant ECCS-2-25462), American Cancer Society, and National Institutes of Health grants (R21AG081715, R21AI173900, and R35GM151028)

Citation: Thomas Hu, Mayar Allam, Shuangyi Cai, Walter Henderson, Brian Yueh, Aybuke Garipcan, Anton V. Ievlev, Maryam Afkarian, Semir Beyaz, and Ahmet F. Coskun. “Single-cell spatial metabolomics with cell-type specific protein profiling for tissue systems biology,” Nature Communications (Dec. 13, 2023)

Mayam and Thomas

Lead authors Mayar Allam and Thomas Hu

 

Ahmet Coskun photo

Ahmet Coskun

 
News Contact

Researchers Find They Can Stop Degradation of Promising Solar Cell Materials

3D illustration of diamond-shaped perovskite structure in longs rows stacked in two layers.

An illustration of metal halide perovskites. They are a promising material for turning light into energy because they are highly efficient, but they also are unstable. Georgia Tech engineers showed in a new study that both water and oxygen are required for perovskites to degrade. The team stopped the transformation with a thin layer of another molecule that repelled water. (Image Courtesy: Juan-Pablo Correa-Baena)

Georgia Tech materials engineers have unraveled the mechanism that causes degradation of a promising new material for solar cells — and they’ve been able to stop it using a thin layer of molecules that repels water.

Their findings are the first step in solving one of the key limitations of metal halide perovskites, which are already as efficient as the best silicon-based solar cells at capturing light and converting it into electricity. They reported their work in the Journal of the American Chemical Society.

“Perovskites have the potential of not only transforming how we produce solar energy, but also how we make semiconductors for other types of applications like LEDs or phototransistors. We can think about them for applications in quantum information technology, such as light emission for quantum communication,” said Juan-Pablo Correa-Baena, assistant professor in the School of Materials Science and Engineering and the study’s senior author. “These materials have impressive properties that are very promising.”

Get the full story on the College of Engineering website.

 
News Contact

Joshua Stewart
College of Engineering

ARCM Facilitates Update of Radio Control System for Army’s UH-60M

GTRI Senior Research Engineer Scott Tompkins is shown reconfiguring an Air Ground Networking Radio (AGNR) for testing

GTRI Senior Research Engineer Scott Tompkins is shown reconfiguring an Air Ground Networking Radio (AGNR) for testing at a lab bench. (Credit: Sean McNeil)

Using a model-based systems engineering (MBSE) approach, researchers from the Georgia Tech Research Institute (GTRI) are developing the software necessary to integrate new control, radio, and cryptographic capabilities into UH-60M Black Hawk helicopters, which are mainstays of the U.S. Army’s helicopter fleet.

The Aviation Radio Control Manager (ARCM) software will enable the sustainment of enduring fleet aircraft by employing a Modular Open Systems Approach (MOSA) to replace obsolete, out-of-production radio equipment and set the stage for future communications suite enhancements. The reusable and adaptable ARCM software is projected to be employed on additional Army aircraft in the future, providing benefits of software reuse, potentially leveraged for future efforts.

Now in its third round of software development, ARCM is due to be flight-tested next summer and installed on the first group of UH-60M aircraft in 2025. The project, supported by the U.S. Army’s PEO Aviation in Huntsville, Alabama, will comply with the service’s Future Airborne Capability Environment (FACE™) Technical Standard, Edition 3.1.

Model-based approaches are being used across the Department of Defense (DoD) to accelerate the development of new platforms and updates to existing ones. Beyond reducing costs and getting new capabilities to warfighters more quickly, the process can streamline procurement by clearly spelling out system specifications and key interfaces.

“Model-based approaches have been a very central part of how we’ve approached ARCM, and the return on investment for ARCM generally and for the MBSEs specifically, is based largely on a business case in which you spend a little more to get the models in place and design the system to interface with multiple components,” said Scott Tompkins, a GTRI senior research engineer who leads the project. “Investments in MBSE can provide huge savings when you reuse the work for other systems and shorten the cycle times to bring new capabilities to aircraft platforms.”

In this first application, the ARCM software will facilitate three major improvements for the UH-60M: (1) replacement of the control head unit (CHU) that aircrews use to operate radio equipment, (2) replacement of an obsolete tactical communications radio, and (3) upgrade of cryptographic systems used for secure communications. The replacement radio hardware, which is being built by multiple vendors, interfaces with the aircraft’s unmodified flight management system (FMS) via the ARCM.

“The aircraft needed a new radio, but the Army doesn’t necessarily desire to change the approved and fielded Black Hawk FMS Operational Flight Program (OFP) to integrate that radio,” Tompkins said. “In this project, we are translating the radio’s interface, so they don’t have to change the main aircraft software. This will address three issues at once through software.”

Two different radios with comparable functionality will be available as options for replacing the existing ARC-201D unit. The ARCM software will make the difference between those two alternatives invisible to aircrews and other systems in the aircraft. The software will also allow transparent substitution of radio equipment on Black Hawks used by foreign nations, and it is designed for future support of alternate radio equipment used by National Guard Black Hawks for collaboration with civil defense and domestic first responder agencies.

“From the models, we generated the vast majority of the code used in the ARCM, and that code meets the FACE Edition 3.1 standard for MOSA software,” Tompkins said. “We have also deployed a development, security, and operations (DevSecOps) pipeline to support our software repository and perform automated testing of the products as part of best practices in software development and acquisition. We are also doing full end-to-end information assurance accreditation.”

Though only the UH-60M work has been performed so far, the work done on ARCM could also be used with CH-47F Chinook and AH-64 Apache helicopters, as well as the Gray Eagle uncrewed aircraft system (UAS). The Army’s Future Vertical Lift (FVL) platforms could also take advantage of the modeling done for ARCM.

“The FACE model provides the ability to unambiguously communicate about interfaces,” Tompkins said. “We have all the contextual meaning for the data so that when we hand this over, there’s no question about what the data is and how to interpret the messages. We have captured all of that in the model.”

Beyond ensuring compatibility with existing Black Hawk systems, GTRI is also making sure the replacement interface – graphics and buttons that control the radio equipment – makes sense to the aircrews that will use it. “We recently completed another round of crew station working group meetings where we had pilots review our graphical user interface (GUI) and the functionality,” said Tompkins. “It was very encouraging, and we continue to get positive user feedback.”

GTRI is scheduled to deliver its full technical data package (TDP) to the Army in January 2024. The ARCM program will submit the software and its associated development artifacts to the Army for an airworthiness qualification to a DO-178C Design Assurance Level ‘C’ level of rigor in Q3 of fiscal year 2024. It will then be reviewed for a first test flight in early summer of that year. Once flight testing is over, ARCM and the new hardware can begin rolling out to Army units in 2025.

GTRI expects to be part of the test flights and then move on to support the development of additional capabilities, including new waveforms being developed by the radio vendors. Discussions are also underway regarding potential applications to other Army rotorcraft.

“Our goal is to have an ARCM release annually that brings new capabilities,” Tompkins said. “With software-defined radios, the vendors are constantly innovating and improving waveforms. We want to get those enhancements out to aircrews as soon as possible.”

The ARCM program has involved multiple labs within GTRI, as well as Tucson Embedded Systems, which is a FACE Verification Authority.

“We have put together a great multidisciplinary team of modelers, software developers, information assurance experts, human factors specialists, and human systems engineers,” Tompkins said. “It’s been a spectacular project – working with a wonderful team – and I’m really excited to see the first test flight.”

DISCLAIMER: This article contains views and opinions that are not official U.S. Army positions.
 

Writer: John Toon (john.toon@gtri.gatech.edu)  
GTRI Communications  
Georgia Tech Research Institute  
Atlanta, Georgia

The Georgia Tech Research Institute (GTRI) is the nonprofit, applied research division of the Georgia Institute of Technology (Georgia Tech). Founded in 1934 as the Engineering Experiment Station, GTRI has grown to more than 2,900 employees, supporting eight laboratories in over 20 locations around the country and performing more than $940 million of problem-solving research annually for government and industry. GTRI's renowned researchers combine science, engineering, economics, policy, and technical expertise to solve complex problems for the U.S. federal government, state, and industry.

 

 

AGNR control head unit (CHU)

AGNR control head unit (CHU) showing the pilot vehicle interface (PVI) for the GTRI-developed Aviation Radio Control (ARCM) software. (Credit: Sean McNeil)

 
News Contact

(Interim) Director of Communications

Michelle Gowdy

Michelle.Gowdy@gtri.gatech.edu

404-407-8060

GTRI, Children’s Healthcare of Atlanta and Emory Use Wearable Sensors to Address Healthcare Worker Burnout

GTRI and CHOA Research Team

The team leading this project includes, from left to right: GTRI Senior Research Scientist Khatereh Hadi, Children's pediatric cardiologist Dr. Michael Fundora, GTRI Senior Research Engineer Paula Gomez, GTRI Senior Research Scientist Matthew Swarts, and Children's Director of Nursing & Allied Health Research and Evidence Based Practice Christina Calamaro, who is also an associate professor at Emory’s Nell Hodgson Woodruff School of Nursing (Photo Credit: Sean McNeil, GTRI).

Healthcare worker burnout, a topic that received significant attention during COVID-19, continues to pose risks for the nation’s health and economic wellbeing. 

In 2022, nearly half of healthcare workers reported feeling burned out, up from 32% in 2018, and the number of healthcare workers who intended to look for a new job increased by 33% over that same time period, according to a recent report from the Centers for Disease Control and Prevention (CDC). Annual burnout-related turnover costs are estimated to be $9 billion for nurses and $2.6 billion to $6.3 billion for physicians, per the U.S. Surgeon General. 

To address this challenge,­ the Georgia Tech Research Institute (GTRI), Children’s Healthcare of Atlanta and Emory University’s Nell Hodgson Woodruff School of Nursing have conducted a study using wearable sensors to better understand how the interplay of workload, stress, and sleep contribute­­ to an elevated risk of burnout among healthcare workers and how to mitigate those risks going forward. 

The group recently measured real-time movement patterns of physicians and nurses in the cardiac intensive care unit (CICU) at Children’s and collected data on their stress levels, work and sleep cycles, healthcare delivery and perceived workloads. The goal of the study is to develop a methodology that can be used by other healthcare systems across the state to minimize turnover costs by better predicting and addressing factors that trigger burnout. 

“Our ultimate goal with this project is to be able to offer our methodology framework to other healthcare systems throughout Georgia so that they can identify and address the specific challenges they are facing on a more granular level,” said Khatereh Hadi, a senior research scientist at GTRI who is leading this project. 

To measure stress, workload and sleep among the study participants, the team used actigraphy sensors developed by Empatica, a spin-off of Massachusetts Institute of Technology (MIT) that designs and develops artificial intelligence (AI) systems to monitor human health through wearable sensors. 

“These sensors are among the few on the market that let you directly download the data you collect,” explained GTR Senior Research Scientist Matthew Swarts who led the sensor development aspects of this project. 

The participants also wore tags that were connected to ultra-wideband (UWB) sensor systems installed in the ceiling of the CICU to track their movements throughout their shifts. 

“Because UWB takes up more radio frequency space, it avoids interference issues that affect other technologies such as Wi-Fi and Bluetooth. This allowed us to have more penetration and better accuracy,” Swarts said. 

The study collected data on 40 total participants, who were evaluated over a four-week time period. The team also used the NASA Task Load Index (NASA-TLX), a widely used assessment tool that rates perceived workload, to gather data on the participants’ workload perceptions. 

Paula Gomez, a GTRI senior research engineer who led the development of the project’s research methodology, said it was rewarding bringing the theoretical aspects of this project into practical application.

“Since GTRI is the applied research arm of Georgia Tech, it is really important for us to have access to a real-world environment to test and validate the theoretical research,” Gomez said. 

GTRI conducted this study with Dr. Michael Fundora, a pediatric cardiologist at Children’s who specializes in congenital heart disease and clinical research, and Christina Calamaro, the Director of Nursing & Allied Health Research and Evidence Based Practice at Children’s and an associate professor at Emory’s Nell Hodgson Woodruff School of Nursing. 

Fundora and Calamaro noted that current data collection methods that examine healthcare worker burnout are done retroactively and may miss certain nuances that are crucial for developing a comprehensive understanding of the issue. 

“A lot of the literature that's been done in this area looks at big data sets that, for the most part, aren’t in real time” said Calamaro. “This is one study that’s able to quantify what are the factors that may impact care at the current time and can set the stage, with the use of technology, for giving us a better measurement of what issues nurses and physicians are facing, versus going back and doing a secondary analysis of big data.” 

While burnout is commonly perceived as just affecting those experiencing it, if left unchecked, it could also lead to diminished patient care and higher mortality rates, said Fundora. 

“People talk about burnout in the sense that it's about the individual, and that's certainly important,” Fundora said. “But we conducted this study to understand how burnout also affects our patients because that's the only way I believe that we're going to get to the root of the problem.” 

Now that the data has been the collected, it will be analyzed and interpreted before potential solutions are evaluated. The team agreed that the interdisciplinary nature of the study will help them generate more impactful solutions. 

“As a physician, working on this study opened my eyes to everything I didn’t know about nurses – they are operating very sophisticated, complex equipment and nearly everything they do in the ICU has a life-or-death impact,” said Fundora. “The solution-oriented approach of GTRI also gave me a fresh perspective.” 

Calamaro added: “I think every healthcare study should have an engineer involved in some way because they see things that we as healthcare professionals don’t. It's like, I never thought of that.” 

 

Writer: Anna Akins 
Photos: Sean McNeil 
GTRI Communications
Georgia Tech Research Institute
Atlanta, Georgia

The Georgia Tech Research Institute (GTRI) is the nonprofit, applied research division of the Georgia Institute of Technology (Georgia Tech). Founded in 1934 as the Engineering Experiment Station, GTRI has grown to more than 2,900 employees, supporting eight laboratories in over 20 locations around the country and performing more than $940 million of problem-solving research annually for government and industry. GTRI's renowned researchers combine science, engineering, economics, policy, and technical expertise to solve complex problems for the U.S. federal government, state, and industry.

Wearable Healthcare Sensor

A close-up of the tags and sensors that were used to measure stress, workload and sleep among the study participants (Photo Credit: Sean McNeil, GTRI).

 
News Contact

(Interim) Director of Communications

Michelle Gowdy

Michelle.Gowdy@gtri.gatech.edu

404-407-8060

Poor and Disadvantaged People Sit in the Dark Longer After a Storm Outage

A young girl wrapped in a blanket holds a candle during a power outage.

Extreme weather events impact disadvantaged communities more harshly, and extended power outages can be dangerous and life-threatening.

Hurricanes and other extreme weather events often affect disadvantaged communities more severely, and extended power outages are some of the most harmful effects. Concerns over the intensification of hurricanes has led to new environmental justice policies that aim to mitigate the unequal impacts of major storms. Now, policy experts and engineers are directing their attention toward illuminating the causes.

Researchers at the Georgia Institute of Technology sought to investigate whether socioeconomically vulnerable households experienced longer power outage durations after extreme weather events. The team analyzed data from the top eight major Atlantic hurricanes between 2017 and 2020 that knocked out power for over 15 million customers in nine states across the southeastern U.S. The team found that people in lower socioeconomic tiers wait significantly longer to have power restored after a major storm — nearly three hours longer on average.  

The interdisciplinary research team consists of Chuanyi Ji, an associate professor in the School of Electrical and Computer Engineering; Scott Ganz, a policy researcher at Georgetown University and a former Georgia Tech faculty member; and Chenghao Duan, a Ph.D. student in Ji’s lab.

Their research paper, titled “Socioeconomic Vulnerability and Differential Impact of Severe Weather-Induced Power Outages,” was published in the journal PNAS Nexus.

“Not only do extreme weather events impact disadvantaged communities more harshly, but power disruption can be dangerous and even life-threatening in certain contexts,” Ji said.  “Those with fewer resources are limited in their ability to evacuate from severe weather situations, and for individuals with electric medical equipment, an extended power outage can be disastrous.”

Ji, who specializes in large-scale data analytics for power grid resilience, has done previous work on power restoration procedures involving infrastructure and utility services, but wanted to expand the work into the realm of communities. The team hypothesized that disadvantaged communities likely wait longer for power to be restored, but to get a realistic picture of the mechanisms at play, the team needed to analyze troves of data.

They obtained weather data for eight major hurricanes between 2017 and 2020 from the National Oceanic and Atmospheric Administration and additional flood databases. They also examined power failure data for 15 million customers for the same time period, which spanned nine states, 588 counties, and 108 utility service regions in the Southeast.

The team used spatial data analytics to model weather impact across regions. They then measured customers’ socioeconomic status by using the social vulnerability index, a tool produced by the Centers for Disease Control that considers indicators related to poverty, housing costs, education, health insurance, and other factors to determine socioeconomic status. Duan and Ji designed the models and estimates, and then analyzed the results to reveal the underlying relationship between customers' socioeconomic status and their power outage durations.

Their results show that, when comparing affluent communities and poor communities given the same kind of impact from weather events, poor communities experienced power outages that average 170 minutes longer. Specifically, they found that a one-decile drop in socioeconomic status is associated with a 6.1% longer outage duration. Their results indicate that there is a statistically significant relationship between socioeconomic vulnerability and the duration of time that elapses before power is restored.

“Our study also tries to rule out some possible explanations for why socioeconomically disadvantaged people take longer to get their power back on,” Ganz said. “For example, our study controls for population density in a county and the peak number of outages in that county, and we still observe that socioeconomically disadvantaged communities experience longer outages.”

He theorized that the “primary cause is that poorer communities are also likely to be more distant from critical infrastructure or require more significant repairs to power lines, but these are important questions for future research.”

The results can have important implications for policymakers, pointing to the necessity of reexamining post-storm recovery and resource allocation policies. Service and utility providers approach power recovery by adhering to procedures and regulations that are policy-driven. Current research shows that the standard procedures for restoring power following big storms, while procedurally fair, may contribute to unequal outcomes. A greater focus on communities could help to correct the issue.

“Power grid resilience is not just about the infrastructure and utility companies — it’s also about the people they serve,” Ji said. “Success in achieving policy goals depends on our ability to identify the features that contribute most to these unequal impacts, which can in turn help us design appropriate interventions to improve outcomes.”

 

Funding: The authors acknowledge financial support from the Georgia Tech Energy Innovation and Policy Center and the Strategic Energy Institute, Georgia Tech School of Electrical and Computational Engineering, Georgia Tech School of Public Policy, and Georgetown McDonough School of Business.

Citation: Scott C Ganz, Chenghao Duan, Chuanyi Ji, Socioeconomic vulnerability and differential impact of severe weather-induced power outages, PNAS Nexus, Volume 2, Issue 10, October 2023.

DOI: https://doi.org/10.1093/pnasnexus/pgad295

Chuanyi Ji

Chuanyi Ji, associate professor of Electrical and Computer Engineering at Georgia Tech

Scott Ganz

Scott Ganz, associate teaching professor at Georgetown University and research fellow at the American Enterprise Institute

Chenghao Duan

Chenghao Duan, a Ph.D. student in Ji's lab at Georgia Tech. 

 
News Contact

Catherine Barzler, Senior Research Writer/Editor

catherine.barzler@gatech.edu

Charlotte Alexander Uses NSF Grants to Create an AI-Powered, Publicly Accessible Court Data Platform

Headshot of Charlotte Alexander

Imagine accessing court documents and data, both civil and criminal, in the state of Georgia through a free central repository. Now imagine this access across the entire U.S. court system.

Charlotte Alexander, professor of Law and Ethics at the Georgia Tech Scheller College of Business, is working on a project that uses AI to mine the text of court records. Her work includes pulling key pieces of information out of court documents and making it freely available to attorneys, judges, prosecutors, criminal defendants, civil litigants, journalists, policymakers, researchers, and any member of the public. 

Currently, court records are stored in systems that are expensive, fragmented, outdated, and hard to navigate. Alexander sees a lack of good data as a key problem impeding court reform efforts. Better data, she says, "would shed light on questions around efficiency and time of action, how long things take, and why there are delays. But it also raises big, heavy, substantive questions about bias and who wins and who loses. Does our legal system actually deliver justice, and if so, to whom?" said Alexander.

Her work, funded primarily through National Science Foundation (NSF) grants, is multi-faceted. She and a team of researchers received an initial grant from the NSF’s Convergence Accelerator Project, which was designed to fund efforts to create new sources of data and then make that data publicly available.

Working on the Federal Level

This initial work with colleagues at Georgia State University, Northwestern University, University of Richmond, and the University of Texas - Austin focused on the federal courts.

"When we started all of this on the federal level, we assembled court records from two full years of all federal cases filed, so everything filed in 2016 and 2017, we downloaded four years later. So, by 2020 and 2021, most of those cases had concluded. Now, we have this big snapshot of federal litigation, including comprehensive data on the progress, pathways, and outcomes of cases that we built using machine and deep learning tools on all those documents," said Alexander.

For example, Alexander provided a small glimpse into how this system might improve court operations. When plaintiffs file a civil case in federal court, they are responsible for a filing fee of $400. The fee can be waived, but individual judges make fee waiver decisions, developing their own separate sets of rules.

The research team's data extracted from court records showed that some judges granted more than eighty percent of waiver requests, whereas others granted fewer than twenty percent. (https://www.science.org/doi/10.1126/science.aba6914).

In other words, whether a litigant received a fee waiver depended on the luck of the draw – on the judge to whom the case was randomly assigned. This analysis has prompted courts to reconsider their fee waiver procedures to ensure greater consistency.

"We found in our conversations with judges that there's a lot of appetite for this type of system-level knowledge. And by that, I mean, 'I know how I manage the cases in my courtroom, but I don't really have a good way to know how other judges handle similar cases,'" she said.

Working on the State Level

Fast forward a few years, and Alexander is currently working to extend her work beyond the federal courts with funding from the NSF’s Prototype Open Knowledge Network (Proto-OKN) program, which supports the development of "an interconnected network of knowledge graphs supporting a very broad range of application domains."

"We've got all this data that we generated, and now we want to flesh it out further, and then feed it into this larger technical apparatus that the NSF is helping fund, which is the knowledge graph infrastructure," she said. "The NSF wants to map different pockets of knowledge so we might connect, for example, census tract level poverty data to different measures of economic development and economic activity to court data using the concept of a knowledge graph to organize all of these nodes."

Alexander and her collaborators received a $1.5 million grant to continue their work on court data access, but this time, on the state level. They are particularly interested in criminal case data from the state courts because, as she puts it, "most criminal prosecutions in the U.S. happen at the state level, not the federal level."

They're focusing on two initial sites: Georgia, beginning with Fulton and Clayton Counties, and Washington State. Using their experience in these two states, they hope to add data from other states and eventually build out a full picture of both criminal and civil litigation on both the state and federal levels.

AI and Machine Learning

With AI and machine learning, Alexander and her colleagues can identify and create results from their data more quickly than they would have even five years ago.

"In any case, civil or criminal, in either state or federal court, the court generates a docket sheet, which is a chronological list of events in the case. Descriptions can be very different using very different language, even if they're talking about the same underlying event,” she explained. “This variation in how court events are recorded makes it difficult to get a system-level view. So, we've used AI, particularly deep learning using large language models to train a model or a set of models to recognize all the different ways litigation events show up.”

Because her research reaches many disciplines, she plans to work with collaborators across Tech. She sees value in bringing in students from the Scheller College of Business and other schools including the College of Computing, Ivan Allen College of Liberal Arts, and Vertically Integrated Projects.

"If we solve the data problem, we're better equipped to attack the procedural and substantive problems around how the courts actually operate. What's exciting is the methodological advances in computer science and natural language processing that have cracked wide open the types of questions that are now answerable, which then allows us to change society for the better," said Alexander.

During the Fall 2023 semester, Alexander is on a Fulbright scholarship in Santo Domingo, Dominican Republic until December to study their digital transformation efforts within the court system and to explore using data to focus on diagnosing problems and creating more efficiency and transparency.

"A court is an organization and systems-level, organizational thinking about courts is not confined to the U.S. We can start to draw connections and collaborations across international boundaries, which I think is pretty exciting," she said.

 
News Contact

Lorrie Burroughs