IDEaS Awards Grants and Cyberinfrastructure Resources for Thematic Programs and Research in AI

3D Graphic of a Server Room

In keeping with a strong strategic focus on AI for the 2023-2024 Academic Year, the Institute for Data Engineering and Science (IDEaS) has announced the winners of its 2023 Seed Grants for Thematic Events in AI and Cyberinfrastructure Resource Grants to support research in AI requiring secure, high-performance computing capabilities. Thematic event awards recipients will receive $8K to support their proposed workshop or series and Cyberinfrastructure winners will receive research support consisting of 600,000 CPU hours on the AMD Genoa Server as well as 36,000 hours of NVIDIA DGX H-100 GPU server usage and 172 TB of secure storage.

Congratulations to the award winners listed below!

Thematic Events in AI Awards

Proposed Workshop: “Foundation of scientific AI (Artificial Intelligence) for Optimization of Complex Systems”
Primary PI: Raphael Pestourie, Assistant Professor, School of Computational Science and Engineering
Secondary PI: Peng Chen, Assistant Professor, School of Computational Science and Engineering

Proposed Series: “Guest Lecture Seminar Series on Generative Art and Music”
Primary PI: Gil Weinberg, Professor, School of Music

Cyber-Infrastructure Resource Awards

Title: Human-in-the-Loop Musical Audio Source Separation
Topics: Music Informatics, Machine Learning
Primary PI: Alexander Lerch, Associate Professor, School of Music
Co-PIs: Karn Watcharasupat, Music Informatics Group | Yiwei Ding, Music Informatics Group | Pavan Seshadri, Music Informatics Group

Title: Towards A Multi-Species, Multi-Region Foundation Model for Neuroscience
Topics: Data-Centric AI, Neuroscience
Primary PI: Eva Dyer, Assistant Professor, Biomedical Engineering

Title: Multi-point Optimization for Building Sustainable Deep Learning Infrastructure
Topics: Energy Efficient Computing, Deep Learning, AI Systems OPtimization
Primary PI: Divya Mahajan, Assistant Professor, School of Electrical and Computer Engineering, School of Computer Science

Title: Neutrons for Precision Tests of the Standard Model
Topics: Nuclear/Particle Physics, Computational Physics
Primary PI: Aaron Jezghani - OIT-PACE

Title: Continual Pretraining for Egocentric Video
Primary PI: : Zsolt Kira, Assistant Professor, School of Interactive Computing
Co-PI: Shaunak Halbe, Ph.D. Student, Machine Learning

Title: Training More Trustworthy LLMs for Scientific Discovery via Debating and Tool Use
Topics: Trustworthy AI, Large-Language Models, Multi-Agent Systems, AI Optimization
Primary PIs: Chao Zhang, School of Computational Science and Engineering & Bo Dai, College of Computing

Title: Scaling up Foundation AI-based Protein Function Prediction with IDEaS Cyberinfrastructure
Topics: AI, Biology
Primary PI: Yunan Luo, Assistant Professor, School of Computational Science and Engineering        

  • Christa M. Ernst
News Contact

Christa M. Ernst - Research Communications Program Manager
Robotics | Data Engineering | Neuroengineering

IDEaS Awards 2023 Seed Grants to Seven Interdisciplinary Research Teams

Graphic of a tree of data growing from a hand
The teams awarded will focus on strategic new initiatives in Artificial Intelligence.

The Institute for Data Engineering and Science, in conjunction with several Interdisciplinary Research Institutes (IRIs) at Georgia Tech, have awarded seven teams of researchers from across the Institute a total of $105,000 in seed funding geared to better position Georgia Tech to perform world-class interdisciplinary research in data science and artificial intelligence development and deployment. 

The goals of the funded proposals include identifying prominent emerging research directions on the topic of AI, shaping IDEaS future strategy in the initiative area, building an inclusive and active community of Georgia Tech researchers in the field that potentially include external collaborators, and identifying and preparing groundwork for competing in large-scale grant opportunities in AI and its use in other research fields.

Below are the 2023 recipients and the co-sponsoring IRIs:

 

Proposal Title: "AI for Chemical and Materials Discovery" + “AI in Microscopy Thrust”
PI: Victor Fung, CSE | Vida Jamali, ChBE| Pan Li, ECE | Amirali Aghazadeh Mohandesi, ECE
Award: $20k (co-sponsored by IMat)

Overview: The goal of this initiative is to bring together expertise in machine learning/AI, high-throughput computing, computational chemistry, and experimental materials synthesis and characterization to accelerate material discovery. Computational chemistry and materials simulations are critical for developing new materials and understanding their behavior and performance, as well as aiding in experimental synthesis and characterization. Machine learning and AI play a pivotal role in accelerating material discovery through data-driven surrogate models, as well as high-throughput and automated synthesis and characterization.

Proposal Title: " AI + Quantum Materials”
PI: Zhigang JIang, Physics | Martin Mourigal, Physics
Award: $20k (Co-Sponsored by IMat)

Overview: Zhigang Jiang is currently leading an initiative within IMAT entitled “Quantum responses of topological and magnetic matter” to nurture multi-PI projects. By crosscutting the IMAT initiative with this IDEAS call, we propose to support and feature the applications of AI on predictive and inverse problems in quantum materials. Understanding the limit and capabilities of AI methodologies is a huge barrier of entry for Physics students, because researchers in that field already need heavy training in quantum mechanics, low-temperature physics and chemical synthesis. Our most pressing need is for our AI inclined quantum materials students to find a broader community to engage with and learn. This is the primary problem we aim to solve with this initiative.

PI: Jeffrey Skolnick, Bio Sci | Chao Zhang, CSE
Proposal Title: Harnessing Large Language Models for Targeted and Effective Small Molecule 4 Library Design in Challenging Disease Treatment
Award: $15k (co-sponsored by IBB)

Overview: Our objective is to use large language models (LLMs) in conjunction with AI algorithms to identify effective driver proteins, develop screening algorithms that target appropriate binding sites while avoiding deleterious ones, and consider bioavailability and drug resistance factors. LLMs can rapidly analyze vast amounts of information from literature and bioinformatics tools, generating hypotheses and suggesting molecular modifications. By bridging multiple disciplines such as biology, chemistry, and pharmacology, LLMs can provide valuable insights from diverse sources, assisting researchers in making informed decisions. Our aim is to establish a first-in-class, LLM driven research initiative at Georgia Tech that focuses on designing highly effective small molecule libraries to treat challenging diseases. This initiative will go beyond existing AI approaches to molecule generation, which often only consider simple properties like hydrogen bonding or rely on a limited set of proteins to train the LLM and therefore lack generalizability. As a result, this initiative is expected to consistently produce safe and effective disease-specific molecules.

PI: Yiyi He, School of City & Regional Plan | Jun Rentschler, World Bank
Proposal Title: “AI for Climate Resilient Energy Systems”
Award: $15k (co-sponsored by SEI)

Overview: We are committed to building a team of interdisciplinary & transdisciplinary researchers and practitioners with a shared goal: developing a new framework which model future climatic variations and the interconnected and interdependent energy infrastructure network as complex systems. To achieve this, we will harness the power of cutting-edge climate model outputs, sourced from the Coupled Model Intercomparison Project (CMIP), and integrate approaches from Machine Learning and Deep Learning models. This strategic amalgamation of data and techniques will enable us to gain profound insights into the intricate web of future climate-change-induced extreme weather conditions and their immediate and long-term ramifications on energy infrastructure networks. The seed grant from IDEaS stands as the crucial catalyst for kick-starting this ambitious endeavor. It will empower us to form a collaborative and inclusive community of GT researchers hailing from various domains, including City and Regional Planning, Earth and Atmospheric Science, Computer Science and Electrical Engineering, Civil and Environmental Engineering etc. By drawing upon the wealth of expertise and perspectives from these diverse fields, we aim to foster an environment where innovative ideas and solutions can flourish. In addition to our internal team, we also have plans to collaborate with external partners, including the World Bank, the Stanford Doerr School of Sustainability, and the Berkeley AI Research Initiative, who share our vision of addressing the complex challenges at the intersection of climate and energy infrastructure.

PI: Jian Luo, Civil & Environmental Eng | Yi Deng, EAS
Proposal Title: “Physics-informed Deep Learning for Real-time Forecasting of Urban Flooding”
Award: $15k (co-sponsored by BBISS)

Overview: Our research team envisions a significant trend in the exploration of AI applications for urban flooding hazard forecasting. Georgia Tech possesses a wealth of interdisciplinary expertise, positioning us to make a pioneering contribution to this burgeoning field. We aim to harness the combined strengths of Georgia Tech's experts in civil and environmental engineering, atmospheric and climate science, and data science to chart new territory in this emerging trend. Furthermore, we envision the potential extension of our research efforts towards the development of a real-time hazard forecasting application. This application would incorporate adaptation and mitigation strategies in collaboration with local government agencies, emergency management departments, and researchers in computer engineering and social science studies. Such a holistic approach would address the multifaceted challenges posed by urban flooding. To the best of our knowledge, Georgia Tech currently lacks a dedicated team focused on the fusion of AI and climate/flood research, making this initiative even more pioneering and impactful.

Proposal Title: “AI for Recycling and Circular Economy”
PI: Valerie Thomas, ISyE and PubPoly | Steven Balakirsky, GTRI
Award: $15k (co-sponsored by BBISS)

Overview: Most asset management and recycling-use technology has not changed for decades. The use of bar codes and RFID has provided some benefits, such as for retail returns management. Automated sorting of recyclables using magnets, eddy currents, and laser plastics identification has improved municipal recycling. Yet the overall field has been challenged by not-quite-easy-enough identification of products in use or at end of life. AI approaches, including computer vision, data fusion, and machine learning provide the additional capability to make asset management and product recycling easy enough to be nearly autonomous. Georgia Tech is well suited to lead in the development of this application. With its strength in machine learning, robotics, sustainable business, supply chains and logistics, and technology commercialization, Georgia Tech has the multi-disciplinary capability to make this concept a reality; in research and in commercial application.

Proposal Title: “Data-Driven Platform for Transforming Subjective Assessment into Objective Processes for Artistic Human Performance and Wellness”
PI: Milka Trajkova, Research Scientist/School of Literature, Media, Communication | Brian Magerko, School of Literature, Media, Communication
Award: $15k (co-sponsored by IPaT)

Overview: Artistic human movement at large, stands at the precipice of a data-driven renaissance. By leveraging novel tools, we can usher in a transparent, data-driven, and accessible training environment. The potential ramifications extend beyond dance. As sports analytics have reshaped our understanding of athletic prowess, a similar approach to dance could redefine our comprehension of human movement, with implications spanning healthcare, construction, rehabilitation, and active aging. Georgia Tech, with its prowess in AI, HCI, and biomechanics is primed to lead this exploration. To actualize this vision, we propose the following research questions with ballet as a prime example of one of the most complex types of artistic movements: 1) What kinds of data - real-time kinematic, kinetic, biomechanical, etc. captured through accessible off-the-shelf technologies, are essential for effective AI assessment in ballet education for young adults?; 2) How can we design and develop an end-to-end ML architecture that assesses artistic and technical performance?; 3) What feedback elements (combination of timing, communication mode, feedback nature, polarity, visualization) are most effective for AI- based dance assessment?; and 4) How does AI-assisted feedback enhance physical wellness, artistic performance, and the learning process in young athletes compared to traditional methods?

-         Christa M. Ernst
News Contact

Christa M. Ernst |  Research Communications Program Manager 
Robotics | Data Engineering | Neuroengineering
christa.ernst@research.gatech.edu

Physicists Solve Mysteries of Microtubule Movers

Microtubules

Three noticeable out-of-plane microtubule bundles are misaligned with the rest of the microtubules at the bottom left of the image.

Active matter is any collection of materials or systems composed of individual units that can move on their own, thanks to self-propulsion or autonomous motion. They can be of any size — think clouds of bacteria in a petri dish, or schools of fish.

Roman Grigoriev is mostly interested in the emergent behaviors in active matter systems made up of units on a molecular scale — tiny systems that convert stored energy into directed motion, consuming energy as they move and exert mechanical force.

“Active matter systems have garnered significant attention in physics, biology, and materials science due to their unique properties and potential applications,” Grigoriev, a professor in the School of Physics at Georgia Tech, explains. 

“Researchers are exploring how active matter can be harnessed for tasks like designing new materials with tailored properties, understanding the behavior of biological organisms, and even developing new approaches to robotics and autonomous systems,” he says.

But that’s only possible if scientists learn how the microscopic units making up active matter interact, and whether they can affect these interactions and thereby the collective properties of active matter on the macroscopic scale. 

Grigoriev and his research colleagues have found a potential first step by developing a new model of active matter that generated new insight into the physics of the problem. They detail their methods and results in a new study published in Science Advances, “Physically informed data-driven modeling of active nematics.”

Grigoriev’s co-authors include School of Physics graduate researchers Matthew Golden and Jyothishraj Nambisan, as well as Alberto Fernandez-Nieves, professor in the Department of Condensed Matter Physics at the University of Barcelona and a former associate professor of Physics at Georgia Tech. 

A two-dimensional 'solution?'

The research team focused on one of the most common examples of active matter, a suspension of self-propelled particles, such as bacteria or synthetic microswimmers, in a liquid medium. These particles cluster, swarm, and otherwise form dynamic patterns due to their ability to move and interact with each other.

“In our paper, we use data from an experimental system involving suspensions of microtubules, which provide structural support, shape, and organization to eukaryotic cells (any cell with a clearly defined nucleus),” Grigoriev explains. 

Microtubules, as well as actin filaments and some bacteria, are examples of nematics, rod-like objects whose "heads" are indistinguishable from their "tails.”

The motion of microtubules is driven by molecular motors powered by a protein, kinesin, which consumes adenosine triphosphate (ATP) dissolved in the liquid to slide a pair of neighboring microtubules past one another. The researcher’s system used microtubules suspended between layers of oil and water, which restricted their movement to two dimensions. 

“That makes it easier to visualize the microtubules and track their motion. By changing the kinesin or ATP concentrations, we could control the motion of the microtubules, making this experimental setup by far one of the most popular in the study of active nematics and even more generally, active matter,” Grigoriev said.

‘This is where the story gets interesting’

Getting a clearer picture of microtubular movements was just one discovery in the study. 

Another was learning more about the relationships between the characteristic patterns describing the orientation and motion of nematic molecules on a macroscopic scale. Those patterns, or topological defects, determine how the nematics orient themselves at the oil-water interface, that is in two spatial dimensions.

“Understanding the relationship between the flow — the global property of the system, or the fluid — and the topological defects, which describe the local orientation of microtubules, is one of the key intellectual questions facing researchers in the field,” Grigoriev said. “One needs to correctly identify the dominant physical effects which control the interaction between the microtubules and the surrounding fluid.”

“And this is where the story gets interesting,” Grigoriev adds. “For over a decade, it was believed that the key physics were well understood, with a large number of theoretical and computational studies relying on a generally accepted first principles model” — that is, one based on established science — “that was originally derived for active nematics in three spatial dimensions.”

In the Georgia Tech model, though, the dynamics of active nematics — more specifically, the length and time scales of the emerging patterns — are controlled by a pair of physical constants describing those assumed dominant physical effects: the stiffness of the microtubules (their flexibility), and the activity describing the stress, or force, generated by the kinesin motors.

“Using a data-driven approach, we inferred the correct form of the model demonstrating that, for two-dimensional active nematics, the dominant physical effects are different from what was previously assumed,” Grigoriev says. “In particular, the time scale is set by the rate at which bundles of microtubules are stretched by kinesin.” It is this rate, rather than the stress, that is constant. 

The danger of confirmation bias 

Grigoriev said the results of the study have important implications for understanding of active nematics and their emergent behaviors, explaining that they help rationalize a number of recent experimental results that were previously unexplained, such as how the density of topological defects scales with the concentration of kinesin and the viscosity of the fluid layers. 

“More importantly, our results demonstrate the danger associated with traditional assumptions that established research communities often land on and have difficulty overcoming,” Grigoriev said. “While data-driven methods may have their own sources of bias, they offer a perspective which is different enough from more traditional approaches to become a valuable research tool in their own right.”

 

About Georgia Institute of Technology

The Georgia Institute of Technology, or Georgia Tech, is one of the top public research universities in the U.S., developing leaders who advance technology and improve the human condition. The Institute offers business, computing, design, engineering, liberal arts, and sciences degrees. Its more than 45,000 undergraduate and graduate students, representing 50 states and more than 148 countries, study at the main campus in Atlanta, at campuses in France and China, and through distance and online learning. As a leading technological university, Georgia Tech is an engine of economic development for Georgia, the Southeast, and the nation, conducting more than $1 billion in research annually for government, industry, and society.

 

Funding: This study was funded by the National Science Foundation, grant no. CMMI-2028454. “Physically informed data-driven modeling of active nematics,” DOI: 10.1126/sciadv.abq6120

 

Left, a graphic showing microtubules orienting themselves in the experiment. Right, a still from a video showing microtubules moving at the interface of oil and water. Graphic by Roman Grigoriev

Left, a graphic showing microtubules orienting themselves in the experiment. Right, a still from a video showing microtubules moving at the interface of oil and water. Graphic by Roman Grigoriev

 

Roman Grigoriev

Roman Grigoriev

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

NSF RAPID Response to Earthquakes in Turkey

Grad student Phuc Mach places a node

In February, a major earthquake event devastated the south-central region of the Republic of Türkiye (Turkey) and northwestern Syria. Two earthquakes, one magnitude 7.8 and one magnitude 7.5, occurred nine hours apart, centered near the heavily populated city of Gaziantep. The total rupture lengths of both events were up to 250 miles. The president of Turkey has called it the “disaster of the century,” and the threat is still not over — aftershocks could still affect the region. 

Now, Zhigang Peng, a professor in the School of Earth and Atmospheric Sciences at Georgia Tech and graduate students Phuc Mach and Chang Ding, alongside researchers at the Scientific and Technological Research Institution of Türkiye (TÜBİTAK) and researchers at the University of Missouri, are using small seismic sensors to better understand just how, why, and when these earthquakes are occurring.

Funded by an NSF RAPID grant, the project is unique in that it aims to actively respond to the crisis while it’s still happening. National Science Foundation (NSF) Rapid Response Research (RAPID) grants are used when there is a severe urgency with regard to availability of or access to data, facilities or specialized equipment, including quick-response research on natural or anthropogenic disasters and other similar unanticipated events.

In an effort to better map the aftershocks of the earthquake event — which can occur weeks or months after the main event — the team placed approximately 120 small sensors, called nodes, in the East Anatolian fault region this past May. Their deployment continues through the summer. 

It’s the first time sensors like this have been deployed in Turkey, says Peng.

“These sensors are unique in that they can be placed easily and efficiently," he explains. "With internal batteries that can work up to one month when fully charged, they’re buried in the ground and can be deployed within minutes, while most other seismic sensors need solar panels or other power sources and take much longer time and space to deploy.” Each node is about the size of a 2-liter soda bottle, and can measure ground movement in three directions.

 “The primary reason we’re deploying these sensors quickly following the two mainshocks is to study the physical mechanisms of how earthquakes trigger each,” Peng adds. Mainshocks are the largest earthquake in a sequence. “We’ll use advanced techniques such as machine learning to detect and locate thousands of small aftershocks recorded by this network. These newly identified events can provide new important clues on how aftershocks evolve in space and time, and what drives foreshocks that occur before large events.”

Unearthing fault mechanisms

The team will also use the detected aftershocks to illuminate active faults where three tectonic plates come together — a region known as the Maraş Triple Junction. “We plan to use the aftershock locations and the seismic waves from recorded events to image subsurface structures where large damaging earthquakes occur,” says Mach, the Georgia Tech graduate researcher. This will help scientists better understand why sometimes faults ‘creep’ without any large events, while in other cases faults lock and then violently release elastic energy, creating powerful earthquakes.

Getting high-resolution data of the fault structures is another priority. “The fault line ruptured in the first magnitude 7.8 event has a bend in it, where earthquake activity typically terminates, but the earthquake rupture moved through this bend, which is highly unusual,” Peng says. By deploying additional ultra-dense arrays of sensors in their upcoming trip this summer, the team hopes to help researchers ‘see’ the bend under the Earth’s surface, allowing them to better understand how fault properties control earthquake rupture propagation.  

The team also aims to learn more about the relationship between the two main shocks that recently rocked Turkey, sometimes called doublet events. Doublet events can happen when the initial earthquake triggers a secondary earthquake by adding extra stress loading. While in this instance, the doublet may have taken place only 9 hours after the initial event, these secondary earthquakes have been known to take place days, months, or even years after the initial one — a famous example being the sequence of earthquakes that spanned 60 years in the North Anatolian fault region in Northern Turkey. 

“Clearly the two main shocks in 2023 are related, but it is still not clear how to explain the time delays,” says Peng. The team plans to work with their collaborators at TÜBİTAK to re-analyze seismic and other types of geophysical data right before and after those two main shocks in order to better understand the triggering mechanisms.

“In our most recent trip in southern Türkiye, we saw numerous buildings that were partially damaged during the mainshock, and many people will have to live in temporary shelters for years during the rebuilding process,” Peng adds. “While we cannot stop earthquakes from happening in tectonically active regions, we hope that our seismic deployment and subsequent research on earthquake triggering and fault imaging can improve our ability to predict what will happen next — before and after a big one — and could save countless lives.”

 

Grad student Phuc Mach holds a node
Members of the team in the field in Turkey
Georgia Tech graduate student Chang Ding pointing at a deployed seismic node in Southern Turkey
A nodal seismic station deployed by a TUBITAK scientist in Southern Turkey
Georgia Tech graduate student Chang Ding posing with a local villager at a seismic site in Southern Turkey
Georgia Tech graduate student Chang Ding pointing at a deployed seismic node in Southern Turkey
Georgia Tech graduate student Chang Ding pointing at a deployed seismic node in Southern Turkey
Georgia Tech scientist Zhigang Peng posing with TUBITAK scientist Ekrem Zor right in front of a possible surface rupture produced by the 2023 magnitude 7.8 earthquake
Researchers from Georgia Tech, Univ. of Missouri and TUBITAK before heading to the field on May 1st, 2023
News Contact

Written By:
Selena Langner

Media Contact:
Jess Hunt-Ralston

Balancing Act of Hurricane Season Sways With Climate Change

Hurricane Radar.

Hurricane season is underway and runs through Nov. 30. While the National Oceanic and Atmospheric Administration is forecasting a “near-normal” 2023, experts say that climate change paints a more unpredictable picture for the future.

Behind the 2023 projections is a balancing act of rising oceanic temperatures and the onset of the climate phenomenon El Niño, explains Susan Lozier, dean and Betsy Middleton and John Clark Sutherland Chair in the College of Sciences. The waters of the tropical Atlantic Ocean are currently 1 – 3°C above average, which would typically signify the potential for more intense activity, but the wind shear associated with El Niño acts as a deterrent for hurricane formation.

Increasing Intensity

But what could happen when the shield of El Niño isn't present to counteract the rising temperatures in the tropical Atlantic?

"Climate change is leading to warmer surface temperatures. We know that will lead to more intense hurricanes, but we don't know if it will necessarily lead to more hurricanes. As climate change progresses, we are interested in understanding how weather patterns will be disrupted, including those related to hurricane formation and pathways," said Lozier, who recently served as president of the American Geophysical Union.

She further explained that the increased intensity is a result of the warm waters releasing additional energy into the storm as it forms. This consequence of climate change could present problems for the Tech campus and the city of Atlanta due to the risk of torrential rainfall. According to the National Weather Service, flooding has proven to be the deadliest hazard associated with hurricanes over the past decade.

"When people think about hurricanes, they generally think about damaging winds. Winds are damaging, but increasingly, the most damaging part of a hurricane is the immense amount of moisture they carry," Lozier said, reflecting on the 2017 landfall of Hurricane Harvey. "An area like Atlanta could be affected by heavy rainfall associated with the path of a hurricane. The winds will have mostly died down by the time a storm reaches Atlanta, but as the climate warms, warmer air holds more moisture, and because of that, the expectation is that there will be more rainfall associated with hurricanes and tropical storms.”

Beyond Reducing Carbon Emissions

Fueling the rising temperatures in the world's oceans is an increase in carbon emissions, and simply curtailing them may not be a solution.

"The private and public sectors are increasingly looking at actively removing carbon from the atmosphere because we are unlikely to limit global warming simply by curtailing emissions. Active carbon drawdown from the atmosphere and the ocean are active areas of research right now,” Lozier said.

Tech researchers are at the forefront of this effort, highlighted by a partnership between the Institute, the Georgia Aquarium, and Ocean Visions­­ — the Center for Ocean-Climate Solutions. Lozier represents the Institute as a partnership lead at the center, where the primary focus is the design and delivery of scalable and equitable ocean-based solutions to reduce the effects of climate change and build climate-resilient marine ecosystems and coastal communities.

Associate Professor Chris Reinhard is exploring how coastal ecosystem restoration can permanently capture carbon dioxide from the atmosphere as it becomes buried in sediments on the seafloor. The overall process of removing carbon from the air can be costly. To combat that, a team of researchers in the School of Chemical and Biomolecular Engineering is developing a traditional direct air capture system that is cheaper to operate and more efficient. Helping to craft policy and research climate solutions, Marilyn Brown, Regents’ Professor and the Brook Byers Professor of Sustainable Systems in the School of Public Policy, serves on the leadership council of Drawdown Georgia.

A certain level of unpredictability will always exist when dealing with natural disasters, but understanding humans’ role in controlling climate change could be a key factor in our ability to accurately assess the threat of developing storms. 

News Contact

Steven Gagliano - Communications Officer 

Institute Communications

Gauging Glaciers: Alex Robel Awarded NSF CAREER Grant for New Ice Melt Modeling Tool

A stylized glacier (Selena Langner)

Alex Robel is improving how computer models of melting ice sheets incorporate data from field expeditions and satellites by creating a new open-access software package — complete with state-of-the-art tools and paired with ice sheet models that anyone can use, even on a laptop or home computer.

Improving these models is critical: while melting ice sheets and glaciers are top contributors to sea level rise, there are still large uncertainties in sea level projections at 2100 and beyond.

“Part of the problem is that the way that many models have been coded in the past has not been conducive to using these kinds of tools,” Robel, an assistant professor in the School of Earth and Atmospheric Sciences, explains. “It's just very labor-intensive to set up these data assimilation tools — it usually involves someone refactoring the code over several years.”

“Our goal is to provide a tool that anyone in the field can use very easily without a lot of labor at the front end,” Robel says. “This project is really focused around developing the computational tools to make it easier for people who use ice sheet models to incorporate or inform them with the widest possible range of measurements from the ground, aircraft and satellites.”

Now, a $780,000 NSF CAREER grant will help him to do so. 

The National Science Foundation Faculty Early Career Development Award is a five-year funding mechanism designed to help promising researchers establish a personal foundation for a lifetime of leadership in their field. Known as CAREER awards, the grants are NSF’s most prestigious funding for untenured assistant professors.

“Ultimately,” Robel says, “this project will empower more people in the community to use these models and to use these models together with the observations that they're taking.”
 

Ice sheets remember

“Largely, what models do right now is they look at one point in time, and they try their best — at that one point in time — to get the model to match some types of observations as closely as possible,” Robel explains. “From there, they let the computer model simulate what it thinks that ice sheet will do in the future.”

In doing so, the models often assume that the ice sheet starts in a state of balance, and that it is neither gaining nor losing ice at the start of the simulation. The problem with this approach is that ice sheets dynamically change, responding to past events — even ones that have happened centuries ago. “We know from models and from decades of theory that the natural response time scale of thick ice sheets is hundreds to thousands of years,” Robel adds.

By informing models with historical records, observations, and measurements, Robel hopes to improve their accuracy. “We have observations being made by satellites, aircraft, and field expeditions,” says Robel. “We also have historical accounts, and can go even further back in time by looking at geological observations or ice cores. These can tell us about the long history of ice sheets and how they've changed over hundreds or thousands of years.”

Robel’s team plans to use a set of techniques called data assimilation to adjust, or ‘nudge’, models. “These data assimilation techniques have been around for a really long time,” Robel explains. “For example, they’re critical to weather forecasting: every weather forecast that you see on your phone was ultimately the product of a weather model that used data assimilation to take many observations and apply them to a model simulation.”

“The next part of the project is going to be incorporating this data assimilation capability into a cloud-based computational ice sheet model,” Robel says. “We are planning to build an open source software package in Python that can use this sort of data assimilation method with any kind of ice sheet model.”

Robel hopes it will expand accessibility. “Currently, it's very labor-intensive to set up these data assimilation tools, and while groups have done it, it usually involves someone re-coding and refactoring the code over several years.”

Building software for accessibility

Robel’s team will then apply their software package to a widely used model, which now has an online, browser-based version. “The reason why that is particularly useful is because the place where this model is running is also one of the largest community repositories for data in our field,” Robel says.

Called Ghub, this relatively new repository is designed to be a community-wide place for sharing data on glaciers and ice sheets. “Since this is also a place where the model is living, by adding this capability to this cloud-based model, we'll be able to directly use the data that's already living in the same place that the model is,” Robel explains. 

Users won’t need to download data, or have a high-speed computer to access and use the data or model. Researchers collecting data will be able to upload their data to the repository, and immediately see the impact of their observations on future ice sheet melt simulations. Field researchers could use the model to optimize their long-term research plans by seeing where collecting new data might be most critical for refining predictions.

“We really think that it is critical for everyone who's doing modeling of ice sheets to be doing this transient data simulation to make sure that our simulations across the field are all doing the best possible job to reproduce and match observations,” Robel says. While in the past, the time and labor involved in setting up the tools has been a barrier, “developing this particular tool will allow us to bring transient data assimilation to essentially the whole field.”

Bringing Real Data to Georgia’s K-12 Classrooms

The broad applications and user-base expands beyond the scientific community, and Robel is already developing a K-12 curriculum on sea level rise, in partnership with Georgia Tech CEISMC Researcher Jayma Koval. “The students analyze data from real tide gauges and use them to learn about statistics, while also learning about sea level rise using real data,” he explains.

Because the curriculum matches with state standards, teachers can download the curriculum, which is available for free online in partnership with the Southeast Coastal Ocean Observing Regional Association (SECOORA), and incorporate it into their preexisting lesson plans. “We worked with SECOORA to pilot a middle school curriculum in Atlanta and Savannah, and one of the things that we saw was that there are a lot of teachers outside of middle school who are requesting and downloading the curriculum because they want to teach their students about sea level rise, in particular in coastal areas,” Robel adds.

In Georgia, there is a data science class that exists in many high schools that is part of the computer science standards for the state. “Now, we are partnering with a high school teacher to develop a second standards-aligned curriculum that is meant to be taught ideally in a data science class, computer class or statistics class,” Robel says. “It can be taught as a module within that class and it will be the more advanced version of the middle school sea level curriculum.”

The curriculum will guide students through using data analysis tools and coding in order to analyze real sea level data sets, while learning the science behind what causes variations and sea level, what causes sea level rise, and how to predict sea level changes. 

“That gets students to think about computational modeling and how computational modeling is an important part of their lives, whether it's to get a weather forecast or play a computer game,” Robel adds. “Our goal is to get students to imagine how all these things are combined, while thinking about the way that we project future sea level rise.”

 

Alex Robel (Credit: Allison Carter)
News Contact

Written by Selena Langner

Contact: Jess Hunt-Ralston

The Oceans Are Missing Their Rivers

In a rhythm that’s pulsed through epochs, a river’s plume carries sediment and nutrients from the continental interior into the ocean, a major exchange of resources from land to sea. More than 6,000 rivers worldwide surge freshwater into oceans, delivering nutrients, including nitrogen and phosphorus, that feed phytoplankton, generating a bloom of life that in turn feeds progressively larger creatures. They may even influence ocean currents in ways researchers are just starting to understand. But today, in rivers around the world, humans are altering this critical phenomenon.