The Dynamics of Deformable Systems: Study Unravels Mathematical Mystery of Cable-like Structures

The research team's main object of study, shown here, is structures that consist of large numbers of pores — arranged in columns and rows with cables and rods added at random.

Systems of rigid rods acquire rigidity via the addition of random additional rods and cables, as captured via a graph theory. The research team's main object of study, shown here, is structures that consist of large numbers of pores — arranged in columns and rows with cables and rods added at random.

Are our bodies solid or liquid? We all know the convention that solids maintain their shapes, while liquids fill the containers they’re in. But often in the real world, those lines are blurred. Imagine walking on a beach. Sometimes the sand gives way under feet, deforming like a liquid, but when enough sand grains pack together, they can support weight like a solid surface.

Modeling these kinds of systems is notoriously difficult — but Zeb Rocklin, an assistant professor in the School of Physics at Georgia Tech, has written a new paper doing just that. 

Rocklin’s study, “Rigidity percolation in a random tensegrity via analytic graph theory,” is published in the journal Proceedings of the National Academy of Sciences (PNAS). The results have the potential to impact fields spanning biology to engineering and nanotechnology, showing that these types of deformable solids offer a rare combination of durability and flexibility.

"I'm very proud of our team, especially Will and Vishal, the two Georgia Tech undergraduates who co-led the study,” Rocklin says. 

The lead author, William Stephenson, and co-author Vishal Sudhakar both completed their undergraduate studies at the Institute during the time of this research. Stephenson is now a first-year grad student at the University of Michigan, Ann Arbor, and Sudhakar has been admitted to Georgia Tech as a graduate student. Additionally, co-author Michael Czajkowski is a post-doctoral researcher in the School of Physics, and co-author James McInerney completed his graduate studies in the School of Physics under Rocklin. McInerney is now a postdoctoral researcher at the University of Michigan. 

Connecting the dots… with cables

Imagine building molecules in chemistry class large wooden spheres connected with sticks or rods. While many models use rods, including mathematical models, biological systems in real life are constructed of polymers, which function more like stretchy strings.

Likewise, when creating mathematical or biological models, researchers frequently treat all the elements as rods as opposed to treating some of them as cables, or strings. But, “there are tradeoffs between how mathematically tractable a model is and how physically plausible it is,” Rocklin says. “Physicists can have some beautiful mathematical theories, but they aren’t always realistic.”  For example, a model using connective rods might not capture the dynamics that connective strings provide. “With a string you can stretch it, and it'll fight you, but when you compress it, it collapses.”

“But, in this study, we’ve extended the current theories,” he says, adding cable-like elements. “And that actually turns out to be incredibly difficult, because these theories use mathematical equations. In contrast, the distance between the two ends of a cable is represented by an inequality, which is not an equation at all. So how do you create a mathematical theory when you aren't starting from equations?” While a rod has a certain length in a mathematical equation, the ends of the string have to be represented as less than or equal to a certain length.

In this situation “all the usual analytic theories completely break,” Rocklin says. “It becomes very difficult for physicists or for mathematicians.”

“The trick was to notice that these physical systems were logically equivalent to something called a directed graph,” Rocklin adds, “where different modes of deformation are linked to each other in specific ways. This allows us to take a relatively complicated system and massively compress it to a much smaller system. And when we did that, we were able to turn it into something that becomes extremely easy for the computer to do.”

From biology to engineering

Rocklin’s team found that when modeling with cables and springs, the target range changed — becoming softer, with a wider margin for error. “That could be really important for something like a biological system, because a biological system is trying to stay close to that critical point,” says Rocklin. “Our model shows that the region around the critical point is actually much broader than what models that only used rods previously showed.”

Rocklin also points out applications for engineers. For example, since Rocklin's new theory suggests that even disordered cable structures can be strong and flexible, it may help engineers leverage cables as building materials to create safer, more durable bridges. The theory also provides a way to easily model these cable-based structures, to ensure their safety before they are built, and provide a way for engineers to iterate on designs.

Rocklin also notes potential applications in nanotechnology. “In nanotechnology, you must accept an increasing amount of disorder, because you can't just have a skilled worker actually go in and put segments there, and you can't have a conventional factory machine put segments there,” Rocklin says. 

But biology has known how to lay down effective, but disordered, rod and cable structures for hundreds of millions of years. “This is going to tell us what sorts of machines we can make with those disordered structures when we're getting to the point of being able to do what biology can do. And that's a possible future design principle for the engineers to explore, at very small scales, where we can't choose exactly where each cable goes,” Rocklin says.

“Our theory shows that with cables, we can maintain a combination of flexibility and strength with much less precision than you might otherwise need.”

 

Funding: This research was funded by the Army Research Office through the MURI program (#W911NF2210219).

DOI: https://doi.org/10.1073/pnas.2302536120


Figure caption: Systems of rigid rods acquire rigidity via the addition of random additional rods and cables, as captured via a graph theory. The research team's main object of study, shown here, is structures that consist of large numbers of pores — arranged in columns and rows with cables and rods added at random.

News Contact

Written by Selena Langner

Editor and Contact: Jess Hunt-Ralston

Researchers Leverage AI to Develop Early Diagnostic Test for Ovarian Cancer

Micrograph of a mucinous ovarian tumor (Photo National Institutes of Health)

Micrograph of a mucinous ovarian tumor (Photo National Institutes of Health)

For over three decades, a highly accurate early diagnostic test for ovarian cancer has eluded physicians. Now, scientists in the Georgia Tech Integrated Cancer Research Center (ICRC) have combined machine learning with information on blood metabolites to develop a new test able to detect ovarian cancer with 93 percent accuracy among samples from the team’s study group.

John McDonald, professor emeritus in the School of Biological Sciences, founding director of the ICRC, and the study’s corresponding author, explains that the new test’s accuracy is better in detecting ovarian cancer than existing tests for women clinically classified as normal, with a particular improvement in detecting early-stage ovarian disease in that cohort.

The team’s results and methodologies are detailed in a new paper, “A Personalized Probabilistic Approach to Ovarian Cancer Diagnostics,” published in the March 2024 online issue of the medical journal Gynecologic Oncology. Based on their computer models, the researchers have developed what they believe will be a more clinically useful approach to ovarian cancer diagnosis — whereby a patient’s individual metabolic profile can be used to assign a more accurate probability of the presence or absence of the disease.

“This personalized, probabilistic approach to cancer diagnostics is more clinically informative and accurate than traditional binary (yes/no) tests,” McDonald says. “It represents a promising new direction in the early detection of ovarian cancer, and perhaps other cancers as well.”

The study co-authors also include Dongjo Ban, a Bioinformatics Ph.D. student in McDonald’s lab; Research Scientists Stephen N. Housley, Lilya V. Matyunina, and L.DeEtte (Walker) McDonald; Regents’ Professor Jeffrey Skolnick, who also serves as Mary and Maisie Gibson Chair in the School of Biological Sciences and Georgia Research Alliance Eminent Scholar in Computational Systems Biology; and two collaborating physicians: University of North Carolina Professor Victoria L. Bae-Jump and Ovarian Cancer Institute of Atlanta Founder and Chief Executive Officer Benedict B. Benigno. Members of the research team are forming a startup to transfer and commercialize the technology, and plan to seek requisite trials and FDA approval for the test.

Silent killer

Ovarian cancer is often referred to as the silent killer because the disease is typically asymptomatic when it first arises — and is usually not detected until later stages of development, when it is difficult to treat.

McDonald explains that while the average five-year survival rate for late-stage ovarian cancer patients, even after treatment, is around 31 percent — but that if ovarian cancer is detected and treated early, the average five-year survival rate is more than 90 percent.

“Clearly, there is a tremendous need for an accurate early diagnostic test for this insidious disease,” McDonald says.

And although development of an early detection test for ovarian cancer has been vigorously pursued for more than three decades, the development of early, accurate diagnostic tests has proven elusive. Because cancer begins on the molecular level, McDonald explains, there are multiple possible pathways capable of leading to even the same cancer type.

“Because of this high-level molecular heterogeneity among patients, the identification of a single universal diagnostic biomarker of ovarian cancer has not been possible,” McDonald says. “For this reason, we opted to use a branch of artificial intelligence — machine learning — to develop an alternative probabilistic approach to the challenge of ovarian cancer diagnostics.”

Metabolic profiles

Georgia Tech co-author Dongjo Ban, whose thesis research contributed to the study, explains that “because end-point changes on the metabolic level are known to be reflective of underlying changes operating collectively on multiple molecular levels, we chose metabolic profiles as the backbone of our analysis.”

“The set of human metabolites is a collective measure of the health of cells,” adds coauthor Jeffrey Skolnick, “and by not arbitrary choosing any subset in advance, one lets the artificial intelligence figure out which are the key players for a given individual.”

Mass spectrometry can identify the presence of metabolites in the blood by detecting their mass and charge signatures. However, Ban says, the precise chemical makeup of a metabolite requires much more extensive characterization.

Ban explains that because the precise chemical composition of less than seven percent of the metabolites circulating in human blood have, thus far, been chemically characterized, it is currently impossible to accurately pinpoint the specific molecular processes contributing to an individual's metabolic profile.

However, the research team recognized that, even without knowing the precise chemical make-up of each individual metabolite, the mere presence of different metabolites in the blood of different individuals, as detected by mass spectrometry, can be incorporated as features in the building of accurate machine learning-based predictive models (similar to the use of individual facial features in the building of facial pattern recognition algorithms).

“Thousands of metabolites are known to be circulating in the human bloodstream, and they can be readily and accurately detected by mass spectrometry and combined with machine learning to establish an accurate ovarian cancer diagnostic,” Ban says.

A new probabilistic approach

The researchers developed their integrative approach by combining metabolomic profiles and machine learning-based classifiers to establish a diagnostic test with 93 percent accuracy when tested on 564 women from Georgia, North Carolina, Philadelphia and Western Canada. 431 of the study participants were active ovarian cancer patients, and while the remaining 133 women in the study did not have ovarian cancer.

Further studies have been initiated to study the possibility that the test is able to detect very early-stage disease in women displaying no clinical symptoms, McDonald says.

McDonald anticipates a clinical future where a person with a metabolic profile that falls within a score range that makes cancer highly unlikely would only require yearly monitoring. But someone with a metabolic score that lies in a range where a majority (say, 90%) have previously been diagnosed with ovarian cancer would likely be monitored more frequently — or perhaps immediately referred for advanced screening.

Citation: https://doi.org/10.1016/j.ygyno.2023.12.030

Funding

This research was funded by the Ovarian Cancer Institute (Atlanta), the Laura Crandall Brown Foundation, the Deborah Nash Endowment Fund, Northside Hospital (Atlanta), and the Mark Light Integrated Cancer Research Student Fellowship.

Disclosure

Study co-authors John McDonald, Stephen N. Housley, Jeffrey Skolnick, and Benedict B. Benigno are the co-founders of MyOncoDx, Inc., formed to support further research, technology transfer, and commercialization for the team’s new clinical tool for the diagnosis of ovarian cancer.

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

Smart Solids: Zeb Rocklin Awarded NSF CAREER for Flexible Metamaterials Research

A mechanical metamaterial: a series of squares connected at their corners, which can move by flexing at the hinges where the corners are connected.

Imagine materials that respond to their environment: winter jackets that become thicker as temperatures drop, shoes that return energy with each stride, and robots that adapt to better accomplish their task as they aid in space exploration. All of these ideas could be made into a reality through mechanical metamaterials, a group of flexible solids that blur the traditional definition of what a solid is. 

Understanding these metamaterials is key to “programming” them correctly, maximizing their utility.  “One of the paradigms of this research is that the material is the machine,” Zeb Rocklin, an assistant professor in the School of Physics, explains. “We're creating a material that performs the mechanical tasks that we want it to, and the processes, forces and displacements in the ways we want it to.”

A new $630,000 NSF CAREER grant will help Rocklin continue that research.

The National Science Foundation Faculty Early Career Development Award is a five-year grant designed to help promising researchers establish a foundation for a lifetime of leadership in their field. Known as CAREER awards, the grants are NSF’s most prestigious funding for untenured assistant professors.

The award, for “Geometric and topological mechanics of flexible structures,” will help Rocklin continue developing a new, unified theory for mechanical metamaterials — a group of structures that can flex and move, while having traditional solid components that make it easier to model. The theory could then be applied by other scientists and engineers to create responsive objects with smart fabrics that could respond to changes in environment — like novel knee replacements, responsive airplane wings, and better robots.

Materials as machines

“A solid is defined by the fact that it has a shape, and if I try to change the shape it might generate patterns of stress, or if I hit it, you might hear noise, because it's vibrating,” says Rocklin. “While we often think about things in terms of solids, liquids, and gasses, a lot of the things that are very important to us are not what we think of as a conventional solid.”

Flexible solids, like clothing, robots, and even our own bodies permeate our world, and are often some of the most useful materials we encounter. “This creates this huge challenge,” Rocklin says, “because flexible solids can't always be understood using current techniques of physics. We can write down the equations, but the equations are often too hard for anyone to solve.” For example, imagine trying to predict or replicate the infinite ways a piece of paper can crumple. As a result, flexible solids are often expensive and time consuming to model.

That’s where Rocklin’s new theory comes in.

Mechanical metamaterials

By combining well-known solids with flexible properties, Rocklin hopes to create a mathematically simple theory. There are philosophical differences and limitations here,” he says, “but as a physicist, I’m looking for universal principles that can apply to a variety of things. Our technique is meant to complement the existing simulations, and it's meant to provide us more insight into these systems so that we can understand how to control them better.”

By building a theory around materials made of repeating solids connected by flexible hinges, Rocklin hopes to make a computationally inexpensive technique to predict and control the deformation of flexible structures. One example of this type of structure consists of  solid square pieces connected by their corners in a checkerboard pattern. The pieces pivot against each other at these hinged corners, allowing the structure to easily expand and contract. “These materials find a sweet spot in between simple solids that were well-characterized in the nineteenth century and the flexible objects that are just too complicated for us to fully describe,” Rocklin adds.

While the material can only deform via one method, (by flexing at the hinges) this does not mean that there is only one way the material deforms. Rather, through this one method of deformation, there are an infinite number of modes or computations that the fabric can assume, illustrating Rocklin's key insight – that a single flexible mode inevitably gives rise to a whole host of complex deformations.

“There's very simple universal math to describe how this type of material operates,” Rocklin adds. “And, when people actually make this material, it turns out that it actually looks like this, and it actually deforms in this way.”

Broad applications

As a theoretical physicist, Rocklin is focused on developing a unified theory that can be applied by experts across many fields. For example, collapsable biomedical devices like stents, which should be small when inserted, but need to expand when inside the body. Inspired by the ever-adapting wings of birds, adaptable airplane wings are also an intriguing frontier.

Rather than minute adjustments via circuitry, airplane wings could be built from these flexible solids, which could be designed to automatically adapt when given a signal from the wind. Building an antenna from materials that respond to certain electromagnetic frequencies, to optimize signal reception, is another of many possible applications for the work. 

A headshot of Zeb Rocklin
News Contact

Written by Selena Langner

Contact: Jess Hunt- Ralston

American Mathematical Society Honors Trio of Faculty with Top Research Prize, Fellows Recognitions

Jennifer Hom

Jennifer Hom

The American Mathematical Society (AMS) recently announced top honors for three School of Mathematics professors, including a top research award and a pair of faculty recognized as AMS Fellows for their work in advancing the field.

Levi L. Conant Prize

School of Mathematics Associate Professor Jennifer (Jen) Hom has received the 2024 Levi L. Conant Prize from the AMS. The award recognizes the best expository paper published in either Notices of the AMS or Bulletin of the AMS in the preceding five years. Hom is recognized for her paper, “Getting a handle on the Conway knot,” which was published in the Bulletin in 2021.

“These awards partially signal the depth and breadth of accomplishment and influence of these three remarkable scholars," says School of Mathematics Chair Mike Wolf. "Jen’s tribute is for her how she was able to communicate the clarity of her understanding of fundamental and difficult mathematics to a wide audience, creating a resource that will affect mathematics and mathematicians for many years."

“I am honored to receive the 2024 Levi L. Conant Prize,” Hom says. “An extremely important but often undervalued part of our job as mathematicians is communication, and I’m grateful to the AMS for valuing high-quality exposition in their publications.” 

Hom adds that she’s “proud to be in the company of my esteemed former colleague Dan Margalit,” who won the Conant Prize in 2021. Margalit then served as a Mathematics Professor at Georgia Tech, and is now Stephenson Professor and Chair of the Department of Mathematics at Vanderbilt University.

AMS cited Hom’s article as “a wonderful resource for the community on timely and important material,” adding that “Hom’s paper packs a remarkable amount of knot theory into 11 pages, but remains clear, engaging, and easy to read throughout. Readers are left with new understanding and a sense of excitement for the future of this field.”

American Mathematical Society Fellows

Last year, Hom was also recognized as an AMS Fellow for her topology research and service to the mathematical community.

This season, two fellow School of Mathematics faculty, Professor Greg Blekherman and Professor Thang Le, also have joined those ranks.

“Thang and Greg received a distinction reserved for only the top few percent of research mathematicians nationwide," Wolf says. "Thang was singled out for his deep work in the creation and development of the fairly new subject of quantum topology over the last quarter century as well as for that subject’s implications for the very classic area of low-dimensional topology. Quantum topology is now a vast area, but many of its most prominent achievements came about through the work of Thang.

“Greg’s work courses through and blends algebraic and convex geometry as well as combinatorics and optimization — and also mathematical biology," Wolf explains. "Most notably, Greg is known for his diverse and important contributions to the theory of nonnegative and “sum of squares” polynomials, a hugely important topic in contemporary optimization theory.

Blekherman and Le are among more than three dozen mathematical scientists from around the world named 2024 AMS Fellows — a cohort which also includes Kasso Okoudjou, a former School of Mathematics Ph.D. student advised by Professor Christopher Heil

“It is my pleasure to congratulate and welcome the new class of AMS Fellows, honored for their outstanding contributions to the mathematical sciences and to our profession,” notes AMS President Bryna Kra. "This year's class was selected from a large and excellent pool of candidates, highlighting the many ways in which our profession is advanced, and I look forward to working with them in service to our community."

“The school is just thrilled by these well-deserved awards to our wonderful colleagues," Wolf added.

About Jennifer (Jen) Hom

Hom joined Georgia Tech as an assistant professor in 2015 after she served as a Ritt Assistant Professor at Columbia University. She has been an associate professor in the School of Mathematics since 2018. Hom’s research centers on low-dimensional topology, which she usually studies using Heegaard Floer homology. She was asked to speak in the topology section of the 2022 International Congress of Mathematicians, the world’s largest gathering of mathematicians. Hom has held a Sloan Fellowship and a Simons Fellowship, is an AMS Fellow, and holds a National Science Foundation CAREER award. 

About Greg Blekherman

Blekherman, who joined Georgia Tech in 2011, is a 2012 recipient of the Sloan Research Fellowship. His research interests lie at the intersection of convex and algebraic geometry. Blekherman received his Ph.D. in 2005 from the University of Michigan under the direction of Alexander Barvinok, and he has held postdoctoral positions at the Microsoft Research Theory Group, Virginia Bioinformatics Institute, Institute for Pure and Applied Math (UCLA) and UC San Diego before joining Georgia Tech.

About Thang Le

Le received his M.S. and Ph.D. from Lomonosov Moscow State University, and joined Georgia Tech in 2003. His research interests include differential topology, 3-manifolds, knot theory, and quasicrystals. He serves as an editor of Quantum Topology, The Journal of Knot Theory and Its Ramifications, and Acta Mathematica Vietnamica. 

Read the AMS press releases on the 2024 Conant Prize and AMS Fellows here.

Greg Blekherman

Greg Blekherman

Thang Le

Thang Le

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

Study Reveals Wintertime Formation of Large Pollution Particles in China’s Skies

Beijing pollution (Photo Kevin Dooley, Creative Commons)

Beijing pollution (Photo Kevin Dooley, Creative Commons)

Previous studies have found that the particles that float in the haze over the skies of Beijing include sulfate, a major source of outdoor air pollution that damages lungs and aggravates existing asthmatic symptoms, according to the California Air Resources Board.

Sulfates usually are produced by atmospheric oxidation in the summer, when ample sunlight facilitates the oxidation that turns sulfur dioxide into dangerous aerosol particles. How is it that China can produce such extreme pollution loaded with sulfates in the winter, when there’s not as much sunlight and atmospheric oxidation is slow?

Yuhang Wang, professor in the School of Earth and Atmospheric Sciences at Georgia Tech, and his research team have conducted a study that may have the answer: All the chemical reactions needed to turn sulfur dioxide into sulfur trioxide, and then quickly into sulfate, primarily happen within the smoke plumes causing the pollution. That process not only creates sulfates in the winter in China, but it also happens faster and results in larger sulfate particles in the atmosphere.

“We call the source ‘in-source formation,’” Wang says. “Instead of having oxidants spread out in the atmosphere, slowly oxidizing sulfur dioxide into sulfur trioxide to produce sulfate, we have this concentrated production in the exhaust plumes that turns the sulfuric acid into large sulfate particles. And that's why we're seeing these large sulfate particles in China.”

The findings of in-source formation of larger wintertime sulfate particles in China could help scientists accurately assess the impacts of aerosols on radiative forcing — how climate change and global warming impact the Earth’s energy and heat balances — and on health, where larger aerosols means larger deposits into human lungs. 

“Wintertime Formation of Large Sulfate Particles in China and Implications for Human Health,” is published in Environmental Science & Technology, an American Chemical Society publication. The co-authors include Qianru Zhang of Peking University and Mingming Zheng of Wuhan Polytechnic University, two of Wang’s former students who conducted the research while at Georgia Tech. 

Explaining a historic smog

China still burns a lot of coal in power plants because its costs are lower compared to natural gas, Wang says. It also makes for an easy comparison between China’s hazy winters and a historic event that focused the United Kingdom’s attention on dangerous environmental hazards — the Great London Smog.

The event, depicted in the Netflix show “The Crown,” saw severe smog descend on London in December 1952. Unusually cold weather preceded the event, which brought the coal-produced haze down to ground level. UK officials later said the Great London Smog (also called the Great London Fog) was responsible for 4,000 deaths and 100,000 illnesses, although later studies estimated a higher death toll of 10,000 to 20,000.

“From the days of the London Fog to extreme winter pollution in China, it has been a challenge to explain how sulfate is produced in the winter,” Wang says. 

Wang and his team decided to take on that challenge. 

Aerosol size and heavy metal influence?

The higher sulfate levels in China, notably in January 2013, defy conventional explanations that relied on standard photochemical oxidation. It was thought that nitrogen dioxide or other mild oxidants found in alkaline or neutral particles in the atmosphere were the cause. But measurements revealed the resulting sulfate particles were highly acidic. 

During Zheng’s time at Georgia Tech, “She was just looking for interesting things to do,” Wang says of the former student. “And I said, maybe this is what we should do — I wanted her to look at aerosol size distributions, how large the aerosols are.” 

Zheng and Wang noticed that the size of the sulfate particles from China’s winter were much larger than those that resulted from photochemically-produced aerosols. Usually measuring 0.3 to 0.5 microns, the sulfate was closer to 1 micron in size. (A human hair is about 70 microns.) Aerosols distributed over a wider area would normally be smaller. 

The micron-sized aerosol observations imply that sulfate particles undergo substantial growth in a sulfur trioxide-rich environment,” Wang says. Larger particles increase the risks to human health.

“When aerosols are large, more is deposited in the front part of the respiratory system but less on the end part, such as alveoli,” he adds. “When accounting for the large size of particles, total aerosol deposition in the human respiratory system is estimated to increase by 10 to 30 percent.”

Something still needs to join the chemical mix, however, so the sulfur dioxide could turn into sulfur trioxide while enlarging the resulting sulfate particles. Wang says a potential pathway involves the catalytic oxidation of sulfur dioxide to sulfuric acid by “transition metals.”

High temperatures, acidity, and water content in the exhaust can greatly accelerate catalytic sulfur dioxide oxidation “compared to that in the ambient atmosphere. It is possible that similar heterogeneous processes occurring on the hot surface of a smokestack coated with transition metals could explain the significant portion of sulfur trioxide observed in coal-fired power plant exhaust,” Wang says.

“A significant amount of sulfur trioxide is produced, either during combustion or through metal-catalyzed oxidation at elevated temperatures.”

An opportunity for cleaner-burning coal power plants

The impact of in-source formation of sulfate suggests that taking measures to cool off and remove sulfur trioxide, sulfuric acid, and particulates from the emissions of coal-combustion facilities could be a way to cut down on pollution that can cause serious health problems.

“The development and implementation of such technology will benefit nations globally, particularly those heavily reliant on coal as a primary energy source,” Wang says.

 

DOI: https://doi.org/10.1021/acs.est.3c05645

Funding: This study was funded by the National Natural Science Foundation of China (nos. 41821005 and 41977311). Yuhang Wang was supported by the National Science Foundation Atmospheric Chemistry Program. Qianru Zhang would also like to thank the China Postdoctoral Science Foundation (2022M720005) and China Scholarship Council for support. Mingming Zheng is also supported by the Fundamental Research Funds for the Central Universities, Peking University (7100604309).

 

Yuhang Wang

Yuhang Wang

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

Physicists Focus on Neutrinos With New Telescope

The Trinity Demonstrator telescope. (Photo Nepomuk Otte)

The Trinity Demonstrator telescope. (Photo Nepomuk Otte)

Georgia Tech scientists will soon have another way to search for neutrinos, those hard-to-detect, high-energy particles speeding through the cosmos that hold clues to massive particle accelerators in the universe — if researchers can find them. 

“The detection of a neutrino source or even a single neutrino at the highest energies is like finding a holy grail,” says Professor Nepomuk Otte, the principal investigator for the Trinity Demonstrator telescope that was recently built by his group and collaborators, and was designed to detect neutrinos after they get stopped within the Earth.

The National Science Foundation (NSF)-funded effort will eventually create “the world’s most sensitive ultra-high energy neutrino telescope.” The Trinity Demonstrator is the first step toward an array of 18 telescopes located at three sites, each on top of a high mountain. 

Earlier in the year, Otte’s group flew a neutrino telescope tethered to a massive NASA-funded balloon — though a leak brought the telescope down earlier than planned. The effort was part of the EUSO-SPB2 collaboration, which wants to study cosmic-particle accelerators with detectors in space.

“This was the first time our group had built an instrument for a balloon mission,” Otte says. “And the big question was if it would work at the boundary to space at -40F and in a vacuum. Even though we only flew 37 hours (of a 50-hour mission), we could show that our instrument worked as expected. We even accomplished some key measurements, like making a measurement of the background light, which no one has done before.”

The search for neutrinos

Otte is the second Georgia Tech physicist to lead a search for neutrinos. Professor Ignacio Taboada is the spokesperson for IceCube, an NSF neutrino observatory located at the South Pole. IceCube uses thousands of sensors buried in the ice to detect neutrinos.

Meanwhile, Trinity telescopes will be especially sensitive to higher-energy neutrinos. “With Trinity, we can potentially open a new, entirely unexplored window in astronomy,” Otte says. “IceCube gives us a couple of good pointers on what to observe. That is also why we modified the building of the Trinity Demonstrator to point toward the only two high-energy neutrino sources” already identified by IceCube scientists.

‘Cherenkov lights’ illuminate ‘air showers’

The Trinity Demonstrator telescope is not your typical astronomy telescope. Instead of looking into the sky, it is looking at the horizon, waiting for a flash of light to happen that only lasts tens of billionths of a second. 

That flash is at the end of a chain of events that happens when a high-energy neutrino enters the Earth under a shallow angle. Upon penetrating Earth and traveling along a straight line for a hundred miles, the neutrino eventually interacts inside the Earth, producing a tau particle, which is like a short-lived massive electron. 

The tau continues to travel through the Earth, and when it bounces out of the ground, it decays into millions of electrons and positrons, which zip through the air. Because the electrons and positrons travel faster than the speed of light in the air, they emit Cherenkov light, the short flash of light the Trinity Demonstrator telescope detects. Using computer algorithms, the recorded Cherenkov flashes are analyzed to reconstruct the energy and arrival direction of the neutrino. 

Otte and his team of Georgia Tech postdoctoral and graduate scholars developed and built the Trinity Demonstrator. Undergraduate students have also had significant responsibilities in designing its optics. “It is good for the students because they are involved in all aspects of the experiment. In big collaborations, you are an expert on one aspect only,” Otte says.

The largest collaboration Otte is currently involved with is the Cherenkov Telescope Array, which involves more than 2,000 researchers. That planned international project will involve 60 next-generation gamma-ray telescopes in Chile and on the Canary Island of La Palma.

Next year, Otte says he and his researchers will apply for funding to build a much bigger telescope, which will be the foundation for the NSF 18-telescope array. For now, the team is busy observing with the Trinity Demonstrator atop Frisco Peak in Utah.

“With a bit of luck, we will detect the first neutrino source at these energies,” Otte said.

 

Funding: National Science Foundation (NSF)

 

The Trinity Demonstrator team, graduate scholar Jordan Bogdan, postdoctoral scholar Mariia Fedkevych, graduate scholar Sofia Stepanoff, and Professor Nepomuk Otte.

The Trinity Demonstrator team, graduate scholar Jordan Bogdan, postdoctoral scholar Mariia Fedkevych, graduate scholar Sofia Stepanoff, and Professor Nepomuk Otte.

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

Water on the Moon May Be Forming Due to Electrons From Earth

Thom Orlando (left) and Brant Jones

Scientists have discovered that electrons from Earth may be contributing to the formation of water on the Moon’s surface. The research, published in Nature Astronomy, has the potential to impact our understanding of how water — a critical resource for life and sustained future human missions to Earth's moon — formed and continues to evolve on the lunar surface.

“Understanding how water is made on the Moon will help us understand how water was made in the early solar system and how water inevitably was brought to Earth,” says Thom Orlando, Regents' Professor in the School of Chemistry and Biochemistry with a joint appointment in the School of Physics, who played a critical role in the discovery alongside Brant Jones, a research scientist in the School of Chemistry and Biochemistry at Georgia Tech.

“Understanding water formation and transport on the Moon will help explain current and future observations,” Jones adds. “It can help predict areas with high concentrations of water that will help with mission planning and in situ resource utilization (ISRU) mining. This is absolutely necessary for sustained human presence on the Moon.”

The role of solar wind

“Water production on airless bodies is driven by a combination of solar wind, heat, ionizing radiation and meteorite impacts,” Jones explains. Solar wind — a continuous stream of protons and electrons emitted by the Sun, traveling at 250 to 500 miles per second — is widely thought to be one of the primary ways in which water has been formed on the Moon. 

While solar wind buffets the Moon’s surface, Earth is protected due to its magnetosphere, a shield that forms as a result of the magnetic fields associated with the hot metals circulating in the Earth’s molten interior layers. However, solar wind is affected by the magnetosphere, forming the northern lights (aurora borealis) and southern lights (aurora australis) at Earth’s poles, and stretching the spherical shield into having a long “tail" — the magnetotail — which the Moon passes through periodically on its orbit around Earth.  When the Moon is in the magnetotail region, it is temporarily shielded from the protons in solar wind, but still exposed to photons from the Sun.

"This provides a natural laboratory for studying the formation processes of lunar surface water," says University of Hawai‘i (UH) at Mānoa Assistant Researcher Shuai Li, who led the research study. "When the Moon is outside of the magnetotail, the lunar surface is bombarded with solar wind. Inside the magnetotail, there are almost no solar wind protons and water formation was expected to drop to nearly zero."

Surprisingly, while the Moon was within the magnetotail, and shielded from solar wind, the researchers found that the rate of water formation did not change. Since water was still forming in the absence of solar wind, the researchers began to theorize what could be responsible for forming the water.

Building on previous research, Orlando and Jones hypothesized that electrons from Earth could be responsible. 

A work in progress

“This work was actually based, in part, on our previous studies examining the role of ionizing radiation in metal-oxide particles present in nuclear waste storage tanks,” Orlando explains, adding that in a previous project as part of an Energy Frontier Research Center called IDREAM, they showed that water forms when a mineral called boehmite is irradiated with electrons produced by energetic particles after radioactive decay.

While boehmite does not exist on the Moon’s surface, minerals with similar compositions are present, and Orlando and Jones theorized that, like the boehmite, irradiation from electrons might be producing water on lunar surface grains. “Despite the incredibly different physical environments,” Orlando says, “the chemistry and physics is likely very similar.”

The solar wind water cycle has the potential to make huge impacts on human discovery of the Moon and beyond. “While some of these water molecules will be destroyed by the Sun, some will eventually make it to the cold spots in permanently shadowed regions at higher latitudes,” Orlando says, in “the regions where some of the planned Artemis landings will be.” The next step? “Our hope is to prove that the hypothesis is correct!”

Orlando and Jones have been studying the role of solar wind on the in situ production of water on the Moon, Mercury, and other airless bodies as part of the NASA Solar System Exploration Research Virtual Institute (SSERVI) center called Radiation Effects on Volatile Exploration of Asteroids and Lunar Surfaces (REVEALS). 

The realization that electrons from Earth were part of the dynamic lunar water cycle was a direct result of the interactions of several scientists with diverse backgrounds made possible by the SSERVI support. The work, which will further expand on the solar wind water cycle — including other sources of energy beyond surface temperature like meteorite and electron impacts — will continue in a new Georgia Tech-led SSERVI program, the Center for Lunar Environment and Volatile Exploration Research (CLEVER).

News Contact

Written by Selena Langner
Editor: Jess Hunt-Ralston

Digging Into Greenland Ice: Unraveling Mysteries in Earth's Harshest Environments

The team snowmobiling to a remote field site.

“You're in the middle of an ice sheet, and it’s one of the most desolate places on Earth. There are no living animals there. There are no plants there. The only animals you see are birds. They might be lost.”

That’s how Rachel Moore describes the view from the top of the Greenland ice sheet. “It's a really challenging environment, but it was really, really interesting to be there. I was there for nearly 50 days.”

Moore is an expert at collecting data in difficult research environments, traveling to some of the most extreme places on Earth in order to research microbes, and what hints they might give regarding astrobiology. 

“It all started in grad school, when I joined a microbial ecology lab,” Moore recalls. “I pretty quickly learned that I love to do really difficult, challenging projects. I got interested in working around fire, biomass burning and forests, and I started collecting bacteria from the air. That was a challenge in and of itself, just trying to collect these really tiny things while standing in the smoke from the forest fires. But from that I learned that I loved to go out into the environment and collect things and try to understand everything around me.”

“I have a lot of different projects, but they all connect through astrobiology,” Moore says. “I’m interested in anything that hasn't been answered yet.” Moore is also leading a project called EXO Methane, which is investigating if different Archaea could survive in Martian and Enceladus-like environments. She’s also collaborating on a project that will send a probe to Venus next year.

Moore started her postdoctoral research at Georgia Tech, and is now continuing her work as a Research Scientist in the same laboratory. “The first project I started in this lab focused around how microbes can survive a really, really dry environment,” she adds. To study this, Moore traveled to the Atacama desert in Chile — the driest place on Earth, and also one of the best analogs to the surface of Mars. “What we were interested in there is how organisms survive intense radiation and intense desiccation. And how does that change as you look at different sites in the Atacama?”

Then, this past summer, Moore traveled to another extreme environment — Greenland. “Instead of being hot and dry, Greenland is extremely cold and dry,” Moore explains. “So it was similar in some aspects, but completely different in terms of logistics and sampling methods. Because we were there in the summer, the sun never set. We were also at high elevation — 10,530 feet above sea level.”

Beneath the ice

The project was started by Nathan Chellman and Joe McConnell from the Desert Research Institute (DRI), and Moore’s role this year was to investigate the microbiology component of the research. “They had been seeing some anomalies in methane and carbon monoxide in ice samples,” Moore says. “We were curious if microbes might be producing some of this, either in the ice core after it’s been sampled, or while it’s still in the glacier.”

“The microbes would not be swimming around or anything” in the ice cores, Moore explains, “but it’s possible that their metabolism is still active, and they’re potentially able to make some of the gases, like methane, in this frozen environment. Our goal was to measure these things in the environment.”

Gathering samples wasn’t easy. “We set up a lab on the glacier, and we set it up in a trench to try to keep any of the ice cores that we pulled out roughly at the same temperature as the glacier itself,” Moore says. Because of that, “weather was a huge, huge thing. Anytime it would get stormy, the wind would blow all of the snow around, and it would fill the entrance to our trench. We had to dig ourselves out several times. People would put out flags so that you could see your way back to the main house or back to your dorms.”

The team hopes that this research will give a more defined record of the past from the Greenland ice sheet, improving climate change predictions. Moore also notes applications in astrobiology, adding that “there are a lot of icy worlds like Mars, Enceladus, and Europa, with either an icy crust over the ocean or glaciers on the northern and southern poles.”

Moore was also able to test new technology in the field, using a tool built by Georgia Tech undergraduates alongside her advisor Christopher Carr, assistant professor in the School of Earth and Atmospheric Sciences. An ice melter that can be used to take and clean ice samples, the tool is a miniaturized prototype that may be able to help take measurements on Mars, or in similar remote environments in the future.

“Being able to take a tool that Georgia Tech undergraduates made to Greenland and test it on 600-year-old ice in the field was a really cool experience,” Moore adds. “We brought Starlink with us, and so I was able to video call the undergraduate team while I was testing their tool, which was really special.”

The team is now lab-analyzing ice cores that they brought back from Greenland, unraveling which microbes might be present and potentially active. “It's really interesting to see: Is this all chemistry? Is it biology based? Or is there some intersection of the two?” Moore says. “Maybe there's some chemistry or photochemistry happening, plus some biology happening. Whatever it is, we'll have to wait and see.”

 

 

Moore stands inside a small space, wearing a mask.
Left to right, PhD student Benjamin Riddell-Young, Nathan Chellman, and Rachel Moore holding an ice core at a remote field site.
Moore at the research station in Greenland.
Moore pictured on her birthday, holding the final ice core.
Nathan Chellman walking into the research trench over drifted snow.
The collected boxes of ice cores.
The team's remote field site.
The research team in Greenland.
The team standing in the research trench.
News Contact

Written by Selena Langner
Editor: Jess Hunt-Ralston

AI/ML Conference Helps School of Physics Launch New Research Initiative

Physicists from around the country come to Georgia Tech for a recent machine learning conference. (Photo Benjamin Zhao)

Physicists from around the country come to Georgia Tech for a recent machine learning conference. (Photo Benjamin Zhao)

The School of Physics’ new initiative to catalyze research using artificial intelligence (AI) and machine learning (ML) began October 16 with a conference at the Global Learning Center titled Revolutionizing Physics — Exploring Connections Between Physics and Machine Learning.

AI and ML have the spotlight right now in science, and the conference promises to be the first of many, says Feryal Özel, Professor and Chair of the School of Physics. 

"We were delighted to host the AI/ML in Physics conference and see the exciting rapid developments in this field,” Özel says. “The conference was a prominent launching point for the new AI/ML initiative we are starting in the School of Physics."​ 

That initiative includes hiring two tenure-track faculty members, who will benefit from substantial expertise and resources in artificial intelligence and machine learning that already exist in the Colleges of Sciences, Engineering, and Computing.

The conference attendees heard from colleagues about how the technologies were helping with research involving exoplanet searches, plasma physics experiments, and culling through terabytes of data. They also learned that a rough search of keyword titles by Andreas Berlind, director of the National Science Foundation’s Division of Astronomical Sciences, showed that about a fifth of all current NSF grant proposals include components around artificial intelligence and machine learning.

“That’s a lot,” Berlind told the audience. “It’s doubled in the last four years. It’s rapidly increasing.”

Berlind was one of three program officers from the NSF and NASA invited to the conference to give presentations on the funding landscape for AI/ML research in the physical sciences. 

“It’s tool development, the oldest story in human history,” said Germano Iannacchione, director of the NSF’s Division of Materials Research, who added that AI/ML tools “help us navigate very complex spaces — to augment and enhance our reasoning capabilities, and our pattern recognition capabilities.”

That sentiment was echoed by Dimitrios Psaltis, School of Physics professor and a co-organizer of the conference. 

“They usually say if you have a hammer, you see everything as a nail,” Psaltis said. “Just because we have a tool doesn't mean we're going to solve all the problems. So we're in the exploratory phase because we don't know yet which problems in physics machine learning will help us solve. Clearly it will help us solve some problems, because it's a brand new tool, and there are other instances when it will make zero contribution. And until we find out what those problems are, we're going to just explore everything.”

That means trying to find out if there is a place for the technologies in classical and modern physics, quantum mechanics, thermodynamics, optics, geophysics, cosmology, particle physics, and astrophysics, to name just a few branches of study.

Sanaz Vahidinia of NASA’s Astronomy and Astrophysics Research Grants told the attendees that her division was an early and enthusiastic adopter of AI and machine learning. She listed examples of the technologies assisting with gamma-ray astronomy and analyzing data from the Hubble and Kepler space telescopes. “AI and deep learning were very good at identifying patterns in Kepler data,” Vahidinia said. 

Some of the physicist presentations at the conference showed pattern recognition capabilities and other features for AI and ML: 

Alves’s presentation inspired another physicist attending the conference, Psaltis said. “One of our local colleagues, who's doing magnetic materials research, said, ‘Hey, I can apply the exact same thing in my field,’ which he had never thought about before. So we not only have cross-fertilization (of ideas) at the conference, but we’re also learning what works and what doesn't.”

More information on funding and grants at the National Science Foundation can be found here. Information on NASA grants is found here

School of Physics Professor Tamara Bogdanovic prepares to ask a question at the recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

School of Physics Professor Tamara Bogdanovic prepares to ask a question at the recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

Matthew Golden, graduate student researcher in the School of Physics, presents at a recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

Matthew Golden, graduate student researcher in the School of Physics, presents at a recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

IDEaS Awards Grants and Cyberinfrastructure Resources for Thematic Programs and Research in AI

3D Graphic of a Server Room

In keeping with a strong strategic focus on AI for the 2023-2024 Academic Year, the Institute for Data Engineering and Science (IDEaS) has announced the winners of its 2023 Seed Grants for Thematic Events in AI and Cyberinfrastructure Resource Grants to support research in AI requiring secure, high-performance computing capabilities. Thematic event awards recipients will receive $8K to support their proposed workshop or series and Cyberinfrastructure winners will receive research support consisting of 600,000 CPU hours on the AMD Genoa Server as well as 36,000 hours of NVIDIA DGX H-100 GPU server usage and 172 TB of secure storage.

Congratulations to the award winners listed below!

Thematic Events in AI Awards

Proposed Workshop: “Foundation of scientific AI (Artificial Intelligence) for Optimization of Complex Systems”
Primary PI: Raphael Pestourie, Assistant Professor, School of Computational Science and Engineering
Secondary PI: Peng Chen, Assistant Professor, School of Computational Science and Engineering

Proposed Series: “Guest Lecture Seminar Series on Generative Art and Music”
Primary PI: Gil Weinberg, Professor, School of Music

Cyber-Infrastructure Resource Awards

Title: Human-in-the-Loop Musical Audio Source Separation
Topics: Music Informatics, Machine Learning
Primary PI: Alexander Lerch, Associate Professor, School of Music
Co-PIs: Karn Watcharasupat, Music Informatics Group | Yiwei Ding, Music Informatics Group | Pavan Seshadri, Music Informatics Group

Title: Towards A Multi-Species, Multi-Region Foundation Model for Neuroscience
Topics: Data-Centric AI, Neuroscience
Primary PI: Eva Dyer, Assistant Professor, Biomedical Engineering

Title: Multi-point Optimization for Building Sustainable Deep Learning Infrastructure
Topics: Energy Efficient Computing, Deep Learning, AI Systems OPtimization
Primary PI: Divya Mahajan, Assistant Professor, School of Electrical and Computer Engineering, School of Computer Science

Title: Neutrons for Precision Tests of the Standard Model
Topics: Nuclear/Particle Physics, Computational Physics
Primary PI: Aaron Jezghani - OIT-PACE

Title: Continual Pretraining for Egocentric Video
Primary PI: : Zsolt Kira, Assistant Professor, School of Interactive Computing
Co-PI: Shaunak Halbe, Ph.D. Student, Machine Learning

Title: Training More Trustworthy LLMs for Scientific Discovery via Debating and Tool Use
Topics: Trustworthy AI, Large-Language Models, Multi-Agent Systems, AI Optimization
Primary PIs: Chao Zhang, School of Computational Science and Engineering & Bo Dai, College of Computing

Title: Scaling up Foundation AI-based Protein Function Prediction with IDEaS Cyberinfrastructure
Topics: AI, Biology
Primary PI: Yunan Luo, Assistant Professor, School of Computational Science and Engineering        

  • Christa M. Ernst
News Contact

Christa M. Ernst - Research Communications Program Manager
Robotics | Data Engineering | Neuroengineering