Study Reveals Wintertime Formation of Large Pollution Particles in China’s Skies

Beijing pollution (Photo Kevin Dooley, Creative Commons)

Beijing pollution (Photo Kevin Dooley, Creative Commons)

Previous studies have found that the particles that float in the haze over the skies of Beijing include sulfate, a major source of outdoor air pollution that damages lungs and aggravates existing asthmatic symptoms, according to the California Air Resources Board.

Sulfates usually are produced by atmospheric oxidation in the summer, when ample sunlight facilitates the oxidation that turns sulfur dioxide into dangerous aerosol particles. How is it that China can produce such extreme pollution loaded with sulfates in the winter, when there’s not as much sunlight and atmospheric oxidation is slow?

Yuhang Wang, professor in the School of Earth and Atmospheric Sciences at Georgia Tech, and his research team have conducted a study that may have the answer: All the chemical reactions needed to turn sulfur dioxide into sulfur trioxide, and then quickly into sulfate, primarily happen within the smoke plumes causing the pollution. That process not only creates sulfates in the winter in China, but it also happens faster and results in larger sulfate particles in the atmosphere.

“We call the source ‘in-source formation,’” Wang says. “Instead of having oxidants spread out in the atmosphere, slowly oxidizing sulfur dioxide into sulfur trioxide to produce sulfate, we have this concentrated production in the exhaust plumes that turns the sulfuric acid into large sulfate particles. And that's why we're seeing these large sulfate particles in China.”

The findings of in-source formation of larger wintertime sulfate particles in China could help scientists accurately assess the impacts of aerosols on radiative forcing — how climate change and global warming impact the Earth’s energy and heat balances — and on health, where larger aerosols means larger deposits into human lungs. 

“Wintertime Formation of Large Sulfate Particles in China and Implications for Human Health,” is published in Environmental Science & Technology, an American Chemical Society publication. The co-authors include Qianru Zhang of Peking University and Mingming Zheng of Wuhan Polytechnic University, two of Wang’s former students who conducted the research while at Georgia Tech. 

Explaining a historic smog

China still burns a lot of coal in power plants because its costs are lower compared to natural gas, Wang says. It also makes for an easy comparison between China’s hazy winters and a historic event that focused the United Kingdom’s attention on dangerous environmental hazards — the Great London Smog.

The event, depicted in the Netflix show “The Crown,” saw severe smog descend on London in December 1952. Unusually cold weather preceded the event, which brought the coal-produced haze down to ground level. UK officials later said the Great London Smog (also called the Great London Fog) was responsible for 4,000 deaths and 100,000 illnesses, although later studies estimated a higher death toll of 10,000 to 20,000.

“From the days of the London Fog to extreme winter pollution in China, it has been a challenge to explain how sulfate is produced in the winter,” Wang says. 

Wang and his team decided to take on that challenge. 

Aerosol size and heavy metal influence?

The higher sulfate levels in China, notably in January 2013, defy conventional explanations that relied on standard photochemical oxidation. It was thought that nitrogen dioxide or other mild oxidants found in alkaline or neutral particles in the atmosphere were the cause. But measurements revealed the resulting sulfate particles were highly acidic. 

During Zheng’s time at Georgia Tech, “She was just looking for interesting things to do,” Wang says of the former student. “And I said, maybe this is what we should do — I wanted her to look at aerosol size distributions, how large the aerosols are.” 

Zheng and Wang noticed that the size of the sulfate particles from China’s winter were much larger than those that resulted from photochemically-produced aerosols. Usually measuring 0.3 to 0.5 microns, the sulfate was closer to 1 micron in size. (A human hair is about 70 microns.) Aerosols distributed over a wider area would normally be smaller. 

The micron-sized aerosol observations imply that sulfate particles undergo substantial growth in a sulfur trioxide-rich environment,” Wang says. Larger particles increase the risks to human health.

“When aerosols are large, more is deposited in the front part of the respiratory system but less on the end part, such as alveoli,” he adds. “When accounting for the large size of particles, total aerosol deposition in the human respiratory system is estimated to increase by 10 to 30 percent.”

Something still needs to join the chemical mix, however, so the sulfur dioxide could turn into sulfur trioxide while enlarging the resulting sulfate particles. Wang says a potential pathway involves the catalytic oxidation of sulfur dioxide to sulfuric acid by “transition metals.”

High temperatures, acidity, and water content in the exhaust can greatly accelerate catalytic sulfur dioxide oxidation “compared to that in the ambient atmosphere. It is possible that similar heterogeneous processes occurring on the hot surface of a smokestack coated with transition metals could explain the significant portion of sulfur trioxide observed in coal-fired power plant exhaust,” Wang says.

“A significant amount of sulfur trioxide is produced, either during combustion or through metal-catalyzed oxidation at elevated temperatures.”

An opportunity for cleaner-burning coal power plants

The impact of in-source formation of sulfate suggests that taking measures to cool off and remove sulfur trioxide, sulfuric acid, and particulates from the emissions of coal-combustion facilities could be a way to cut down on pollution that can cause serious health problems.

“The development and implementation of such technology will benefit nations globally, particularly those heavily reliant on coal as a primary energy source,” Wang says.

 

DOI: https://doi.org/10.1021/acs.est.3c05645

Funding: This study was funded by the National Natural Science Foundation of China (nos. 41821005 and 41977311). Yuhang Wang was supported by the National Science Foundation Atmospheric Chemistry Program. Qianru Zhang would also like to thank the China Postdoctoral Science Foundation (2022M720005) and China Scholarship Council for support. Mingming Zheng is also supported by the Fundamental Research Funds for the Central Universities, Peking University (7100604309).

 

Yuhang Wang

Yuhang Wang

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

Three Earth and Atmospheric Sciences Researchers Awarded DOE Earthshot Funding for Carbon Removal Strategies

Earth (Credit NASA/Joshua Stevens)

Earth (Credit NASA/Joshua Stevens)

Three Georgia Tech School of Earth and Atmospheric Sciences researchers — Professor and Associate Chair Annalisa Bracco, Professor Taka Ito, and Georgia Power Chair and Associate Professor Chris Reinhard — will join colleagues from Princeton, Texas A&M, and Yale University for an $8 million Department of Energy (DOE) grant that will build an “end-to-end framework” for studying the impact of carbon dioxide removal efforts for land, rivers, and seas. 

The proposal is one of 29 DOE Energy Earthshot Initiatives projects recently granted funding, and among several led by and involving Georgia Tech investigators across the Sciences and Engineering.

Overall, DOE is investing $264 million to develop solutions for the scientific challenges underlying the Energy Earthshot goals. The 29 projects also include establishing 11 Energy Earthshot Research Centers led by DOE National Laboratories. 

The Energy Earthshots connect the Department of Energy's basic science and energy technology offices to accelerate breakthroughs towards more abundant, affordable, and reliable clean energy solutions — seeking to revolutionize many sectors across the U.S., and relying on fundamental science and innovative technology to be successful.

Carbon Dioxide Removal 

The School of Earth and Atmospheric Sciences project, “Carbon Dioxide Removal and High-Performance Computing: Planetary Boundaries of Earth Shots,” is part of the agency’s Science Foundations for the Energy Earthshots program. Its goal is to create a publicly-accessible computer modeling system that will track progress in two key carbon dioxide removal (CDR) processes: enhanced earth weathering, and global ocean alkalinization. 

In enhanced earth weathering, carbon dioxide is converted into bicarbonate by spreading minerals like basalt on land, which traps rainwater containing CO2. That gets washed out by rivers into oceans, where it is trapped on the ocean floor. If used at scale, these nature-based climate solutions could remove atmospheric carbon dioxide and alleviate ocean acidification. 

The research team notes that there is currently “no end-to-end framework to assess the impacts of enhanced weathering or ocean alkalinity enhancement — which are likely to be pursued at the same time.” 

 “The proposal is for a three-year effort, but our hope is that the foundation we lay down in that time will represent a major step forward in our ability to track carbon from land to sea,” says Reinhard, the Georgia Power Chair who is a co-investigator on the grant. 

“Like many folks interested in better understanding how climate interventions might impact the Earth system across scales, we are in some ways building the plane in midair,” he adds. “We need to develop and validate the individual pieces of the system — soils, rivers, the coastal ocean — but also wire them up and prove from observations on the ground how a fully integrated model works.”

That will involve the use of several existing computer models, along with Georgia Tech’s PACE supercomputers, Professor Ito explains. “We will use these models as a tool to better understand how the added alkalinity, carbon and weathering byproducts from the soils and rivers will eventually affect the cycling of nutrients, alkalinity, carbon and associated ecological processes in the ocean,” Ito adds. “After the model passes the quality check and we have confidence in our output, we can start to ask many questions about assessment of different carbon sequestration approaches or downstream impacts on ecosystem processes.”

Professor Bracco, whose recent research has focused on rising ocean heat levels, says CDR is needed just to keep ocean systems from warming about 2 degrees centigrade (Celsius). 

“Ninety percent of the excess heat caused by greenhouse gas emissions is in the oceans,” Bracco shares, “and even if we stop emitting all together tomorrow, that change we imprinted will continue to impact the climate system for many hundreds of years to come. So in terms of ocean heat, CDRs will help in not making the problem worse, but we will not see an immediate cooling effect on ocean temperatures. Stabilizing them, however, would be very important.”

Bracco and co-investigators will study the soil-river-ocean enhanced weathering pipeline “because it’s definitely cheaper and closer to scale-up.” Reverse weathering can also happen on the ocean floor, with new clays chemically formed from ocean and marine sediments, and CO2 is included in that process. “The cost, however, is higher at the moment. Anything that has to be done in the ocean requires ships and oil to begin,” she adds.

Reinhard hopes any tools developed for the DOE project would be used by farmers and other land managers to make informed decisions on how and when to manage their soil, while giving them data on the downstream impacts of those practices.

“One of our key goals will also be to combine our data from our model pipeline with historical observational data from the Mississippi watershed and the Gulf of Mexico,” Reinhard says. “This will give us some powerful new insights into the impacts large-scale agriculture in the U.S. has had over the last half-century, and will hopefully allow us to accurately predict how business-as-usual practices and modified approaches will play out across scales.”

(From left) Annalisa Bracco, Taka Ito, Chris Reinhard

(From left) Annalisa Bracco, Taka Ito, Chris Reinhard

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

NSF RAPID Response to Earthquakes in Turkey

Grad student Phuc Mach places a node

In February, a major earthquake event devastated the south-central region of the Republic of Türkiye (Turkey) and northwestern Syria. Two earthquakes, one magnitude 7.8 and one magnitude 7.5, occurred nine hours apart, centered near the heavily populated city of Gaziantep. The total rupture lengths of both events were up to 250 miles. The president of Turkey has called it the “disaster of the century,” and the threat is still not over — aftershocks could still affect the region. 

Now, Zhigang Peng, a professor in the School of Earth and Atmospheric Sciences at Georgia Tech and graduate students Phuc Mach and Chang Ding, alongside researchers at the Scientific and Technological Research Institution of Türkiye (TÜBİTAK) and researchers at the University of Missouri, are using small seismic sensors to better understand just how, why, and when these earthquakes are occurring.

Funded by an NSF RAPID grant, the project is unique in that it aims to actively respond to the crisis while it’s still happening. National Science Foundation (NSF) Rapid Response Research (RAPID) grants are used when there is a severe urgency with regard to availability of or access to data, facilities or specialized equipment, including quick-response research on natural or anthropogenic disasters and other similar unanticipated events.

In an effort to better map the aftershocks of the earthquake event — which can occur weeks or months after the main event — the team placed approximately 120 small sensors, called nodes, in the East Anatolian fault region this past May. Their deployment continues through the summer. 

It’s the first time sensors like this have been deployed in Turkey, says Peng.

“These sensors are unique in that they can be placed easily and efficiently," he explains. "With internal batteries that can work up to one month when fully charged, they’re buried in the ground and can be deployed within minutes, while most other seismic sensors need solar panels or other power sources and take much longer time and space to deploy.” Each node is about the size of a 2-liter soda bottle, and can measure ground movement in three directions.

 “The primary reason we’re deploying these sensors quickly following the two mainshocks is to study the physical mechanisms of how earthquakes trigger each,” Peng adds. Mainshocks are the largest earthquake in a sequence. “We’ll use advanced techniques such as machine learning to detect and locate thousands of small aftershocks recorded by this network. These newly identified events can provide new important clues on how aftershocks evolve in space and time, and what drives foreshocks that occur before large events.”

Unearthing fault mechanisms

The team will also use the detected aftershocks to illuminate active faults where three tectonic plates come together — a region known as the Maraş Triple Junction. “We plan to use the aftershock locations and the seismic waves from recorded events to image subsurface structures where large damaging earthquakes occur,” says Mach, the Georgia Tech graduate researcher. This will help scientists better understand why sometimes faults ‘creep’ without any large events, while in other cases faults lock and then violently release elastic energy, creating powerful earthquakes.

Getting high-resolution data of the fault structures is another priority. “The fault line ruptured in the first magnitude 7.8 event has a bend in it, where earthquake activity typically terminates, but the earthquake rupture moved through this bend, which is highly unusual,” Peng says. By deploying additional ultra-dense arrays of sensors in their upcoming trip this summer, the team hopes to help researchers ‘see’ the bend under the Earth’s surface, allowing them to better understand how fault properties control earthquake rupture propagation.  

The team also aims to learn more about the relationship between the two main shocks that recently rocked Turkey, sometimes called doublet events. Doublet events can happen when the initial earthquake triggers a secondary earthquake by adding extra stress loading. While in this instance, the doublet may have taken place only 9 hours after the initial event, these secondary earthquakes have been known to take place days, months, or even years after the initial one — a famous example being the sequence of earthquakes that spanned 60 years in the North Anatolian fault region in Northern Turkey. 

“Clearly the two main shocks in 2023 are related, but it is still not clear how to explain the time delays,” says Peng. The team plans to work with their collaborators at TÜBİTAK to re-analyze seismic and other types of geophysical data right before and after those two main shocks in order to better understand the triggering mechanisms.

“In our most recent trip in southern Türkiye, we saw numerous buildings that were partially damaged during the mainshock, and many people will have to live in temporary shelters for years during the rebuilding process,” Peng adds. “While we cannot stop earthquakes from happening in tectonically active regions, we hope that our seismic deployment and subsequent research on earthquake triggering and fault imaging can improve our ability to predict what will happen next — before and after a big one — and could save countless lives.”

 

Grad student Phuc Mach holds a node
Members of the team in the field in Turkey
Georgia Tech graduate student Chang Ding pointing at a deployed seismic node in Southern Turkey
A nodal seismic station deployed by a TUBITAK scientist in Southern Turkey
Georgia Tech graduate student Chang Ding posing with a local villager at a seismic site in Southern Turkey
Georgia Tech graduate student Chang Ding pointing at a deployed seismic node in Southern Turkey
Georgia Tech graduate student Chang Ding pointing at a deployed seismic node in Southern Turkey
Georgia Tech scientist Zhigang Peng posing with TUBITAK scientist Ekrem Zor right in front of a possible surface rupture produced by the 2023 magnitude 7.8 earthquake
Researchers from Georgia Tech, Univ. of Missouri and TUBITAK before heading to the field on May 1st, 2023
News Contact

Written By:
Selena Langner

Media Contact:
Jess Hunt-Ralston

Gauging Glaciers: Alex Robel Awarded NSF CAREER Grant for New Ice Melt Modeling Tool

A stylized glacier (Selena Langner)

Alex Robel is improving how computer models of melting ice sheets incorporate data from field expeditions and satellites by creating a new open-access software package — complete with state-of-the-art tools and paired with ice sheet models that anyone can use, even on a laptop or home computer.

Improving these models is critical: while melting ice sheets and glaciers are top contributors to sea level rise, there are still large uncertainties in sea level projections at 2100 and beyond.

“Part of the problem is that the way that many models have been coded in the past has not been conducive to using these kinds of tools,” Robel, an assistant professor in the School of Earth and Atmospheric Sciences, explains. “It's just very labor-intensive to set up these data assimilation tools — it usually involves someone refactoring the code over several years.”

“Our goal is to provide a tool that anyone in the field can use very easily without a lot of labor at the front end,” Robel says. “This project is really focused around developing the computational tools to make it easier for people who use ice sheet models to incorporate or inform them with the widest possible range of measurements from the ground, aircraft and satellites.”

Now, a $780,000 NSF CAREER grant will help him to do so. 

The National Science Foundation Faculty Early Career Development Award is a five-year funding mechanism designed to help promising researchers establish a personal foundation for a lifetime of leadership in their field. Known as CAREER awards, the grants are NSF’s most prestigious funding for untenured assistant professors.

“Ultimately,” Robel says, “this project will empower more people in the community to use these models and to use these models together with the observations that they're taking.”
 

Ice sheets remember

“Largely, what models do right now is they look at one point in time, and they try their best — at that one point in time — to get the model to match some types of observations as closely as possible,” Robel explains. “From there, they let the computer model simulate what it thinks that ice sheet will do in the future.”

In doing so, the models often assume that the ice sheet starts in a state of balance, and that it is neither gaining nor losing ice at the start of the simulation. The problem with this approach is that ice sheets dynamically change, responding to past events — even ones that have happened centuries ago. “We know from models and from decades of theory that the natural response time scale of thick ice sheets is hundreds to thousands of years,” Robel adds.

By informing models with historical records, observations, and measurements, Robel hopes to improve their accuracy. “We have observations being made by satellites, aircraft, and field expeditions,” says Robel. “We also have historical accounts, and can go even further back in time by looking at geological observations or ice cores. These can tell us about the long history of ice sheets and how they've changed over hundreds or thousands of years.”

Robel’s team plans to use a set of techniques called data assimilation to adjust, or ‘nudge’, models. “These data assimilation techniques have been around for a really long time,” Robel explains. “For example, they’re critical to weather forecasting: every weather forecast that you see on your phone was ultimately the product of a weather model that used data assimilation to take many observations and apply them to a model simulation.”

“The next part of the project is going to be incorporating this data assimilation capability into a cloud-based computational ice sheet model,” Robel says. “We are planning to build an open source software package in Python that can use this sort of data assimilation method with any kind of ice sheet model.”

Robel hopes it will expand accessibility. “Currently, it's very labor-intensive to set up these data assimilation tools, and while groups have done it, it usually involves someone re-coding and refactoring the code over several years.”

Building software for accessibility

Robel’s team will then apply their software package to a widely used model, which now has an online, browser-based version. “The reason why that is particularly useful is because the place where this model is running is also one of the largest community repositories for data in our field,” Robel says.

Called Ghub, this relatively new repository is designed to be a community-wide place for sharing data on glaciers and ice sheets. “Since this is also a place where the model is living, by adding this capability to this cloud-based model, we'll be able to directly use the data that's already living in the same place that the model is,” Robel explains. 

Users won’t need to download data, or have a high-speed computer to access and use the data or model. Researchers collecting data will be able to upload their data to the repository, and immediately see the impact of their observations on future ice sheet melt simulations. Field researchers could use the model to optimize their long-term research plans by seeing where collecting new data might be most critical for refining predictions.

“We really think that it is critical for everyone who's doing modeling of ice sheets to be doing this transient data simulation to make sure that our simulations across the field are all doing the best possible job to reproduce and match observations,” Robel says. While in the past, the time and labor involved in setting up the tools has been a barrier, “developing this particular tool will allow us to bring transient data assimilation to essentially the whole field.”

Bringing Real Data to Georgia’s K-12 Classrooms

The broad applications and user-base expands beyond the scientific community, and Robel is already developing a K-12 curriculum on sea level rise, in partnership with Georgia Tech CEISMC Researcher Jayma Koval. “The students analyze data from real tide gauges and use them to learn about statistics, while also learning about sea level rise using real data,” he explains.

Because the curriculum matches with state standards, teachers can download the curriculum, which is available for free online in partnership with the Southeast Coastal Ocean Observing Regional Association (SECOORA), and incorporate it into their preexisting lesson plans. “We worked with SECOORA to pilot a middle school curriculum in Atlanta and Savannah, and one of the things that we saw was that there are a lot of teachers outside of middle school who are requesting and downloading the curriculum because they want to teach their students about sea level rise, in particular in coastal areas,” Robel adds.

In Georgia, there is a data science class that exists in many high schools that is part of the computer science standards for the state. “Now, we are partnering with a high school teacher to develop a second standards-aligned curriculum that is meant to be taught ideally in a data science class, computer class or statistics class,” Robel says. “It can be taught as a module within that class and it will be the more advanced version of the middle school sea level curriculum.”

The curriculum will guide students through using data analysis tools and coding in order to analyze real sea level data sets, while learning the science behind what causes variations and sea level, what causes sea level rise, and how to predict sea level changes. 

“That gets students to think about computational modeling and how computational modeling is an important part of their lives, whether it's to get a weather forecast or play a computer game,” Robel adds. “Our goal is to get students to imagine how all these things are combined, while thinking about the way that we project future sea level rise.”

 

Alex Robel (Credit: Allison Carter)
News Contact

Written by Selena Langner

Contact: Jess Hunt-Ralston

The Oceans Are Missing Their Rivers

In a rhythm that’s pulsed through epochs, a river’s plume carries sediment and nutrients from the continental interior into the ocean, a major exchange of resources from land to sea. More than 6,000 rivers worldwide surge freshwater into oceans, delivering nutrients, including nitrogen and phosphorus, that feed phytoplankton, generating a bloom of life that in turn feeds progressively larger creatures. They may even influence ocean currents in ways researchers are just starting to understand. But today, in rivers around the world, humans are altering this critical phenomenon.