New Method Uses Collisions to Break Down Plastic for Sustainable Recycling

The high impact between the metal balls in a ball mill reactor and the polymer surface is sufficient to momentarily liquefy the polymer and facilitate chemical reactions.

The high impact between the metal balls in a ball mill reactor and the polymer surface is sufficient to momentarily liquefy the polymer and facilitate chemical reactions.

While plastics help enable modern standards of living, their accumulation in landfills and the overall environment continues to grow as a global concern.

Polyethylene terephthalate (PET) is one of the world’s most widely used plastics, with tens of millions of tons produced annually in the production of bottles, food packaging, and clothing fibers. The durability that makes PET so useful also means that it is more difficult to recycle efficiently.

Now, researchers have developed a method to break down PET using mechanical forces instead of heat or harsh chemicals. Published in the journal Chem, their findings demonstrate how a “mechanochemical” method — chemical reactions driven by mechanical forces such as collisions — can rapidly convert PET back into its basic building blocks, opening a path toward faster, cleaner recycling.

Led by postdoctoral researcher Kinga Gołąbek and Professor Carsten Sievers of Georgia Tech’s School of Chemical and Biomolecular Engineering, the research team hit solid pieces of PET with metal balls with the same force they would experience in a machine called a ball mill. This can make the PET react with other solid chemicals such as sodium hydroxide (NaOH), generating enough energy to break the plastic’s chemical bonds at room temperature, without the need for hazardous solvents.

“We’re showing that mechanical impacts can help decompose plastics into their original molecules in a controllable and efficient way,” Sievers said. “This could transform the recycling of plastics into a more sustainable process.”

Mapping the Impact

In demonstrating the process, the researchers used controlled single-impact experiments along with advanced computer simulations to map how energy from collisions distributes across the plastic and triggers chemical and structural transformations. 

These experiments showed changes in structure and chemistry of PET in tiny zones that experience different pressures and heat. By mapping these transformations, the team gained new insights into how mechanical energy can trigger rapid, efficient chemical reactions.

“This understanding could help engineers design industrial-scale recycling systems that are faster, cleaner, and more energy-efficient,” Gołąbek said.

Breaking Down Plastic

Each collision created a tiny crater, with the center absorbing the most energy. In this zone, the plastic stretched, cracked, and even softened slightly, creating ideal conditions for chemical reactions with sodium hydroxide.

High-resolution imaging and spectroscopy revealed that the normally ordered polymer chains became disordered in the crater center, while some chains broke into smaller fragments, increasing the surface area exposed to the reactant. Even without sodium hydroxide, mechanical impact alone caused minor chain breaking, showing that mechanical force itself can trigger chemical change.

The study also showed the importance of the amount of energy delivered by each impact. Low-energy collisions only slightly disturb PET, but stronger impacts cause cracks and plastic deformation, exposing new surfaces that can react with sodium hydroxide for rapid chemical breakdown. 

“Understanding this energy threshold allows engineers to optimize mechanochemical recycling, maximizing efficiency while minimizing unnecessary energy use,” Sievers explained.

Closing the Loop on Plastic Waste

These findings point toward a future where plastics can be fully recycled back into their original building blocks, rather than being downcycled or discarded. By harnessing mechanical energy instead of heat or harsh chemicals, recycling could become faster, cleaner, and more energy-efficient.

“This approach could help close the loop on plastic waste,” Sievers said. “We could imagine recycling systems where everyday plastics are processed mechanochemically, giving waste new life repeatedly and reducing environmental impact.”

The team now plans to test real-world waste streams and explore whether similar methods can work for other difficult-to-recycle plastics, bringing mechanochemical recycling closer to industrial use.

“With millions of tons of PET produced every year, improving recycling efficiency could significantly reduce plastic pollution and help protect ecosystems worldwide,” Gołąbek said.

CITATION: Kinga Gołąbek, Yuchen Chang, Lauren R. Mellinger, Mariana V. Rodrigues, Cauê de Souza Coutinho Nogueira, Fabio B. Passos, Yutao Xing, Aline Ribeiro Passos, Mohammed H. Saffarini, Austin B. Isner, David S. Sholl, Carsten Sievers, “Spatially-resolved reaction environments in mechanochemical upcycling of polymers,” Chem, 2025.

Kinga Golabek

Kinga Gołąbek

Professor Carsten Sievers

Prof. Carsten Sievers

 
News Contact

Storms Are Changing — Should the Hurricane Scale Change Too?

Image of a hurricane

The Saffir-Simpson scale classifies hurricanes solely by sustained wind speed, ranging from Category 1 to Category 5.

As climate change continues to reshape the intensity and behavior of hurricanes, meteorologists and researchers are examining whether the Saffir-Simpson Hurricane Wind Scale, a decades-old classification system, still adequately communicates the full scope of hurricane hazards. While the scale remains a widely recognized tool, experts like Zachary Handlos, director of Atmospheric and Oceanic Sciences at Georgia Tech, suggest that a complementary system could enhance public understanding of the broader risks hurricanes pose. 

Developed in 1969 by civil engineer and Georgia Tech alumnus Herbert Saffir, CE 1940, and meteorologist Robert Simpson, the scale classifies hurricanes solely by sustained wind speed, ranging from Category 1 to Category 5. It has long served as the primary tool for describing hurricane intensity in forecasts and media coverage. 

 “For anyone that follows hurricane coverage on TV, social media, the internet, or in any other form, the Saffir-Simpson scale is the way that hurricanes are described and classified,” said Handlos. 

Toward a More Comprehensive Hazard Framework 

Handlos noted that while the scale is widely recognized, it does not account for other major hazards such as storm surge, inland flooding, tornadoes, and storm size. “Maximum wind speeds are certainly a threat if one is in the path of a hurricane,” he said, “but several other hazards are also problematic.” 

A new scale to complement the Saffir-Simpson scale could be beneficial. It would need to have accurate messaging about all aspects of a hurricane event while also continuing to record Saffir-Simpson scale data for comparison to past events. 

Any effort to revise or supplement the scale would require broad collaboration across sectors. Handlos emphasized that input from government agencies, emergency managers, academic researchers, and private industry would be essential, and that formal adoption of any new system would likely involve coordination with the National Oceanic and Atmospheric Administration and the National Hurricane Center

He added, “If there is a way to update this scale or devise a new scale that both accounts for all types of hurricane hazards and is something that is digestible to the general public, this could be helpful in the future.” 

Forecasting Advances and Communication Challenges 

Climate change is not currently altering how hurricane strength is measured, but it is changing the conditions in which hurricanes form.  Handlos said that with the observed increase in global average temperature over the past several decades, scientists also anticipate sea surface temperature values continuing to rise. This would result in the additional transfer of heat energy from the ocean’s surface to the atmosphere, further fueling hurricanes. It also provides the potential for hurricane development farther poleward in both hemispheres.  

He also pointed to changes in atmospheric moisture. As air temperature rises, the atmosphere’s capacity to hold water vapor is expected to increase. One possible consequence of this is that any rainfall associated with hurricanes could be associated with higher rain rates and more total precipitation, which could intensify inland flooding.  

Advances in forecasting technology are helping meteorologists improve how hurricane hazards are predicted and communicated. According to Handlos, the integration of traditional numerical weather prediction models with artificial intelligence and machine learning techniques, alongside data from radar, satellites, weather balloons, and aircraft, has significantly enhanced the accuracy of hurricane forecasts over the past two decades. 

Still, Handlos cautioned that effectively reaching the public remains a persistent challenge. “Despite repeated warnings and widespread messaging, we often hear stories of individuals choosing not to evacuate, because they’ve weathered previous storms without issue,” he said. “In today’s environment of nonstop social media, constant notifications, and information overload, people can struggle to identify which messages are most important and trustworthy.” 

Image of the Saffir-Simpson Hurricane Scale
 
News Contact
Siobhan Rodriguez
Senior Media Relations Representative 
Institute Communications

BBISS Announces 2025 Sustainability Next Seed Grant Recipients

cover of the 2023-2030 Sustainability Next Plan

The 2025 round of Sustainability Next Research Seed Grants has been awarded to 17 transdisciplinary research teams representing a vibrant network of 51 collaborators from across Georgia Tech. These teams span 21 unique units from six of the seven Colleges, including Schools, research centers, and Interdisciplinary Research Institutes. 

The seed grant program, administered by the Brook Byers Institute for Sustainable Systems (BBISS), reaches many faculty members from a diverse array of disciplines due to the generous support provided by broad-based partnerships in addition to the Sustainability Next funds. This year’s partners are the Georgia Tech Arts Initiative, BBISS, Walter H. Coulter Department of Biomedical Engineering, School of Civil and Environmental Engineering, College of Design, School of City and Regional Planning, School of Computer Science, Ray C. Anderson Center for Sustainable Business, Energy Policy and Innovation Center, Parker H. Petit Institute for Bioengineering and Bioscience, Institute for Matter and Systems, Institute for People and Technology, Institute for Robotics and Intelligent Machines, Strategic Energy Institute, and Center for Sustainable Communities Research and Education.

The goal of the program is to nurture promising research areas for future large-scale collaborative sustainability research, research translation, and/or high-impact outreach; to provide mid-career faculty with leadership and community-building opportunities; and to broaden and strengthen the Georgia Tech sustainability community as a whole. The call for proposals was modeled after the Office of the Executive Vice President for Research’s Moving Teams Forward and Forming Teams programs.

Looking ahead, BBISS will support and nurture these projects in collaboration with the relevant funding partners. Beginning in October, BBISS will host a series of focused workshops designed to foster collaboration and provide additional support to help advance these initiatives. Projects have been grouped into five thematic clusters, each of which will be the focus of an upcoming workshop:

  • Circularity Programs
  • Adaptation to the Changing Environment
  • Community Engagement and Education
  • Climate Science and Solutions
  • Environmental and Health Impacts

BBISS faculty fellows, past seed grant recipients, and other interested Georgia Tech faculty are invited to participate. If you are interested in participating in the workshops, please email kristin.janacek@gatech.edu. The first session on Circularity Programs is Oct. 16 at 1 p.m. in the Peachtree Room (3rd floor) of the John Lewis Student Center.

The 2025 Sustainability Next Seed Grant awards are:

Forming Teams:

Moving Teams Forward:

This round of funding was highly competitive, with 45 proposals submitted. BBISS extends its gratitude to all the individuals and groups who applied, as well as to the faculty and staff who contributed their time and expertise to evaluate the proposals. Their thoughtful input was essential to achieving a fair and collaborative selection process, ensuring that the awarded proposals align strongly with the BBISS’ strategy and show promise for long-term impact and future research opportunities.

According to BBISS Executive Director Beril Toktay, and Brady Family Chair in Management, “The high level of participation demonstrates the enduring commitment to sustainability research and engagement by the Georgia Tech community. BBISS honors this commitment by looking for collaboration opportunities with all who are driving sustainability efforts at Georgia Tech.”

 
News Contact

Brent Verrill, Research Communications Program Manager, BBISS

ChBE Professor Leads Team Awarded $9.2M NSF Grant to Build “Plug-and-Play” Biotechnology

Mark Styczynski in lab

Imagine if building new medicines or sustainable materials were as straightforward as snapping together LEGO® bricks. That’s the goal of a new project led by the Georgia Institute of Technology that could help transform the future of biomanufacturing.

The project, headed by Professor Mark Styczynski in Georgia Tech’s School of Chemical and Biomolecular Engineering (ChBE@GT), recently received a $9.2 million grant from the National Science Foundation Directorate for Technology, Innovation and Partnerships (NSF TIP) to accelerate the adoption of cell-free systems in biomanufacturing.

Promising Technology

Biotechnology has largely relied on living cells for production of products such as medicines, fragrances, or renewable fuels. But working with living cells can be complex and expensive.

Cell-free systems, by contrast, strip biology down to its essential parts, the enzymes and molecules that carry out life’s chemical reactions. This can simplify and speed up biomanufacturing, making it easier to scale.

The challenge, Styczynski explained, is that most cell-free projects still require custom-built setups. “Right now, engineering biology is like reinventing the wheel for every application,” he said. “You have to figure out how all the parts fit together each time. We want to change that by making ready-to-use modules that work right out of the box.”

Styczynski’s project, called Meta-PURE (PUrified Recombinant Elements), will create eight standardized modules, each designed for a key function in cell-free systems, such as generating energy, producing proteins, or assembling complex molecules.

“Like interchangeable puzzle pieces, these modules can be mixed and matched to support different applications,” Styczynski said.

Demonstrating Uses

His team will demonstrate the system’s versatility by producing santalene (a plant-derived fragrance used widely in consumer products), GamS protein (a tool that can improve cell-free processes), and a bacteriophage (a virus that can be safely used in research and the development of new therapeutic treatments).

These examples highlight the technology’s potential across industries ranging from pharmaceuticals and agriculture to chemicals and sustainable materials.

“We want to make these tools so that someone in industry can create their molecule or product more quickly and efficiently, and get it out the door,” Styczynski said. 

“Right now, cell-free systems are mostly limited to high-value products because the cost is too high. The goal is to drive costs down and productivity up, so we can move closer to commodity chemicals like biofuels or monomers for polymers, not just niche applications. One of our partners recently developed a butanol process that shows where this can go,” he said.

NSF Initiative

Styczynski’s team is one of four recently awarded an inaugural investment of $32.4 million to help grow the U.S. bioeconomy. The initiative is called the NSF Advancing Cell-Free Systems Toward Increased Range of Use-Inspired Applications (NSF CFIRE).

“NSF is resolute in our commitment to advancing breakthroughs in biotechnology, advanced manufacturing, and other key technologies of significance to the U.S. economy,” said Erwin Gianchandani, assistant director for NSF TIP. “The novel approaches from these four CFIRE teams will speed up and expand the adoption of cell-free systems across a variety of industries and ensure America’s competitive position in the global bioeconomy.”

Collaborative Effort

While ChBE@GT is the lead, Meta-PURE is a broad collaboration with partners across academia, industry, and government. Co-principal investigators include Paul Opgenorth, co-founder and vice president of development at the biotech firm eXoZymes; Nicholas R. Sandoval, associate professor of Tulane University’s Department of Chemical and Biomolecular Engineering; and Anton Jackson-Smith, founder of the biotech startup b.next.

Meta-PURE will also train graduate students and postdocs in partnership with industry, government, and other universities, helping prepare trainees to be the future of a highly interdisciplinary U.S. bioeconomy. The team will also engage the scientific community on the implementation of metrics and standards in cell-free biotechnology to better facilitate broad adoption and interoperability of not just the results of the Meta-PURE project, but of cell-free efforts more broadly. 

 

 
 
News Contact

Lack of Charging Station Data Deters Widespread Adoption of Electric Vehicles

Omar Isaac Asensio

Electric vehicles (EVs) can be environmentally friendly and more cost-effective — until drivers plan a road trip. Charging stations aren’t as prevalent as traditional gas stations, and even if they can be found along the route, they may not be functioning or may already be occupied by other cars. 

While EV charging locator apps can show drivers where the nearest charger is, they aren’t always accurate enough to show real-time status, such as whether a charger is working and available. How are drivers supposed to hit the road when they aren’t sure where their next charge is coming from? This uncertainty can be enough to deter drivers from purchasing an EV altogether.

New research from Georgia Tech, Harvard University, and Massachusetts Institute of Technology suggests that state governments should step in to help. The right policy could inspire data transparency by station hosts, ensuring that EV drivers have reliable networks — and thus encourage EV ownership. The researchers presented their findings in the paper, “Charger Data Transparency: Curing Range Anxiety, Powering EV Adoption,” in September’s Brookings

Data Deserts

The researchers conducted a field experiment to discover the extent of the problem. This analysis showed that just 34% of EV charging stations provide real-time status updates across six major interstates in 40 U.S. states. The researchers found 150 to 350-mile stretches without real-time charger availability, longer than the stated range of many EV models. This leaves thousands of miles of highways in a data desert. 

“We just don't have real-time data infrastructure necessary to build confidence in the reliability of charging, especially in communities along transit corridors,” said Omar Asensio, an associate professor in the Jimmy and Rosalynn Carter School of Public Policy. “It's not that the capability isn’t there. It's that there aren't clear incentives to encourage EV charging station operators to do the right thing and share the data.” 

Charging Transparency

Government regulation is necessary to improve charging reliability, according to the researchers. State governments could offer funding for charging stations only if the station host agrees to data transparency. A simpler policy proposal would be for all fast chargers on highways to post their real-time status to an application programming interface, where software developers could access it. This approach would provide reliable information on whether a public charger is operational, and it can make government spending more efficient by leveraging network effects. The research team is already collaborating with state governments from Massachusetts to Georgia to discuss how to make this government regulation a reality. 

State governments will also benefit, as EVs can help them close the gap on decreasing carbon emissions. 

“Electric vehicles are a key strategy for decarbonizing the transportation sector and delivering public health co-benefits, but consumers need to trust that public chargers will work when they need them,” Asensio said. “Until real-time data disclosure standards are addressed, reliable, widespread adoption will be hard. A data-centric approach can enhance the efficiency of existing transportation investments.”

Many states, including Georgia, have also supported EV manufacturing. EV brand Rivian recently broke ground on an assembly plant outside Atlanta. More widespread EV adoption is paramount to making these plants economic successes. Data transparency regulations could be a start toward finally making EVs the ideal road trip vehicle. 

 
News Contact

Tess Malone, Senior Research Writer/Editor

tess.malone@gatech.edu

AI for Science and Engineering Collaboration Workshop


Artificial Intelligence (AI) and Machine Learning (ML) are transforming science and engineering — from groundbreaking achievements like protein structure prediction (AlphaFold) to the broad adoption of large language models. Building on this momentum, the Institute for Data Engineering and Science (IDEaS) will host a one-day workshop on Monday, October 13, to explore how AI/ML can drive the next wave of advances in science and engineering at Georgia Tech.

The Future of Antarctic Ice: New Study Reveals the Mathematics of Meltwater Lakes

A view of Greenland's ice sheet from the NASA/USGS Landsat 8 satellite showing meltwater lakes on a glacier. (Credit: NASA)

A view of Greenland's ice sheet from the NASA/USGS Landsat 8 satellite showing meltwater lakes on a glacier. (Credit: NASA)

Georgia Tech researchers have developed a mathematical formula to predict the size of lakes that form on melting ice sheets — discovering their depth and span are linked to the topography of the ice sheet itself. 

The team leveraged physics, model simulations, and satellite imagery to develop simple mathematical equations that can easily be integrated into existing climate models. It’s a first-of-it’s-kind tool that is already improving climate models.

“Melt lakes play an important role in ice sheet stability, but previously, there were no constraints on what we would expect their maximum size to be in Antarctica,” says study lead Danielle Grau, a Ph.D. student in the School of Earth and Atmospheric Sciences. “I was intrigued by the idea of quantifying how much of a role we could expect them to play in the future.”

The paper, “Predicting mean depth and area fraction of Antarctic supraglacial melt lakes with physics-based parameterizations,” was published in Nature Communications. In addition to Grau, the research team includes School of Earth and Atmospheric Sciences Professor Alexander Robel, who is Grau’s advisor, and Azeez Hussain (PHYS 2025).

Their predictions show that the majority of these lakes will be less than a meter deep and span up to 40% of the ice sheet surface area.

“Many models don’t include any data about lakes on the surface of ice sheets, while others simulate these melt lakes growing until the ice collapses,” Robel says. “Our results show that the reality is somewhere in between — and that the maximum size of these lakes can be predicted using these new equations. This gives us real, concrete numbers to use in climate models.”

From summer project to satellite discovery 

Grau first started working on the project as an undergraduate student when she applied for a Summer Research Experiences for Undergraduates program hosted by the School of Earth and Atmospheric Sciences.

Inspired by terrestrial lake research, Grau and Robel investigated the “self-affinity” of the Antarctic ice sheet — a property associated with surface roughness across various scales. For example, a landscape like Badlands National Park, with many rolling hills of a wide range of sizes, would have a different self-affinity than a flat prairie with three large volcanoes.

“A previous study had used this property to predict the size of terrestrial lakes and ponds, and we were curious if we could use a similar approach for supraglacial lakes in Antarctica,” Grau says. “Establishing that the Antarctic ice sheet also has this property was the first step in pursuing this research in more depth.” 

The mathematics of melt

Grau continued the investigation as a Ph.D. student in Robel’s lab. Together, they unraveled the physics of how meltwater moves across the ice surface, designing a ‘glacier in a computer’ that mimics meltwater accumulation and movement across various topographies.

“We designed an algorithm and integrated it into a model that the GT Ice & Climate Group has used in the past,” Grau says. “From that, we were able to see how lakes would form on different surfaces across thousands of scenarios. This was the foundation for the mathematical equations I developed, which can predict the lake depth and lake surface area based on the self-affinity property.”

To check their results, Grau enlisted the help of Hussain — then an undergraduate in the School of Physics — to examine satellite data from the Landsat satellite program (which captures detailed photography of the Earth’s surface from space) to measure existing supraglacial lakes and surface topography. 

“It was exciting to see how our predictions lined up with what we were seeing in the satellite imagery,” Robel explains. “This shows that our solution is a concrete avenue for climate models to realistically incorporate supraglacial lakes.”

Grau is already working to incorporate the team’s equations into an atmospheric model used by NASA in addition to an ice sheet model developed by the NASA Jet Propulsion Laboratory and Dartmouth College. 

“By turning complicated models and satellite data into simple predictive equations, we’re giving climate models a new lens to see the future,” she says. “It’s a small piece of the puzzle,  but one that helps us understand how ice sheets respond to a warming world.”

 

Funding: NASA Modeling, Analysis, and Prediction Program

DOI: https://doi.org/10.1038/s41467-025-61798-8 

 
News Contact

Written by Selena Langner

Atlanta Science Festival Kickoff at Georgia Tech | Celebrate STEAM

Georgia Tech is excited to kickoff off the 13th annual Atlanta Science Festival by welcoming the community to our campus to Celebrate STEAM! Attendees can participate in hands-on STEAM activities, watch science and technology demonstrations, connect with student researchers, and discover the exciting advancements happening at Georgia Tech. The event is free and we have something for everyone with robotics, brains, biology, space, art, nanotechnology, paper, computer science, wearables, bioengineering, chemical engineering, or systems engineering. 

Assembling the Future: Emerging Trends in Auto Manufacturing

Join us for a dynamic and forward-looking conversation with Krishna Bandaru, a leading voice in automotive innovation and manufacturing.