3.8‑Billion‑Year‑Old Titanium Clue Sheds New Light on the Moon’s Early Chemistry

Earth peeking out from beyond the lunar surface.

Taken aboard Apollo 8 by Bill Anders, this iconic picture shows Earth peeking out from beyond the lunar surface as the first crewed spacecraft circumnavigated the Moon, with astronauts Anders, Frank Borman, and Jim Lovell aboard. (Credit: NASA)

A chemical signature hidden in a 3.8‑billion‑year‑old lunar rock is offering new insights into the availability of oxygen within the young Moon.

Published today in the journal Nature Communications, the paper “Trivalent Titanium in High-Titanium Lunar Ilmenite” confirms titanium in a reduced, trivalent state in a black, metal-rich lunar mineral called ilmenite. It’s a state only possible in low-oxygen environments, conditions researchers refer to as “reducing.”

“Models have suggested that these reducing conditions may have varied at different locations and times across the surface of the Moon,” says lead author Advik Vira, a graduate student in the School of Physics who recently earned his doctoral degree. “We hope our microscopy technique can be a valuable step in mapping and understanding the Moon’s 4.5-billion-year history.”

The team anticipates that their technique could be used on many of the lunar samples collected more than 50 years ago by the Apollo missions in addition to the Apollo Next Generation Samples — a group of lunar samples that have been stored under pristine conditions — and new samples from the planned Artemis missions, with Artemis II slated for launch this spring. The technique might also be applicable to samples collected from the far side of the Moon and returned in 2024 by the Chang’e-6 mission.

“The Moon holds clues not only to its own past, but also to the earliest eras of Earth’s evolution — history that has long since been erased from our planet,” Vira says. “This study is a step toward understanding the history of both and a reminder that there is still so much left to learn from the lunar rocks we’ve brought back to Earth.”

The School of Physics research team included corresponding authors Vira and Professor Phillip First; in addition to graduate student Roshan Trivedi; undergraduate students Gabriella Dotson, Keyes EamesDean Kim, and Emma Livernois; and Professor Zhigang Jiang, along with Institute for Matter and Systems Materials Characterization Facility Senior Research Scientist Mengkun TianSchool of Chemistry and Biochemistry Senior Research Scientist Brant Jones and Thom OrlandoRegents' Professor in the School of Chemistry and Biochemistry with a joint appointment in the School of Physics. 

The Georgia Tech team was joined by Addis Energy Senior Geochemist Katherine Burgess; Macalester College Assistant Professor of Geology Emily First; along with Lawrence Berkeley National Laboratory Research Scientist Harrison Lisabeth, Senior Scientist Nobumichi Tamuraand Postdoctoral Fellow Tyler Farr, who recently earned a Ph.D. from Georgia Tech’s George W. Woodruff School of Mechanical Engineering.

CLEVER research

The investigation began with a dark gray rock called a lunar basalt. Formed when ancient magma erupted on the Moon’s surface, minerals crystallized as it cooled — preserving key information in their structures. Billions of years later, the rock was brought to Earth by the 1972 Apollo 17 mission, where a small piece is now stored at Georgia Tech’s Center for Lunar Environment and Volatile Exploration Research (CLEVER), a NASA Solar System Exploration Research Virtual Institute (SSERVI) center led by Orlando.

As a NASA virtual institute, CLEVER supports researchers exploring lunar conditions and developing tools for the upcoming crewed Artemis missions, and provided the lunar samples for this research. The SSERVI also plays a critical role in training the next generation of planetary researchers: both Vira and Farr earned their Ph.D.s while on the CLEVER team.

“At CLEVER, we are very interested in understanding the impacts of space weathering,” Vira says. “We implemented modern sample preparation and advanced microscopy techniques to image samples at the atomic level, and were curious to apply it more broadly to the collection of Apollo rocks in the Orlando Lab. This sample caught our attention.”

“When we imaged an ilmenite crystal from the lunar basalt, what struck us first was how uniform and perfect the crystal structure was,” he recalls. “We found no defects from space weathering and instead saw an undamaged, pristine crystal — undisturbed for 3.8 billion years.”

To investigate further, the team analyzed small chips of the rock with Burgess, a member of the RISE2 SSERVI team and then a geologist at the U.S. Naval Research Laboratory. Using state-of-the-art electron microscopy and spectroscopy techniques, Vira determined the oxidation state of the elements in the ilmenite present. 

In spectroscopy measurements, each element leaves a distinct ‘signature,’ Vira explains. “When we brought our results back to Georgia Tech’s Materials Characterization Facility, Mengkun (Tian) noticed something unusual: the signature showed titanium might be present in the trivalent state.”

The presence of trivalent titanium had long been suspected in this lunar mineral. The team was intrigued. 

A new window into old rocks

With funding from Georgia Tech’s Center for Space Technology and Research (CSTAR), Vira returned to the U.S. Naval Research Laboratory to analyze additional samples. The results confirmed that more titanium was present than the mineral’s formula (FeTiO₃) predicts — indicating a portion of the titanium present was trivalent.

“That led me to place our measurements in terms of the broader geological context,” Vira shares. Working with First, Vira explored how ilmenite with trivalent titanium could help reconstruct the nature of ancient magmas from the Moon, especially the chemical availability of oxygen.

“Because its location on the Moon was noted during the Apollo mission, we know exactly where this rock is from, and we can determine how old the rock is,” he explains. “When coupled with our trivalent titanium measurements, we can use that information to estimate the reducing conditions for this specific region at the specific time our rock formed.”

If the upcoming Artemis missions return samples suitable for the team’s technique, these rocks could provide a new window into ancient lunar geology. The research also highlights that many lunar samples already on Earth could be reexamined to look for trivalent titanium.

“There is still so much to learn from the lunar samples we have already brought to Earth,” Vira says. “It’s a testament to the long-term value of each sample return mission. As technology continues to advance, this type of work will continue to give us critical insights into our planet and our place in the universe for years to come.”

 

DOI: 10.1038/s41467-026-69770-w

Funding: This work was directly supported by the NASA SSERVI under CLEVER. Researchers were also supported by the NASA RISE2 SSERVI and the Heising-Simons Foundation. Funding for collaborations between the U.S. Naval Research Laboratory and Georgia Tech for the investigation of lunar minerals was provided by the Georgia Tech Center for Space Technology and Research. Sample preparation was performed at the Georgia Tech Institute for Matter and Systems, which is supported by the National Science Foundation. This work utilized the resources of the Advanced Light Source, a user facility supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, and was supported in part by previous breakthroughs obtained through the Laboratory Direct.

Advik Vira. He is wearing a colorful science-print button up.

Advik Vira

A figure showing moon rocks, a magnifying glass showing the internal structure, with a green wavy line emitting from the rock.

An illustration of the Apollo rock 75035 on the Moon, an atomic image of the sample, and its spectral signature. (Credit: August Davis)

A chip of the lunar sample.

An optical image of the chip from the lunar rock the team investigated.

The chip, colored in large areas with purple, with blue ribbons of color. There are a total of five white rectangles on the blue areas.

An image of the chip from the sample, imaged using scanning electron microscopy. Titanium is shown in light blue, and white boxes show areas where samples were extracted to analyze the ilmenite crystal.

 
News Contact

Written by:

Selena Langner
College of Sciences
Georgia Institute of Technology

A Successful USDA Program That Has Supported More Than 533,000 Affordable Rental Homes in Rural America is Getting Phased Out

Low-income Americans in rural areas can struggle to pay market-rate rents. mphillips007/iStock via Getty Images Plus

Low-income Americans in rural areas can struggle to pay market-rate rents. mphillips007/iStock via Getty Images Plus

The high cost of renting and buying homes in U.S. cities is no secret. But this affordability problem isn’t limited to urban regions – it affects rural areas as well.

Rural areas, home to about 25% of Americans, benefit from federally supported rental housing programs – particularly a U.S. Department of Agriculture program to provide affordable homes for low-income residents.

The USDA’s Section 515 program is the primary way that the U.S. government finances affordable rental homes in rural communities. Since its inception in 1963, the program has supported the construction of over 533,000 apartments, townhouses and other small, multifamily rental homes.

The program offers below-market-rate loans to private and nonprofit developers who build and manage residential housing for low-income residents in small towns and rural counties. The terms of the deal between property owners and the government obliges these landlords to keep rents affordable for their occupants for decades, generally restricting rent to about 30% of tenants’ income.

 

Last New Loans Were in 2011

People who live in Section 515 housing typically pay around US$325 per month. That’s much less than rural market-rate rents, which typically run $800-$1,100 per month for modest homes.

Because the USDA stopped issuing new Section 515 loans in 2011, this arrangement is phasing out now as existing loans mature.

Loans for about 90% of all remaining Section 515 homes will mature by 2045, according to the Housing Assistance Council, a national nonprofit that supports affordable housing efforts throughout rural America. By 2050, the owners of nearly all properties currently in the program’s portfolio are projected to have paid off their mortgages.

And once most of the owners of these homes exit the Section 515 program, it will have been fully phased out.

An Often-Overlooked Housing Program

As a public policy professor who studies housing, I wanted to understand what happens when Section 515 loans mature. I also was interested in what determines whether properties remain affordable or leave the program after the loans are paid off.

To find out, I worked with three other housing policy researchers on a national study that was peer-reviewed and published in Housing Policy Debate in September 2025.

As of 2024, these loans were still supporting some 400,000 homes on almost 13,000 properties across 87% of all U.S. counties.

The roughly 750,000 Americans in those homes are among the nation’s poorest. The average household income of someone living in Section 515 housing in 2023 was just about $16,000 per year, which was only about one-fifth of the national median household income, which hovered around $76,600 during the same period in inflation-adjusted 2023 dollars.

In addition to having a very low income, more than 60% of the people enrolled in the program are over 62, have disabilities, or fall into both of those categories.

Market-Rate Options After Maturity

The vast majority of these affordable rental homes were built in the 1970s through the 1990s and financed with USDA loans that last between 30 and 50 years.

By 2050, there will be no Section 515 housing left.

The owners of these rental properties no longer have to keep rents affordable once they have paid off their loans. And their owners and tenants may also lose access to a USDA rental assistance program, which helps keep tenants’ housing costs low.

They can refinance the homes or sell the properties. They also can continue to charge affordable rents to occupants or convert those units to market rate. Because of this flexibility, a large share of rural affordable housing units could soon be converted to properties rented at market rates.

What the Data Shows So Far

For this study, our research team analyzed data from nearly 15,000 of the Section 515 properties throughout the country, which have been placed in service since 1963 – including many that are no longer providing rural affordable housing.

We found that the largest factors determining whether a building remains affordable after a Section 515 loan matures are who owns and manages that property. Buildings owned by for-profit companies are far more likely to leave the program than those that belong to nonprofit housing organizations.

Nonprofit-owned buildings, after accounting for building age and local market conditions, are 30% to 40% less likely to convert formerly Section 515 affordable housing into market-rate properties after the owners pay off their loans.

After analyzing this data, we also concluded that buildings run by small property management companies are more likely to leave the program than those managed by larger ones. Properties where the owner manages the homes are also more likely to exit.

Landlords owning more residential properties were also more likely to exit the program. This indicates that larger landlords may be able to afford the renovations and upgrades required to turn their buildings into market-rate housing once restrictions end.

A symbolic wooden house, containg a stack of $1 bills and a money bag with a dollar symbol, sits next to an alarm clock in a grocery cart.

Time is running out on the nation’s main affordable housing program in rural areas. Max Zolotukhin/iStock via Getty Images Plus

 

Why Subsidies and Local Markets Matter

Having subsidies through other government programs can help keep affordable housing units from being converted to market-rate housing.

One-third of Section 515 properties also get support from other programs, including Section 8 vouchers and low-income housing tax credits. Those tax credits are another federal incentive that’s provided to developers who build and rehabilitate affordable rental housing while allowing lower rents for low-income tenants.

Those properties are more likely to remain affordable, even years after some of these tax incentives expire.

Local economic conditions can play a role too. In areas with high unemployment rates, large military populations and low housing inventory, properties are also more likely to exit the program.

That means the same rural counties experiencing economic or demographic pressures are often the most likely to have a decline in affordable housing units when owners pay off their Section 515 loans.

Steps That Can Be Taken

Congress and the USDA have taken some steps to slow the loss of affordable housing in rural areas.

For example, the USDA has funded preservation efforts such as the Multifamily Housing Preservation and Revitalization pilot program, which provides grants, loan restructuring and other financing tools to help repair aging Section 515 properties and extend their affordability.

These efforts have helped preserve some buildings and support ownership transfers from private sector landlords to nonprofit housing groups. But they spend only tens of millions of dollars per year and focus mainly on maintaining existing properties rather than building new housing.

Researchers estimate that about $5.6 billion in repairs would be needed to preserve the affordable housing currently tied to the Section 515 program.

Some lawmakers have proposed reforms aimed at doing more than chipping away at the loss of this kind of affordable housing. The bipartisan Rural Housing Service Reform Act, first introduced in 2023 and reintroduced in 2025, would modernize USDA rural housing programs and allow certain rental assistance contracts to continue after mortgages mature. As of early 2026, the bill remains under consideration.

Over the next two decades, most of these landlords will pay off their Section 515 loans. Unless the government reinvigorates the program or replaces it with something else, much of rural America’s affordable rental housing could gradually disappear as owners convert all Section 515 properties to market-rate housing.

Whether rural communities retain affordable housing will depend not only on what the federal government does, but also on the properties’ owners.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 
News Contact
Author:

Brian Y. An, Co-Director of Center for Urban Research, Director of Master of Science in Public Policy Program, and Assistant Professor of Public Policy, Georgia Institute of Technology

Media Contact:

Shelley Wunder-Smith
shelley.wunder-smith@research.gatech.edu

Georgia Tech Renews Memorandum of Understanding With Sandia

Group of people at Georgia Tech/Sandia MOU signing

Photo by Alicia Bustillos from Sandia National Laboratories

Since 2020, Georgia Tech has partnered with Sandia National Laboratories, a federally funded research and development center focused on national security. In February, the two institutions renewed their collaboration with a new Memorandum of Understanding (MOU), reaffirming a relationship that has already strengthened research capabilities on both sides.

The partnership has driven progress in areas ranging from hypersonics to bioscience, while also deepening institutional ties beyond research. Joint faculty appointments — such as Anirban Mazumdar, who holds roles at both Sandia and the George W. Woodruff School of Mechanical Engineering — demonstrate how closely the organizations work together. The collaboration has also expanded student talent pipelines, providing more avenues for Georgia Tech students to pursue careers at the national lab.

“At its core, this partnership is about people,” said Tim Lieuwen, executive vice president for Research at Georgia Tech. “Sandia and Georgia Tech share a commitment to discovery and developing the talent, creativity, and collaboration our nation needs.”

The renewed MOU, he said, “strengthens connections between our researchers, opens new doors for our students, and builds meaningful career pathways into national service. When our communities work together to address national priorities, we not only accelerate technological advances — we expand opportunities for the people who will shape the future of our nation’s security.”

Under the new MOU, Sandia and Georgia Tech will focus on integrated research across key national security‑aligned areas, including secure artificial intelligence and computing, quantum technologies, critical minerals, advanced manufacturing, energy and grid resilience, and hypersonics. The partnership emphasizes connecting manufacturing, computation, and systems approaches directly to national security applications.

“Together, we have been solving new and unprecedented challenges in science and engineering, and now we have a great opportunity to develop this partnership,” said Dan Sinars, Sandia’s deputy chief research officer. “Our research benefits both national security and national prosperity, and keeps the country at the forefront of the world.”

With this strengthened connection, the partners aim to grow their shared research footprint through increased funding, publications, and faculty-led startups. Over the long term, Georgia Tech intends to become one of Sandia’s top hiring pipelines, ensuring that talent developed through joint research continues into national security careers.

History of the Partnership

The Institute’s collaboration with Sandia began in the mid‑2010s, when the labs selected Georgia Tech as one of its partner institutions. The first MOU, signed in 2015, formalized the relationship and outlined initial technical focus areas. 

In 2018, George White, executive director of strategic partnerships, and Olof Westerstahl, senior director strategic initiatives in the Office of Corporate Engagement, helped expand the partnership. They launched “Sandia Day,” an event designed to introduce Georgia Tech faculty to Sandia researchers and spark new collaborations. By 2020, the organizations signed a second MOU that expanded the partnership’s technical focus areas to include energy and grid security, materials and nanotechnology, advanced electronics, advanced manufacturing, advanced computing, cyber and information security, bioscience, hypersonics, quantum information science, and engineering sciences.

The results have been substantial. Since 2018, Sandia has sponsored $35 million in research collaborations with Georgia Tech. Researchers from both institutions have co-authored 450 publications since 2016. Research activity continues to accelerate, with $1.6 million in new contracts in the past year alone. As of August 2025, Sandia employs 325 Georgia Tech alumni — a testament to the impact of the growing talent pipeline.

“We view our work with Sandia as the model for engagement with other national labs,” said White. “With the new MOU, we will continue to grow the Sandia partnership. I would like to see our footprint double in scope in the next five years.”

 

 
News Contact

Tess Malone, Senior Research Writer/Editor

tess.malone@gatech.edu

When GPS Lies at Sea: How Electronic Warfare is Threatening Ships and Their Crews

Cyberattacks like GPS spoofing threaten oil supertankers and cargo ships at sea. Ping Shu/Moment via Getty Images

Cyberattacks like GPS spoofing threaten oil supertankers and cargo ships at sea. Ping Shu/Moment via Getty Images

The war in Iran has dominated headlines with reports of airstrikes and escalating military activity. But beyond the immediate devastation, the conflict has also illuminated a quieter and rapidly growing danger: the vulnerability of ships, and the people who operate them, to disruption of their navigation systems.

Modern shipping depends heavily on GPS satellite navigation. When those signals are disrupted or manipulated, ships can suddenly appear to their navigators and to other ships to be somewhere they are not. In some cases, vessels have been shown jumping across maps, drifting miles inland or appearing to circle in impossible patterns. The risk is even higher in war zones, where ships could be misdirected into harm’s way.

As a cybersecurity researcher studying critical infrastructure and maritime systems, I investigate how digital threats affect ships and the people who operate them.

To understand the threat from GPS disruptions, it helps to first understand how GPS works. GPS systems determine location using signals from satellites orbiting Earth. A receiver calculates its position by measuring how long those signals take to arrive. Because those signals are extremely weak by the time they reach Earth, they are relatively easy to disrupt.

GPS Jamming and Spoofing

In GPS jamming, an attacker blocks the real satellite signals by overwhelming them with electromagnetic noise so receivers cannot detect them. When this happens, navigation systems lose their position. On a phone, it might look like the map freezing or jumping erratically.

GPS spoofing is more sophisticated. Instead of blocking signals, an attacker transmits fake satellite signals designed to mimic the real ones. The receiver accepts these signals and gives a false location. Imagine driving north while your navigation system suddenly insists you are traveling south. The receiver is not malfunctioning; it has simply been tricked.

a map showing numerous red dots and three red circles

Circular loops in the Black Sea show spoofed ship positions recorded in January 2025. The red points represent false GPS locations broadcast during spoofing events, making vessels appear to move in perfect circles on tracking maps even though they were actually hundreds of miles away. These disruptions are widely believed to be linked to electronic interference in the region during the war in Ukraine. Image created with data from Spire Global. Anna Raymaker

For mariners at sea, spoofing can have serious consequences. In the open ocean, there are few landmarks to verify a ship’s position if GPS behaves strangely. Nearshore, the margin for error disappears: Water depths change quickly and hazards are everywhere, especially in narrow routes like the Strait of Hormuz near Iran, where reports indicate that GPS spoofing has been happening since the outbreak of the war. Because ships are large and slow to maneuver, even small navigation errors can lead to groundings or collisions.

Red Sea Grounding

One example came in May 2025. While transiting the Red Sea, the container ship MSC Antonia began showing positions far from its true location. To navigators onboard, this looked like they had jumped hundreds of miles south on the map and started moving in a new direction. This caused the crew to become disoriented, and the ship eventually ran aground. The grounding caused millions of dollars in damage and required a salvage operation that lasted over five weeks.

two copies of a map side-by-side showing a body of water

MSC Antonia route comparison showing the vessel’s true route and grounding point, left, versus the spoofed route, right. The red and black lines on the right show the spoofed locations where the ship appeared to suddenly jump to on GPS. These lines confused the navigators and caused them to run aground. Images created with data from VT Explorer. Anna Raymaker

Incidents like the MSC Antonia are not isolated. Vessel-tracking data has revealed clusters of ships suddenly appearing in impossible locations, sometimes far inland or moving in perfect circles. These anomalies are increasingly linked to GPS spoofing in regions experiencing geopolitical conflict.

But GPS interference is only one type of cyber threat facing ships. Industry reports have documented ransomware attacks on shipping companies, supply chain compromises and increasing concern about the security of onboard control systems, including engines, propulsion and navigation equipment. As ships become more connected through satellite internet systems and remote monitoring tools, the number of potential entry points for cyberattacks is growing.

Military vessels often address these risks through stricter network segregation and regular training exercises such as “mission control” drills, which simulate operating with compromised communications or navigation systems. Some cybersecurity experts argue that similar practices could help commercial shipping improve its resilience, although smaller crews and limited resources make adopting military-style procedures more difficult.

Mariners’ Experiences

Much of the public discussion around maritime cybersecurity focuses on technical vulnerabilities in ship systems. But an equally important piece of the puzzle is the people who must interpret and respond to these technologies when something goes wrong.

In recent research, my colleagues and I interviewed professional mariners about their experiences with cyber incidents and their preparedness to respond to them. The interviews included navigation officers, engineers and other crew members responsible for ship systems. What emerged was a consistent picture: Cyber threats are increasingly occurring at sea, but crews are not well prepared to deal with them.

Many mariners told us that their cybersecurity training focused almost entirely on email phishing and USB drives. That kind of training may make sense in an office, but it does little to prepare crews for cyber incidents on a ship, where navigation and control systems can be the primary targets. As a result, many mariners lack clear guidance on how cyberattacks might affect the equipment they rely on every day.

a man inside the bridge of a large ship at sea looks through binoculars with another ship in the background

Commercial shipping crews are generally poorly trained to deal with cyber threats. MenzhiliyAnantoly/iStock via Getty Images

This becomes a problem when ship systems begin behaving strangely. Mariners described GPS showing incorrect positions or temporarily losing signal. It can be difficult to tell whether these incidents are equipment failures or signs of cyber interference.

Even when mariners suspect something may be wrong, many ships lack clear procedures for responding to cyber incidents. Participants frequently described situations where they would have to improvise if navigation or other digital systems behaved unexpectedly. Unlike equipment failures, which have established checklists and procedures, cyber incidents often fall into a gray area where responsibility and response plans are unclear.

Another challenge is the gradual disappearance of traditional navigation practices. For centuries, mariners relied on paper charts and celestial navigation to determine their position. Today, most commercial vessels rely almost entirely on electronic systems.

Many mariners noted that paper charts are not available onboard, and celestial navigation is rarely practiced. If GPS or electronic navigation systems fail, crews have limited ways to independently verify their position. One mariner bluntly described the risk to us: “If you don’t have charts and you’re being spoofed, you’re a little screwed.”

A crew member explains the instruments on the bridge of an oil tanker.

Increasing Connectivity, Increasing Risk

At the same time, ships are becoming more connected. Modern vessels increasingly rely on satellite internet systems like Starlink and remote monitoring tools to manage operations and communicate with shore.

While these technologies improve efficiency, they also expand the vulnerability of ship systems. Connectivity that allows crews to send emails or access the internet can also provide pathways for cyber threats to reach onboard systems.

As GPS spoofing becomes more common in regions experiencing geopolitical conflict, the challenges mariners described in our research are becoming harder to ignore. The oceans may seem vast and empty, but the digital signals that guide modern ships travel through crowded and contested space.

When those signals are manipulated, the consequences do not stay confined to military systems. They reach the commercial vessels that carry most of the world’s goods and the crews responsible for navigating them safely.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 
News Contact
Author:

Anna Raymaker, Ph.D. Candidate in Electrical and Computer Engineering, Georgia Institute of Technology

Media Contact:

Shelley Wunder-Smith
shelley.wunder-smith@research.gatech.edu

Sheepdogs Reveal a Better Way to Guide Robot Swarms

Sheepdog herding sheep

SMART Dogs herding sheep on a farm, looks like flock of bird pattern

Sheepdogs, bred to control large groups of sheep in open fields, have demonstrated their skills in competitions dating back to the 1870s.

In these contests, a handler directs a trained dog with whistle signals to guide a small group of sheep across a field and sometimes split the flock cleanly into two groups. But sheep do not always cooperate.

Researchers at the Georgia Institute of Technology studied how handler–dog teams manage these unpredictable flocks in sheepdog trials and found principles that extend beyond livestock herding.

In a study published in Science Advances as the cover feature, the researchers applied those insights to computer simulations showing how similar strategies could improve the control of robot swarms, autonomous vehicles, AI agents, and other networked systems where many machines must coordinate their actions despite uncertain conditions.

Group Movement Dynamics

“Birds, bugs, fish, sheep, and many other organisms move in groups because it benefits individuals, including protection from predators,” said Saad Bhamla, an associate professor in Georgia Tech’s School of Chemical and Biomolecular Engineering. “The puzzle is that the ‘group’ is not a single organism. It is built from many individuals, each making local, imperfect decisions.”

When a predator threatens a herd of sheep, individuals near the edge often move toward the center to reduce their own risk, Bhamla explained. “This is ‘selfish herd’ behavior,” he said. “Shepherds exploit that instinct using trained dogs.”

From examining hours of contest footage, the researchers found that controlling small groups of sheep can be harder than managing large ones. A larger group, with more sheep protected in the center, may behave more coherently than a small group as the animals constantly shift between two instincts: “follow the group” and “flee the dog.”

“That switching behavior makes the group unpredictable,” said Tuhin Chakrabortty, a former postdoctoral researcher in the Bhamla Lab who co-led the study.

Looking closely at how dogs and their handlers guide small groups, the researchers found that unpredictability in the flock’s behavior does not always make control harder. “Under the right conditions, that ‘noisy’ behavior might actually be a benefit,” Bhamla said.

Successful Sheep Herding

Sheepdog handlers categorize sheep by how strongly they respond to a dog’s threatening pressure. Some very responsive sheep might panic under too much pressure, while others might ignore mild pressure and require stronger positioning by the dog.

The researchers observed that successful control often followed a two-step pattern. First, the dog subtly influenced the sheep’s orientation while the animals were mostly standing still. Once the flock was aligned in the desired direction, the dog increased pressure to trigger movement. The timing of those actions was critical, because alignment within a small group could disappear quickly as individuals switched between instincts.

“In our simulations, increasing pressure makes the flock reach the desired orientation faster, but how long the flock stays aligned is set mainly by noise,” Chakrabortty said. “In essence, dogs can steer the direction, but they can’t hold that decision indefinitely, so timing matters.”

Developing Computer Models

To understand the broader implications of that behavior, the team developed computer models that captured how sheep respond both to the dog and to one another. The models allowed the researchers to test different strategies for guiding groups whose members make independent decisions under uncertainty.

They then applied those ideas to simulations of robotic swarms. Engineers often design such systems so that each robot blends signals from all nearby robots before deciding how to move. While that approach works well when signals are clear, it can break down when information is noisy or conflicting, Bhamla explained.

To explain why that switching strategy can work under noisy conditions, the researchers used an analogy of a smoke-filled room where only one person can see the exit, and no one knows who that person is. If everyone polls everyone else and averages the guesses, the one correct signal can get diluted by many noisy ones.

“That’s the counterintuitive part. When only one person has the right information, averaging can wash out the signal. But if you follow one person at a time, and keep switching who that is, the right information can spread through the crowd,” Bhamla said.

Building on that idea, the researchers tested a strategy inspired by the switching behavior they observed in sheep. In the simulations, each robot paid attention to just one source at a time (either a guiding signal or a neighboring robot) and switched that source from one step to the next.

Under noisy conditions, this switching strategy required less effort to keep the group moving along a desired path than either averaging-based strategies or fixed leader-follower strategies.

The researchers call their approach the Indecisive Swarm Algorithm. The name reflects a counterintuitive insight: allowing influence to shift among individuals over time can make groups easier to guide when conditions are uncertain.

“Our findings suggest that the same dynamics that make small animal groups unpredictable may also offer new ways to control complex engineered systems,” Bhamla said.

CITATION: Tuhin Chakrabortty and Saad Bhamla, “Controlling noisy herds: Temporal network restructuring improves control of indecisive collectives,” Science Advances, 2026

Sheepdog herding seep

A dog herding sheep in a sheepdog trial

 
News Contact

US Military Leans Into AI for Attack on Iran, But the Tech Doesn’t Lessen the Need for Human Judgment In War

 black and white aerial view of an airfield AI is helping U.S. forces find and choose targets in Iran, like this airfield. U.S. Central Command via AP
black and white aerial view of an airfield

AI is helping U.S. forces find and choose targets in Iran, like this airfield. U.S. Central Command via AP

The U.S. military was able “to strike a blistering 1,000 targets in the first 24 hours of its attack on Iran” thanks in part to its use of artificial intelligence, according to The Washington Post. The military has used Claude, the AI tool from Anthropic, combined with Palantir’s Maven system, for real-time targeting and target prioritization in support of combat operations in Iran and Venezuela.

While Claude is only a few years old, the U.S. military’s ability to use it, or any other AI, did not emerge overnight. The effective use of automated systems depends on extensive infrastructure and skilled personnel. It is only thanks to many decades of investment and experience that the U.S. can use AI in war today.

In my experience as an international relations scholar studying strategic technology at Georgia Tech, and previously as an intelligence officer in the U.S. Navy, I find that digital systems are only as good as the organizations that use them. Some organizations squander the potential of advanced technologies, while others can compensate for technological weaknesses.

Myth and |Reality in Military AI

Science fiction tales of military AI are often misleading. Popular ideas of killer robots and drone swarms tend to overstate the autonomy of AI systems and understate the role of human beings. Success, or failure, in war usually depends not on machines but the people who use them.

In the real world, military AI refers to a huge collection of different systems and tasks. The two main categories are automated weapons and decision support systems. Automated weapon systems have some ability to select or engage targets by themselves. These weapons are more often the subject of science fiction and the focus of considerable debate.

Decision support systems, in contrast, are now at the heart of most modern militaries. These are software applications that provide intelligence and planning information to human personnel. Many military applications of AI, including in current and recent wars in the Middle East, are for decision support systems rather than weapons. Modern combat organizations rely on countless digital applications for intelligence analysis, campaign planning, battle management, communications, logistics, administration and cybersecurity.

Claude is an example of a decision support system, not a weapon. Claude is embedded in the Maven Smart System, used widely by military, intelligence and law enforcement organizations. Maven uses AI algorithms to identify potential targets from satellite and other intelligence data, and Claude helps military planners sort the information and decide on targets and priorities.

The Israeli Lavender and Gospel systems used in the Gaza war and elsewhere are also decision support systems. These AI applications provide analytical and planning support, but human beings ultimately make the decisions.

Researcher Craig Jones explains how the U.S. military is using artificial intelligence in its attack on Iran, and some of the issues that arise from its use.

The Long History of Military AI

Weapons with some degree of autonomy have been used in war for well over a century. Nineteenth-century naval mines exploded on contact. German buzz bombs in World War II were gyroscopically guided. Homing torpedoes and heat-seeking missiles alter their trajectory to intercept maneuvering targets. Many air defense systems, such as Israel’s Iron Dome and the U.S. Patriot system, have long offered fully automatic modes.

Robotic drones became prevalent in the wars of the 21st century. Uncrewed systems now perform a variety of “dull, dirty and dangerous” tasks on land, at sea, in the air and in orbit. Remotely piloted vehicles like the U.S. MQ-9 Reaper or Israeli Hermes 900, which can loiter autonomously for many hours, provide a platform for reconnaissance and strikes. Combatants in the Russia-Ukraine war have pioneered the use of first-person view drones as kamikaze munitions. Some drones rely on AI to acquire targets because electronic jamming precludes remote control by human operators.

But systems that automate reconnaissance and strikes are merely the most visible parts of the automation revolution. The ability to see farther and hit faster dramatically increases the information processing burden on military organizations. This is where decision support systems come in. If automated weapons improve the eyes and arms of a military, decision support systems augment the brain.

Cold War era command and control systems anticipated modern decision support systems such as Israel’s AI-enabled Tzayad for battle management. Automation research projects like the United States’ Semi-Automatic Ground Environment, or SAGE, in the 1950s produced important innovations in computer memory and interfaces. In the U.S. war in Vietnam, Igloo White gathered intelligence data into a centralized computer for coordinating U.S. airstrikes on North Vietnamese supply lines. The U.S. Defense Advanced Research Projects Agency’s strategic computing program in the 1980s spurred advances in semiconductors and expert systems. Indeed, defense funding originally enabled the rise of AI.

Organizations Enable Automated Warfare

Automated weapons and decision support systems rely on complementary organizational innovation. From the Electronic Battlefield of Vietnam to the AirLand Battle doctrine of the late Cold War and later concepts of network-centric warfare, the U.S. military has developed new ideas and organizational concepts.

Particularly noteworthy is the emergence of a new style of special operations during the U.S. global war on terrorism. AI-enabled decision support systems became invaluable for finding terrorist operatives, planning raids to kill or capture them, and analyzing intelligence collected in the process. Systems like Maven became essential for this style of counterterrorism.

The impressive American way of war on display in Venezuela and Iran is the fruition of decades of trial and error. The U.S. military has honed complex processes for gathering intelligence from many sources, analyzing target systems, evaluating options for attacking them, coordinating joint operations and assessing bomb damage. The only reason AI can be used throughout the targeting cycle is that countless human personnel everywhere work to keep it running.

AI gives rise to important concerns about automation bias, or the tendency for people to give excessive weight to automated decisions, in military targeting. But these are not new concerns. Igloo White was often misled by Vietnamese decoys. A state-of-the-art U.S. Aegis cruiser accidentally shot down an Iranian airliner in 1988. Intelligence mistakes led U.S. stealth bombers to accidentally strike the Chinese embassy in Belgrade, Serbia, in 1999.

Many Iraqi and Afghan civilians died due to analytical mistakes and cultural biases within the U.S. military. Most recently, evidence suggests that a Tomahawk cruise missile struck a girls school adjacent to an Iranian naval base, killing about 175 people, mostly students. This targeting could have resulted from a U.S. intelligence failure.

 

Automated Prediction Needs Human Judgment

The successes and failures of decision support systems in war are due more to organizational factors than technology. AI can help organizations improve their efficiency, but AI can also amplify organizational biases. While it may be tempting to blame Lavender for excessive civilian deaths in the Gaza Strip, lax Israeli rules of engagement likely matter more than automation bias.

As the name implies, decision support systems support human decision-making; AI does not replace people. Human personnel still play important roles in designing, managing, interpreting, validating, evaluating, repairing and protecting their systems and data flows. Commanders still command.

In economic terms, AI improves prediction, which means generating new data based on existing data. But prediction is only one part of decision-making. People ultimately make the judgments that matter about what to predict and how to use predictions. People have preferences, values and commitments regarding real-world outcomes, but AI systems intrinsically do not.

In my view, this means that increasing military use of AI is actually making humans more important in war, not less.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 
News Contact
Author:

Jon R. Lindsay, associate professor of Cybersecurity and Privacy and of International Affairs, Georgia Institute of Technology

Media Contact:

Shelley Wunder-Smith
shelley.wunder-smith@research.gatech.edu

Effective Carbon Removal Requires Transparency, Says New Georgia Tech Research

A tall red‑and‑white industrial smokestack releasing a thick plume of light‑colored smoke into the sky.

Carbon dioxide continues to push global temperatures toward dangerous thresholds that affect everything from public health to economies. To mitigate these effects, researchers are looking into carbon removal methods such as direct air capture machines that can chemically bind with carbon or simple ecological strategies like adding trees to unwooded areas. These approaches could potentially supplement the decarbonization of transport, industry, and the energy system.

But as carbon removal grows, so does a core problem: The carbon removal industry is largely unregulated, particularly for more novel technologies without long-standing norms around reporting and verification. In today’s “voluntary carbon market,” a private company can claim it removed a certain amount of carbon, list that amount for sale, and allow another company to buy it to offset its emissions — with little independent oversight or transparency.

A new Nature NPJ Climate Action article argues that this system isn’t enough to meet global climate goals, and could even end up causing harm. In the paper, Chris Reinhard, Georgia Power Chair and associate professor in Georgia Tech’s School of Earth and Atmospheric Sciences, and Noah Planavsky of the Yale Center for Natural Carbon Capture call for a fundamental shift: Carbon removal should be quantifiable, economically viable, and pursued in ways that create benefits for local communities — and greater transparency in carbon removal practice is necessary.

“We argue that it’s important to understand and quantify carbon removal practices that can benefit local communities, like better crop yields, and that this understanding is really only possible if these practices are pursued transparently,” Reinhard said. “The data used to quantify carbon removal and how much it costs need to be transparent — the surest route toward learning what works and building public trust in carbon removal as a solution.”

Transparency Trouble

Reinhard and Planavsky bring a unique technical and policy perspective to the issue. As geochemists, they study how Earth’s chemical composition and geological processes control the carbon cycle. Reinhard also co-founded a carbon removal startup he has since divested from. That insider experience and academic background helped them see the disconnect between what’s technologically possible and what market logic culturally or commercially incentivizes.

Today’s carbon removal startups often guard their methods and data as proprietary intellectual property. Without regulatory requirements or pressure from corporate carbon buyers, these startups have little reason to disclose carbon accounting practices, cost structures, or actual long-term impacts. The researchers argue that policy guidance and advocacy are needed to shift the industry toward meaningful openness.

“Our expertise is most firmly grounded in the technical dimensions of these carbon removal processes,” Reinhard said, “but we saw an opportunity here to push for better policy and start this dialogue about what transparency really means, in part to foster more public debate about what carbon removal ought to be doing for society.”

Community Beyond Carbon

The authors also stress that carbon removal should deliver benefits beyond atmospheric cleanup that communities can see and advocate for. For example, liming, or adding limestone to soil, can remove carbon while also improving crop yields and reducing erosion. Coastal ecosystem restoration can sequester carbon while strengthening shorelines and supporting fisheries. Georgia Tech’s own direct air capture work builds community engagement into the process to ensure that carbon removal is equitable. 

Reinhard and Planavsky say the next best step for the carbon removal industry is to identify which removal pathways offer the clearest benefits, what they cost, and where transparency gaps are most damaging. This foundation will help create policies that make carbon removal reliable, verifiable, and community-centered. 

Without oversight, they argue, carbon removal risks remaining a niche, market-defined practice — when the climate challenge demands a trusted, scalable, and democratically governed solution.

CITATION: Reinhard, C.T., Planavsky, N.J. The importance of radical transparency for responsible carbon dioxide removal. npj Clim. Action 5, 7 (2026). https://doi.org/10.1038/s44168-025-00324-4

 
News Contact

Tess Malone, Senior Research Writer/Editor
tess.malone@gatech.edu

Georgia Tech Energy Day: Meeting AI’s Growing Energy Demands

Georgia Tech Energy Day 2026 Header Image with three boxes showing an image of a datacenter, an electric bulb with energy sources around it and a multi-colored critical mineral

Georgia Tech Energy Day returns this year on March 19 with an expanded focus and a new collaborative momentum. Cohosted by the Georgia Tech Institute for Matter and Systems (IMS) and the Strategic Energy Institute, (SEI) with plenary session support from the Energy Policy and Innovation Center, Energy Day 2026 convenes leaders from academia, industry, government, and students to address the challenges associated with meeting the rapidly growing electricity demand driven by artificial intelligence (AI) and high-performance computing. 

Set in the heart of Tech Square on the Georgia Tech campus, this year’s event explores how energy systems, materials, technologies, supply chains, and policy must evolve in response to AI’s accelerating impact. As digital infrastructure expands and computation intensifies, the need for reliable, resilient, and sustainable power has never been more urgent. 

“Energy Day reflects Georgia Tech’s strength in connecting world-class research in materials and components with the infrastructure and partnerships needed to translate discovery into scalable energy technologies that serve industry, society, and the future economy,” said Eric Vogel, executive director of the IMS and the Hightower Professor in Materials Science and Engineering. 

Energy Day 2026 also marks an important milestone with the introduction of its first group of corporate sponsors: GE VernovaSouthern CompanyGeorgia Power, Southwire Spark, ExxonMobilGems Setraand Tektronix. Their support reflects a shared commitment to advancing energy solutions. 

“Tektronix is excited to be part of Energy Day because advancing the future of energy starts with precise measurement and trusted insights,” said Christopher Bohn, president of Tektronix. “From power electronics and high voltage systems to grid scale renewables and AI driven control technologies, the breakthroughs discussed here directly align with the innovations we support through our products and solutions. Collaborating with Georgia Tech allows us to engage early with emerging research and the next generation of engineers—critical collaborators in building a cleaner, smarter, and more resilient energy ecosystem.”

The keynote address will be delivered by Vanessa Z. Chan, a nationally recognized leader at the intersection of innovation, commercialization, and emerging technologies. Chan will provide insights on accelerating technological discovery, emphasizing how AI is transforming energy and materials design. She will discuss how commercialization strategies must rapidly evolve across multidisciplinary energy domains from grid modernization to advanced batteries and clean manufacturing.

Building on the themes introduced in the keynote, the program transitions into a fireside chat with Georgia Tech EVPR Tim Lieuwen featuring Amit Kulkarni and Jim Walsh. Kulkarni is vice president of Product Management and Strategy for the Gas Power business within GE Vernova, where he oversees the world’s largest portfolio of power generation equipment. Walsh, vice president of GE Vernova’s Consulting Services, leads teams providing innovative solutions across the full spectrum of power generation, delivery, and utilization.

Next comes a policy-focused panel that will explore the surge in power demand driven by AI, how the United States is addressing today’s most urgent energy challenges, and the long-term implications of today’s decisions for a sustainable energy future. Bringing together leading voices in U.S. environmental and energy policy, the panel features Joe Aldy of Harvard University and former special assistant to the president for Energy and Environment; Al McGartland of New York University’s Institute for Policy Integrity and former Environmental Protection Agency lead economist and director of the National Center for Environmental Economics; and Kevin Rennert, fellow and director of the Comprehensive Climate Strategies Program at Resources for the Future and former staff member on the U.S. Senate Committee on Energy and Natural Resources.

The second panel focuses on critical materials — the foundation of advanced energy systems and digital technologies. As AI, data centers, and advanced energy technologies drive demand for critical materials, securing them now requires integration and coordination across the entire value chain. Panelists include Rachel Galloway, British consul general in Atlanta; Vijay Murugesan, head of Materials Intelligence and Digital Innovation at Amazon, Charles Sims, Tennessee Valley Authority Distinguished Professor of Energy and Environmental Policy at the University of Tennessee; and Nortey Yeboah, principal engineer at Southern Company. Together, they will offer perspectives on the policy and economic frameworks shaping the energy supply chain, from developing raw resources to manufacturing the technologies essential to future energy systems.

In the afternoon, participants can dive deeper into specialized topics through three focused technical tracks. 

  • Meeting the Demand for Power” will examine how emerging technologies, advanced nuclear systems, and renewable integration can work together to deliver reliable, resilient electricity.
  • Data Center Infrastructure and Resources” will explore innovations in thermal management technologies, energy-efficient computing, and the broader resource impacts of expanding digital infrastructure.
  • Grid Technologies and Markets” will highlight strategies for strengthening grid capacity, incorporating demand-side management, and optimizing carbon performance as energy systems evolve.

“Meeting the rapidly rising electricity demand driven by AI requires bold ideas, coordinated action, and research that moves at the speed of innovation,” said Yuanzhi Tang, executive director of the SEI. “Energy Day 2026 brings together the people and expertise needed to shape resilient, sustainable energy systems for the future. At Georgia Tech, we see this event as a catalyst for new partnerships, new solutions, and a shared commitment to strengthening the nation’s energy foundation.”

Energy Day 2026 is designed for researchers advancing emerging energy technologies, policymakers navigating shifting regulatory and geopolitical landscapes, industry professionals seeking insight into emerging tools and supply chains, and students preparing to enter one of the most consequential sectors of the decade. It also welcomes anyone interested in AI, sustainability, electrification, and critical materials. 

Join us to explore the future of energy. To learn more and register, visit: Energy Day 2026.

 
News Contact

Priya Devarajan | Communications Program Manager

$8.9 million approved for Georgia Forestry Innovation Initiative

Tall pine trees in a sunlit forest with dense green grasses and undergrowth covering the forest floor.

Georgia Tech is pleased to partner with the Georgia Forestry Commission on the approved $8.9 million Georgia Forestry Innovation Initiative included in Gov. Brian Kemp’s amended FY 2026 budget.

Georgia’s forest industry has long been a pillar of the state’s rural economy. But in recent years, mill closures and shifting markets have put pressure on landowners, workers, and entire communities, particularly in south Georgia. A recently approved $8.9 million Georgia Forestry Innovation Initiative will help chart a new path forward, creating more value from Georgia’s abundant forest resources and expanding opportunities for the people and regions depending on them. 

Georgia Tech is pleased to partner with the Georgia Forestry Commission on the approved $8.9 million Georgia Forestry Innovation Initiative included in Gov. Brian Kemp’s amended FY 2026 budget. This effort aims to transform low-value wood and mill byproducts into high-value materials, strengthening Georgia’s forest-based economy and supporting new commercial opportunities across the state. The initiative will establish pilot facilities and accelerate technology to business transfer in partnership with industry, with the long-term goal of enabling multiple manufacturing sites across Georgia.  

“We appreciate the state’s investment in helping move these innovations from the lab to Georgia businesses,” said Carson Meredith, executive director of Tech’s Renewable Bioproducts Institute (RBI). “We also acknowledge the critical support of industry collaborators and partners like the Georgia Forestry Association and Georgia Forestry Foundation.” 

The work builds on collaborative interdisciplinary research at Georgia Tech involving School of Chemical and Biomolecular Engineering Professors Andreas Bommarius, Chris Luettgen and Meredith; School of Chemistry and Biochemistry Professor Stefan France and Professor of the Practice A.J. “Bo” Arduengo; and H. Milton Stewart School of Industrial Systems and Engineering Professor Valerie Thomas. Gary Black, RBI program manager, has also contributed to this effort. It is led by RBI’s Center for a Renewables-Based Economy from Wood (ReWOOD.) The effort reflects years of cross-disciplinary collaboration among faculty and staff committed to advancing sustainable, wood-based technologies. 

 
News Contact

Media Contact: Jennifer Martin | jennifer.martin@research.gatech.edu