Gauging Glaciers: Alex Robel Awarded NSF CAREER Grant for New Ice Melt Modeling Tool

A stylized glacier (Selena Langner)

Alex Robel is improving how computer models of melting ice sheets incorporate data from field expeditions and satellites by creating a new open-access software package — complete with state-of-the-art tools and paired with ice sheet models that anyone can use, even on a laptop or home computer.

Improving these models is critical: while melting ice sheets and glaciers are top contributors to sea level rise, there are still large uncertainties in sea level projections at 2100 and beyond.

“Part of the problem is that the way that many models have been coded in the past has not been conducive to using these kinds of tools,” Robel, an assistant professor in the School of Earth and Atmospheric Sciences, explains. “It's just very labor-intensive to set up these data assimilation tools — it usually involves someone refactoring the code over several years.”

“Our goal is to provide a tool that anyone in the field can use very easily without a lot of labor at the front end,” Robel says. “This project is really focused around developing the computational tools to make it easier for people who use ice sheet models to incorporate or inform them with the widest possible range of measurements from the ground, aircraft and satellites.”

Now, a $780,000 NSF CAREER grant will help him to do so. 

The National Science Foundation Faculty Early Career Development Award is a five-year funding mechanism designed to help promising researchers establish a personal foundation for a lifetime of leadership in their field. Known as CAREER awards, the grants are NSF’s most prestigious funding for untenured assistant professors.

“Ultimately,” Robel says, “this project will empower more people in the community to use these models and to use these models together with the observations that they're taking.”
 

Ice sheets remember

“Largely, what models do right now is they look at one point in time, and they try their best — at that one point in time — to get the model to match some types of observations as closely as possible,” Robel explains. “From there, they let the computer model simulate what it thinks that ice sheet will do in the future.”

In doing so, the models often assume that the ice sheet starts in a state of balance, and that it is neither gaining nor losing ice at the start of the simulation. The problem with this approach is that ice sheets dynamically change, responding to past events — even ones that have happened centuries ago. “We know from models and from decades of theory that the natural response time scale of thick ice sheets is hundreds to thousands of years,” Robel adds.

By informing models with historical records, observations, and measurements, Robel hopes to improve their accuracy. “We have observations being made by satellites, aircraft, and field expeditions,” says Robel. “We also have historical accounts, and can go even further back in time by looking at geological observations or ice cores. These can tell us about the long history of ice sheets and how they've changed over hundreds or thousands of years.”

Robel’s team plans to use a set of techniques called data assimilation to adjust, or ‘nudge’, models. “These data assimilation techniques have been around for a really long time,” Robel explains. “For example, they’re critical to weather forecasting: every weather forecast that you see on your phone was ultimately the product of a weather model that used data assimilation to take many observations and apply them to a model simulation.”

“The next part of the project is going to be incorporating this data assimilation capability into a cloud-based computational ice sheet model,” Robel says. “We are planning to build an open source software package in Python that can use this sort of data assimilation method with any kind of ice sheet model.”

Robel hopes it will expand accessibility. “Currently, it's very labor-intensive to set up these data assimilation tools, and while groups have done it, it usually involves someone re-coding and refactoring the code over several years.”

Building software for accessibility

Robel’s team will then apply their software package to a widely used model, which now has an online, browser-based version. “The reason why that is particularly useful is because the place where this model is running is also one of the largest community repositories for data in our field,” Robel says.

Called Ghub, this relatively new repository is designed to be a community-wide place for sharing data on glaciers and ice sheets. “Since this is also a place where the model is living, by adding this capability to this cloud-based model, we'll be able to directly use the data that's already living in the same place that the model is,” Robel explains. 

Users won’t need to download data, or have a high-speed computer to access and use the data or model. Researchers collecting data will be able to upload their data to the repository, and immediately see the impact of their observations on future ice sheet melt simulations. Field researchers could use the model to optimize their long-term research plans by seeing where collecting new data might be most critical for refining predictions.

“We really think that it is critical for everyone who's doing modeling of ice sheets to be doing this transient data simulation to make sure that our simulations across the field are all doing the best possible job to reproduce and match observations,” Robel says. While in the past, the time and labor involved in setting up the tools has been a barrier, “developing this particular tool will allow us to bring transient data assimilation to essentially the whole field.”

Bringing Real Data to Georgia’s K-12 Classrooms

The broad applications and user-base expands beyond the scientific community, and Robel is already developing a K-12 curriculum on sea level rise, in partnership with Georgia Tech CEISMC Researcher Jayma Koval. “The students analyze data from real tide gauges and use them to learn about statistics, while also learning about sea level rise using real data,” he explains.

Because the curriculum matches with state standards, teachers can download the curriculum, which is available for free online in partnership with the Southeast Coastal Ocean Observing Regional Association (SECOORA), and incorporate it into their preexisting lesson plans. “We worked with SECOORA to pilot a middle school curriculum in Atlanta and Savannah, and one of the things that we saw was that there are a lot of teachers outside of middle school who are requesting and downloading the curriculum because they want to teach their students about sea level rise, in particular in coastal areas,” Robel adds.

In Georgia, there is a data science class that exists in many high schools that is part of the computer science standards for the state. “Now, we are partnering with a high school teacher to develop a second standards-aligned curriculum that is meant to be taught ideally in a data science class, computer class or statistics class,” Robel says. “It can be taught as a module within that class and it will be the more advanced version of the middle school sea level curriculum.”

The curriculum will guide students through using data analysis tools and coding in order to analyze real sea level data sets, while learning the science behind what causes variations and sea level, what causes sea level rise, and how to predict sea level changes. 

“That gets students to think about computational modeling and how computational modeling is an important part of their lives, whether it's to get a weather forecast or play a computer game,” Robel adds. “Our goal is to get students to imagine how all these things are combined, while thinking about the way that we project future sea level rise.”

 

Alex Robel (Credit: Allison Carter)
 
News Contact

Written by Selena Langner

Contact: Jess Hunt-Ralston

The Oceans Are Missing Their Rivers

In a rhythm that’s pulsed through epochs, a river’s plume carries sediment and nutrients from the continental interior into the ocean, a major exchange of resources from land to sea. More than 6,000 rivers worldwide surge freshwater into oceans, delivering nutrients, including nitrogen and phosphorus, that feed phytoplankton, generating a bloom of life that in turn feeds progressively larger creatures. They may even influence ocean currents in ways researchers are just starting to understand. But today, in rivers around the world, humans are altering this critical phenomenon.

Using Rocks to Hammer Out a Connection Between Visual Gaze and Motor Skills Learning

For his latest research on motor skills, visual learning, and their effects on human physiology, School of Biological Sciences associate professor Lewis Wheaton and his team went all the way back to the Paleolithic Era to study a very retro skill: stone toolmaking.

“One of the cool things about this particular study,” Wheaton says, “is this opportunity to look at a completely novel motor task, something most people have no idea how to do, and that’s making a stone tool.”

The new research, published today in Communications Biology, attempts to fill in the gaps when it comes to the science of how we learn complex motor skills — and what may be required to relearn them. 

Wheaton says there are studies researching the behavioral changes that are involved with learning complex skills. But research is still thin on how people adapt their visuomotor skills (how vision and movements combine) to carry out a complex task. Wheaton’s current study sought to quantify and evaluate the changes and relationship in action perception processes – how we understand actions, then select, organize, and interpret what needs to be done for a particular task. 

“The overall motivation was to determine if we could see any kind of emerging relationship between the perceptual system and the motor system, as somebody is really trying to learn to do this skill,” Wheaton says. Those are important processes to understand, he adds, not just for how people attain complex motor skills learning, but what would be needed for motor relearning, as in a rehabilitation setting.

Wheaton conducted the research with graduate students Kristel Yu Tiamco Bayani and Nikhilesh Natraj, plus three researchers from Emory University’s Department of Anthropology.

Tracking the eyes to learn about learning 

The test subjects in the study watched videos of paleolithic stone toolmaking for more than 90 hours of training. The subjects’ visual gaze patterns and motor performance were checked at three different training time points: the first time they watched the video, at 50 hours of training, and at approximately 90 hours. Everybody was able to make a stone tool (with varying degrees of success) at 90 hours, but some picked up the skills at 50 hours.

Wheaton says there was a lot of information to pay attention to in the videos. “There’s a lot of physics in (making stone tools). You’re hitting a rock which is made up of all different kinds of material. There could be a fissure or fault lines, and if you hit it the wrong way it could crumble. When you’re doing it at first, you don’t know that.”

As the video training went on, the participants started to pick up cues about how to strike the rock, along with other aspects of toolmaking. “At first you’re watching from curiosity, then you’re watching with intent.”

That was the exciting part for Wheaton and his team: Being able to see the different phases of learning during the training — which they actually could see by monitoring gaze tracking, or where the subjects’ eyes landed on the video screen as they watched (see photo.)

“Part of the study was to understand the variability where they are visually focused as they get better at the task,” he says.

That’s how Wheaton’s team found there are certain parts of the skills learning that connect better to gaze, but others that connect better to the physical act of making a stone tool. “As you’re going through time, your motor abilities are changing, and at some point that allows you to watch somebody else perform the same task differently, suggesting you’re able to follow the action better, and pull more information from the video in a much clearer way.”

The study not only found a connection between gaze and motor skills learning, but that the connection evolved as the learning went on. The next step in this research, Wheaton says, should include brain imaging “heat maps” to determine where learning takes place with this process. 

That could also help Wheaton’s team apply these lessons for rehabilitation purposes.

“That’s the link between that and some of the other work we’ve done in a rehab context,” he says. “If you’re watching somebody perform a task, if you’re undergoing rehab, there are different ways you’re watching the task. You’re not always watching it the same way. Maybe it depends on how good you are, or how you’re impaired, but all those variables play a role into what you’re visually pulling out” of the rehab training.

 

DOI: doi.org/10.1038/s42003-021-02768-w

 
News Contact

Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

 

Specialized Cells or Multicellular Multitaskers? New Study Reshapes Early Economics and Ecology Behind Evolutionary Division of Labor

A new research study from researchers in the School of Biological Sciences and School of Physics focuses on the evolution of reproductive specialization – how early single cells first got together to create more complex multicellular organisms. In particular, scientists leading the study sought to better understand how those early cells decided which ones would focus on reproduction, and which ones would get busy building parts of a larger organism.

The work, published this month in the journal eLife, references “division of labor,” “trade,” “productivity” and “return on investment,” (ROI) to describe those cellular activities. If that sounds like a paper destined for a business magazine instead of a peer-reviewed journal on biological sciences research, there’s a good reason. 

As the study, led by assistant professor Peter Yunker and associate professor Will Ratcliff, notes in the abstract, “A large body of work from evolutionary biology, economics, and ecology has shown that specialization is beneficial when further division of labor produces an accelerating increase in absolute productivity.” In other words, the prevailing theories state that specialization pays off only when it increases total productivity – whether it’s multicellular organism or widgets streaming out of a factory. 

What Yunker, from the School of Physics and the Parker H. Petit Institute for Bioengineering and Bioscience, and Ratcliff, from the School of Biological Sciences and co-director of the Interdisciplinary Ph.D. in Quantitative Biosciences (QBioS) have found is that the conditions for the evolution of specialized cells were actually much broader than previously thought. Absolute productivity be darned, the cells seem to say; specialization appeared to be a winning strategy, even under conditions that should favor cellular self-sufficiency. 

Why? It has to do with the topology of the network of cells within the organism – what Ratcliff calls a branchy structure. That topology determines that the division of labor can be favored, even if productivity suffers. 

“Topological constraints in early multicellularity favor reproductive division of labor” is the title of the team’s paper. Yunker and Ratcliff collaborated with several other Georgia Tech faculty and graduate students on the research: Joshua S. Weitz, Patton Distinguished Professor in the School of Biological Sciences and co-director of QBioS; School of Physics graduate students David Yanni and Shane Jacobeen; and School of Biological Sciences graduate student Pedro Marquez-Zacarias. All are members of Georgia Tech’s Center for Microbial Dynamics and Infection.

Multicellular multitasking

As cells get more complex, they begin to specialize. Some cells are dedicated to reproduction, while others are devoted to other general tasks such as making and maintaining the organism’s body. “In this paper, what we’re trying to figure out is, when is it a good idea to specialize and have that pay off, and when it is a good idea for your cells to remain generalists?” Ratcliff says. “Under what conditions does evolution favor specialization, and in what conditions do simple multicellular organisms keep every cell a generalist?”

For centuries, scientists have known that specialization is very important for multicellularity. “Once we had microscopes, we were off to the races learning about specialization,” Ratcliff says. 

The thinking for the last few decades has been that more specialized cells evolve when specialization results in increasingly higher productivity. “That will push things to complete specialization because there’s more to be gained by specializing than not specializing.” 

Yet what if those cells are not interacting randomly with a lot of other cells, but only with a few cells over and over again? “This is actually the case for a little branchy structure that contains mom and all her kids. The only cells you are attached to are the ones that gave rise to you, and the ones that arise from you,” he says. Those “branchy structures” offer the topological constraints mentioned in the title of the research study. 

Branch banking of cellular products

Yunker explains that those tree-branchy structures can be thought of as similar to fractals, in which math functions are repeated again and again and are depicted as jagged borders stretching into infinity. 

“Mandelbrot sets and the broader study of fractals have been an inspiration for a lot of this,” Yunker says. “After the concepts behind fractals were identified, people eventually started to see them everywhere. Instead of some unique esoteric thing, it was pervasive. In a similar vein, the structures that we find make evolving division of labor easier, these sparse filaments and branched topologies, are common in nature,” including so-called snowflake yeast and some forms of algae.

Yunker agrees that it may seem counter-intuitive, but as you restrict cellular interactions, like swapping of products that can enhance reproduction or specialization, that specialization actually becomes easier according to his team’s mathematical models. 

Cells that produce the same products won’t interact or 'trade' with each other, since that would be a waste of energy and efficiency. “A redundancy comes into play here,” Yunker says. “If you have a lot of similar cells trading, that increased productivity doesn’t do you a lot of good. Whereas if you have dissimilar or opposites trading, even with lower productivity, they’re able to direct those resources in a more efficient manner.”

What can economists and cancer researchers learn from these cells?

Since economics has already figured into the study of how multicellular organisms evolved, with all of that labor and trade and ROI, could that discipline have something to learn from Yunker and Ratcliff’s new theory — could the lessons mean a more efficient way to make all kinds of products?

“Could this apply in economics? Could it apply elsewhere?” Yunker echoes. “This is something we would love to pursue going forward.”

Ratcliff notes the multidisciplinary approach his biophysics and biosciences team took to approaching the study, which also involved mathematical models developed by Weitz. “We were really motivated by understanding both how life got to be complex, and the rules for why it did,” he says. “This paper follows into the ‘why’ category. Fundamental mathematics tells you about the rules evolution plays by, and there are a lot of downstream applications, like cancer research, agriculture, and infectious disease. You never really can predict how someone will leverage basic insight.”

 
News Contact

Renay San Miguel
Communications Officer
College of Sciences
404-894-5209