Celebrate STEAM | Atlanta Science Festival Launch at Georgia Tech

Members of the Georgia Tech community are excited to welcome the community back to campus for the kickoff event of the 12th annual Atlanta Science Festival. Formerly known as Georgia Tech Science and Engineering Day, Celebrate STEAM will feature hands on activities for participants of all ages. Whether your interests lie in robotics, brains, biology, space, art, nanotechnology, paper, computer science, wearables, bioengineering, chemical engineering, or systems engineering, we have something for everyone.

Georgia Tech’s Executive Vice President for Research Search: Finalist 1 Seminar

Each candidate’s bio and curriculum vitae, along with further details, will be accessible through the EVPR search site two business days ahead of each visit. Georgia Tech credentials are required to access all materials. Information is being made available in this manner to protect the confidentiality of the finalists.

Finalists Chosen in Georgia Tech’s Executive Vice President for Research Search

Historical sign depicting information about Tech Tower

Georgia Tech’s Executive Vice President for Research (EVPR) search committee has selected three finalists. Each candidate will visit campus and present a seminar sharing their broad vision for the Institute's research enterprise. 

The seminars are open to all faculty, students, and staff across the campus community. Interested individuals can attend in person or register to participate via Zoom (pre-registration is required).    

All seminars will take place at 11 a.m. on the following dates:  

  • Candidate 1: Monday, January 13, Scholars Event Theater, Price Gilbert 1280 (register for webinar)  
  • Candidate 2: RESCHEDULED to Wednesday, January 29, Scholars Event Theater, Price Gilbert 1280 (register for webinar)
  • Candidate 3: Monday, January 27, Scholars Event Theater, Price Gilbert 1280 (register for webinar)  

Each candidate’s bio and curriculum vitae, along with further details, will be accessible through the EVPR search site 48 hours prior to each visit. Georgia Tech credentials are required to access all materials. Information is being made available in this manner to protect the confidentiality of the finalists. Following each candidate’s visit, the campus community is invited to share their comments via a survey that will be posted on the candidate’s webpage.   

The search committee is chaired by Susan Lozier, dean of the College of Sciences. Search committee members include a mix of faculty and staff representing colleges and units across campus. Georgia Tech has retained the services of the executive search firm WittKieffer for the search.  

News Contact

Shelley Wunder-Smith | shelley.wunder-smith@research.gatech.edu
Director of Research Communications
 

Intelligent XR for Adaptive Task Guidance in Future Factories

Mohsen Moghaddam
Gary C. Butler Family Associate Professor
H. Milton Stewart School of Industrial and Systems Engineering
George W. Woodruff School of Mechanical Engineering
Georgia Institute of Technology

Monday, April 7
12:00 - 1:00 PM Eastern Time
Location: Callaway/GTMI bldg.,
Room 114

Lunch provided for in-person attendees on a first come first serve basis.

Gregory Sawicki to Serve as Interim Director of the Institute for Robotics and Intelligent Machines

Gregory Sawicki

Gregory Sawicki to Serve as Interim Director of the Institute for Robotics and Intelligent Machines

Effective January 1st, Gregory Sawicki will serve as interim executive director of the Georgia Tech Institute for Robotics and Intelligent Machines (IRIM). Sawicki is a professor and the Joseph Anderer Faculty Fellow in the George W. Woodruff School of Mechanical Engineering with a joint appointment in the School of Biological Sciences.

“Professor Greg Sawicki will make a great interim executive director of IRIM. He brings experience with robotics and collaborative research to this role,” said Julia Kubanek, professor and vice president for interdisciplinary research at Georgia Tech. “He'll be a strong partner to faculty, students, and the EVPR team as we explore the future of IRIM and robotics over the next several months."

Sawicki succeeds Seth Hutchinson who will be taking a new position at Northeastern University in Boston. Hutchinson, professor and KUKA Chair for Robotics in Georgia Tech’s College of Computing, has served as executive director of IRIM for five years. During Hutchinson’s tenure as executive director, IRIM expanded its industry outreach activities, developed more consistent communications, and grew its faculty pool at Georgia Tech to include a diverse cohort from across the Colleges of Engineering and Computing and the Georgia Tech Research Institute. 

"I am extremely excited to step into this leadership role for IRIM, maintain our research excellence in the foundational areas of robotics, and proactively leverage opportunities to grow across campus and beyond in novel, creative interdisciplinary directions,” said Sawicki. “This will involve new initiatives to incentivize connections with GTRI and other IRI's on campus, to build new industry partnerships, and continue to strengthen the M.S./Ph.D. program in Robotics by engaging with Schools beyond those with a traditional footprint in robotics education and research.”

Sawicki directs the Human Physiology of Wearable Robotics (PoWeR) Lab where he and his group seek to discover physiological principles underpinning locomotion performance and apply them to develop lower-limb robotic devices capable of improving both healthy and impaired human locomotion. By focusing on the human side of the human-machine interface, his team has begun to create a roadmap for the design of lower-limb robotic exoskeletons that are truly symbiotic – that is, wearable devices that work seamlessly in concert with the underlying physiological systems to facilitate the emergence of augmented human locomotion performance.

Sawicki earned a B.S. in mechanical and aerospace engineering from Cornell University in 1999, an M.S. in mechanical and aeronautical engineering from the University of California - Davis in 2001, and a Ph.D. in neuromechanics at the University of Michigan-Ann Arbor in 2007. Sawicki completed his postdoctoral studies in integrative biology at Brown University in 2009.

Sawicki has been recognized for his interdisciplinary research and teaching, recently receiving a $2.6 million Research Project Grant from the National Institutes of Health (NIH) to study optimization and artificial intelligence to personalize exoskeleton assistance for individuals with symptoms resulting from stroke. * Sawicki was also selected as a 2021 George W. Woodruff School Academic Leadership Fellow, and the 2022 College of Sciences Student Recognition of Excellence in Teaching and the 2023 American Society of Biomechanics Founders’ Award for excellence in research and mentoring. Sawicki has also been featured as an expert voice on exoskeletons and human neuromechanics in numerous print and television news releases.

--Christa M. Ernst

*Joint Award with Aaron Young, Assistant Professor in the Woodruff School of Mechanical Engineering

News Contact
Christa M. Ernst [christa.ernst@research.gatech.edu],
 
Research Communications Program Manager,
 
Topic Expertise: Robotics | Data Sciences| Semiconductor Design & Fab

 

Bridging Tradition and Technology: Robotics and AI Open a New Path for Classical Indian Music

Raghav Sankaranarayanan and the robot violin

Ph.D. student Raghav Sankaranarayanan with Hathaani, his violin-playing robot. (Credit: Wes McRae)

Raghavasimhan Sankaranarayanan has over 200 album and film soundtrack credits to his name, and he has performed in more than 2,000 concerts across the globe. He has composed music across many genres and received numerous awards for his technical artistry on the violin. 

He is also a student at Georgia Tech, finishing up his Ph.D. in machine learning and robotics. 

One might wonder why a successful professional musician would choose to become a student again.

“I always wanted to integrate technology, music, and robotics because I love computers and machines that can move,” he said. “There’s been little research on Indian music from a technological perspective, and the AI and music industries largely focus on Western music. This bias is something I wanted to address.”

Sankaranarayanan, who began playing the violin at age 4, has focused his academic studies on bridging the musically technical with the deeply technological. Over the past six years at Georgia Tech, he has explored robotic musicianship, creating a robot violinist and an accompanying synthesizer capable of understanding, playing, and improvising the music closest to his heart: classical South Indian music.

The Essence of Carnatic Music

Carnatic music, a classical form of South Indian music, is believed to have originated in the Vedas, or ancient sacred Hindu texts. The genre has remained faithful to its historic form, with performers often using non-amplified sound or only a single mic. A typical performance includes improvisations and musical interaction between musicians in which violinists play a crucial role. 

Carnatic music is characterized by intricate microtonal pitch variations known as gamakas — musical embellishments that modify a single note’s pitch or seamlessly transition between notes. In contrast, Western music typically treats successive notes as distinct entities.

Out of a desire to contribute technological advancements to the genre, Sankaranarayanan set out to innovate. When he joined the Center for Music Technology program under Gil Weinberg, professor and the Center’s director, no one at Georgia Tech had ever attempted to create a string-based robot.

“In our work, we develop physical robots that can understand music, apply logic to it, and then play, improvise, and inspire humans,” said Weinberg. “The goal is to foster meaningful interactions between robots and human musicians that foster creativity and the kind of musical discoveries that may not have happened otherwise.”

The Brain and the Body

Sankaranarayanan conceptualizes the robot as comprising two parts: the brain and the body. The “body” consists of mechanical systems that require algorithms to move accurately, including sliders and actuators that convert electric signals into motion to produce the sound of music. The “brain” consists of algorithms that enable the robot to understand and generate music.

In robotic musicianship, algorithms interpret and perform music, but building these algorithms for non-Western music is challenging; far less data is available for these forms. This lack of representation limits the capabilities of robotic musicianship and diminishes the cultural richness diverse musical forms can offer. 

Classical algorithms would struggle to capture the nuances of Carnatic music. To address this, Sankaranarayanan collected data specifically to model gamakas in Carnatic music. Then, using audio from performances by human musicians, he developed a machine-learning model to learn those gamakas. 

“You may ask, ‘Why not just use a computer?’ A computer can respond with algorithms, but music’s physicality is vital,” Weinberg said. “When musicians collaborate, they rely on the visual cues of movement, which make the interaction feel alive. Moreover, acoustic sound created by a physical instrument is richer and more expressive than computer-generated sound, and a robot musician provides this.”

Sankaranarayanan built the robot incrementally. Initially, he developed a bow mechanism that moved across wheels; now, the robot violin uses a real bow for authentic sound production.

Developing a New Musical Language

Another challenge involves technologies like MIDI (Musical Instrument Digital Interface), a protocol that enables electronic musical instruments and devices to communicate and sync by sending digital information about musical notes and performances. MIDI, however, is based on Western music systems and is limited in its application to music with microtonal pitch variations such as Carnatic music.

So Sankaranarayanan and Weinberg developed their own system. Using audio files of human violin performances, the system extracts musical features that inform the robot on bowing techniques, left-hand movements, and pressure on strings. The software synthesizer then listens to Sankaranarayanan’s playing, responding and improvising in real time and creating a dynamic interplay between human and robot.

“Like in many other fields, bias also exists in the area of music AI, with many researchers and companies focusing on Western music and using AI to understand tonal systems,” Weinberg said. “Raghav’s work aims to showcase how AI can also generate and understand non-Western music, which he has achieved beautifully.”

Giving Back to the Community

Carnatic music and its community of musicians shaped Sankaranarayanan's musical sensibility, motivating him to give back. He is developing an app to teach Carnatic music to help make the genre more appealing to younger audiences.

“By merging tradition with technology, we can expand the reach of traditional Carnatic music to younger musicians and listeners who desire more technological engagement,” Sankaranarayanan said. 

Through his innovative work, he is not just preserving Carnatic music but also reshaping its future for a digital age, inviting a new generation to engage with its deep heritage.

 

News Contact

Catherine Barzler, Senior Research Writer/Editor

catherine.barzler@gatech.edu