When people live and work on the moon in the Artemis missions, they need good navigation aids. Sure, they’ll have a GPS equivalent to help them find their way. And there will be LunaNet, the Moon’s equivalent to the Internet. But there are places on the moon that are quite remote. In these cases, explorers might need more than one method of communication and navigation. This prompted NASA’s Goddard research engineer Alvin Yew to create an AI-driven local map service. It uses local landmarks for navigation.
The idea is to use surface data already collected from astronaut photos and mapping missions to provide overlapping navigational aids. “It is important for security and scientific geotagging that explorers know exactly where they are when exploring the lunar landscape,” said Alvin Yew, research engineer at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Equipping an onboard unit with a local map would support any mission, robot or human.”
Having a map-based system as a backup would make life much easier for researchers in craters, for example, Yew said. “The motivation for me was to enable lunar crater exploration where the entire horizon would be the crater rim.”
Remove all ads on Universe today
Join our Patreon for just $3!
Get the ad-free experience for life
The collection of ridges, craters, and boulders that make up a lunar horizon can be used by artificial intelligence to precisely locate a lunar traveler. A system being developed by research engineer Alvin Yew would provide a backup location service for future researchers, robots or humans. Credit: NASA/MoonTrek/Alvin Yew
Use of moon mapping data as a navigational aid
At the heart of Yew’s system is data from the Lunar Reconnaissance Orbiter. This spacecraft maps the lunar surface in the greatest possible detail and performs other lunar research and exploration tasks. The onboard Lunar Orbiter Laser Altimeter (LOLA) provides high-resolution topographic maps of the Moon.
Yew fed LOLA data into an AI program that uses digital elevation models to recreate features on the lunar horizon. It makes them look like they would appear to an explorer on the lunar surface. The result is a series of digital panoramas. The AI can correlate them to known surface objects – like large boulders or ridges. The goal is to provide accurate location identification for a specific area.
“Conceptually, it’s like going outside and trying to figure out where you are by looking at the horizon and surrounding landmarks,” Yew said. “While estimating the baseball field may be easy for one person, we want to demonstrate accuracy on the ground to less than 9 meters (30 feet). This accuracy opens the door to a wide range of mission concepts for future exploration.”
Yew’s geolocation system also has roots in the capabilities of GIANT (Goddard Image Analysis and Navigation Tool) developed by Goddard engineer Andrew Liounis. Scientists used GIANT to check and verify navigation data for NASA’s OSIRIS-REx mission. This spacecraft flew to the asteroid Bennu to collect a sample for analysis here on Earth.
Moon maps in your device
There may soon come a time when a lunar explorer will set out to study various surface features. They will be equipped with cameras and communication devices. This is similar to earth geologists going into the field with a DSLR and cell phone with GPS and satellite access. You can find your way around by noting landmarks, but it’s always useful to have backup methods. Of course, we have multiple communication networks here on Earth.
LunaNet concept art for a possible communication and navigation device to be used on the moon. Credit: NASA/Reese Patillo
This infrastructure does not exist on the moon. But it should be there when the Artemis mission is in full swing. Still, it won’t be long before these lunar geologists are “on the ground” themselves. And they need all the help they can get in getting their jobs done. According to a study published by Goddard researcher Erwan Mazarico, an explorer of the lunar surface can only see up to 300 kilometers from any unobstructed spot on the moon. This makes long-term surface surveys over wide areas a bit more difficult. Ideally, a surface explorer could use the “app” Yew is developing in a handheld device. Like a handheld GPS device, a lunar pathfinding device would help astronauts in regions that don’t have the greatest line of sight. Onboard terrain datasets including elevation data would be part of its software.
Yew’s geolocation system has some likely applications beyond the moon. Even on Earth, location technology like Yew’s will help researchers in areas where GPS signals are obstructed or subject to interference. This use of AI-interpreted visual data compared to known lunar surface models could provide a new generation of navigation tools not only for Earth and the Moon, but even for Mars.
For more informations
NASA is developing AI to control by landmarks – on the moon
Lunar Reconnaissance Orbiters
LunaNet: Strengthening of Artemis with communication and navigation interoperability
Like this:
Loading…
Comments are closed.