How AR And VR Could Help Get Humans To Mars

Lockheed Martin has rolled out its MAIA “digital ecosystem,” which combines machine learning with AR/VR to provide more self-reliance to manned space missions.
Fast Company, April 16, 2018

Having your vehicle break down when you’re far from home can be inconvenient and at times alarming, but not usually an insurmountable problem. Having it break down when you’re more than 100 million miles from home can be disastrous.

To help space travelers recover from the kinds of mishaps that can occur in the reaches of deep space—mishaps that can end missions and potentially end lives—Lockheed Martin is combining machine learning and artificial intelligence with augmented and virtual reality interfaces to provide a bit more self-reliance to manned space exploration missions at distances at which it could take more than 40 minutes to get a reply from mission control.

Known as MAIA—the Model-based Artificial Intelligent Assistant—this “digital ecosystem,” as Lockheed calls it, is getting its first rollout today, as part of Lockheed’s work to support NASA’s NextSTEP program, which works with private-sector companies to develop the technologies that will eventually take men and women to Mars.

The system combines a full digital representation of a spacecraft or surface installation, updated in real time, with machine learning algorithms that can look ahead to understand what might occur under various conditions, and interfaces including augmented reality and virtual reality gear that can provide repair and maintenance instructions or let crew members explore various parts of a spacecraft without having to actually visit them.

Lockheed hopes to apply the system first to NASA’s Orion spacecraft, which is slated to start taking astronauts to space in the 2020s as part of NASA’s plan to send humans to Mars in the early 2030s. The idea is to bring a crucial measure of self-reliance to crews operating in deep space, where assistance from Earth may not be immediately available, says Bill Pratt, program manager for Lockheed’s NextSTEP work with NASA.

AN ARMY OF GENIUSES
“In low earth orbit there’s essentially no communications delay, and if you have an emergency you can push the button and come home in a matter of hours,” Pratt says. “At Mars, there’s a 40-minute round-trip delay in communications. So if a problem pops up at or near Mars and it’s a critical event, we may have a lot of smart people on the ground, an army of geniuses, but they may not be able to jump in. So the crew has to become a lot more self-sufficient.”

The canonical example is the Apollo 13 mission of 1970, which experienced a “critical event” two and a half days after launch when an oxygen tank exploded, causing a host of problems and jeopardizing the entire mission, including the three crew members’ lives. To solve one problem, NASA technicians on the ground devised a fix known as “the mailbox” using duct tape and an air hose from a space suit. But they were able to tell the crew how to implement it only because the round-trip communications delay between Earth and the Moon is less than three seconds. A 40-minute delay would in many cases render such assistance useless.

With MAIA, crew members on their way to Mars, or living at the Lunar Orbital Platform-Gateway NASA envisions in orbit around the moon, would be able to devise solutions to such problems using a “digital twin” provided by the system. That digital doppelganger would incorporate machine learning, through an analytics engine known as System Invariant Analysis Technology (SIAT), developed by Japan’s NEC Corp. Lockheed is using SIAT to crunch sensor data from satellites to help the system predict problems that might arise, and determine how a spacecraft would react to changes—changes like an oxygen tank exploding, say, or a hard-pressed astronaut duct-taping a cube-shaped lithium hydroxide canister to a cylindrical socket via a repurposed air hose.

“MAIA will allow a crew member to slip on an AR visor and see real-time data overlaid on top of physical space, using sensor data such as thermal, gas mixtures, things like that,” says Tony Antonelli, a former space shuttle pilot who now directs the advanced program group for Lockheed’s commercial-civil space business. While NASA hasn’t yet incorporated MAIA into its official programs, Lockheed is prime contractor on the Orion craft, and similar AR technology is already at use on the International Space Station, Antonelli says. “They’re not widely used on ISS today, because ISS came up in a different time, before those tools were really available. But those tools are moving so quickly now, it’s almost like you can’t imagine a world in deep space without those tools.”

MAKING MAIA SMARTER
“With the Gateway, we have the opportunity to build those things in from the very beginning,” Antonelli says. Rather than just scan and analyze the finished product, MAIA will start learning about the craft it is to replicate as early as the design phase.

“Even before we first cut metal for the spacecraft, we’re already building the digital assistant. It gets built not just as you launch the spacecraft, but when you first start designing the spacecraft,” Pratt says. “We’re building it now as we’re studying the Gateway. We’re building the first parts of MAIA through simulations and digital models and virtual representations. Every time we build a simulation of a subsystem, that becomes a part of MAIA. As we move through not just the design but the fabrication, the assembly, and the testing of the spacecraft, all the knowledge we gain from those activities only make MAIA smarter.”

Lockheed is already using augmented reality in the design and construction of the Orion spacecraft. Engineers working on Orion’s Forward Bay Cover (which protects the craft’s parachutes) use AR work instructions to streamline the assembly of the component, saving time (and money) and leading to lower error rates.

Much of the AR work that will go into MAIA is being done in partnership with Scope AR (which creates AR work instructions like those used on the Forward Bay Cover), and being delivered via Microsoft HoloLens. The machine learning systems, including those provided by NEC, will also help predict “on-orbit issues” before they arise. “We never flew a shuttle mission without a whole list of technical things, electrical shorts, computer single-event upsets,” Antonelli says. “In the future, you can start predicting that kind of stuff.

So will crew members merely be along for the ride? “You can’t do deep-space human exploration without humans, so there will be that human problem-solver doing the sanity check,” Antonelli says. “You’ll be there, covering things that weren’t anticipated anywhere in the design. But that’s the beauty of having the digital twin there: you can work through that together with all the data and the machine learning and the human, right there in the spacecraft.” Even if it takes 40 minutes to get an answer when you phone home.