Astronaut Andreas Mogensen shows off a pair of Astrobee robotic free flyers with the Obruta software. Image credit: NASA JSC.

Obruta, along with their partners, has successfully concluded their autonomous spacecraft mission aboard the International Space Station. 

This was the culmination of “17 months of rigorous testing,” the announcement said, focused on giving spacecraft the ability to autonomously dock with each other using normal monocular computer vision augmented by AI. 

Obruta and RPOD

Obruta is a company that went through a dramatic pivot during its early lifetime, moving from orbital debris removal to autonomous spacecraft guidance during its successful time as part of the University of Toronto’s Creative Destruction Lab Space Stream. 

Their current focus is providing spacecraft with autonomous “RPOD” capabilities: Rendezvous, Proximity Operations, and Docking. This will let spacecraft perform in-orbit refuelling, servicing, orbit corrections, and other interventions on other spacecraft without human direction or controllers. The goal is to provide this as a simple, “turnkey” solution through the use of RPOD kits that can be integrated into customers’ spacecraft.   

In previous SpaceQ coverage, Obruta co-founder and CEO, Kevin Stadnyk, said that these capabilities could become “critical aspects of the in-space economy.” The Canadian Space Agency appeared to agree, providing Obruta with a set of STDP awards to develop both computer vision and guidance/navigation capabilities suitable for spacecraft docking. 

AstroSee and the Astrobees

A National Laboratory Research Announcement (NLRA) award put them in partnership with the American company GeoJump for testing these capabilities. 

The mission was called “AstroSee”, and had  GeoJump and Obruta working with NASA in order to test Obruta’s technology using the ISS’s “Astrobee” robots. The Astrobees are three small cube-shaped autonomous vehicles owned by NASA’s Ames Research Center. They’re used aboard the ISS for routine tasks like moving cargo and taking inventory inside its GEM module. 

NASA said that these tests would be particularly useful, as vision-based navigation would “leverage lightweight, inexpensive, and less power-intensive optical sensors compared to larger, higher cost, and more power intensive sensors such as Lidar and Radar”. Up to now, however, reliable orbital navigation using monocular computer vision had been “an unsolved problem”. 

The AstroSee mission was a potential way for NASA to get that problem solved.

For their part, Obruta saw them as a perfect platform for testing “the full suite of the guidance, navigation and control software”, according to Stadnyk, as well as for performing computer vision tests. So, with the help of GeoJump and the ISS National Lab, Obruta used the Astrobees to perform the tests. 

These tests were scheduled to take place during 2024 over three sessions, but an extra test was added. Stadnyk said to SpaceQ in October that this final test was intended to cover everything they’d done so far, and would involve a “full end-to-end mission.” The mission had one of the Astrobees “navigating visually within its environment” using “real time, computer vision-based relative navigation” to guide itself to the other Astrobee and perform a simulated docking. 

Obruta said in their announcement that this final test was “the culmination of 17 months of rigorous testing.”

Obruta’s accomplishments with AstroSee

According said they had several important accomplishments:

  • They “independently validated our AI vision system,” including providing its suitability for edge computing, as well as “its effectiveness for autonomous operations even in the cluttered environment of the ISS.” They also said that it demonstrates “our proprietary simulation-to-reality transfer success,” applying terrestrially-trained models to the orbital environment.
  • They “validated our fuel- and time-optimal guidance system,” which exploited the oddities of orbital mechanics to deliver “35% greater fuel savings during experiments.”
  • They “completely rewrote the Astrobee controller and replaced it with our own,” Obruta said,” and “demonstrated its effectiveness even against the many perturbing fans inside the ISS.” 
  • Finally, Obruta said that their last flight brought all the systems together “for the grand finale” —the end-to-end docking test mentioned earlier—and successfully used all of them to “perform the first-ever fully autonomous in-space docking using a monocular vision system.” Thus, apparently, solving NASA’s navigation dilemma. 

To that list, readers can add one other that had been mentioned to SpaceQ by Stadnyk last October: upgrading the Astrobees’ capabilities to be able to properly employ AI-based computer vision. That wasn’t technically possible before, but it is now, and other Astrobee users will be able to use these upgraded capabilities for their own tests. 

Obruta said that after this successful round of testing, “we will be delivering the technology to our customers” and “enabling missions spanning in-space refueling, logistics, satellite servicing, space debris removal, and space situational awareness.” They claimed that the RPOD technology will become “a game-changer for future space missions.”

Craig started writing for SpaceQ in 2017 as their space culture reporter, shifting to Canadian business and startup reporting in 2019. He is a member of the Canadian Association of Journalists, and has a Master's Degree in International Security from the Norman Paterson School of International Affairs. He lives in Toronto.

Leave a comment