Metaspectral is getting ready to roll out a new tool for users this week: an AI-powered interface called “Clarity AI” that will make it easier for users to discover relevant materials, objects and other findings within the imagery.
The Vancouver-based company gave more information in an announcement on September 26th. Co-founder and CEO Francis Doumet went into more detail, providing a sample case of Clarity AI denoting likely copper deposits in a particular hyperspectral image.
Hyperspectral and Metaspectral
Hyperspectral imaging is somewhat different from the kind of imaging that many readers might be familiar with. Instead of the normal red, green, and blue bands of visible light that you see in a typical pixel in a color image, each pixel in a hyperspectral image has complete spectrums that includes hundreds of different wavelength bands. Each image is, therefore, more like a data “cube”; the x dimension, the y dimension, and a third “dimension” containing those hundreds of different wavelengths per pixel.
This creates a lot of possibilities across a lot of industries related to earth observation: including resource extraction, agriculture, forestry, defence, climate modelling, and wildfire control among many others. But also a lot of issues, especially when operating in orbit.
One issue, for example, is storage and transmission; a single image can be gigabytes in size, which creates serious challenges for both storing and transmitting the imagery. This is part of Metaspectral’s AI work; they use machine learning techniques and custom FPGA hardware to shrink these images from 60% to 90%, depending on the amount of acceptable lossiness in the imagery.
Metaspectral also worked with Armada on edge-computing AI-based techniques that integrate Metaspectral’s imagery with Armada’s “Galleon” edge-computing datacenters. The goal: relieving the data storage and transmission requirements by performing analysis using on-site Galleon datacenters. At the time, Doumet said that the arrangement would let Metaspectral “push analysis results to the cloud for consumption by the customer.”
Another issue is actually interpreting the data. As most of the bands aren’t human-visible, and as the “image” is technically more akin to a cube, it’s extraordinarily difficult to find what you’re looking for, or even to know what you’re looking for. This is another way that AI and machine learning play a big role; machine learning algorithms and models routinely play a big part in being able to interpret this information.
That kind of ML-supported analysis forms the basis for Metaspectral’s Fusion software, which can be used for everything from wildfire mapping, to precision agriculture, to detecting the polymers and materials that make up a particular object.
Metaspectral and Clarity AI
The recent Metaspectral announcement takes this idea of AI-based analysis one step further, through the incorporation of the kind of large language model (LLM) made familiar by LLMs like ChatGPT. This particular model appears to be a “reasoning” model; the kind that breaks queries down into specific concrete steps for analysis and improved decision-making.
Doumet explained in the announcement that, historically, “advanced, specialized skills and expertise were required to derive insights of any value from hyperspectral imagery.” ClarityAI would change that allowing users “with or without technical expertise” to load a hyperspectral image and ask the LLM for insights “in plain language,” and then have Clarity AI “perform deep analysis of the scene in seconds and deliver expert, actionable, evidence-backed insights.”
The announcement said that this was made possible by Metaspectral using “deep learning to rapidly analyze these massive visual datasets and identify hidden signals within the data.” Clarity AI “instantly conducts background research, runs target detection models, analyzes the image, and directly overlays the results,” said Doumet, “reducing the need for manual analysis and expert interpretation.”
On Metaspectral’s website, Doumet provided a a more detailed example of Clarity AI working, taking an image taken over Yerington, Nevada provided by SpecTIR and asking “where should I look in my image to mine copper.” Clarity AI “confirmed the area’s geology matched the known characteristics of the Yerington district,” Doumet said, and revealed a “classic pattern” of a “phyllic zone” rich in illite surrounded by a “propylitic zone” rich in chlorite. This was identified as a “geological fingerprint of a large scale copper deposit.”
ClarityAI then identified thermal and sulfide cores of the deposit, Doumet added, finding the points where these cores overlap and identifying them as “the most favorable zones for high-grade copper to have formed” before ranking them by potential quality. Ultimately, the point it identified was “within or adjacent to the known Ann Mason copper deposit, a major copper resource in the district,” meaning that Clarity had “independently rediscovered a known deposit.”
Doumet said that this was made possible by “Clarity AI’s ability to act as a research collaborator.” He pointed to an YouTube video of Clarity AI finding the copper site using the Spectral Explorer interface, where (Doumet said) it “proactively assisted in the generation of various band math indices” before “performing analysis by thresholding and overlaying.”
Finally, Doumet said that the waitlist was open for potential users to try it themselves, adding that Clarity AI “turns passive data exploration into an active, collaborative dialogue.”
