The most interesting customer conversations I am having these days no longer start with requests for specific sensor data. Instead, customers describe their real-world missions that no single sensor can solve: a defense agency needing 24/7 port monitoring, an insurance company that must assess damage after a disaster, or an NGO tracking vessels whose transponders have gone dark.
After several years working on these complex customer missions, I’ve become convinced that data fusion (piecing together information from multiple sources) is essential. Sometimes this means combining sensors of the same type (such as multiple SAR satellites), and often it requires integrating different modalities, including Synthetic Aperture Radar (SAR), Radio Frequency (RF), Electro-Optical (EO), and even open-source intelligence.
The real question our customers are asking is no longer “Which sensor should we use?” It’s “How can I get the actionable insights that combine multiple datasets in the right way, for the mission at hand?” This shift is reshaping how we deliver intelligence to the customers in the various markets we serve.
When Individual Sensors Meet Their Match
Let me give you a concrete example. A customer recently asked us to provide imagery of a site multiple times a day. By 2030, our 30-satellite constellation will enable us to revisit any location on Earth within an hour. Close, but not quite there. The traditional response would be “build more satellites.” However, there’s a more innovative way to think about this: create what the industry refers to as a virtual constellation.
Here’s the reality: even though you might pass over a geographical region, you might not have the capacity to image because you have already committed the satellite time to a different customer. That’s where virtual constellation comes into play. You can use this virtual constellation, which can comprise multiple satellites and sensors with varying modalities from multiple sources, to understand when each of these sensors has the capacity to take an image and provide timely information to the customer.
Each modality and sensor brings different strengths to this concept. EO sensors provide intuitive images that can be interpreted by humans, as well as information about specific object/feature signatures from reflected sunlight. On the other hand, SAR provides 24/7, all-weather imaging capability, which is a major differentiator for certain missions. RF sensors can monitor radio signals that can provide signal intelligence. Automatic identification system (AIS) tells the location broadcast by various ships. Data curation of these disparate datasets ensures consistency of quality across geography and time, allowing you to gain insights that leverage multiple datasets.
Data Curation
Data fusion isn’t just about compiling multiple datasets together. One of the key aspects of data fusion is data curation. When it comes to satellites, one of the key factors is the location accuracy of the imagery. Each sensor will have its own spatial errors, and images from multiple sensors over time, over the same geographic location, will not align with each other. One part of data curation involves aligning all the datasets to match geographically.
Then there’s the consistency piece of imagery, either over geographies or over time. If SAR satellites are taking pictures with different angles (known as off-nadir angles), the shadows will be in different places. To perform something like interferometry, you need consistency in the geometry. When you think about data fusion, you also need to consider the geometry of the various datasets. Can they be fused together? Or can you collect the data in a way that allows them to be combined in data fusion? There are other factors, such as image quality (e.g., sensor noise) and time of image capture, that also need to be considered for data curation.
Once you perform data curation, the data is now analytics-ready (Analytics Ready Data or ARD) and can be used in analytical workflows, whether through AI/ML models or algorithms such as Motion Target Indication (MTI). Data curation minimizes noise and enables these algorithms to generate consistent information and insights across multiple modalities.
Applications
Four national security customer missions, in particular, illustrate the power of data fusion. First is foundational Geospatial Intelligence (GEOINT), which involves the creation of maps and foundational insights about the globe. Second is a broad area search: where customers are looking for a particular object in a large geographic area, the size of a country or a continent.
The third mission type is Early Indicators and Warnings (EI&W): customers are looking at a given site or location to understand the patterns of life, and look for anomalies and insights that can trigger early warnings. Finally, there’s custody of objects: customers want to maintain custody of an object, ship, plane, or truck, or perform persistent surveillance.
Now, let us discuss an example of data fusion. Take the use case of maritime monitoring. We can detect various types of ships using SAR (different ship classes, length of the ship, direction, and velocity). But when you overlay AIS data on top of those images, you can identify what we call dark vessels: ships that either are spoofing their location or have switched off their AIS location. That’s intelligence you simply can’t get from any single sensor.
Another example of data fusion is disaster damage assessment. If you’re looking for buildings that have fallen or been damaged after a typhoon or cyclone, SAR can see through clouds and capture an image immediately after the event. Using advanced analytics, we can pinpoint where buildings have been damaged. EO images capturing post-disaster provide first responders with the visual context of the extent. Together, they provide a comprehensive damage assessment much faster and more efficiently than traditional methods, which involve field personnel performing damage assessments.