In my previous blog, I discussed why data fusion is important for modern geospatial intelligence. Now, let’s explore how this actually works in practice for frequent monitoring/object custody through a multi-sensor virtual constellation.
One of the most practical applications is what we call tip and cue for frequent monitoring and object custody. The way this works is very straightforward. You take an image, whether it’s SAR or Electro-Optical, of an area of interest which could be a large area, spanning hundreds or thousands of square kilometers. You want to identify those objects of interest, or in the case of object custody, that specific object of interest. Then you need to tell the other sensors: here is the predicted latitude and longitude of the object/objects of interest, and this is the time frame we want the other sensors to collect the next image. You iterate the process over the duration required for object custody.
Let me give you a specific example of port monitoring. Say a SAR satellite detects unusual activity at a port at 1 AM, where several ships that weren’t there the day before have appeared. SAR object detection provides precise coordinates of new ships that arrived at the port, and correlation with AIS data shows anomalies in the GPS location transmitted by a couple of ships. Subsequently, an optical satellite can focus on just those specific coordinates of the ships with anomalous AIS transmissions when it passes over in daylight, capturing detailed imagery of these ships.
Tip and cue approach increases the efficiency of the use of a multi-sensor virtual constellation by directing sensors when and where they’re most valuable. The tip from one sensor provides the latitude, longitude, and cues other sensors to collect new pertinent information. And this cycle iterates for a particular mission duration, which can be hours or days. Furthermore, I’ve seen examples of tip and cue leveraging a combination of sensors, including EO, SAR, RF, and AIS, as well as information from multiple domains (space, air, land, and sea), as well as leveraging other sources of intelligence, such as OSINT (Open Source Intelligence).
Addressing Technical Challenges with Tip and Cue of Virtual Constellation
There are definitely challenges associated with managing a virtual constellation. Different commercial satellite companies use different mechanisms for tasking their satellites. Several companies have now built tasking APIs that streamline this operation. One of the key issues that needs to be addressed includes a machine-to-machine interface for tipping and cueing between multiple commercial satellites without a human in the loop. When tracking fast-moving objects or time-sensitive threats, even minutes of human decision-making delay can mean losing track of the target entirely. For time-sensitive missions such as object custody, there is a need for the industry to come together to solve this problem.
At Synspective, we are designing our constellation for a low-latency response to support our customer missions. We have developed a tasking API that allows for virtual constellation orchestration by communicating the availability of our satellites.
Leveraging SAR for Frequent Monitoring and Object Custody
At Synspective, we believe that while humans can interpret SAR imagery, its true power lies in automated analysis. SAR is primarily intended for AI/ML and computer vision algorithms to derive actionable insights from images that can be collected 24/7 and in all weather conditions.
So at Synspective, since the company’s inception, we started developing analytical products not only using our StriX constellation but also from other public and private satellites. We leverage multiple frequencies of SAR (X, C, and L) as well as Electro-optical imagery in our analytics. Today, we offer six analytical products, including an object detection service that can be leveraged to rapidly identify objects of interest for frequent monitoring and object custody use cases. Our disaster assessment and flood damage assessment products are designed to provide first responders with actionable insights with low latencies. Furthermore, we are also investing in the development of techniques such as MTI (Moving Target Indicator), both on land and at sea, to derive the speed and velocity of objects, which can be used to predict the future location of an object for tracking purposes.
Where This All Leads
I see the industry moving toward information as a service. Customers want insights, not raw imagery that they have to interpret themselves. This naturally favors companies that can effectively integrate multiple data sources.
I expect we’ll see more partnerships between companies operating different sensor types. Rather than every company trying to build every capability, specialization and integration will prove more effective. A SAR company partnering with electro-optical providers, RF specialists, and analytics companies can deliver more comprehensive solutions.
As constellation sizes increase and data volumes expand, the capability to extract actionable intelligence from multiple sensor types will become a critical differentiator. Success will depend on transforming diverse data streams into meaningful insights rather than optimizing individual sensor performance.