Now, image processing by AI can be executed inside satellite in space.
Source: IEEE Spectrum
Planet Labs, based in Calif., released an image captured by its Pelican-4 multispectral satellite showing an airport in Alice Springs, Australia. On the tarmac, more than a dozen aircraft are scattered, each highlighted in a neat green box, identified by an AI model running aboard the satellite.
Planet Labs’ engineers had worked 18 months to accomplish reliable autonomous object classification from space. They hope the technology will put Earth observation on steroids, enabling autonomous tasking and real-time sharing of insights with users on Earth.
“We have very good eyes in space looking at everything that’s going on. But then, we collect so much data and have to wait six to 12 hours to get the information out. So, you’re essentially looking at the past.” said Kiruthika Devaraj, vice president of engineering at Planet Labs.
Planet Labs currently operates a constellation of several hundred Dove and SuperDove CubeSats, each only 30 centimeters long. These low-cost space cameras scan the entire surface of Earth multiple times a day at a resolution of around 5 meters. The company is also building up a fleet of 32 larger satellites, called Pelicans, which image the planet’s surface in 30-centimeter detail. The fourth of these, deployed into orbit in 2025, ran the airplane-recognition algorithm.

All Planet’s satellites combined generate 30 terabytes of data per day—equivalent to 10,000 hours of high-definition video, which gets beamed to the ground for processing and analysis via tens of radio stations scattered all over the world.
Transferring the downloaded data into the cloud for processing and subsequent AI analysis takes hours, leading to delays, which could mean that a sparked wildfire gets noticed only when it’s too large to quickly contain.
The AI image-recognition algorithms developed by Devaraj and her team analyze a single Pelican image comprising 16,000 pixels in half a second, using onboard GPUs. The results can be in the hands of users in minutes from the moment the image was taken.
So far, only the Pelican satellites are fitted with AI-capable processors—the Nvidia Jetson Orin GPU modules frequently used in autonomous drones. But Devaraj says Planet plans to augment the SuperDove constellation with a new type of satellite, called the Owl. The satellite will provide daily revisits with a higher resolution of up to 1 meter and will also be fitted with Nvidia’s Jetson processors, which are capable of AI detection.
In the future, the company wants to switch to more-powerful Nvidia Jetson Thor processors and eventually run large language models (LLMs) in space.
To run onboard AI image analysis in space, the algorithms need to be able to handle unprocessed raw data that hasn’t been smoothened out and corrected, unlike data crunched by AI algorithms on Earth.
The space-based real-time AI-detection service will only be made available to customers in the next six to nine months.
Planet collaborates with Google on the Suncatcher project, which intends to deploy a vast constellation of data-processing satellites into Earth’s orbit. The project is one in a plethora of recently discussed ventures that envision moving Earth-based data-crunching infrastructure off the planet. Proponents, including tech giants SpaceX and Amazon, believe that in Earth’s orbit, power-hungry computers will be able to run on free solar power and be easily cooled without straining water supplies. But critics question whether large-scale computing infrastructure could ever be launched cheaply enough to compete with technology on Earth.

