A Bird’s-Eye View of the Chinese Balloon
The Visual Investigations team at The New York Times relies on visual evidence, such as video footage, photography and forensic analysis, to conduct its investigations and offer readers a more holistic view of crucial news moments.
For one of its latest projects, the team looked to the sky.
In late February, Muyi Xiao, a reporter on the Visual Investigations desk who covers China, was brainstorming ways her team could examine the Chinese balloon that floated across the United States weeks earlier. Ms. Xiao began researching satellite imagery of high-altitude balloons online. She learned of a company, Synthetaic, that used artificial intelligence to find several satellite images that contained the balloon, including one image that showed it moving over South Carolina about four hours before it was shot down. With these images, the company was able to trace a path of the balloon’s trajectory over the United States.
Ms. Xiao contacted Synthetaic’s founder, Corey Jaskolski, that night to learn more about the technology. For about a month, Ms. Xiao and Visual Investigations, along with The Times’s Graphics desk, collaborated with Synthetaic and Planet Labs, the satellite image provider, to better understand the balloon’s capabilities and trace a more precise path, from the balloon’s launch site in Hainan, China, in mid-January to its downing weeks later.
In February, U.S. officials concluded that the balloon was part of a global surveillance fleet designed to collect information on militaries around the world; China has insisted that it was a weather balloon that drifted off course. The work of the Visual Investigations team reveals new information about the balloon’s movements and abilities, including that it was remotely maneuvered at points on its journey. In an interview, Ms. Xiao shared more about her investigation into the balloon and the challenges of tackling fast-moving news. This conversation has been edited.
What were your goals heading into the investigation?
There are two things we really wanted. One was to find the balloon’s trace in satellite imagery when it was in Asia so we could have a more complete picture of its journey using very precise sightings by satellites. The second goal was to calculate the balloon’s altitude. When I was talking with the founder of Synthetaic, he told me he calculated some points of altitude and how he did it. I thought, OK, that makes so much sense. Now we can do that for every location of the balloon. We refined the methodology after several conversations with him and Planet Labs and calculated the altitude for every location of the balloon.
If you know the altitude of each location, then you have evidence to show whether this balloon was maneuverable. You know if it was being steered.
How does Synthetaic find objects in satellite images?
It uses an unsupervised A.I. model, which means the algorithm doesn’t have to be trained with lots of reference images for what an object looks like. You can give it a sketch of something and it will look for it in satellite images. The software will return a cluster of results that it thinks is similar to what you drew.
After you see this cluster, you need to manually select the images that look more like it. In this process, you are teaching the algorithm to learn what you’re looking for in a more precise way. After rounds of selection and refining, the platform will most likely give you the right results.
How did you trace the balloon’s path and altitude?
In order to map out the most precise path to date, what you need are coordinates. Based on sightings and images, you only have a rough region. We got the precise coordinates from Synthetaic.
To calculate the balloon’s altitude, you need exclusive raw data from Planet Labs, the satellite image provider, including the speed of the satellite and altitude at any moment; the time difference between the two images you are using to do this calculation; and the exact resolution, like how many meters one pixel in that satellite imagery is equal to. This data isn’t publicized, and we got it exclusively from Planet Labs.
For a time, you weren’t sure whether the balloon you were tracking was the Chinese balloon. What evidence did you have to confirm that this was the actual balloon?
Julian Barnes, who covers national security from Washington, passed questions to his U.S. government sources and got some pretty good information. The most important reporting that came from Julian’s communication with his sources was the confirmation that the launch date was mid-January. The balloon we tracked launched on Jan. 15, so we confirmed that this balloon was the balloon.
What are your takeaways from the balloon’s altitude fluctuations?
I lean on the balloon expert’s takeaways. We found a retired NASA engineer who designed high-altitude superpressure balloons. (A superpressure balloon is the same type of balloon as the Chinese balloon.) He took a look at our altitude calculations and said the range of the altitude change along the path was not caused by natural forces — it was man-made. Based on this altitude change, this balloon is an altitude-controlled balloon, meaning that it could be steered.
What were some of your challenges during reporting?
I think the most difficult part for me was that, before this, I’d only done one article at The New York Times that involved satellite imagery. I basically had to do a crash course about some pretty advanced technology, including satellite technology and geospatial analysis, so that nothing was misrepresented in the story.
Is there some larger aspect of being on a Visual Investigations team that’s particularly complicated?
Our team is really trying to offer clarity on quick-moving news moments, and that’s challenging. To make any story meet an investigative threshold is not easy. A lot of times, you just need more time. Even though we schedule a lot of time and effort to look into something, you might not yield any results. Tackling news moments under time pressure is another persistent challenge for our team.