Satellites often assume that things on the ground aren't moving quickly (which is a fair assumption) and instead of trying to take photos of all color channels at once, they photograph each color channel in series, then stitch it together afterwards with remapping based on altitude data.
This works great as long as the thing they're photographing is actually on the ground and isn't moving at hundreds of miles per hour.
Here's a better example of the same issue.
I use the google street view app for this:
https://play.google.com/store/apps/details?id=com.google.android.street&hl=nl&gl=US
It has a button "take 360 image", and when you press it, it turn the camera on, and shows a yellow sphere at the horizon, keep the camera very steady and it scans the orb (it takes a picture), then rotate the phone so it scans the spheres at the sides and top and bottom.
Once you scanned all the orbs, or you press the continue button, it progresses to the next step. In this step, it inspects at all the "progress pictures", and assembles it to a full 360 photo, this takes about half a minute, and deal with "seaming issues"
The trick for a good 360 photo, is that you want to rotate the phone around the location of the lens, so every photo perfectly lines up. it is especially important for close by things
I would guess that he is using one of these apps or something similar.
https://play.google.com/store/apps/details?id=com.lge.sc
https://play.google.com/store/apps/details?id=fr.giroptic.cam
In both of these apps the interfaces look about the same so it hard to tell from that blurry picture