A U.K. woman was photographed standing in a mirror where her reflections didn’t match, but not because of a glitch in the Matrix. Instead, it’s a simple iPhone computational photography mistake.
I see your point, though I wouldn’t put it that far. It’s an edge case that has to happen in a very short duration.
Similar effects can be acheived with traditional cameras with rolling shutter.
If you’re only concerned of relative positions of different people during a time frame, I don’t think you need to be that worried. Being aware of it is enough.
I don’t think that’s what’s happening. I think Apple is “filming” over the course of the seconds you have the camera open, and uses the press of the shutter button to select a specific shit from the hundreds of frames that have been taken as video. Then, some algorithm appears to be assembling different portions of those shots into one “best” shot.
It’s not just a mechanical shutter effect.
I’m aware of the differences. I’m just pointing out that similar phenomenon and discussions have been made since rolling shutter artifacts have been a thing. It still only takes milliseconds for an iPhone to finish taking it’s plethora of photos to composite. For the majority of forensic use cases, it’s a non issue imo. People don’t move that quick to change relative positions substantially irl.
Did you look at the example in the article? It’s clearly not milliseconds. It’s several whole seconds.