Support for iPhone Live Photos

Issue
PhotoStructure displays different time stamps for JPG and MOV files from iPhone Live Photos. The .MOV files are time stamped +9 hours ahead of the JPG files in PhotoStructure, which causes them to appear out of order instead of next to each other. The JPG and MOV files have the same time stamp when viewed via file explorer.

Setup

  • iPhone w/ iOS 14.3 using “Most Compatible” format (JPG/MOV)
  • PhotoStructure v0.9.1 AppImage running on an Ubuntu 20.04
  • PhotoSync syncs photos from iPhone to Synology NAS (storage location for PhotoStructure)

Steps to Replicate

  • Take photos on iPhone
  • PhotoSync syncs separate JPG and MOV files to Synology NAS photos folder
  • PhotoStructure scans Synology NAS photos folder and imports new iPhone live photos

EXPECTED RESULT: JPG and MOV files appear with the same timestamps and are right next to each other in PhotoStructure (or even better, merged)

OBSERVED RESULT: JPG and MOV files are separate with different timestamps. MOV files are +9 hours ahead of JPG times

Is there a better way to deal with iPhone live photos using PhotoStructure?

1 Like

Sorry about this! You’re actually seeing both a bug and a missing feature.

The bug: videos are N hours off because PhotoStructure v0.9.1 doesn’t correctly infer the correct timezone when it’s missing from tags (which is frequently the case with videos).

The missing feature: live photos needs to be “deduped” into a single asset, which will require several changes to the current deduping heuristics. Live photos (with autoplay-video-that-fades-to-high-resolution-image) needs some custom FE work, too. It’s on my todo list.

I’m going to recategorize this post as a feature request so people can vote for live photo support. :+1:

I am not sure if this help, but I found this link elsewhere: Working with Live Photos | Limit Point
This at least seems to indicate that live photos must not rely on a heursitic, but can be identified very deterministic. Personally, I would feel that UI magic like autoplay-on-hover or auto-play videos in the thumbnails can be considered advanced features.
I am currently evaluating PhotoStructure and having the live videos all over the place makes the overview very messy. I would be perfectly fine with having the live videos only accestible indirectly at first or even not at all, to be honest. :slight_smile:

Wow, that’s a great tip, thanks for sharing that! I’ll add this to the deduplication heuristics as soon as I can.

Hi,

Are there any updates since the last reply or new live-photo related features in photo structure now?

Jacob

Howdy: live photo support won’t be in the next release, v2.1.

I did find the content UUID that Apple is using and exposed with standard tags (the above link requires Apple’s Photos library: PhotoStructure needs to work on Linux and Windows as well).

There are two bits of work for this feature: backend aggregation support, and frontend rendering support.

Depending on the file, MediaGroupUUID or ContentIdentifier can be used as a content UUID to aggregate the HEIC and HEVC files. I need a new indexed content UUID database column to efficiently glue the variants together at import time. That’s totally doable, but v2.1 is feature frozen (it keeps getting pushed out due to a terrible case of “while I’m in there, just one more improvement”-itis).

The next bit of work is to make the frontend handle smoothly transitioning from the autoplayed movie to the higher resolution still image. Again, totally doable, but it needs to be in a future release.

Apologies for the delay, and thanks for your patience!

Would this also apply to non-iphone Live photos? And does this thread concern non-iphone live photos?

@Samit_Shaikh if you have a “live photo” image/video pair from a non-iPhone, and it doesn’t contain any private content, please send them in an email to support@photostructure.com and I can take a look at what I’d need to do to support it.

Eh, maybe? It really depends on how different the code will have to be to properly aggregate and render those other “live photos”.

Wow! Just in case this is interesting, that’s a far more conclusive answer than the most senior apple-media-support-rep available was able to give me. When I asked for this level of detail regarding how apple must have been storing live photo UUIDs somewhere in the .jpg/.mov metadata, after four separate “I don’t know, let me escalate you” I was finally met with “that is something so low-level and niche that our photos.app developers do not even tell us how that works”.

At the moment my 40,000 item media library has been aggregated in a few central folders since I’m trying to keep things out of photos.app, but I have not been able to find a way to mark or in any way filter/organize live photo pairs.

Depending on the file, MediaGroupUUID or ContentIdentifier can be used as a content UUID to aggregate the HEIC and HEVC files.

I believe you may have just given the most concrete and specific answer to what happens with live photos on a metadata level anywhere on the internet.

My only question is now: Is there any way I can do something with this info on my own to organize my dangling live photos until photo structure v2.1 comes out?

Thanks so much for your efforts!

Oh ok Ill get on it soon

Maybe? It depends on what you’re going to accomplish once you find each “pair”.

PhotoStructure uses ExifTool (via exiftool-vendored) to extract these tags. If you’re handy with shell scripting, you could cook up a script to move or rename files based on their content UUID, perhaps?

Whatever you do, please have a full, offline backup of your stuff: it’s astoundingly easy to get things not quite right, and computers are fast, especially when it comes to cataclysmic terribleness. I actually have several full offline backups: lots of copies keeps stuff safe.

Thanks for sharing: bummer that Apple gave you the runaround!

Thanks so much for keeping this on your to-do list!

For what it’s worth, although I think it would be super cool to have the Live Photo video play and then display the still image in Photostructure like Apple does, I’d be very happy in the meantime to have the date mismatch issue sorted out so that the video and photo portions of the Live Photos show up in pairs right next to each other in my library, and don’t end up hours or even days apart. I’ve been holding off on loading lots of our photos from the past several years into Photostructure because we have so many Live Photos. Based on my spot testing, it looks like they’re going to be pretty scattered with photo/video pairs split up all over the place. :slight_smile: If it would help for me to resend some of the Live Photos that get sorted several days apart, please let me know!

Thanks again for sending those examples! I replied via email just now, but it looks like the new EXIF parsing code in v2.1 is handling the timezone offsets correctly in your files now.

If you try v2.1 and find it’s not doing the right thing, please do tell me!

Cheers!