Combining Images

Is photostructure supposed to combine similar but not identical images? I noticed that a number of images which were taken by timer (like 10 images in quick sequence) were are combined into 1 by photostructure (see image). Is this the expected behavior?

The deduping heuristics are fairly complicated:

You can use the ā€œinfoā€ tool and give it two files to have PhotoStructure explain why or why not two images will be considered duplicates.

https://photostructure.com/server/tools/#file-comparisons

It will aggregate images that are taken with the same camera and the same exposure information at the same moment (where ā€œmomentā€ has subsecond accuracy) with lower image content matching (to ensure RAW and jpg pairs get combined). If you DM me a couple example images, Iā€™d be happy to look into it: the heuristics could certainly need tweaking for new use cases I havenā€™t considered.

Cheers!

I guess I donā€™t know if it is necessarily a bad thing or not. I could understand grouping photos like that together (when taken in quick succession with a timer) although sometime you want to see a ā€œfunnyā€ photo and a ā€œseriousā€ photo. Perhaps if there was a way to break the photos apart and a way to choose the ā€œbestā€ photo to show from the group. Also Iā€™ll have to check to see if it accurately grouped all of the photos from the timer shots we took.

I can send you some images from this.

What would be the simplest way to ā€œun-groupā€ two or more images aside from updating DB? Change ExifCapture TS to be more than (what value?) seconds apart?

K

If a photo is more than two seconds apart, or has substantively different exposure information, (different when looking at 2 significant digits) itā€™ll be kicked into a different asset.

I thought about it, and I could add a ā€œthis isnā€™t a duplicate of these imagesā€ button to the asset info panel. Itā€™ll add a new uuid to the asset files, and a different uuid to the file the needs to be kicked out of the group. The deduper will then be able to keep those files apart in subsequent syncs or rebuilds.

This sounds great. I analyzed two images I want to ā€œsplitā€ with exiftool and I donā€™t see proper createTS in them, yet PS managed somehow to properly find it. HOW!? I looked at various fields in exiftool output, but there are no valid candidates for this. These three below are wrongly pointing to filesystem level properties:

File Modification Date/Time     : 2021:04:28 21:09:32-04:00
File Access Date/Time           : 2021:05:28 18:56:25-04:00
File Inode Change Date/Time     : 2021:04:28 21:09:32-04:00

PSā€™s info tool for the same image gives capturedAt JSON element with what I think correct dates are.

{
      date: u {
        start: ExifDateTime {
          year: 2016,
          month: 12,
          day: 17,
          hour: 17,
          minute: 52,
          second: 4,
          millisecond: 0,
          tzoffsetMinutes: undefined,
          rawValue: '2016:12:17 17:52:04',
          zoneName: undefined
        },
        middle: ExifDateTime {
          year: 2016,
          month: 12,
          day: 17,
          hour: 21,
          minute: 25,
          second: 3,
          millisecond: 250,
          tzoffsetMinutes: -300,
          rawValue: undefined,
          zoneName: 'America/Toronto'
        },
        end: ExifDateTime {
          year: 2016,
          month: 12,
          day: 18,
          hour: 11,
          minute: 37,
          second: 0,
          millisecond: 250,
          tzoffsetMinutes: -300,
          rawValue: '2016:12:18 11:37:00.250',
          zoneName: undefined
        },

Found this PhotoStructure | How does PhotoStructure capture captured-at? Seems we are getting to Step 4: Infer a date interval from siblings. This is freaking awesome and so close to black magicā€¦ :astonished:

K

We just had a photoshoot and when I added the photos into the library it combined a lot, which based on how it works makes sense however I would rather it not combine them because I had to use windows photo viewer to actually flip through and view each image easily, I would have rather used photostructure.

Perhaps there could be a switch which turns off duplicate combining or even to enable/disables certain rules to make it combine exactly how you want. Or perhaps a switch which in the UI splits all the files so you can look at them all for one option and combines them back all together as per the de-dup rules for the other option.

1 Like

If you can send me some incorrectly-deduplicated image pairs, I can take a look and tweak the heuristics.

Hmm, thatā€™s an interesting approach! Iā€™ll think about that a bit.

If I can stick my oar in, Iā€™d love the ability to turn off de-duping entirely.

My PhotoStructure library is an export from my Lightroom, and Iā€™ve got no duplicates, because the Lightroom catalogue is already maintained and de-duped.
Iā€™m concerned that I might have photos which might be de-duped when I donā€™t want that, e.g. burst sequences, film scans which all have the same capture date, old photos which have all been set to 1-Jan-1990. De-duping has no value for my library, although I can imagine it being very helpful in other scenarios.

Same. I am also in that boat where my photos are already deduped. Therefore I would also love a switch that turns off de-duping heuristics entirely.

So far I have been able achieve turning it off myself by setting all the coefficients to 100. But something more user friendly would make this easy! :slight_smile:

Needless to say, Iā€™m quite jealous of you two that your libraries are already so tidy: Iā€™ll think about how to add a simple ā€œonly very strict dedupingā€ setting to make this easier to configure.

1 Like

To include I your thinkingā€¦

The ability to ā€˜de-dupeā€™ different file types into one asset, but never include multiple of the same file types.

I.e. I also have spent a lot of time pretty well cleaning up my photos, but I shoot a lot of JPEG+RAW that I would still like to be combined into one asset.

Just a thought.

@rgrandy shared a couple of the incorrectly-aggregated images with me, and it turns out that the photographer stripped the images of all standard timestamp tags (!!), and added Metadata Date, which is an XMP tag (that Iā€™ve never seen in the wild before: I suspect itā€™s from Lightroom).

@bdillahu , @Adrian-Ng , if you can send me some incorrectly aggregated images, it might be another case of a simple tweak I can make, and then you donā€™t have to fight the de-duplicator anymore.

This fix will be in 1.0.0-beta.4 (which will be released soon).

@rgrandy just shared a couple more examples: it turns out that when I added the fuzzy-date-comparator, Iā€™d inadvertently deleted the simple non-fuzzy date check.

@bdillahu and @Adrian-Ng youā€™ll probably want to run ā€œrebuild libraryā€ when you upgrade to 1.0.0-beta.4: this will re-aggregate the assets in your library and fix these issues.

Iā€™ve actually just removed this tag from the default set of values for the capturedAtTags setting in v2.1. According to the XMP spec, MetadataDate is the last time metadata was edited: itā€™s certainly not the correct value for when the image was captured.