Process question with non destructive edits in source libraries

This is kind of general question and I am not sure there is a good answer to it but I am curious to hear how people deal with image adjustments applied to originals by upstream photo management software.
An example, I have a photo that look bad and I import it in Lightroom to adjust it, and it now looks good, yet, since the edit is non destructive, when I import these originals into photostructure, I do not see my changes and photostructure shows the original botched exposure.
I can export the image from lightroom and import that into photostructure, but then the connection to the original is broken.
It feels like it’s kind of wanting to have my cake and eat it too kind of question but I am wondering if folks have found solutions for this because, let’s face it, we all do want to have our cake and eat it too! :slight_smile:

I’d love to be able to “replay” the transformations that are done in other applications (like LR, Digikam, Darktable, …), but the last time I looked into this,

a) there didn’t seem to be any standardization, so each editor would require specialized work

b) there wasn’t a straightforward way for PhotoStructure to convert those transforms to a rastered image.

(FWIW, I’d be happy to add a setting so people could configure calls to external applications to “export on the fly” to make this less onerous, but we’d need to look into exactly what that would entail so I could design the new setting correctly)

If you did export it, wouldn’t photostructure manage to connect it to the original?
Essentially it should merge the 2 photos (the original and the edited) into 1 asset.

It depends, unfortunately: if the exporting application erases enough metadata, PhotoStructure has to resort to image content matching, and frequently a large exposure, contrast, or saturation edit will result in a different-enough image hash to make PhotoStructure think it’s a different asset.

But ideally, yes, the deduper would do it’s job properly (and if you ever see otherwise, please tell me: the heuristics can only get better with more examples!)

I use Folder Publisher by Jeffrey Friedl to publish/export my Lightroom images, and then use those exported jpgs for Photostructure. This way PS has all the edits.

When I make changes to images in Lightroom it republished the changes to PS. I find I need to click Restart Sync in PS to pick up the changes or new photos but otherwise it’s quite a quick and seamless workflow.

I imagine the scenario where the original is P12345.jpg and the edited version is P12345_edited.jpg (I think I’ve seen that naming conventions in one of my sources - forgot which). I would think PS would give special consideration in the matching when it sees this sort of filename pattern.

Ah, nice! It’s be ready for me to automatically reduce the image correlation threshold if the file names seemed to “match.”

@awojtas would this help you too? What naming convention does your tool use by default? Basically, can you send me

  1. A full path to an original image
  2. A full path to an exported image

?

Yes, these ideas are similar to what I was kicking around in my head. Rather than a token in the name, I was thinking about using a keyword in the metadata that would tell PS that this image is an edited version that has been “developed” and represents the intended look over the original. So, assuming PS properly matches the original with the edited version, it would automatically display the developed version but you could click on a button to display the original.
Since the editing is done in another program that uses non destructive edits, what PS needs is a PROXY of the edited work rather than a full baked out version (jpg 85 quality versus full 16 bit tiff). This would give you something temp you could use right away and give you the info you would need to go back and edit the original.
Obviously, the issue is how you marry these edited files to their originals so PS can bundle them back up as the same asset. I can use a specific token on naming the file during export or use Jeff Friedl’s lightroom plugin called “metadata wrangler” which would allow me to embed any extra info.
The problem with the filename token is that in my 150000 photos, I do have some names that repeat for completely different originals and the edits (crop, flips, extreme color corrections) might make it hard to find the correct source for.
The metadata keyword relies on the functionality of the image editing software and there is no consistency there. Maybe the original capture date can help differentiating the correct original source?
My feeling is that the filename token is the way to go.

Make the token configurable? For those of us that have hopped from tool to tool, it’s likely there are multiple variations on this theme. Then there is the question of multiple edits… So maybe it’s more a pattern than a specific token.

It reminds me a little bit of what I see in my library all over the place, due to sloppy cut and paste on some OSes… IMG123.jpg, IMG123(1).jpg, IMG123(2).jpg… Those are getting matched by PS just because they’re truly identical files, but what if there was a setting that could explicitly match a pattern. Dare I suggest regex (shudder in fear)

Agreed: me too. That’s why I think a better approach would be to lower the threshold for image correlation, rather than just ignoring image correlation.

Yup! I was just asking so I could make a “reasonable” set of defaults.

Now now, regular expressions aren’t that scary.

Unless you’re trying to validate emails. But don’t do that.

(I actually was assuming I’d use them here: there are already several settings that are either directly interpreted as RegExp or are assembled into RegExp: see sensitiveEnvRegExp and validationErrorBlocklist)

No, the naming doesn’t matter for me. If a photo is exported via the Folder Publisher plugin for Lightroom, I want it shown in the library. I wish I could turn off the de-duping entirely.

Ah, we talked about this a while back.

I’ve since added “meta” settings that set a bunch of other settings. disableAllFilters is one of those.

I’ve just made the strictDeduping setting set the following:

  Settings.useImageHashes.tmpValue = true
  Settings.minExposureSettingsCoeffPct.tmpValue = 95
  Settings.minImageCoeffPctWithExactDate.tmpValue = 95
  Settings.minImageCoeffPctWithFuzzyDate.tmpValue = 95
  Settings.minGreyscaleImageCoeffPct.tmpValue = 98
  Settings.minColorCoeffPct.tmpValue = 95
  Settings.minMeanCoeffPct.tmpValue = 95
  Settings.modeCorrCieDiffWeight.tmpValue = 1
  Settings.modeCorrIndexDiffWeight.tmpValue = 2

This should effectively disable “fuzzy” merging: but realize that RAW and JPG may not match anymore, as well.

This will drop with v1.0.0-beta.10.

1 Like