Thanks for taking the time to share this! I saw Simonās post when it was on hackernews, which pointed to osxphotos: both are excellent work.
(full disclosure: Iāve had some horrible experiences with Apple Photos corrupting my library ((several times!)), so personally, I donāt use it, but I understand the desire to not re-do any organization or curation work that people have done in other apps, which is why the next version pulls in Picasa and IPTC tags).
Caution: the following is me prattling on, rubber-ducky style.
Ideally there would be a way to plug in third party ācuratorsā into PhotoStructure, so if someone wanted to incorporate new tags from different sources, they could.
The reason why I havenāt just pulled this all into PhotoStructure already is because, at least currently, the exact same code runs on all 5 different flavors of PhotoStructure, and my only runtime requirement for every PhotoStructure edition is Node.
Both of these projects require a Python 3 runtime (which, if I could make as a dependency, would help so many other things, especially with ML/AI related projects, but would be a bear to package into the already-large PhotoStructure for Desktops editions).
The other issue with curation is very similar to the problem with keyword and other metadata editing: when there are several āvariantsā for a given asset, which variant (just the ābestā, or all of them?) should contribute to the final assetās metadata?
Currently metadata tag values are ālayeredā: the variants are sorted given a set of heuristics, and for any given tag, the ābestā variant with a value is the value for the asset.
Apple Photos doesnāt really de-dupe things (at least in my library) very well, so there is likely to be metadata in the photos library for more than one variant. I guess the layering strategy still makes sense in this case, as well?
In any case, osxphotos supports a one-time extraction of library metadata into sidecars. PhotoStructure already looks for and reads from these sidecars (and support for Title and Description are coming as well). So, if youāre willing to run osxphotos export ... --sidecar ..., the metadata that gets added to those sidecars will already make itās way into your PhotoStructure library.
A couple beta users are already using osxphotos to feed PhotoStructure, fwiw.
For tags that arenāt added as sidecars: Iām already spinning up ExifTool to do metadata extraction: there could be 1 or more tools that are configured as a setting, that when given a path to a filename, returns a JSON object with N tags (like the ZOVERALLAESTHETICSCORE). I guess this could be translated into a star rating for photos, and labels could be interpreted as additional keywords, perhaps?
And thus concludes this rubber-ducky episode. Thanks for tuning in.