Blur content that's NSFW, NSFL, gory, disturbing, etc

While going through my photos to ensure nothing private is present before sharing with my family, I came across some really gorey ones from a hand injury. Some people absolutely don’t want to see any blood, it makes them nauseous :nauseated_face:.

I suggest that if a photo is tagged with NSFL or NSFW, it is not just blurred which can still show some detail, but instead replaced with a placeholder warning image. Maybe let people view all an image’s tags before actually showing the content, so user can better determine whether they actually want to see the image before showing it.

This is a great idea!

UX wise, is it enough to just blur thumbnails, and if they click, it isn’t blurred?

Unfortunately the current database schema makes this lookup somewhat expensive. I suspect I’d need to materialize the blur/nsf* flag into the asset table and add it to the cover index. I have a bunch of other materialized bits in schema already (like asset tag counts), so it’s just another thing the db janitor needs to do.

(there are also ML models that do NSFW-ish detection, so this could be automated–but what constitutes NSFW is extremely variable across customs and cultures, so it’d need to be configurable)

(I also suspect that people may not want to blur NSF* content, but “hide” or “archive” it away from public views altogether)

That’s absolutely good enough to start with. Can always be more elaborate later if needed.

Probably depends on the particular reason for being sensitive. Eg explicit material with adult themes may not be in PhotoStructure at all due to chance of children stumbling across it even if hidden/archived. But medical images with gore, blood, stitches, open wounds, etc., would probably stay available to all, just censored by default for individuals who feel faint at the sight of blood or just don’t like it.