mirror of
https://github.com/photoprism/photoprism.git
synced 2026-03-02 22:57:18 -05:00
NSFW filter not moving files to private #1721
Labels
No labels
ai
android
api
auth
awesome
bug
bug
ci
cli
config
database
declined
deprecated
docker
docs 📚
documents
duplicate
easy
enhancement
enhancement
enhancement
epic
faces
feedback wanted
frontend
hacktoberfest
help wanted
idea
in-progress
incomplete
index
invalid
ios
labels
live
live
low-priority
macos
member-feature
metadata
mobile
nas
needs-analysis
no-coding-required
no-coding-required
observability
performance
places
please-test
plus-feature
priority
pro-feature
question
raspberry-pi
raw
released
released
released
research
resolved
security
sharing
tested
tests
third-party-issue
thumbnails
upgrade
upstream-issue
ux
vector
video
waiting
won't fix
won't fix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/photoprism#1721
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @kloknibor on GitHub (Feb 25, 2023).
1. What is not working as documented?
According to the docs when the following flag is enabled:
" PHOTOPRISM_DETECT_NSFW = True "
NSFW pictures should be checked for nsfw content and when found moved to the private folder. This does not seem to work.
2. How can we reproduce it?
Fresh install of Photoprism.
make sure " PHOTOPRISM_UPLOAD_NSFW = True "
and " PHOTOPRISM_DETECT_NSFW = True "
get 5 clear nsfw pictures from google and upload them.
All files will appear in search and not in the Private folder.
Funnely enough when " PHOTOPRISM_UPLOAD_NSFW = False " the pictures will be stopped from uploading, so the nsfw model seems to mark them correctly.
3. What behavior do you expect?
I expect that the test images with clear nsfw content (got 5 pictures, the upload filter will actually stop them from uploading when enabled) to move automatically to the private folder.
4. What could be the cause of your problem?
NSFW model not kicking in, althoug some log messages do show it's active. So NSFW not marking images as private.
5. Can you provide us with example files for testing, error logs, or screenshots?
nsfw_test.txt
6. Which software versions do you use?
(a) PhotoPrism Architecture & Build Number: AMD64, ARM64, ARMv7,...
AMD64
(b) Database Type & Version: MariaDB, MySQL, SQLite,...
Tried with MariaDB 10 and SQlite
(c) Operating System Types & Versions: Linux, Windows, Android,...
Debian LXC and Docker on synology NAS
(d) Browser Types & Versions: Firefox, Chrome, Safari on iPhone,...
Brave, Chrome
(e) Ad Blockers, Browser Plugins, and/or Firewall Software?
none
Build [221118-e58fee0fb] - Running from docker on a Synology NAS (DS920+) and MariaDB 10.5 in docker
Build [221118-e58fee0fb] - Running from an LXC on a NUC8 and MariaDB 10.5 in docker/SQLlite - used this for install https://github.com/tteck/Proxmox
7. On what kind of device is PhotoPrism installed?
(a) Device / Processor Type: Raspberry Pi 4, Intel Core i7-3770, AMD Ryzen 7 3800X,...
AMD64
(b) Physical Memory & Swap Space in GB
8 GB ram, DS920+ - Synology NAS
4 GB ram, 8 cores - NUC
(c) Storage Type: HDD, SSD, RAID, USB, Network Storage,...
HDD - Synology NAS
NVME SSD - NUC
(d) Anything else that might be helpful to know?
no
8. Do you use a Reverse Proxy, Firewall, VPN, or CDN?
no
@lastzero commented on GitHub (Feb 27, 2023):
Have these test images already been indexed? What file format? This will only flag new photos, so you can remove the flag after indexing without having it reset all the time.
@kloknibor commented on GitHub (Mar 8, 2023):
I just checked, they seemed to be done indexing. 3 were jpg's, 1 jfif and 1 was webp
@kloknibor commented on GitHub (Mar 8, 2023):
Is there possibly a test picture for this ai to make sure it's not the test pictures I used. Since I'm not sure I can share them due to copyright.
@lastzero commented on GitHub (Mar 8, 2023):
https://github.com/photoprism/photoprism/tree/develop/internal/nsfw/testdata
@kloknibor commented on GitHub (Mar 12, 2023):
I have checked the test library and the hentai is successfully detected even when I make small changes to it. However I have tried a lot of other test pics (hentai and real content) from google search and none is detected. So the detection is quite minimal but ok that can be since there is already a warning for this.
However how is it possible that the nsfw for upload detection (PHOTOPRISM_UPLOAD_NSFW) seems to function a better than the move to to private folder (PHOTOPRISM_DETECT_NSFW) how is that possible? I just checked and the upload filter detects 50% of the images correctly (set of 12) vs 1 for the move to private.
Seperate question, can we train the model ourselves to improve it?
@lastzero commented on GitHub (Mar 12, 2023):
We use different thresholds depending on the use case. Blocking all pictures that could contain pr0n is easy, simply block all. You see the problem?
@kloknibor commented on GitHub (Mar 16, 2023):
I understand, is there anyway to use the same threshold as the NSFW upload? Or even better set your own % thresholds? Or is everything hardcoded. I couldn't find anything in the docs about it. Thanks for the help by the way!
@lastzero commented on GitHub (Mar 16, 2023):
It's a variable in our code that you cannot change through the config right now. First time someone is asking for this.
@Menethoran commented on GitHub (Apr 25, 2024):
Since this thread is still open, i would like to add my input that I would also find benefit in being able to set a tighter filter on what is and is not NSFW