Deduplication: Not found or no asset.delete access #4516

Closed
opened 2026-02-20 03:06:29 -05:00 by deekerman · 33 comments
Owner

Originally created by @shad00m on GitHub (Dec 20, 2024).

The bug

When trying to deduplicate all my pictures in the GUI it gives me the error Not found or no asset.delete access

Trying to deduplicate them one by one works, and it can move them to the trash just fine. There's also no problem cleaning the trash.
All my photos are in an external Library, and this library has the same permissions set to all files and folders in DSM.
I assume 154637 duplicates is too much for it or there's a problem with "some" duplicates? How can I find which duplicate is causing this error?

The OS that Immich Server is running on

Synology DSM via Portainer

Version of Immich Server

v1.123.0

Version of Immich Mobile App

none

Platform with the issue

  • Server
  • Web
  • Mobile

Your docker-compose.yml content

name: immich

services:
  immich-server:
    container_name: immich_server
    image: ghcr.io/immich-app/immich-server:${IMMICH_VERSION:-release}
    # extends:
    #   file: hwaccel.transcoding.yml
    #   service: cpu # set to one of [nvenc, quicksync, rkmpp, vaapi, vaapi-wsl] for accelerated transcoding
    volumes:
      # Do not edit the next line. If you want to change the media storage location on your system, edit the value of UPLOAD_LOCATION in the .env file
      - ${UPLOAD_LOCATION}:/usr/src/app/upload
      - /etc/localtime:/etc/localtime:ro
      - /volume1/homes/shadoom/Photos:/home/photos
    env_file:
      - stack.env
    ports:
      - '2283:2283'
    depends_on:
      - immich_redis
      - database
    restart: always
    healthcheck:
      disable: false

  immich-machine-learning:
    container_name: immich_machine_learning
    # For hardware acceleration, add one of -[armnn, cuda, openvino] to the image tag.
    # Example tag: ${IMMICH_VERSION:-release}-cuda
    image: ghcr.io/immich-app/immich-machine-learning:${IMMICH_VERSION:-release}
    # extends: # uncomment this section for hardware acceleration - see https://immich.app/docs/features/ml-hardware-acceleration
    #   file: hwaccel.ml.yml
    #   service: cpu # set to one of [armnn, cuda, openvino, openvino-wsl] for accelerated inference - use the `-wsl` version for WSL2 where applicable
    volumes:
      - /volume2/docker/immich/cache:/cache
    env_file:
      - stack.env
    restart: always
    healthcheck:
      disable: false

  immich_redis:
    container_name: immich_redis
    image: docker.io/redis:6.2-alpine@sha256:eaba718fecd1196d88533de7ba49bf903ad33664a92debb24660a922ecd9cac8

    healthcheck:
      test: redis-cli ping || exit 1
    restart: always

  database:
    container_name: immich_postgres
    image: docker.io/tensorchord/pgvecto-rs:pg14-v0.2.0@sha256:90724186f0a3517cf6914295b5ab410db9ce23190a2d9d0b9dd6463e3fa298f0
    environment:
      POSTGRES_PASSWORD: ${DB_PASSWORD}
      POSTGRES_USER: ${DB_USERNAME}
      POSTGRES_DB: ${DB_DATABASE_NAME}
      POSTGRES_INITDB_ARGS: '--data-checksums'
    volumes:
      # Do not edit the next line. If you want to change the database storage location on your system, edit the value of DB_DATA_LOCATION in the .env file
      - ${DB_DATA_LOCATION}:/var/lib/postgresql/data
    command: >-
      postgres
      -c shared_preload_libraries=vectors.so
      -c 'search_path="$$user", public, vectors'
      -c logging_collector=on
      -c max_wal_size=2GB
      -c shared_buffers=512MB
      -c wal_compression=on
    restart: always

Your .env content

UPLOAD_LOCATION=/volume2/docker/immich/upload
DB_DATA_LOCATION=/volume2/docker/immich/db
IMMICH_VERSION=release
DB_PASSWORD=dbpassword
DB_USERNAME=postgres
DB_DATABASE_NAME=immich
REDIS_HOSTNAME=immich_redis

Reproduction steps

  1. Review Duplicates

  2. Click "Deduplicate All"

  3. Confirm
    image

  4. Top right corner shows this error:
    image

...

Relevant log output

[Nest] 17  - 12/20/2024, 12:10:48 PM   DEBUG [Api:LoggingInterceptor~52hrbm3m] DELETE /api/assets 204 3120.61ms ::ffff:172.27.0.1
[Nest] 17  - 12/20/2024, 12:10:48 PM VERBOSE [Api:LoggingInterceptor~52hrbm3m] {"ids":["46d68ba2-257b-4b9d-b66d-6bf8067854b5","313bc638-4f07-4322-a3e1-6b6ab63c650a","2f92f43d-6e50-4bc8-8572-5cfa36f76a1f","3f6ff897-8377-4d0a-84c7-a48faa29de08","ccd3e0a9-bab3-4ddc-9c0e-c5be3dc8ed8f","a25be367-cbe3-4801-85ae-860b056fe426","c891967f-ca26-4829-9a68-ee51ef2c8e94","3daf6043-e213-4aec-aaa8-bc43d1b0fded","6ba23e1e-795c-4abc-acd4-b570dab06c94","a1148fb2-4279-43df-b045-0fdc30df02e2","2c8cdb68-2b38-416d-8e56-2eb67473ceaf","a58b08dc-17a0-45f2-92f9-aec433bf8d05","25c2be32-8ca4-4bfb-8ab6-d7fe250654c6","877e849d-b8ee-41c9-b1dc-740d7f452aff","5dfc9fb6-5ca2-41f8-8808-4553dbeebb48","caaf409e-f6d6-4e80-825a-a19ee85b6f68","83fa2be9-dc53-41f6-96d1-81444be77ca4","f4d5de3f-f46b-4e0b-b09e-a8391fe85934","39ad0e6f-dc20-4ce0-b486-2163defa8e56","e208a51e-8af5-47fd-a8df-7a8a0a1bae19","d7a8425b-443c-46da-9b19-4096dc15d61c","17087e6f-17b7-4e3c-8910-24f7643340f0","4096ca51-d22c-4ae1-bef6-691a9ca0d2a5","63a8436e-ec47-4b83-bf3f-9a2cae800953","4bd3ae2a-a936-4426-856e-ee1e024a63c5","9d27b9ce-a40c-427a-9d00-46582f357a30","b405e88f-ec9b-4cad-ad21-45944a9595aa","580c2b04-57ff-4afa-b6a2-11172a36c973","b0cc779b-49b4-40d4-ba7a-c13320c2ae8d","0d2b2504-1f45-4ac0-bb28-9153d1e55f65","61f782e4-2fac-4c10-8cc4-9c1a521cdf8e","80645d58-641f-43ec-bf1f-273529add736","0264db4a-38bb-42d3-a8b8-ec61f60db22a","d1a9469b-79c1-4946-b9a4-c7c55367a208","fad98ac3-4676-4d9d-a6eb-1858e459c062","31b6d2f2-c121-4afb-9558-398fc09e3750","60ba97c3-309c-4794-bcae-f8dd1830722b","01fb2101-41f5-4ac0-8a6f-1df50262395c","2e9b158a-99cc-4fd5-9983-f0ade1a0849d","f0a282df-26d5-4036-a74b-d18e850dccf3","b34dda89-60ce-499a-b26b-ec1c2cd17906","86b2f26a-d368-4ee9-ae48-684cdb269288","5dfcdbe2-83ae-4a38-a3b9-468c7f869769","9414eee5-28e5-4831-985a-9e041eb721ff","49f4de11-313a-44a1-8176-34bfced30dfb","2e03f81f-0de7-4f04-8011-1a11b7076bcd","70e14285-73cd-4099-bca8-3059855e1e62","a57e4809-3c61-4539-bb5a-748fd3d2441d","fbdfa59a-2773-4a53-9901-422d0fa462bf","b89f075d-e127-481a-8098-0b9133c36df5","43b4e984-104e-41c3-878e-dd6571e71bf4","26938466-2f41-4d90-8b41-598668151a27","9bf4b23f-cc97-44ae-879b-727821f59d8e","1b6d0dda-6d57-4c29-b702-fef29f70ace9","1050890a-67b8-4e5e-9ede-dad3c0663d85","4895fce1-57cf-4ede-b863-dd58a291f61b","94c185ab-ce0f-4604-96fd-9f05e206e49a","a1fbcd7e-39f2-47f0-ba1f-af106c149123","8dd8d079-e9ab-4fd5-bf8f-a7b1134175d8","3119ab6b-1c77-4e78-9fab-360b246ba6a7","29ebab94-fa5a-45ee-88e9-cdd4bffa9c42","e1ea3d77-90c1-43bd-b914-ff455af595f5","83a8db03-2cba-46ed-882e-77d063149a76","bd17fa44-0e7f-49b2-aaef-e5ce7696ba8a","97bb3e84-15c0-49e5-a761-5ca56129ad61","6309034b-7304-42f4-9f10-2a6d85019f05","5500027b-48a2-4bc5-869f-ff209eea6190","1014ee7c-a695-4ee0-b069-7824f6676c6f","aac4e5f2-b187-4386-9002-68535989e96d","54ca49ac-e894-4c39-923d-f5df0449a5d8","47782279-6c2d-46d0-bb1c-29880ac8b3b7","68824585-98c3-4afb-93ff-46f3c4cbf6ab","ff6fd083-5ea7-4c59-9496-c56db5a59ecf","d4f71d8d-04a7-458a-944e-13b6e0671a2c","74be561f-14d8-419d-b49e-fd67d1371129","9a48b898-61f6-47c0-a127-c3e951da7cb2","6dc246b1-04f3-403e-b672-c269cad6c639","ff656157-a2be-44ce-996d-0e377ce156eb","7a3440a5-6de0-4575-a20c-37b28201a4db","5ebd07cf-be02-4eaa-890c-d5efd24796fc","5ede83d4-dd0f-4bdd-9197-abe2e6ae9590","429179c2-8a73-4f0a-84a5-a3fccb1336ed","881968ca-88ce-4733-a790-0207e10b3ba9","43842535-3727-4437-9617-3256609172d2","58783f5c-90c8-41cf-9d4a-9a08ce3822ad","eaee754c-296a-46f2-ad59-dc6b10820882","866aac07-5283-46a3-a254-8a58f449bdcb","36a12f0a-e7cc-4558-bab8-d3e03f901215","3a0a3163-ed0d-4e70-83dd-63501f14c158","1dda6f53-8b56-4b64-98de-96be938336ef","9a83fbd3-058a-47cf-aa99-7ee3e0901ee4","f3ae5dc3-c756-4bab-bf8c-029afd313c0b","fc4c46bc-de7d-443b-b74c-a62d15fbf0cd","26c4aa96-6b92-49c0-aba7-f286a7c2b834","da042d54-2881-48fc-a482-8b5daba6355a","28d6e8f1-e548-4b44-90cd-f463a68b1fb4","9f02d9aa-bb04-4b26-81c0-87bbe605c73d","a3c794a5-515b-48f6-8811-2d6664cbb82a","d3131c64-88b7-4d71-8b06-a9640e14107a","ff593132-9db4-41c8-b14f-e84d92baa699","...and 154637 more"],"force":false}
[Nest] 17  - 12/20/2024, 12:10:48 PM   DEBUG [Api:GlobalExceptionFilter~52hrbm3m] HttpException(400): {"message":"Not found or no asset.delete access","error":"Bad Request","statusCode":400}
[Nest] 17  - 12/20/2024, 12:11:04 PM   DEBUG [Api:LoggingInterceptor~1xlr9aph] GET /api/server/ping 200 0.22ms ::ffff:127.0.0.1

Additional information

image

Originally created by @shad00m on GitHub (Dec 20, 2024). ### The bug When trying to deduplicate all my pictures in the GUI it gives me the error **Not found or no asset.delete access** Trying to deduplicate them one by one works, and it can move them to the trash just fine. There's also no problem cleaning the trash. All my photos are in an external Library, and this library has the same permissions set to all files and folders in DSM. I assume 154637 duplicates is too much for it or there's a problem with "some" duplicates? How can I find which duplicate is causing this error? ### The OS that Immich Server is running on Synology DSM via Portainer ### Version of Immich Server v1.123.0 ### Version of Immich Mobile App none ### Platform with the issue - [X] Server - [X] Web - [ ] Mobile ### Your docker-compose.yml content ```YAML name: immich services: immich-server: container_name: immich_server image: ghcr.io/immich-app/immich-server:${IMMICH_VERSION:-release} # extends: # file: hwaccel.transcoding.yml # service: cpu # set to one of [nvenc, quicksync, rkmpp, vaapi, vaapi-wsl] for accelerated transcoding volumes: # Do not edit the next line. If you want to change the media storage location on your system, edit the value of UPLOAD_LOCATION in the .env file - ${UPLOAD_LOCATION}:/usr/src/app/upload - /etc/localtime:/etc/localtime:ro - /volume1/homes/shadoom/Photos:/home/photos env_file: - stack.env ports: - '2283:2283' depends_on: - immich_redis - database restart: always healthcheck: disable: false immich-machine-learning: container_name: immich_machine_learning # For hardware acceleration, add one of -[armnn, cuda, openvino] to the image tag. # Example tag: ${IMMICH_VERSION:-release}-cuda image: ghcr.io/immich-app/immich-machine-learning:${IMMICH_VERSION:-release} # extends: # uncomment this section for hardware acceleration - see https://immich.app/docs/features/ml-hardware-acceleration # file: hwaccel.ml.yml # service: cpu # set to one of [armnn, cuda, openvino, openvino-wsl] for accelerated inference - use the `-wsl` version for WSL2 where applicable volumes: - /volume2/docker/immich/cache:/cache env_file: - stack.env restart: always healthcheck: disable: false immich_redis: container_name: immich_redis image: docker.io/redis:6.2-alpine@sha256:eaba718fecd1196d88533de7ba49bf903ad33664a92debb24660a922ecd9cac8 healthcheck: test: redis-cli ping || exit 1 restart: always database: container_name: immich_postgres image: docker.io/tensorchord/pgvecto-rs:pg14-v0.2.0@sha256:90724186f0a3517cf6914295b5ab410db9ce23190a2d9d0b9dd6463e3fa298f0 environment: POSTGRES_PASSWORD: ${DB_PASSWORD} POSTGRES_USER: ${DB_USERNAME} POSTGRES_DB: ${DB_DATABASE_NAME} POSTGRES_INITDB_ARGS: '--data-checksums' volumes: # Do not edit the next line. If you want to change the database storage location on your system, edit the value of DB_DATA_LOCATION in the .env file - ${DB_DATA_LOCATION}:/var/lib/postgresql/data command: >- postgres -c shared_preload_libraries=vectors.so -c 'search_path="$$user", public, vectors' -c logging_collector=on -c max_wal_size=2GB -c shared_buffers=512MB -c wal_compression=on restart: always ``` ### Your .env content ```Shell UPLOAD_LOCATION=/volume2/docker/immich/upload DB_DATA_LOCATION=/volume2/docker/immich/db IMMICH_VERSION=release DB_PASSWORD=dbpassword DB_USERNAME=postgres DB_DATABASE_NAME=immich REDIS_HOSTNAME=immich_redis ``` ### Reproduction steps 1. Review Duplicates 2. Click "Deduplicate All" 3. Confirm ![image](https://github.com/user-attachments/assets/41874319-0f3b-44d5-9296-44d3716a9b41) 4. Top right corner shows this error: ![image](https://github.com/user-attachments/assets/df6c5ac7-a99e-4b6b-8f79-fbd87ac4b925) ... ### Relevant log output ```shell [Nest] 17 - 12/20/2024, 12:10:48 PM DEBUG [Api:LoggingInterceptor~52hrbm3m] DELETE /api/assets 204 3120.61ms ::ffff:172.27.0.1 [Nest] 17 - 12/20/2024, 12:10:48 PM VERBOSE [Api:LoggingInterceptor~52hrbm3m] {"ids":["46d68ba2-257b-4b9d-b66d-6bf8067854b5","313bc638-4f07-4322-a3e1-6b6ab63c650a","2f92f43d-6e50-4bc8-8572-5cfa36f76a1f","3f6ff897-8377-4d0a-84c7-a48faa29de08","ccd3e0a9-bab3-4ddc-9c0e-c5be3dc8ed8f","a25be367-cbe3-4801-85ae-860b056fe426","c891967f-ca26-4829-9a68-ee51ef2c8e94","3daf6043-e213-4aec-aaa8-bc43d1b0fded","6ba23e1e-795c-4abc-acd4-b570dab06c94","a1148fb2-4279-43df-b045-0fdc30df02e2","2c8cdb68-2b38-416d-8e56-2eb67473ceaf","a58b08dc-17a0-45f2-92f9-aec433bf8d05","25c2be32-8ca4-4bfb-8ab6-d7fe250654c6","877e849d-b8ee-41c9-b1dc-740d7f452aff","5dfc9fb6-5ca2-41f8-8808-4553dbeebb48","caaf409e-f6d6-4e80-825a-a19ee85b6f68","83fa2be9-dc53-41f6-96d1-81444be77ca4","f4d5de3f-f46b-4e0b-b09e-a8391fe85934","39ad0e6f-dc20-4ce0-b486-2163defa8e56","e208a51e-8af5-47fd-a8df-7a8a0a1bae19","d7a8425b-443c-46da-9b19-4096dc15d61c","17087e6f-17b7-4e3c-8910-24f7643340f0","4096ca51-d22c-4ae1-bef6-691a9ca0d2a5","63a8436e-ec47-4b83-bf3f-9a2cae800953","4bd3ae2a-a936-4426-856e-ee1e024a63c5","9d27b9ce-a40c-427a-9d00-46582f357a30","b405e88f-ec9b-4cad-ad21-45944a9595aa","580c2b04-57ff-4afa-b6a2-11172a36c973","b0cc779b-49b4-40d4-ba7a-c13320c2ae8d","0d2b2504-1f45-4ac0-bb28-9153d1e55f65","61f782e4-2fac-4c10-8cc4-9c1a521cdf8e","80645d58-641f-43ec-bf1f-273529add736","0264db4a-38bb-42d3-a8b8-ec61f60db22a","d1a9469b-79c1-4946-b9a4-c7c55367a208","fad98ac3-4676-4d9d-a6eb-1858e459c062","31b6d2f2-c121-4afb-9558-398fc09e3750","60ba97c3-309c-4794-bcae-f8dd1830722b","01fb2101-41f5-4ac0-8a6f-1df50262395c","2e9b158a-99cc-4fd5-9983-f0ade1a0849d","f0a282df-26d5-4036-a74b-d18e850dccf3","b34dda89-60ce-499a-b26b-ec1c2cd17906","86b2f26a-d368-4ee9-ae48-684cdb269288","5dfcdbe2-83ae-4a38-a3b9-468c7f869769","9414eee5-28e5-4831-985a-9e041eb721ff","49f4de11-313a-44a1-8176-34bfced30dfb","2e03f81f-0de7-4f04-8011-1a11b7076bcd","70e14285-73cd-4099-bca8-3059855e1e62","a57e4809-3c61-4539-bb5a-748fd3d2441d","fbdfa59a-2773-4a53-9901-422d0fa462bf","b89f075d-e127-481a-8098-0b9133c36df5","43b4e984-104e-41c3-878e-dd6571e71bf4","26938466-2f41-4d90-8b41-598668151a27","9bf4b23f-cc97-44ae-879b-727821f59d8e","1b6d0dda-6d57-4c29-b702-fef29f70ace9","1050890a-67b8-4e5e-9ede-dad3c0663d85","4895fce1-57cf-4ede-b863-dd58a291f61b","94c185ab-ce0f-4604-96fd-9f05e206e49a","a1fbcd7e-39f2-47f0-ba1f-af106c149123","8dd8d079-e9ab-4fd5-bf8f-a7b1134175d8","3119ab6b-1c77-4e78-9fab-360b246ba6a7","29ebab94-fa5a-45ee-88e9-cdd4bffa9c42","e1ea3d77-90c1-43bd-b914-ff455af595f5","83a8db03-2cba-46ed-882e-77d063149a76","bd17fa44-0e7f-49b2-aaef-e5ce7696ba8a","97bb3e84-15c0-49e5-a761-5ca56129ad61","6309034b-7304-42f4-9f10-2a6d85019f05","5500027b-48a2-4bc5-869f-ff209eea6190","1014ee7c-a695-4ee0-b069-7824f6676c6f","aac4e5f2-b187-4386-9002-68535989e96d","54ca49ac-e894-4c39-923d-f5df0449a5d8","47782279-6c2d-46d0-bb1c-29880ac8b3b7","68824585-98c3-4afb-93ff-46f3c4cbf6ab","ff6fd083-5ea7-4c59-9496-c56db5a59ecf","d4f71d8d-04a7-458a-944e-13b6e0671a2c","74be561f-14d8-419d-b49e-fd67d1371129","9a48b898-61f6-47c0-a127-c3e951da7cb2","6dc246b1-04f3-403e-b672-c269cad6c639","ff656157-a2be-44ce-996d-0e377ce156eb","7a3440a5-6de0-4575-a20c-37b28201a4db","5ebd07cf-be02-4eaa-890c-d5efd24796fc","5ede83d4-dd0f-4bdd-9197-abe2e6ae9590","429179c2-8a73-4f0a-84a5-a3fccb1336ed","881968ca-88ce-4733-a790-0207e10b3ba9","43842535-3727-4437-9617-3256609172d2","58783f5c-90c8-41cf-9d4a-9a08ce3822ad","eaee754c-296a-46f2-ad59-dc6b10820882","866aac07-5283-46a3-a254-8a58f449bdcb","36a12f0a-e7cc-4558-bab8-d3e03f901215","3a0a3163-ed0d-4e70-83dd-63501f14c158","1dda6f53-8b56-4b64-98de-96be938336ef","9a83fbd3-058a-47cf-aa99-7ee3e0901ee4","f3ae5dc3-c756-4bab-bf8c-029afd313c0b","fc4c46bc-de7d-443b-b74c-a62d15fbf0cd","26c4aa96-6b92-49c0-aba7-f286a7c2b834","da042d54-2881-48fc-a482-8b5daba6355a","28d6e8f1-e548-4b44-90cd-f463a68b1fb4","9f02d9aa-bb04-4b26-81c0-87bbe605c73d","a3c794a5-515b-48f6-8811-2d6664cbb82a","d3131c64-88b7-4d71-8b06-a9640e14107a","ff593132-9db4-41c8-b14f-e84d92baa699","...and 154637 more"],"force":false} [Nest] 17 - 12/20/2024, 12:10:48 PM DEBUG [Api:GlobalExceptionFilter~52hrbm3m] HttpException(400): {"message":"Not found or no asset.delete access","error":"Bad Request","statusCode":400} [Nest] 17 - 12/20/2024, 12:11:04 PM DEBUG [Api:LoggingInterceptor~1xlr9aph] GET /api/server/ping 200 0.22ms ::ffff:127.0.0.1 ``` ### Additional information ![image](https://github.com/user-attachments/assets/ac138eb1-1304-444d-97aa-321cd52e7f02)
Author
Owner

@ASupinski commented on GitHub (Dec 22, 2024):

Identical issue with me, also a new setup running on the latest recommended docker with a single giant external library. Also able to remove duplicates one at a time, same error in logs and in browser (firefox). Happy to provide any additional information possible.

@ASupinski commented on GitHub (Dec 22, 2024): Identical issue with me, also a new setup running on the latest recommended docker with a single giant external library. Also able to remove duplicates one at a time, same error in logs and in browser (firefox). Happy to provide any additional information possible.
Author
Owner

@monocycler commented on GitHub (Dec 31, 2024):

I have the same issue, with 83,288 duplicates. I am using v1.123.0 with no external library.

My photos are mounted like this:

user@tcclxc4docker ~/immich-app$ df /home/user/immich-app/photoslibrary
Filesystem 1K-blocks Used Available Use% Mounted on
qpool/photoslibrary 2941178112 1787351808 1153826304 61% /home/user/immich-app/photoslibrary

docker compose.yml

name: immich

services:
immich-server:
container_name: immich_server
image: ghcr.io/immich-app/immich-server:${IMMICH_VERSION:-release}
volumes:
- ${UPLOAD_LOCATION}:/usr/src/app/upload
- /etc/localtime:/etc/localtime:ro
env_file:
- .env
ports:
- 2283:2283 #was 3001 in 1.117.0
depends_on:
- redis
- database
restart: always

/home/user/immich-app/.env:

UPLOAD_LOCATION=./photoslibrary
IMMICH_VERSION=release
DB_PASSWORD=dbpassword
DB_HOSTNAME=immich_postgres
DB_USERNAME=postgres
DB_DATABASE_NAME=immich

image Chrome says:

DELETE http://192.168.1.200:2283/api/assets 400 (Bad Request)

handle-error.Bpe9OZ77.js:1 [handleError]: Unable to resolve duplicate dt: Error: 400
at Object.it [as ok] (http://192.168.1.200:2283/_app/immutable/chunks/fetch-client.DUPjmxtR.js:1:7428)
at async http://192.168.1.200:2283/_app/immutable/nodes/27.FsQ_Vtz6.js:2:11207
at async p (http://192.168.1.200:2283/_app/immutable/nodes/27.FsQ_Vtz6.js:2:10135) Error: Error: 400
at Object.it [as ok] (http://192.168.1.200:2283/_app/immutable/chunks/fetch-client.DUPjmxtR.js:1:7428)
at async http://192.168.1.200:2283/_app/immutable/nodes/27.FsQ_Vtz6.js:2:11207
at async p (http://192.168.1.200:2283/_app/immutable/nodes/27.FsQ_Vtz6.js:2:10135)

@monocycler commented on GitHub (Dec 31, 2024): I have the same issue, with 83,288 duplicates. I am using v1.123.0 with no external library. My photos are mounted like this: > user@tcclxc4docker ~/immich-app$ df /home/user/immich-app/photoslibrary > Filesystem 1K-blocks Used Available Use% Mounted on > qpool/photoslibrary 2941178112 1787351808 1153826304 61% /home/user/immich-app/photoslibrary docker compose.yml > name: immich > > services: > immich-server: > container_name: immich_server > image: ghcr.io/immich-app/immich-server:${IMMICH_VERSION:-release} > volumes: > - ${UPLOAD_LOCATION}:/usr/src/app/upload > - /etc/localtime:/etc/localtime:ro > env_file: > - .env > ports: > - 2283:2283 #was 3001 in 1.117.0 > depends_on: > - redis > - database > restart: always /home/user/immich-app/.env: > UPLOAD_LOCATION=./photoslibrary > IMMICH_VERSION=release > DB_PASSWORD=dbpassword > DB_HOSTNAME=immich_postgres > DB_USERNAME=postgres > DB_DATABASE_NAME=immich > <img width="668" alt="image" src="https://github.com/user-attachments/assets/c6231538-f1ea-4021-8df0-a1527313e0e1" /> Chrome says: DELETE http://192.168.1.200:2283/api/assets 400 (Bad Request) handle-error.Bpe9OZ77.js:1 [handleError]: Unable to resolve duplicate dt: Error: 400 at Object.it [as ok] (http://192.168.1.200:2283/_app/immutable/chunks/fetch-client.DUPjmxtR.js:1:7428) at async http://192.168.1.200:2283/_app/immutable/nodes/27.FsQ_Vtz6.js:2:11207 at async p (http://192.168.1.200:2283/_app/immutable/nodes/27.FsQ_Vtz6.js:2:10135) Error: Error: 400 at Object.it [as ok] (http://192.168.1.200:2283/_app/immutable/chunks/fetch-client.DUPjmxtR.js:1:7428) at async http://192.168.1.200:2283/_app/immutable/nodes/27.FsQ_Vtz6.js:2:11207 at async p (http://192.168.1.200:2283/_app/immutable/nodes/27.FsQ_Vtz6.js:2:10135)
Author
Owner

@wtrdk commented on GitHub (Jan 6, 2025):

I experience the same. Deduplication works, but when I want to delete more than one search result, I get this error. I can only delete one search result at a time.

@wtrdk commented on GitHub (Jan 6, 2025): I experience the same. Deduplication works, but when I want to delete more than one search result, I get this error. I can only delete one search result at a time.
Author
Owner

@Dantheman61 commented on GitHub (Jan 6, 2025):

Same error with 98,318 duplicate

@Dantheman61 commented on GitHub (Jan 6, 2025): Same error with 98,318 duplicate
Author
Owner

@DoctorFranky commented on GitHub (Jan 27, 2025):

Same error with 71.469 duplicate with 3 external libreries.
I can delete them one by one without any problems.

@DoctorFranky commented on GitHub (Jan 27, 2025): Same error with 71.469 duplicate with 3 external libreries. I can delete them one by one without any problems.
Author
Owner

@blackfox33 commented on GitHub (Jan 30, 2025):

Same error for me with 208.058 duplicates.
I run immich on UnRaid v7.0.0.
Immich version is 1.125.7.
I can delete single duplicates, but no batch delete though.

@blackfox33 commented on GitHub (Jan 30, 2025): Same error for me with 208.058 duplicates. I run immich on UnRaid v7.0.0. Immich version is 1.125.7. I can delete single duplicates, but no batch delete though.
Author
Owner

@Eragon277 commented on GitHub (Feb 10, 2025):

Same issue for me with 177.493 duplicates on one external library.
Version is 1.126.1

@Eragon277 commented on GitHub (Feb 10, 2025): Same issue for me with 177.493 duplicates on one external library. Version is 1.126.1
Author
Owner

@StyleSnap commented on GitHub (Feb 17, 2025):

Same issue for me with 83.865 duplicates.
I run Immich on docker in an Proxmox LXC Container.

@StyleSnap commented on GitHub (Feb 17, 2025): Same issue for me with 83.865 duplicates. I run Immich on docker in an Proxmox LXC Container.
Author
Owner

@esagheer commented on GitHub (Mar 5, 2025):

I have the same issue with 42,113 duplicates.
Maybe not related to this bug, this was just imported from my google photos. Why do I have duplicates, I remember seeing duplicates not really uploaded. So, I am confused.

When I go back to deduplication utility. It says No duplicates were found. Now they are in the trash.

@esagheer commented on GitHub (Mar 5, 2025): I have the same issue with 42,113 duplicates. Maybe not related to this bug, this was just imported from my google photos. Why do I have duplicates, I remember seeing duplicates not really uploaded. So, I am confused. When I go back to deduplication utility. It says No duplicates were found. Now they are in the trash.
Author
Owner

@endotronic commented on GitHub (Mar 21, 2025):

Same issue here, same environment (external libraries with duplicates). I collected logs on both frontend and backend in case anything was missing from this issue, but I have nothing to add that isn't already here.

[Nest] 17  - 03/21/2025, 7:35:17 PM   DEBUG [Api:LoggingInterceptor~qaaem79s] DELETE /api/assets 204 489.26ms 192.168.107.105
[Nest] 17  - 03/21/2025, 7:35:17 PM VERBOSE [Api:LoggingInterceptor~qaaem79s] {"ids":["..and 69856 more"],"force":false}
[Nest] 17  - 03/21/2025, 7:35:17 PM   DEBUG [Api:GlobalExceptionFilter~qaaem79s] HttpException(400): {"message":"Not found or no asset.delete access","error":"Bad Request","statusCode":400}
[Nest] 17  - 03/21/2025, 7:35:26 PM   DEBUG [Api:LoggingInterceptor~qwusf6rx] GET /api/server/ping 200 0.44ms ::ffff:127.0.0.1
@endotronic commented on GitHub (Mar 21, 2025): Same issue here, same environment (external libraries with duplicates). I collected logs on both frontend and backend in case anything was missing from this issue, but I have nothing to add that isn't already here. ``` [Nest] 17 - 03/21/2025, 7:35:17 PM DEBUG [Api:LoggingInterceptor~qaaem79s] DELETE /api/assets 204 489.26ms 192.168.107.105 [Nest] 17 - 03/21/2025, 7:35:17 PM VERBOSE [Api:LoggingInterceptor~qaaem79s] {"ids":["..and 69856 more"],"force":false} [Nest] 17 - 03/21/2025, 7:35:17 PM DEBUG [Api:GlobalExceptionFilter~qaaem79s] HttpException(400): {"message":"Not found or no asset.delete access","error":"Bad Request","statusCode":400} [Nest] 17 - 03/21/2025, 7:35:26 PM DEBUG [Api:LoggingInterceptor~qwusf6rx] GET /api/server/ping 200 0.44ms ::ffff:127.0.0.1 ```
Author
Owner

@WitnessMee commented on GitHub (Mar 28, 2025):

I get the "Not found or no asset.delete access" 400 error when trying to directly delete at /api/assets with the API

When I do a get at /api/assets/7b33008d-fa2c-4062-a63c-b5c0a6455941 it returns the asset.

It didn't work because the asset had a different owner id than the the owner linked to the API key. So I was not allowed to delete duplicate assets of my partner.

@WitnessMee commented on GitHub (Mar 28, 2025): I get the "Not found or no asset.delete access" 400 error when trying to directly delete at `/api/assets` with the API When I do a get at `/api/assets/7b33008d-fa2c-4062-a63c-b5c0a6455941` it returns the asset. It didn't work because the asset had a different owner id than the the owner linked to the API key. So I was not allowed to delete duplicate assets of my partner.
Author
Owner

@hendkai commented on GitHub (Apr 10, 2025):

I have the same problem. But i have only 1 user and im the admin. So is there a workaround currently? It work with a older immich version. I cant tell which version it was.

@hendkai commented on GitHub (Apr 10, 2025): I have the same problem. But i have only 1 user and im the admin. So is there a workaround currently? It work with a older immich version. I cant tell which version it was.
Author
Owner

@snachodog commented on GitHub (May 8, 2025):

I am also getting the Not found or no asset.delete access (Immich Server Error) error on v1.132.3. I'm the only user.

@snachodog commented on GitHub (May 8, 2025): I am also getting the `Not found or no asset.delete access (Immich Server Error)` error on [v1.132.3](https://github.com/immich-app/immich/releases/tag/v1.132.3). I'm the only user.
Author
Owner

@alextran1502 commented on GitHub (May 8, 2025):

The issue should be fixed on main, and will be available in the next release

@alextran1502 commented on GitHub (May 8, 2025): The issue should be fixed on main, and will be available in the next release
Author
Owner

@monocycler commented on GitHub (Jun 7, 2025):

The error persists for me on v1.134.0

@monocycler commented on GitHub (Jun 7, 2025): The error persists for me on v1.134.0
Author
Owner

@wtrdk commented on GitHub (Jun 7, 2025):

For me as well.

Op za 7 jun , monocycler @.***(mailto:Op za 7 jun , monocycler < schreef:

monocycler left a comment (immich-app/immich#14821)

The error persists for me on v1.134.0


Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.Message ID: @.***>

@wtrdk commented on GitHub (Jun 7, 2025): For me as well. Op za 7 jun , monocycler ***@***.***(mailto:Op za 7 jun , monocycler <<a href=)> schreef: > monocycler left a comment [(immich-app/immich#14821)](https://github.com/immich-app/immich/issues/14821#issuecomment-2952063083) > > The error persists for me on v1.134.0 > > — > Reply to this email directly, [view it on GitHub](https://github.com/immich-app/immich/issues/14821#issuecomment-2952063083), or [unsubscribe](https://github.com/notifications/unsubscribe-auth/ABC3RHEDDDTLI5TE6I3PU6T3CKIOJAVCNFSM6AAAAABT7BPSK2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDSNJSGA3DGMBYGM). > You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
Author
Owner

@monocycler commented on GitHub (Jun 7, 2025):

I found a workaround that can get you out of this situation. See https://github.com/immich-app/immich/issues/14562

"Try setting the duplicate detection threshold in the ML settings to 0.001, save, then run duplicate detection on all assets. The idea is that a stricter threshold means fewer detected duplicates. You can raise it once you're done with those."

I went from 0.1 (fail) to 0.001 (success) to 0.005 (success) to 0.01 (success). I succeeded in freeing 130GB.

@monocycler commented on GitHub (Jun 7, 2025): I found a workaround that can get you out of this situation. See https://github.com/immich-app/immich/issues/14562 "Try setting the duplicate detection threshold in the ML settings to 0.001, save, then run duplicate detection on all assets. The idea is that a stricter threshold means fewer detected duplicates. You can raise it once you're done with those." I went from 0.1 (fail) to 0.001 (success) to 0.005 (success) to 0.01 (success). I succeeded in freeing 130GB.
Author
Owner

@BudroidWV commented on GitHub (Jun 13, 2025):

This is a workaround, not a fix. On Ubuntu, I opened the Duplicates utility in a browser, then opened a terminal. Position the terminal so you can see the Trash button for Immich clearly in the browser window. In the terminal, type

sudo apt install xdotool

Once xdotool is installed, type in a command similar to this BUT DON'T HIT ENTER YET

xdotool click --repeat 1000 --delay 250 1

Adjust the repeat to how many deduplication clicks you need then move the mouse pointer to hover over the Trash button. Press enter and your mouse will click the number of times you need. Let it run until done. Depending on your number, you may want to start it when you can let the system run for a while without interaction. Worked well for me.

@BudroidWV commented on GitHub (Jun 13, 2025): This is a workaround, not a fix. On Ubuntu, I opened the Duplicates utility in a browser, then opened a terminal. Position the terminal so you can see the Trash button for Immich clearly in the browser window. In the terminal, type sudo apt install xdotool Once xdotool is installed, type in a command similar to this BUT DON'T HIT ENTER YET xdotool click --repeat 1000 --delay 250 1 Adjust the repeat to how many deduplication clicks you need then move the mouse pointer to hover over the Trash button. Press enter and your mouse will click the number of times you need. Let it run until done. Depending on your number, you may want to start it when you can let the system run for a while without interaction. Worked well for me.
Author
Owner

@bt4y1or commented on GitHub (Jun 14, 2025):

I was having the same issue. Do you have photos in the Locked Folder? After I removed them and tried again, duplicate mass delete worked as it should

@bt4y1or commented on GitHub (Jun 14, 2025): I was having the same issue. Do you have photos in the Locked Folder? After I removed them and tried again, duplicate mass delete worked as it should
Author
Owner

@BudroidWV commented on GitHub (Jun 14, 2025):

I don't have anything in the Locked folder. I had never set it up, thought maybe that was the issue, but I still get the same error after setting up Locked folder. Thanks for the idea to check.

@BudroidWV commented on GitHub (Jun 14, 2025): I don't have anything in the Locked folder. I had never set it up, thought maybe that was the issue, but I still get the same error after setting up Locked folder. Thanks for the idea to check.
Author
Owner

@BudroidWV commented on GitHub (Jun 14, 2025):

Well, I stand corrected. I did receive an error after I setup the Locked Folder and going back to Deduplicate All. However, when I refreshed the page, there are now no duplicates. Thanks again!

@BudroidWV commented on GitHub (Jun 14, 2025): Well, I stand corrected. I did receive an error after I setup the Locked Folder and going back to Deduplicate All. However, when I refreshed the page, there are now no duplicates. Thanks again!
Author
Owner

@Lakenheathen commented on GitHub (Jun 15, 2025):

wanted to comment. it was due to locked folder contents. Thanks!

@Lakenheathen commented on GitHub (Jun 15, 2025): wanted to comment. it was due to locked folder contents. Thanks!
Author
Owner

@blackfox33 commented on GitHub (Jun 15, 2025):

For me the error persists in v1.134.0
I checked the locked folder, it wasn't even set up until now. After initial set up it was empty.
Same error again when trying to mass deduplicate.

@blackfox33 commented on GitHub (Jun 15, 2025): For me the error persists in v1.134.0 I checked the locked folder, it wasn't even set up until now. After initial set up it was empty. Same error again when trying to mass deduplicate.
Author
Owner

@wernerno commented on GitHub (Jun 15, 2025):

I found a workaround:
First unlock the locked folder and then immediately delete all duplicates. That's what worked for me.

@wernerno commented on GitHub (Jun 15, 2025): I found a workaround: First unlock the locked folder and then immediately delete all duplicates. That's what worked for me.
Author
Owner

@kvvoff commented on GitHub (Jul 13, 2025):

I found a workaround: First unlock the locked folder and then immediately delete all duplicates. That's what worked for me.

Work for me v1.134.0. Thanks.
Before unlocking the secret folder, duplicate thumbnails were not loaded. After unlocking, they became visible and deletion worked. This clearly needs to be fixed.

@kvvoff commented on GitHub (Jul 13, 2025): > I found a workaround: First unlock the locked folder and then immediately delete all duplicates. That's what worked for me. Work for me v1.134.0. Thanks. Before unlocking the secret folder, duplicate thumbnails were not loaded. After unlocking, they became visible and deletion worked. This clearly needs to be fixed.
Author
Owner

@chchia commented on GitHub (Jul 20, 2025):

i am not using lock folder, none of my photo is locked, still i am having this problem when i try to deduplicate about 110k photo.

using latest main that published in docker yesterday.

@chchia commented on GitHub (Jul 20, 2025): i am not using lock folder, none of my photo is locked, still i am having this problem when i try to deduplicate about 110k photo. using latest main that published in docker yesterday.
Author
Owner

@kresbeatz commented on GitHub (Aug 16, 2025):

Same error for me. Also I'm not using lock folder (however I created it with PIN, tried to put couple photo inside, then remove from there - no changes), trying to deduplicate 66,000 photos. Using latest docker (v1.138.0) with latest TrueNAS 25.04

@kresbeatz commented on GitHub (Aug 16, 2025): Same error for me. Also I'm not using lock folder (however I created it with PIN, tried to put couple photo inside, then remove from there - no changes), trying to deduplicate 66,000 photos. Using latest docker (v1.138.0) with latest TrueNAS 25.04
Author
Owner

@thomaslr commented on GitHub (Aug 24, 2025):

same error too - locked folder trick mention earlier didn't help, still get the "not found or no asset delete access immich server error"

@thomaslr commented on GitHub (Aug 24, 2025): same error too - locked folder trick mention earlier didn't help, still get the "not found or no asset delete access immich server error"
Author
Owner

@estebanpapp commented on GitHub (Aug 25, 2025):

Hit the same issue on an instance with >55k duplicates

Was able to workaround the problem with the following script (in Chrome, go to the three vertical dots on the top right, More Tools -> Developer Tools, paste it in the Console and hit enter)

const buttonSelector = 'button.ring-offset-background.focus-visible\\:ring-ring.flex.items-center.justify-center.gap-1.font-medium.whitespace-nowrap.transition-colors.focus-visible\\:ring-2.focus-visible\\:ring-offset-2.focus-visible\\:outline-none.px-4.py-2.text-sm.rounded-lg.bg-danger.text-light.hover\\:bg-danger\\/80.rounded-s-full';
button = document.querySelector(buttonSelector);
while (button) {
  button.click();
  await new Promise(resolve => setTimeout(resolve, 50));
  button = document.querySelector(buttonSelector);
}

Its not ideal, will take a while depending the amount of duplicates...
I stopped it around 33k and was able to hit the deduplicate all without issues, so you dont need to delete all of them with the above (to stop just refresh the browser)

@estebanpapp commented on GitHub (Aug 25, 2025): Hit the same issue on an instance with >55k duplicates Was able to workaround the problem with the following script (in Chrome, go to the three vertical dots on the top right, More Tools -> Developer Tools, paste it in the Console and hit enter) ``` const buttonSelector = 'button.ring-offset-background.focus-visible\\:ring-ring.flex.items-center.justify-center.gap-1.font-medium.whitespace-nowrap.transition-colors.focus-visible\\:ring-2.focus-visible\\:ring-offset-2.focus-visible\\:outline-none.px-4.py-2.text-sm.rounded-lg.bg-danger.text-light.hover\\:bg-danger\\/80.rounded-s-full'; button = document.querySelector(buttonSelector); while (button) { button.click(); await new Promise(resolve => setTimeout(resolve, 50)); button = document.querySelector(buttonSelector); } ``` Its not ideal, will take a while depending the amount of duplicates... I stopped it around 33k and was able to hit the deduplicate all without issues, so you dont need to delete all of them with the above (to stop just refresh the browser)
Author
Owner

@thomaslr commented on GitHub (Aug 26, 2025):

my workaround was to use apple's automator app to send "shift C" keystrokes maybe 4 per second after adding a 10 second initial delay so I could switch to the browser window after starting the script in automator. my duplicates was over 200k - will see if I can start it at 33k as mentioned above.

@thomaslr commented on GitHub (Aug 26, 2025): my workaround was to use apple's automator app to send "shift C" keystrokes maybe 4 per second after adding a 10 second initial delay so I could switch to the browser window after starting the script in automator. my duplicates was over 200k - will see if I can start it at 33k as mentioned above.
Author
Owner

@Crypto-Cow commented on GitHub (Aug 29, 2025):

Is this ever going to be officially get fixed?

@Crypto-Cow commented on GitHub (Aug 29, 2025): Is this ever going to be officially get fixed?
Author
Owner

@JarekLB commented on GitHub (Sep 28, 2025):

im now having this issue also on v1.143.1. 103k duplicates.

@JarekLB commented on GitHub (Sep 28, 2025): im now having this issue also on v1.143.1. 103k duplicates.
Author
Owner

@organom commented on GitHub (Oct 7, 2025):

Still fails with the latest released version v2.0.1 (with 27k duplicates)
@alextran1502 Any chance to get this fixed and/or reopen the issue?

In the meanwhile, thank you @estebanpapp for the workaround, still working perfectly (just raised the delay to not overload my tiny server)

@organom commented on GitHub (Oct 7, 2025): Still fails with the latest released version v2.0.1 (with 27k duplicates) @alextran1502 Any chance to get this fixed and/or reopen the issue? In the meanwhile, thank you @estebanpapp for the workaround, still working perfectly (just raised the delay to not overload my tiny server)
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/immich#4516
No description provided.