Feature Request : Add an option to run scripts "Import using script" concurrently. #9343

Open
opened 2026-02-20 00:12:57 -05:00 by deekerman · 0 comments
Owner

Originally created by @Xenoxis on GitHub (Jan 14, 2026).

Is there an existing issue for this?

  • I have searched the existing open and closed issues

My *arrs are running inside docker containers.

I've made a bash script that send through TCP inputFile and outputFile paths to a nodeJS TCP server in a container (on the same docker network as my *arrs) with ffmpeg for transcoding.

The script waits until the distant server close the socket to exit and check potential errors, and then report it to radarr.

My server implementation have an internal queue that hold the TCP socket until it has been processed by a worker : a ffmpeg process spawn, and then worker return, whatever the transcoding result. This is doing asynchronously using JS promises.

But, Radarr (and all *arrs) using "Import using custom script" are waiting until the script ended before doing another import, which currently breaks my queue implementation.
Moreover, if the transcode is lasting, the import delay could be very long, which is not optimal.

Describe the solution you'd like

Add a checkbox (or not, just redefine the new default behaviour) to allow multiples import scripts to run concurrently, which will fit perfect to such implementations.

Describe alternatives you've considered

One other possibility is to allow my container to access /var/run/docker.sock and add docker client on the container, and rework my worker implementation to run one container per transcode, but it will be a dirtier way to do the same thing, and may add vulnerabilities to allow access to docker.sock

Anything else?

Nothing, but I can help about the implementation of this

Originally created by @Xenoxis on GitHub (Jan 14, 2026). ### Is there an existing issue for this? - [x] I have searched the existing open and closed issues ### Is your feature request related to a problem? Please describe My *arrs are running inside docker containers. I've made a bash script that send through TCP inputFile and outputFile paths to a nodeJS TCP server in a container (on the same docker network as my *arrs) with ffmpeg for transcoding. The script waits until the distant server close the socket to exit and check potential errors, and then report it to radarr. My server implementation have an internal queue that hold the TCP socket until it has been processed by a worker : a ffmpeg process spawn, and then worker return, whatever the transcoding result. This is doing asynchronously using JS promises. But, Radarr (and all *arrs) using "Import using custom script" are waiting until the script ended before doing another import, which currently breaks my queue implementation. Moreover, if the transcode is lasting, the import delay could be very long, which is not optimal. ### Describe the solution you'd like Add a checkbox (or not, just redefine the new default behaviour) to allow multiples import scripts to run concurrently, which will fit perfect to such implementations. ### Describe alternatives you've considered One other possibility is to allow my container to access /var/run/docker.sock and add docker client on the container, and rework my worker implementation to run one container per transcode, but it will be a dirtier way to do the same thing, and may add vulnerabilities to allow access to docker.sock ### Anything else? Nothing, but I can help about the implementation of this
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/Radarr#9343
No description provided.