How can naming and post-processing options be applied when aria2 is used to download videos concurrently? #12583

Closed
opened 2026-02-21 06:42:08 -05:00 by deekerman · 2 comments
Owner

Originally created by @MarkKoz on GitHub (Jan 15, 2018).

What is the purpose of your issue?

  • Bug report (encountered problems with youtube-dl)
  • Site support request (request for adding support for a new site)
  • Feature request (request for a new functionality)
  • Question
  • Other

Description

I want to download videos concurrently and then have youtube-dl write metadata to the videos and format their titles.

As described in #350, aria2 can be used to concurrently download videos. However, using --external-downloader will result in only one video at a time (but aria2 can still split the single video). To fix this, the list of URLs needs to be provided ahead of time. However, youtube-dl's -a can't be used with aria2's -i, as this will result in aria2 downloading the whole list for each video youtube-dl parses.

Remarks

aria2 has event hooks that may be of use, particularly --on-download-complete. Unfortunately, from my understanding of the documentation, it seems to only execute when the entire download completes, not once per video. Even if it did do it once per video, I don't see any way youtube-dl can modify an existing video or how the necessary information (e.g. original link) could be passed back to it so that metadata could be retrieved for the video.

I've observed that youtube-dl can detect when a video has already been downloaded. I assume it does this with a simple file name check. I could download all the videos with aria2 and then run youtube-dl passing the list of the original links. However, due to reliance on the file names, I unfortunately wouldn't be able to format them.

Possible Solutions

My current solution is to write a script to start a new youtube-dl process with --external-downloader for every line in a file containing URLs. If anyone sees something horribly wrong with this or has any better ideas, please, do tell.

Alternatively, I could use simulation options or --write-info-json to dump metadata to files. Videos would then be downloaded using a single aria2 process. The aforementioned event hook would execute a program I will write which will parse the dumped metadata and write them to the videos. This seems more proper than the first solution, but it's also substantially more work.

Originally created by @MarkKoz on GitHub (Jan 15, 2018). ### What is the purpose of your *issue*? - [ ] Bug report (encountered problems with youtube-dl) - [ ] Site support request (request for adding support for a new site) - [ ] Feature request (request for a new functionality) - [x] Question - [ ] Other ### Description I want to download videos concurrently and then have youtube-dl write metadata to the videos and format their titles. As described in #350, aria2 can be used to concurrently download videos. However, using `--external-downloader` will result in only one video at a time (but aria2 can still split the single video). To fix this, the list of URLs needs to be provided ahead of time. However, youtube-dl's `-a` can't be used with aria2's `-i`, as this will result in aria2 downloading the whole list for each video youtube-dl parses. ### Remarks aria2 has event hooks that may be of use, particularly `--on-download-complete`. Unfortunately, from my understanding of the documentation, it seems to only execute when the entire download completes, not once per video. Even if it did do it once per video, I don't see any way youtube-dl can modify an existing video or how the necessary information (e.g. original link) could be passed back to it so that metadata could be retrieved for the video. I've observed that youtube-dl can detect when a video has already been downloaded. I assume it does this with a simple file name check. I could download all the videos with aria2 and then run youtube-dl passing the list of the original links. However, due to reliance on the file names, I unfortunately wouldn't be able to format them. ### Possible Solutions My current solution is to write a script to start a new youtube-dl process with `--external-downloader` for every line in a file containing URLs. If anyone sees something horribly wrong with this or has any better ideas, please, do tell. Alternatively, I could use simulation options or `--write-info-json ` to dump metadata to files. Videos would then be downloaded using a single aria2 process. The aforementioned event hook would execute a program I will write which will parse the dumped metadata and write them to the videos. This seems more proper than the first solution, but it's also substantially more work.
Author
Owner

@dstftw commented on GitHub (Jan 15, 2018):

So what do you expect to hear? Spawning youtube-dl process per URL is the most obvious way to parallelize downloads, yes. Alternatively, you handle everything yourself only using youtube-dl as URL a resolver and metadata provider.

@dstftw commented on GitHub (Jan 15, 2018): So what do you expect to hear? Spawning youtube-dl process per URL is the most obvious way to parallelize downloads, yes. Alternatively, you handle everything yourself only using youtube-dl as URL a resolver and metadata provider.
Author
Owner

@MarkKoz commented on GitHub (Jan 15, 2018):

I expected to perhaps hear something helpful which I missed. Thanks.

@MarkKoz commented on GitHub (Jan 15, 2018): I expected to perhaps hear something helpful which I missed. Thanks.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/youtube-dl-ytdl-org#12583
No description provided.