ytdlp.org

Practical yt-dlp docs: install, commands, fixes, cookies, and workflows.

Current section

Workflow guide

How to batch download videos with yt-dlp

Running yt-dlp one URL at a time works until you have ten URLs. Or fifty. Or a list that someone sends you every week. Batch downloading is where yt-dlp stops being a manual tool and starts being a workflow.

Quick answer

yt-dlp -a urls.txt

Put one URL per line in a text file, then pass it with -a (short for --batch-file). That is the standard way to batch download with yt-dlp.

Create the batch file

The batch file is just a plain text file with one URL per line. Blank lines and lines starting with # are ignored, so you can organize and comment your list.

# Conference talks
https://www.youtube.com/watch?v=dQw4w9WgXcQ
https://www.youtube.com/watch?v=jNQXAC9IVRw

# Tutorials
https://www.youtube.com/watch?v=9bZkp7q19f0
https://vimeo.com/148751763

You can mix URLs from different sites in the same file. yt-dlp handles each one with the correct extractor automatically.

Pass multiple URLs inline

If you only have a handful of URLs and do not want to create a file, pass them directly as arguments.

yt-dlp "URL1" "URL2" "URL3"

This works fine for a few URLs but becomes unmanageable past five or six. Use a batch file instead.

Skip already-downloaded videos

If you rerun the same batch file regularly, use the download archive to skip videos you already have. This is the single most important flag for repeatable batch workflows.

yt-dlp -a urls.txt --download-archive downloaded.txt

yt-dlp writes the ID of each downloaded video to downloaded.txt. On the next run, it checks the file and skips anything already listed. No duplicates, no wasted bandwidth.

Combine with output templates

Batch downloads without good naming turn into a mess fast. Pair the batch file with an output template to keep things organized.

yt-dlp -a urls.txt \
  --download-archive downloaded.txt \
  -o "%(uploader)s/%(title)s.%(ext)s"

This downloads each video into a folder named after the uploader. Combined with the archive flag, you can safely rerun this daily without thinking about it.

Handle failures gracefully

When one URL in a batch fails, yt-dlp continues to the next one by default. You do not need to do anything special. But if you want to limit retries or ignore errors explicitly:

yt-dlp -a urls.txt -i --retries 3

-i tells yt-dlp to ignore errors and keep going. --retries 3 sets how many times to retry each failed download before moving on.

Speed up batch downloads

yt-dlp downloads one video at a time by default. For large batches, you can speed things up with concurrent fragment downloads:

yt-dlp -a urls.txt -N 4

-N 4 downloads 4 fragments at once per video. This helps significantly on sites that serve DASH or HLS streams. Note: this parallelizes fragments within a single video, not across multiple videos.

Common batch download mistakes

  • • not using a download archive, then redownloading everything on each run
  • • dumping hundreds of files into one flat folder with no output template
  • • assuming all URLs in a batch will succeed without adding error handling
  • • using batch files for a workflow that runs daily without realizing yt-dlp has no built-in scheduler
  • • not testing the batch with 2-3 URLs first before running the full list

When batch files stop being enough

Batch files work for tens of URLs. Once you are dealing with hundreds of URLs, multiple sources updating on different schedules, or need to process the output programmatically, you are building a pipeline, not running a command. That is a different problem with different tools.

Workflow

Turn this into a repeatable workflow

If you are doing this more than once, the real win is not memorizing more flags. It is making the workflow reusable, organized, and less manual. That is where Importly starts making sense.