#downloader #completion #config #count #progress #statistics #automatic #logging #issue

app multifiledownloader

A Concurrent and Configurable Multi-File downloader cli tool

4 releases

Uses new Rust 2024

new 0.2.0 May 25, 2025
0.1.2 May 24, 2025
0.1.1 May 24, 2025
0.1.0 May 24, 2025

#56 in HTTP client

MIT license

29KB
611 lines

Multi File Downloader

GitHub license Crates.io CI

A high-performance, concurrent multi-file downloader written in Rust with progress tracking and error handling.

Features

  • 🚀 Concurrent downloads with configurable worker count
  • 📊 Progress bars for individual files and overall progress
  • 🔄 Resume support for partially downloaded files
  • 🗑️ Clean destination directory before downloading
  • 📂 Customizable destination directory (supports tilde expansion)
    • The destination directory is created if it does not exist automatically
  • 🔄 Automatic shell completion support
  • 📊 Human-readable download statistics
  • 🛠️ Robust error handling and logging
  • 📝 dotenv support for configuration

Usage

Basic Usage

Download multiple files concurrently:

multifiledownloader -w 8 --dest ~/Downloads --urls "https://example.com/file1.txt,https://example.com/file2.txt"

Advanced Usage

  • Specify custom destination directory:

    multifiledownloader -d ~/Downloads/custom-dir -u "url1,url2,url3"
    
  • Clean destination directory before downloading:

    multifiledownloader --clean -u "url1,url2"
    
  • Set custom number of workers:

    multifiledownloader -w 4 -u "url1,url2"
    

Reading URLs from a File

You can read URLs from a file where each URL is on a new line:

# Create a file with URLs
$ cat > urls.txt << EOF
https://example.com/file1.txt
https://example.com/file2.txt
EOF

# Download using the file
$ multifiledownloader -w 8 --dest ~/Downloads --urls "$(cat urls.txt | tr '\n' ',' | sed 's/,$//g')"

Shell Completion

Generate shell completion scripts for your shell:

# Bash
multifiledownloader --completion bash | sudo tee /usr/local/etc/bash_completion.d/multifiledownloader

# Zsh or to a directory in your $fpath
multifiledownloader --completion zsh | sudo tee /usr/local/share/zsh/site-functions/_multifiledownloader

# Fish
multifiledownloader --completion fish | sudo tee /usr/local/share/fish/vendor_completions.d/multifiledownloader.fish

# PowerShell
multifiledownloader --completion powershell | Out-File -FilePath $PROFILE\multifiledownloader.ps1

# Elvish
multifiledownloader --completion elvish | tee $HOME/.elvish/completions/multifiledownloader.elv

Options

Option Description Default
-w, --workers Number of concurrent download workers CPU cores count
-d, --dest Destination directory for downloaded files current directory
-u, --urls Comma-separated list of URLs to download required
-c, --clean Clean destination directory before downloading false
--completion Generate shell completion script -
-h, --help Show help message -
-V, --version Show version information -

Installation

Using Cargo

cargo install multifiledownloader

From Source

git clone https://github.com/tralahm/multifiledownloader-rs.git
cd multifiledownloader-rs
cargo build --release
cp target/release/multifiledownloader /usr/local/bin/

Troubleshooting

Common Issues

  1. Permission Errors

    • Ensure you have write permissions to the destination directory
    • Use --clean flag to remove existing files before downloading
  2. Network Issues

    • Check if URLs are accessible
    • Use fewer workers if experiencing connection timeouts
  3. Progress Bar Issues

    • Progress bars may not display correctly in some terminals
    • Try using a different terminal emulator if experiencing issues

Debugging

To enable debug logging:

RUST_LOG=debug multifiledownloader -u "url1,url2"

License

This project is licensed under the MIT License - see the LICENSE file for details.

Dependencies

~30–45MB
~752K SLoC