List of links to be downloaded

I have a list with over 100 links. Each line is formatted to be executed using the command prompt, e.g. like this:

powershell -c “Invoke-WebRequest -Uri ‘https://theswissbay.ch/xy.pdf’ -OutFile ‘e:/xy.pdf’”

If I run a single line with the command prompt, it works. But when I create a bat-file to run all the lines in a row, it always comes up with error “404 Not found”.
(It is the same when I run a single line with Powershell as opposed to running the whole list as a ps1-script).

How can I use command prompt or powershell to run the whole list?

No need for batch files, it would be best to convert your site and file names into a csv (or other format) to be consumed as an object from Powershell. CSV is the most straight forward and you can use Import-Csv in place of the $csv that is mocking the import of a csv. The use a loop to enumerate the CSV and process each row. Here is a basic example:

$csv = @'
Site,File
https://theswissbay.ch, xy.pdf
https://foo.com, foo.pdf
'@ | ConvertFrom-Csv

foreach ($row in $csv) {
    $uri = '{0}/{1}' -f $row.Site,$row.File
    $outfile = 'e:\{0}' -f $row.File

    'Processing {0} and exporting to {1}' -f $uri, $outfile
    #Invoke-WebRequest -Uri $uri -OutFile $outfile
}

Output:

Processing https://theswissbay.ch/xy.pdf and exporting to e:\xy.pdf
Processing https://foo.com/foo.pdf and exporting to e:\foo.pdf

Would recommend adding try\catch and setting the ErrorAction on the Invoke-Restmethod so that you can track if a specific site is giving a 404 so you can fix it.

2 Likes

By the way, I found out why it doesn’t work with the batch-file - which of course would be the easiest solution: because here some characters are encoded in the form $hx, and when reading from a file the command prompt ignores the % character. So you would have to convert e.g. every occurrence of %20 (=ascii 32) within a link into a space etc.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.