How do I download a website from a link?

I need to download the site from the link so that I can open it without internet. The site has photos, videos, gif, text, design and scripts. On Windows 10 I used wget and the command “wget -r -l 5000 -k -p -E -nc http://example.com --no-check-certificate”, but on Windows 11 it doesn’t work. I know that there are now two aliases for the Invoke-WebRequest cmdlet in PowerShell: iwk and wget. But I have no idea how to work with them. Please help me. :sob:

This is a familiar place anyone trying to use something new starts from. I would begin by reading the help for it fully, then looking at the examples.

If you encounter a specific issue or question, feel free to ask for assistance at that point with as many details as you can.

1 Like

I downloaded “wget-1.21.4-win64” separately and used the command: wget.exe -w 1 -r -l 1 -k -p -E -nc https://joyreactor.cc/best/26090 -np -o log.txt --save-cookies file.txt --no-cache --no-check-certific
The .webm and .mp4 files did not download, nor did the comments section. Help! :sob:

this isn’t really a Powershell question since you’re specifically using the wget executable

As a starting point, I took your wget command and ran it through copilot, asking it to translate it to PowerShell. Here’s what it gave me. I have not tested this at all, so no idea if it works. Just somewhere for you to start.

$webpage = "https://joyreactor.cc/best/26090"
$logFile = "log.txt"
$cookiesFile = "file.txt"

# Download the main page
$response = Invoke-WebRequest -Uri $webpage -UseBasicParsing -SessionVariable session

# Save cookies
$session.Cookies | Export-CliXml -Path $cookiesFile

# Parse the links from the main page
$links = $response.Links | Where-Object { $_.href -match "^https?://" }

# Download each file
foreach ($link in $links) {
    $url = $link.href
    $fileName = [System.IO.Path]::GetFileName($url)
    Start-BitsTransfer -Source $url -Destination $fileName
}

# Log the operation
$response | Out-File -FilePath $logFile

I executed the command you wrote, but I didn’t understand where the file was downloaded to.

bold of you to blindly run powershell code off the internet. Always a good idea to read through it and understand what it’s doing first.
The line with Start-BitsTransfer has a -Destination parameter where it specifies where to transfer the data to.
Running the [System.IO.Path]::GetFileName() method against a test url like https://google.com returns “google.com” so I would take that to mean it’s going to download to a file in the current directory.
Same thing with the Out-File line at the end: $logFile is defined at the top as just “log.txt” with no path which means it will be relative to wherever Powershell is.
I would look in the same directory you ran the script from, and the directory that the Powershell executable lives in.

2 Likes

Agreed, especially since I didn’t write the code - Copilot did. Not sure which I trust less to write good code, a random person on the internet or an AI.

3 Likes

I didn’t find. It’s too complicated. Is there any way to specify the path to save to?

of course there is. Both lines of code I called out and can modified however you see fit:

# change this variable to point to whatever you want, e.g. c:\users\MaximFox\Desktop\log.txt
$logFile = "log.txt"

and in the loop you could can pick whatever you want for the -Destination parameter. However, some experimenting here with the code that CoPilot made shows that the [System.IO.Path]::GetFileName method probably isn’t what you want.
As an example, look what happens here:

PS> [System.IO.Path]::GetFileName('https://joyreactor.cc/best/26090')
26090

probably not what you would want to call that file.

Again, I don’t know if Powershell is the best solution for what you’re trying to accomplish here, and the wget.exe program isn’t Powershell so we’re probably not much help to you there.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.