Want to download web url content

Hi All,

I have one url and I want to download the all web url content and copy it to some text file but unable to do so. That URL have username and password as well so please guide how to do this. Do I need to pass credentials also in script. If yes, then how. Any insight would be appreciated.

Till now code is -

Invoke-WebRequest -Uri abc.com/#/queues -Credential "guest" $WebClient = New-Object System.Net.WebClient $WebClient.DownloadFile('abc.com/#/queues/%2F/onp.document.dlq','C:\Users\ankitpar\test.txt')

but on executed it’s not giving the desired result. It’s not copying the web page.

Thanks
Ankit

A brief illustration:

$Cred = Get-Credential
$Webclient.Credentials = $Cred
$Webclient.DownloadFile($Url,$Destination)

Personally I find this stuff a bit easier with the built in Invoke-WebRequest cmdlet, but either works, really. :slight_smile:

Hi Joel,

Thanks for replying but I have already tried this via google but didn’t get the actual result. The issue is on web url there is a tab of “get-message” and after clicking on that tab the result will show. In my case i think i have to call the “get-message” tab in script and after that only then the result will come so do you have any idea that how to do this?

Thanks in advance
Ankit