Unable to compare website hash with below script

Hey Everyone,

I am unable to compare website hash with below script. It is generating different hash for same page every time I run it. Please help.

$url1 = ""
$url2 =  ""

$tempFile1 = New-TemporaryFile
$tempFile2 = New-TemporaryFile

Invoke-WebRequest -Uri $url1 -OutFile $tempFile1

Invoke-WebRequest -Uri $url2 -OutFile $tempFile2

$hash1 = Get-FileHash -Path $tempFile1 -Algorithm SHA256
$hash2 = Get-FileHash -Path $tempFile2 -Algorithm SHA256 


if ($hash1.Hash -eq $hash2.Hash){
    Write-output "Web pages are same"
    Write-output "Web Pages are different"

remove-item $tempFile1
remove-item $tempFile2

Did you check if the files are really the same everytime you request them? You may use a file comparison tool to check. I’d assume you have a variable part in the website you’re checking.

1 Like

There are quite a few security features more websites are implementing that would cause this behavior.


CSRF Tokens

These are just a few examples.

1 Like

You are right. I have used file comparison tool. Both times content returned from same webpage was different. Is there a different way known to compare hash?

Not easily. Are these sites you need to monitor? What exactly is your objective?

so basically In my team I have someone who is following articles on these web sites. Once those articles are modified. They need to read hundreds of them to figure out which one is changed. I was just trying to create a script for them so that once that content is updated, It would alert them. So they do not need to read all of them every time. They can go directly to the one that is updated.

Are they updated internally? If yes a script that can read the page files directly from the server might be more useful or if there is a built-in alerting mechanism.

These websites are public and different so basically I do not have much Information on them. Everything I can do is from my end to create something that can track changes.

What changes normally, text? Why not just keep track of each article by URL and word count. If the word count changes X amount, up or down, consider it changed? That’s just one idea. Of course, this goes beyond the general scope of this powershell forum.

1 Like

@krzydoug That does not come to my mind. Digging into data as compare to using whole webpage. Up to this point this looks good and helping me.

Thanks alot.

As long as they don’t have dynamic content that should work, but also if they the content is fairly static you may be able to get away with just tracking the Content length, the tokens and nounces I mentioned should always be the same length so that wouldn’t impact the content length of the page.