Mike, I try to explain. MS Exchange server is my hobby. Strange, I know. I have decided to create a reporting / alerting for suspicious logons into on-premise Exchange server. That means logons that occurred from different places in time that is shorter than time needed to travel that far.
I know there are tools for that. But I like to make it on my own. It helps me learn about IIS logs, PowerShell and many more. And it actually works already. It is not a piece of art, but it works. I am taking IIS logs at 9AM from previous and current day and process them. Thanks to your optimization it takes under 3 hours to process.
What it does:
Reads the log files and stores the lines containingMicrosoft-Server-ActiveSync or EWS or OWA into variable $iislog
Then I create a $shortlog as I need only several columns. Some conditions are redundant, but they do not hurt.
$shortlog = $iislog | Where-Object {($_.csuseragent -notlike "MicrosoftNinja*") -and ($_.scstatus -eq 200) -and ($_.csusername -ne "-") -and ($_.cip -ne "127.0.0.1") -and ($_.cip -ne "::1") -and (($_.csuristem -like "*OWA*") -or ($_.csuristem -like "*EWS*") -or ($_.csuristem -like "*Microsoft-Server-ActiveSync*"))} | Select-Object date, time, csusername, cip
Then I unify usernames because users use different account format: firstname.lastname@domain, firstname.lastname, domain\firstname.lastname, etc. I have just firstname.lastname in the variable and all in lowercase.
I also replace our internal private IP addresses to our external IP address, because I use public IP address database to acquire latitude and longitude and private address would not work.
I read data of previously found IP addresses from CSV. If the address in log does not match address in CSV, I send a request to online database and then store the results into my CSV for next time.
I remove all users with just 1 IP address in the log. That is the part you helped with:
$shortlog = $shortlog | Group-Object csusername, cip | Group-Object @{e={($_.name -split ",")[0]}} | Where-Object Count -gt 1 | Select-Object -ExpandProperty group | Select-Object -ExpandProperty group | Sort-Object csusername, date, time
The I remove redundant records in the log. If the user connected 5 times in the row from single address, I just need the first and last occurrence.
$userrecords = $shortlog | Where-Object {$_.csusername -eq $name} | Sort-Object date, time
# Helping variable. This part make things a log faster. We do not need to compare all records. If the user connected from the same IP address several times in row,
# we just need the first and last record. Everything in between can be removed. But that is valid only for a group of connections that was not interupted by a connection
# from other address. (AAAABBBBAAAACCCCAAAAADDDDD) makes (AABBAACCAADD)
$stillthesame = 0
foreach ($userrecord in $userrecords) {
$ip1 = $userrecord.cip
# We need to know the position of the record in the array so that we can delete it if needed.
$index1 = $userrecords.indexof($userrecord)
# We need to know IP adress in next record, so that we can compare it. For that we need index + 1
$index2 = $index1 + 1
$ip2 = $userrecords[$index2].cip
# Compare IP addresses
if ($ip1 -eq $ip2) {
# If they are the same, but the variable $stillthesame is 0, it is first address in the line and we need to keep it. But we know that we have two same addresses and we change the variable to 1.
if ($stillthesame -eq 0) {
$stillthesame = 1
}
# If $stillthesame is already 1, then it means that in the previous record and next record is the same IP address, so this record can be deleted. We just add the index to the aray list.
# We are going to remove the records later.
else {
$collindex.add($index1) > $null
}
}
# If the addresses were not the same, lets reset the help variable.
else {
$stillthesame = 0
}
}
# We have gathered indexes of records that can be deleted because they contain redundant addresses. We need to delete them from the end. If we delete them from the start, position of next records
# changes and we would delete incorrect records. So we need to sort them in descending order.
$collindex = $collindex | Sort-Object -Descending
# We need to remove records from array and that is not usually possible because it has fixed size. We need to convert it to System.Collections.ObjectModel.Collection.
$final = {$userrecords}.Invoke()
# Now we can remove the records on indexes that we gathered.
write-host "Removing redundant records for $name"
get-date -Format HH:mm:ss
foreach ($index in $collindex) {
$final.removeat($index)
}
And then the main part comes:
I go user by user, for each users I compare each two his records. Not unique records. Each two records. If the IP addresses are NOT same, I calculate distance between those two points. I have date and time for each records, so I can compute speed. Then I can use distance and speed to set severity.
# Anything under speed 100 km/hod and distance 300km can is not important. Increase severity with speed.
if (($speed -gt 100) -and ($distance -gt 300)) {
$sev="Low"
if ($speed -gt 500) {
$sev="Medium"
if ($speed -gt 1000) {
$sev="High"
}
}
# Mobile operators usually return location in capital city. If there is a big distance within country, severity is still low.
if ($country1 -eq $country2) {
$sev = "Low"
}
And I store the results for suspicious logons.
Back to the main part. What do I do. Let us say, this is the log for my account:
"date";"time";"csusername";"cip"
"2021-01-11";"13:03:14";"jan.kovar";"188.244.55.168"
"2021-01-11";"13:03:50";"jan.kovar";"37.9.192.134"
"2021-01-11";"13:03:55";"jan.kovar";"37.9.192.134"
"2021-01-11";"13:06:46";"jan.kovar";"188.244.55.168"
"2021-01-11";"13:09:50";"jan.kovar";"37.9.192.134"
"2021-01-11";"13:09:55";"jan.kovar";"37.9.192.134"
"2021-01-11";"16:35:36";"jan.kovar";"37.48.0.203"
"2021-01-11";"16:35:45";"jan.kovar";"37.9.192.134"
"2021-01-11";"16:46:04";"jan.kovar";"89.24.45.135"
"2021-01-11";"16:46:16";"jan.kovar";"37.48.0.203"
"2021-01-11";"16:48:56";"jan.kovar";"89.24.45.135"
This is content of $collection1. I need to compare these rows, but I do not know how to do it, so I create $collection2 which is same as $collection1. I have two cycles outer foreach for $collection1 and inner foreach for $collection2. I do not want to compare same rows, so in each pass of $collection1, I remove first row from $collection2. $collection2.RemoveAt(0) Like this:
foreach ($collrow1 in $collection1) {
$ip1 = $collrow1.cip
$date1 = $collrow1.date
$time1 = $collrow1.time
# Creating date + time variable
$dt1 = $date1 + " " + $time1
# For the last record in collection 1, collection 2 would be empty. So lets check, that it is not.
if ($null -ne $collection2) {
# Removing first row from second collection. They are the same. We want to start comparing first two rows not the same row. First row will be removed in each cycle. We want to compare: A-B. A-C, B-C.
$collection2.RemoveAt(0)
# Compare all remaining records from collection 2 with selected record from collection 1
foreach ($collrow2 in $collection2) {
$ip2 = $collrow2.cip
$date2 = $collrow2.date
$time2 = $collrow2.time
# Creating date + time variable
$dt2 = $date2 + " " + $time2
# Calculating time difference in hours.
$howlong = (New-TimeSpan -Start $dt1 -End $dt2).totalhours
# Same IP addresses = same location, no reasong to calculate distance. If they are different, lets calculate.
if ($ip1 -ne $ip2) {
"Do the math, calculate distance, speed and severity"
}
}
}
}
Does it make sense?
Thank you
Honza