My PS Script doesnt throw any errors nor loads data into a flat file

Hi All i have a PS that works fine from SERVER A , as we are migrating i moved the PS script to SERVER B , when i execute i do not get any error message and the flat file doesnt load any records.

The task Scheduler uses this action (F:\PVD_GIS_Extracts\PowerShell_Scripts\DataExtract-Parcel.cmd)

DataExtract-Parcel.cmd code is

powershell "& ““F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1"””

Please advice.

Thanks

What flat file? what records?

Why are you using a .cmd file to run a PowerShell script, rather than running the script directly?

What are the differences between server A and B? are they the same OS? are the user accounts migrated, or re-created from scratch? is this a local AD domain, or Azure, or some other cloud hosted infrastructure? are the shared drives/file system paths the same? are the group policies the same?

Have you reviewed the script for things that will be affected by configuration differences between server A and B? Have you reviewed your ACLs?

I’d say you don’t really need the intermediate .cmd file. You can just create a scheduled task to call powershell with the argument of F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 Having said that, it doesn’t explain why it worked on one machine and not the other. To troubleshoot, log into SERVER B and make sure F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 is a correct path. If so, just try to run it manually. If it works, then the problem is with your scheduled task and not the script (which you didn’t post). If it still doesn’t work, then the issue is probably with the script itself. If you post it, I may be able to help you troubleshoot it.

[quote quote=272722]What flat file? what records?

Why are you using a .cmd file to run a PowerShell script, rather than running the script directly?

What are the differences between server A and B? are they the same OS? are the user accounts migrated, or re-created from scratch? is this a local AD domain, or Azure, or some other cloud hosted infrastructure? are the shared drives/file system paths the same? are the group policies the same?

Have you reviewed the script for things that will be affected by configuration differences between server A and B? Have you reviewed your ACLs?

[/quote]
Thanks ,

Sorry my apologies Server A is 2008 and Server B is 2016. We have been having the issue and we with server team went through everything and they all are identical. Also this PS is part of 5 other PS scripts that were migrated over. Rest all scripts have no issue only this one.

Thanks

[quote quote=272728]I’d say you don’t really need the intermediate .cmd file. You can just create a scheduled task to call powershell with the argument of F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 Having said that, it doesn’t explain why it worked on one machine and not the other. To troubleshoot, log into SERVER B and make sure F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 is a correct path. If so, just try to run it manually. If it works, then the problem is with your scheduled task and not the script (which you didn’t post). If it still doesn’t work, then the issue is probably with the script itself. If you post it, I may be able to help you troubleshoot it.

[/quote]
Thanks. I tried from task scheduler , manually , with the same results. I will post the script here.

Thanks

Thanks MikeR i ran the task scheduler as F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 and its taking forever, usually it takes 3 mins.

$From = ""
$To = ""
$Cc = ""
#$Attachment = "C:\temp\Some random file.txt"
$SMTPServer = ""
$SMTPPort = "25"
$Subject = "Success - PVD Parcel Data Extract Completed Successfully"
$Body = "PVD Parcel Data Extract CompleSQLPPted Successfully"
$StartTime = Get-Date

function ErrorHandler($error) 
{

$ErrorTime = Get-Date
$Subject = "Failure - PVD Parcel Data Extract Process Started at $StartTime and failed to Complete at $ErrorTime"
$Body = "PVD Parcel Data Extract Process failed to Complete"
Send-MailMessage -From $From -to $To -Cc $Cc -Subject $Subject -Body $Body -SmtpServer $SMTPServer

}

trap { ErrorHandler $_; break }
$ErrorActionPreference = "Stop"

#Prod Setup using Cloud URL
#Setup your credential
$username = ""
$password = "" | ConvertTo-SecureString -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($username,$password)

#Get a session object to pass to other cmdlet's

#Prod Cloud Session
$session = Get-PVDSession -URL "" -Credential $credential -UseSSL

#$query = "SELECT top(100000) * FROM vd_vw_VendorDB_ParcelGIS;"
#$query = "SELECT * FROM vd_vw_VendorDB_ParcelGIS;"
$query = "SELECT 
       [ParcelNumber]
      ,[AltParcelNumber]
      ,[MapNumber]
      ,case OwnerName when 'NA' then 'Occupant' else OwnerName end ownername
      ,case OwnerName when 'NA' then '' else OwnerName2 end ownername2
      ,case OwnerName when 'NA' then '' else OwnerName3 end ownername3
      ,case OwnerName when 'NA' then '' else OwnerName4 end ownername4
      ,[OwnerAddress]
      ,[OwnerAddress2]
      ,[OwnerCity]
      ,[OwnerState]
      ,[OwnerZip]
      ,[OwnerForeignState]
      ,[OwnerForeignCountry]
      ,[Township]
      ,[LegalDescription]
      ,[Acreage]
      ,[PropertyClass]
      ,[PropertySubClass]
      ,[PropertySubClassDescription]
      ,[AddressNumber]
      ,[AddressNumberFraction]
      ,[AddressDirection]
      ,[AddressStreetName]
      ,[AddressStreetSuffix]
      ,[AddressStreetSuffix2]
      ,[AddressCity]
      ,[AddressZip]
      ,[FullAddress1]
      ,[FullAddress2]
      ,[Block]
      ,[Lot]
      ,[StateBlock]
      ,[StateTaxDistrictID]
      ,[TaxDistrict]
      ,[STR_Section]
      ,[STR_Township]
     ,[STR_Range]
      ,[AssessorDistrict]
      ,[SubdivisionID]
      ,[SubdivisionName]
      ,[LandNeighborhood]
      ,[Neighborhood]
      ,[NeighborhoodFactor]
      ,[TrendingFactor]
      ,[PlatBook]
      ,[PPFormType]
      ,[CondoType]
      ,[Sewer]
      ,[Water]
      ,[Gas]
      ,[Electricity]
      ,[Alley]
      ,[Sidewalk]
      ,[RoadType]
      ,[Topography]
      ,[AssessorYear]
      ,[AssessorYear_LandC1]
      ,[AssessorYear_LandC2]
      ,[AssessorYear_LandC3]
      ,[AssessorYear_ImpC1]
      ,[AssessorYear_ImpC2]
      ,[AssessorYear_ImpC3]
      ,[AssessorYear_LandTotal]
      ,[AssessorYear_ImpTotal]
      ,[AssessorYear_TotalAV]
      ,[CertifiedYear]
      ,[CertifiedYear_LandC1]
      ,[CertifiedYear_LandC2]
      ,[CertifiedYear_LandC3]
      ,[CertifiedYear_ImpC1]
      ,[CertifiedYear_ImpC2]
      ,[CertifiedYear_ImpC3]
      ,[CertifiedYear_LandTotal]
      ,[CertifiedYear_ImpTotal]
      ,[CertifiedYear_TotalAV]
      ,[Status]
      ,[AssessedBy]
      ,[AssessedDate]
      ,[CheckedBy]
      ,[CheckedDate]
      ,[UpdatedBy]
      ,[UpdateDate]
      ,[CamaParcelID] FROM vd_vw_VendorDB_ParcelGIS;"

Get-PVDAdhocQuery -Session $session -Query $query | Out-PVDQueryToFile -FileName "C:\PVD_GIS_Extracts\Extract_Files\Parcel_extract_cloud_test.csv" -Delimited -Delimiter "|"

$EndTime = Get-Date
$Body = "PVD Parcel Data Extract Started at $StartTime and Completed Successfully at $EndTime"
Send-MailMessage -From $From -to $To -Cc $Cc -Subject $Subject -Body $Body -SmtpServer $SMTPServer

exit

Mark,
I posted the script here , please advice where it is wrong
Thanks

Are all of these PVD cmdlets an internal developed API wrapper? Nothing posted is necessarily wrong and per your post you are not getting errors, it’s just taking a long time to execute. First, what is “a long time”? Does it finish? If you execute this manually on the server, does it complete? Some of the things that stand out to me is Out-PVDQueryToFile. Why do you need a wrapper for this? Normally, you establish a session and perform a GET, in your case a SQL query. That query should be returned and be a PSObject to use Powershell export methods. New-Session | Get-Something | Export-Csv -Delimiter ‘|’. I would run this in pieces to determine where time is being consumed. Create the session. Run the query. Run the parse to a file.

Thanks Rob Simmers. I think i am confusing everyone here. Like i mentioned there are two servers and there are 5.ps1 scripts on the old server. All these 5 scripts are copied to the new server . Out of 5 all the other 4 are working , expect this one (posted script). Here are the cases on the new server for the problematic script.
Case 1 : This script when executed using the task scheduler
The task Scheduler uses this action (F:\PVD_GIS_Extracts\PowerShell_Scripts\DataExtract-Parcel.cmd)

DataExtract-Parcel.cmd code is

powershell “& “”F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1″””
It completes in 2 minutes and NO DATA is loaded.
2) When i just run “”F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1″”” using task scheduler it run for hours and wont stop, until i manually stop it.
On OLD SERVER this takes 3 mins and loads the data 350,000 records.
3) I manually executed the script and it completes in 3 minutes with NO RECORDS loaded.
Thanks

You are running the script directly as the Program\Script? This would just be attempting to open the PS1, not execute the PS1. Just like the batch file, you need to call Powershell.exe and pass the script as the parameter:

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -File F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1

The Program\Script would be C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe and -File … would be the Argument.

Thanks Rob Simmers, apologies for my ignorance regarding PS,
So should i just open a new window in PowershellISE and paste (C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -File F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1) and RUN it?

Thanks in task scheduler created a new task
Now it run few seconds and loads nothing

Hi Any update on this?
Thanks

You are not providing a great deal of information to assist you. Assuming you have the task setup to execute the script properly, next step might be validating that the script actually executing. Add a line to the beginning of the script:

Set-Content -Path C:\Scripts\PVD.log -Value 'Script started'

If you run the task and see the file, then the script is executing. If it executing, then you need look at what is incorrect in the script. What credentials are running the script. Are all of the settings and credentials correct in the script?

Thanks to all. Here is what i did , in my SELECT Query instead of bringing in all the350K records i added TOP 100 and i was able to load the data into the flat file. Now my question is , is it some kind of timeout issue? If it is where should i look to increase the timeout?
Thanks

https://stackoverflow.com/questions/18932130/task-scheduler-powershell-timeout

Thanks All,
After many iterations what i came to found out was in my SELECT statement if i chose TOP 40000 it loads data, anything more than that no errors but the file doesn’t get loaded. I tried the timeout options provided by Rob Simmers, still the same issue, only 40K records gets loaded into the flat file.

You’ve mentioned that this one script out of a set of five is experiencing an issue, but the others seem fine. What does this script do that is different from the other four? Do the others handle the same records and process them into output files?

If I am understanding correctly from looking at your script, it connects to a database stored on a cloud server (Oracle VPD?) and submits an SQL query for ~80 fields of the records stored in the database (real estate data of some kind). Are the other scripts providing the custom PVD commands that are being used in this script? Please answer Rob’s question from before:

A query for ~80 fields for 40000 records would produce a results table with more than 3 million values. Is this normal for how you’ve handled this data in the past? I’m not a database expert, but it seems likely that you would run into memory issues at some point, or connection limitations on Oracle’s side. Have these records been stored in the cloud for a long time, or were they migrated recently?

Frankly, due to the amount of data involved, this looks like a job for an actual application and not a script.

If this is a wrapper for an API and you can get a specific number like that before it stops, this points to throttle limits on the API. This goes into paging configuration of the API calls. Say the limit is 800 API calls within 15 minutes and paging is returning 50 records per page, then you are hitting a throttle limit in the API. If you want 80000 records, you would have to page with 100 records. If you were running this on another server without issue, that does not make much sense unless an IP\Name was whitelisted on the API side to ignore API calls.