I’m a support analyst\help desk technician by trade. Over the years, I’ve cobbled together a PowerShell script to harvest details of Windows client and server platforms.
So, being the over-achieving techie\geek that I am I decided to re-arrange the variable & function definition blocks and actual function calls to enhance readability and performance.
Big mistake!
Functional script outline before changes:
-
Variable Declarations
-
Function Definitions
-
Function calls (main script)
Non-functional script outline after changes:
-
Variable Declarations
-
Function Calls (main script)
-
Function Definitions
I ran the modified script in PowerShell ISE where it immediately failed at the first function call with an ‘unknown command’ error. Script functionality was restored after I reversed the changes made.
Conclusion: PowerShell\Visual Basic Code runs uncompiled scripts as an interpreter; code is read sequentially from the top down, line by line. The initial function call was made, but the actual function definition itself had not yet been read\interpreted which in turn raised the ‘unknown command’ error.
Don’t have much use for compilers, e.g. Invoke-PS2EXE .\myscript.ps1 .\myscript.exe, and in my script’s case completely defeats the purpose of the exercise. But I am curious to know if compiled scripts will exhibit similar behavior.
Here’s the PowerShell script I was messing with:
I use this script to streamline information gathering, minimize redundant communications, and enhance case note documentation. Saves me a buttload of time by eliminating guesswork about system architecture and reducing unnecessary communications to obtain details I should have had in the first place.