Wondering what the general opinion is of always having functions only rely on parameters passed into them (to aid pester testing, debugging etc) or if people think its ok to utilise variables that are not explicitly set in the function scope.
i.e. the project I’m working on has a centralised configuration json file that is loaded when any script is invoked (contain information such as SQL, file and event logging information etc). The project has many functions that all will use the logging information (at a minimum) so the question is, should every function that calls another function always pass on the necessary config information as parameters or is it “acceptable” that the functions can use the logging variables from a higher scope (i.e. module but definitely not global )?
I’m totally fine with module-level variables, a la the “Preference” global variables. Just export them as members of the module (Export-ModuleMember). And name them appropriately. E.g., in a “SQLStuff” module, $SQLStuffLoggingPreference would be totally legit for me.
Thanks for the quick response Don!
The platform config is exported out of a “utilities” type module using Export-ModuleMember. However its breaking my goal of having all functions as self-contained as possible, but can’t see any other way except passing config items as parameters into every function which seems impractical.
To be clear these are not just preferences but also specific config information like SQL server, database, logging location, event log ID’s etc etc. If that is also acceptable to you then I’m good
You say “configuration,” PowerShell says “preference.” Same thing. “I prefer to use __ server, __ logging location, __ whatever.”