Central Powershell Repository?

Hi there,

I was wondering if anyone knows of a good tool that could act as an internal centralised Powershell script repository.

I ask because I’ve recently started a new job and now I work in a team and have been busy writing PoSh scripts for various tasks. Ideally I’d like to share these with my colleagues in a way that meant they could use, review and amend them in a simple, logical and controlled manner AND dot source them from scripts that are run as scheduled tasks (like we could if we ran them from a centralised file server) on our server estate.

Effectively I’m looking for something that can be used like SourceSafe, GitHub or some other SVN that we could also run the scripts from by dot sourcing from the central location.

We’ve got SharePoint, but it doesn’t look like I can create a link to the script files from the server (although it appears I can from the client) but this could be more about my lack of SharePoint experience than anything else. If there is a way to access (read only would be BRILLIANT) these files from Powershell so they could be dot sourced from a server then SharePoint would do fantastically for what we want.

To recap:

Ideally I’d like something that let us accomplish the following (in priority order) and I was wondering if anyone knew of anything that exists, or anything that could be made to work:

  1. Version control
  2. Check-out type facility to prevent simultainious edits
  3. Authorisation of edits before commitment (so others can check the changes in case they rely on the item)
  4. Central repository that can be dot sourced from script files

Thanks,

Owen

Most folks either use SharePoint or a source control tool. SAPIEN offers ChangeVue, for example, and there are obviously numerous open-source tools, and Microsoft has Team Foundation Server. Most don’t let you “link” a file from a file server; you “import” the file into the tool, which is how it gets under source control. Your colleagues would need to “get” the file to use it, unless you got it yourself and published the latest version to a file server.

Don’t force them to do dot-sourcing, though. Do the right thing - build modules that they can import. It’s no harder to build, and much more structured to use. Dot-sourcing is for beginners!

The only tool that has all that out of the box that I know of is SVN WebDAV
https://svn.apache.org/repos/asf/subversion/trunk/notes/http-and-webdav/webdav-usage.html

You could, of course, set up custom read-only WebDAV based on any source code repository. http://sabredav.org/ is an easy to work with, PHP Open Source WebDAV server that leaves it up to you to provide a storage back end. I’ve actually been toying with the idea of making a SabreDAV plugin to work on top of Git.

Thanks Don,

So if I’ve understood you correctly you’re saying something along the lines of the following would be a better idea:

We create a module, or series of modules with similar functions grouped together, and then use the import-module cmdlet to bring those into the scripts (or commands) we run on the servers (in some cases as scheduled tasks)?

This is kind of how I’ve approached the scripts with my dot-sourcing idea. On my searching for a repository I came across a blog (Scripting Guy I think…) about creating modules. Conceptually I don’t really see the difference between the two approaches; have I missed something here?

What are your thoughts on adding these modules to the default powershell profiles on the target servers? Would it be too much of a waste to load things that might never be used? Would the default Powershell profile location cover things like Service Accounts which don’t interactively logon to the machines, but run tasks in the Task Scheduler and when using the “Run as a different user” option?

Joel thanks for your idea.

If I’ve understood correctly you’re saying I should be able to setup a WebDAV on the back of a SVN which would allow the dot sourcing idea I had (that might now be moot).

We create a module, or series of modules with similar functions grouped together, and then use the import-module cmdlet to bring those into the scripts (or commands) we run on the servers (in some cases as scheduled tasks)?

Correct. The difference between a module and dot-sourcing involves a couple of things. One, PowerShell knows where to look for modules. In v3 and later, it can provide help for commands that aren’t yet loaded, and implicitly load commands that you try to execute. So it’s as if all modules are loaded all the time, which helps with command discovery. Unlike dot-sourcing, which pollutes the global scope, modules can also be cleanly unloaded. Modules also support the creation of private module-level variables, aliases, and functions, meaning you can have “private” utility functions that aren’t seen by someone using the module. Modules support module-level documentation (comment-based help). They are the correct way to package sets of commands for consumption by others.

Modules can also evolve from a single-script file into a multi-file module with a manifest. That might include your script, supporting scripts, formatting views, type extensions, and so on. They all get loaded and unloaded as a single piece.

What are your thoughts on adding these modules to the default powershell profiles on the target servers? Would it be too much of a waste to load things that might never be used? Would the default Powershell profile location cover things like Service Accounts which don’t interactively logon to the machines, but run tasks in the Task Scheduler and when using the “Run as a different user” option?

In v3, there’s no reason to import modules in a profile, provided the module is located in one of the paths referenced in the PSModulePath environment variable (which you can add to). That environment variable tells the shell where modules live; it will auto-load the modules on-demand provided they live in that path. That way you get the convenience of having them loaded all the time, without the downside of loading something you won’t use.

That said, your scripts should always document the modules they require - and the ability to do so, using the #requires comment, is another advantage of modules over dot-sourcing. PowerShell reads those comments and can fail the script with a useful error message if the required module(s) aren’t available.

Dot-sourcing was a quick-and-dirty means of including one script within another in v1, when the team was frankly a bit strapped for time on certain features they’d originally wanted to add. When v2 introduced modules, that was clearly the direction they’d wanted to go. Dot-sourcing is fine for quick-and-dirty, but it lacks the structure and support of modules. You can’t “discover” a dot-sourced script; you have to know about it in advance. The shell can’t tell you what commands the script implements without you loading it, and even then the shell won’t know which commands came from that script - there’s no connection back to it. A dot-sourced script can’t declaratively refer to other files (like views or type extensions); it must explicitly load them as part of its code. Dot-sourced scripts can’t be un-dot-sourced to remove them.

With modules, you get all of those things.

And frankly, form a technical perspective, a module is just the script you already have, with a .psm1 filename extension instead of .ps1, and located in the correct path for auto-discover and -loading. There’s no extra “overhead” in making a script module.

Thanks for the detailed reply Dom; I really do appreciate it.

Having come from an environment where “quick and dirty” was the de-facto method it’s nice to have some place where someone will not only say what the best practise is but explain why.

I take it the auto-loading is what I’ve seen when using Get-AdUser without first loading the module; which now I think about it makes a lot of sense. Honestly it’s pretty amazing that I can take my PS1 files and turn them into something like that with a change of file extension (and proper location).

I think I might use SharePoint for version control and then install the modules on the given servers (or discuss the idea of using a GPO to push them out to all the Servers).

Also, I’m looking forward to the scripting games; the new team based approach looks interesting (in a good way). It also seems like a great opportunity to get some of my new colleagues involved.

Yeah, for what it’s worth … I agree completely with Don that you should be using modules.

Of course, you still need to put them somewhere (ideally in source control) and get access to them on your computers.

SVN’s WebDAV would essentially let you map a drive to the module repository. I’ve never tried to use it that way (and I’m not sure I’d want to, but that’s what you asked for).
What I would say about source control and automatically deploying modules to servers (whether by UNC path, or by copying to the server, etc) is that you really want to make sure that whatever method you use lets you:

  1. be sure you have the current version
  2. develop separately from deployment

In other words: if you use SVN and WevDAV, you need to use branching or a second repository for development … to make sure that the code that the servers are using doesn’t have half-finished changes. If you use GPOs to copy the files from a central file server to the individual machines, you need to make sure that the file copies are being refreshed regularly… I have some thoughts along those lines, but they’re not ready for prime time.

I think I’ve stumbled into an area that’s far more traditionally covered by normal development cycles and I’m not really applying the best approach to it from lack of experience.

It certainly seems like Powershell blurs the lines between scripting and developing the more you use it and get to know it; which is a brilliant thing to be honest.

Taking on board the comments here I think I’ll adpot an approach whereby we write and develop scripts/modules in a central repository (probably SharePoint for now) and then copy and deploy them from there (somehow, maybe just manually to start off with).

Is there a way that I can use the #Requires tag to do a versioning check on the module to be imported?

For example if I create a module today and then tomorrow make some amendments can I assign those amendments as version 2 of that module and then use the #Requires tag accordingly? If not I presume I could achieve this by renaming the whole module (i.e. Module-Name-v1 then Module-Name-v2).

Instead of pestering both of you in your free time, have you got any links that you would recommend on reading with regards to PS Modules?

Thanks,

Owen

Apologies, I suffer from pre-mature posting (where I post something, then continue to search for information around that item afterwards).

I’ve found the TechNet about_Requires article which explains it rather well. (http://technet.microsoft.com/en-us/library/hh847765.aspx)

Thanks for the leads & tips.