What is your process for developing & testing DSC?

I’ve been working with DSC for about 2 years now and am surprised by how little I have found on this topic. I come from an application development background so I have high expectations of my local development process. Specifically:

  1. Easily list dependencies in 1 place. Use that list for both local development and in production deployments.
  2. Rapid iterations of change code => build code => test change => repeat. Rapid meaning 1 - 10 seconds.
My current process does not meet my expectations. I'm hoping others are willing to share their setup so I can spot ways to improve. Here is how I do it:


  • Azure Automation pull server
  • Local Windows 10 dev machine
  • Target DSC node - an Azure VM. OS varies but is usually Windows Server 2012 R2
Process A

This is how I used to do it:

  1. Deploy the VM. Do not register it to Azure Automation.
  2. Log into the VM.
  3. Install git, VS Code, clone repo.
  4. Source code has all PowerShell modules in it. This means 3rd party OS (gross, I know) and custom modules we wrote.
  5. Source code has DSC configs.
  6. Custom build script copies all modules to a directory in the PSModulesPath.
  7. Custom build script complies the config (dot sourced).
  8. Custom build script applies the config to localhost with Start-DscConfiguration.
  9. Rapidly repeat steps 6 - 8 to work through any compilation or runtime errors.
  10. When task is complete, check in changes to modules & config.
  11. Delete the VM.
  12. Push modules to Azure Automation. All modules needed are in the project so we know we are deploying the same versions we just tested.
  13. Push configs to Azure Automation. Compile them.
  14. Deploy the VM to Azure. This time, let it register to Azure Automation.
  • Very rapid feedback when developing both configs and custom modules.
  • Setting up a dev machine on the target DSC node is cumbersome. Imagine if the DSC failed on step 13. I now have to set the whole thing up again to troubleshoot.
  • Obvious problem - 3rd party modules are included in the project. This is fixed in the next process described below.

Process B

This is the direction I’m heading for an improved process:

  1. Deploy a VM to Azure. Let it register with Azure Automation pull server. (Assume a minimal or empty DSC config exists just to let the initial registration succeed).
  2. On my local workstation, Install-Module all modules my config depends on.
  3. Compile the config (dot source). Iterate to work through compilation errors.
  4. Upload the mof(s) to Azure Automation.
  5. Remote into the Azure VM and trigger an on-demand pull with Update-DscConfiguration.
  6. Review errors in Event logs.
  7. Repeat from 2 - 6 until task is complete.
  8. Deploy all 3rd party and custom modules to Azure Automation using an ARM template or similar.
  9. Let Azure Automation comple the DSC.
  10. Verify the config was applied successfully.
  11. Check in changes.
The one variation of this is if I need to author a custom DSC module. In this case, I'd also add a symlink in step 2.


  • No need to setup dev tools on the target DSC node. Can just use my local workstation.
  • Removed source controlled 3rd party modules from my project. I can now just install the package and version I want, when I want it.
  • There is no single manifest for modules. Notice step 2 uses Install-Module and step 8 uses something else. We must roll our own manifest to prevent simple mistakes.
  • Feedback loop is slower. The time it takes to see the results of a change in this process is significantly slower than what was used in process A.

How do you do it?


That’s a great question, and I’m also surprised by how few people ask that question…

the TL,DR is… Test-Kitchen

Now in a bit more details, although I feel I already talked about it a lot (maybe not enough):

There are different elements you want to test: Custom DSC Resource, Composite, Configuration and types of tests such as Unit, Integration and “Operation Validation” tests, where appropriate…

From your description you seem to be focused on testing your configuration, or maybe also a bit of Custom DSC Resources, so I’ll quickly cover those only.


Test-Kitchen is a testing harness enabling Test-Driven Infrastructure development that handles and abstracts the repetitive tasks of create/test/destroy Nodes & configurations. It's, at core, a single node testing tool (you can't test multi-nodes inter-dependency out of the box).

Stuart Preston (From Chef) and I, have talked about it during last PS Conf EU, watch the video for an introduction.
I’ve also wrote about the benefit and workflow for DSC in details in a blog post, a while ago.

So the workflow is roughly:

  • you dev your module/config
  • run
    • kitchen create to create the VM
    • kitchen converge to apply the config
    • kitchen verify to run the tests
    • kitchen destroy to remove the VM
    • All those can be abstracted behind a simple kitchen test
  • Test-Kitchen (and the libs) allows you to pull required modules from a gallery
  • You can use a driver for Azure, or Hyper-V, Vagrant, AWS and other...
The downside when using a cloud provider is the Added Delay. Local virtualisation usually can run a (relatively simple) end to end test in about 5min (from creating the VM to Destroy). Some configs can be faster (Diff disks, Linked Clone, better HW...).

There’s much more to it (i.e. test matrix with multiple suites on multiple platforms…), but feel free to look at the links and ask further question.

Side note, that does not remove the need for further integration testing in Azure (if that’s your prod environment), such as making sure the Group of nodes work together…