Building a script module

Like I said in the previous blog, I like to write short functions and keep each in its own file in source control. The problem is then to make these functions accessible. The usual approach is to dot-source them. I’m not particularly wild about the idea, but let’s have a look at that first.

Sample code

This example is taken from a utility to synchronize two NuGet servers. Only some of the packages will be synchronized. I use a blacklist to filter the desired packages.

I usually start a project like this with a monolithic script. That could be sufficient, but in this instance I wanted to reuse the code identifying the packages on each server. Once I had that working well enough against one server, I refactored the code into functions. These I saved in individual scripts.

The easy way: use dot-sourcing

Here is a simple example of how it could be done with dot-sourcing. First a code block.

@blacklist = @{}
@blacklist['Powershell'] = $true
@blacklist['Sandbox'] = $true
@blacklist['sqlccJamie'] = $true
@blacklist['Graphics'] = $true
@blacklist['CentinelTest'] = $true
@blacklist['Backoffice Service Portal'] = $true

This is the list of projects to be ignored, and I wanted to use a hash for the purpose. The first refactoring is to extract the names and use a loop:

@blacklist = @{}
'Powershell','Sandbox','sqlccJamie','Graphics','CentinelTest','Backoffice Service Portal' | % {
	@blacklist[$_] = $true
}

I now refactored this into a function:

function Get-Blacklist {
	param(
		[string[]]$Blacklist
	)
	$blacklistHash = @{}
	$Blacklist | % {
		$blacklistHash[$_] = $true
	}
	$blacklistHash
}

which would be invoked thus:

. .\Scripts\Get-Blacklist.ps1
@blacklist = Get-Blacklist -Blacklist 'Powershell','Sandbox','sqlccJamie','Graphics','CentinelTest','Backoffice Service Portal'

I like doing the dot-sourcing near the first usage of the function while I am still busy refactoring the code. The dot sourcing line can then be moved together with the function to a new location. This can be done differently later. I am putting the scripts into a separate folder so that they can be manipulated as a group, for example by dot-sourcing them all:

ls .\Scripts\*.ps1 -Recurse | % {
    $path = $_.FullName
    . %path
}

What is not to like about dot-sourcing

The problem for me comes with deploying the code. The fewer files and folders the better. The simplest deployment is as of single script that contains all the functions as well as the top level code – a single .ps1 file. When there are multiple scripts to be deployed, it becomes more convenient to package shared functions in a script module – a single .psm1 file. To be more professional, this should be accompanied by a manifest module with metadata like descriptions, versions and visibility rules – a .psd1 file.

The visibility rules are another thing to dislike about dot-sourcing: everything is visible.

Packaging the module

My script is SyncPackages.ps1. My plan is to deploy it with the SyncNuGetRepos script module. Everything happens in the SyncNuGetRepos folder, and I put my script there. I create an empty file called SyncNuGetRepos.psm1. Later I can put other stuff there. Also in the folder I have the Scripts subfolder with my functions.

Now I create my Build.ps1 script that creates the module. The new module is put in the bin subfolder. It is all very simple:

[string]$text = gc .\SyncNuGetRepos.psm1 | Out-String
ls .\Scripts\*.ps1 -Recurse | % {
	$path = $_.FullName
	$fn = gc $path | Out-String
	$text = "$text
$fn
"
}
if (-not (Test-Path .\bin)) {
	mkdir .\bin | Out-Null
}

$text | Out-File .\bin\SyncNuGetRepos.psm1 -Encoding utf8

We need to import the module when running the script, so we add the code to the beginning of SyncPackages.ps1:

if (Get-Module SyncNuGetRepos -All) {
    Remove-Module SyncNuGetRepos
}
if (Test-Path .\bin\SyncNuGetRepos.psm1) {
    Import-Module .\bin\SyncNuGetRepos.psm1
} else {
    Import-Module .\SyncNuGetRepos.psm1
}

The idea is to use the module we have just built from the bin folder. When we deploy the thing, the script and the module will be in the same folder. Of course we need to ensure that the context is also pointing to the folder.


The Journey Begins

Thanks for joining me!

Good company in a journey makes the way seem shorter. — Izaak Walton

post

About this blog

I have been a programmer for a very long time, and have learned a few things getting here. I want to share those things.

For the last few years I have been working on a challenging project, upgrading my employer’s Continuous Integration process. Being a Microsoft shop, the powers decided to use Microsoft products unless there is something much better for out needs.

Thus each developer is expected to use Visual Studio as the primary development tool, and Azure DevOps for team management, builds and releases. We have a lot of reuse, so components are distributed internally using NuGet.

NuGet is great for sharing components and version control. NuGet is also a challenge because it does not have native support for SQL Server or PowerShell projects. That is a problem because we are predominantly a database shop. SQL Server projects and their build targets, dacpac packages, are excellent for building database applications with many components.

My role is CI Architect, and I designed a VS database solution template. It includes custom support for consuming and publishing database packages. The aim is to make the development experience the same for all our developers.

Choices made

The old CI process depends heavily on PowerShell scripts. Having to support those assets, I decided to stick to PowerShell with the new process.

My other choices follow from my background as a programmer.

  • I like small, focused routines
  • I like modular designs
  • I like reuse
  • I like Test Driven Development (TTD)
  • I like the safety of source control
    • I am particularly fond having my routines individually under source control
  • I like encapsulating external resources
    • I like making them replaceable, say Subversion with Git, Azure DevOps with other frameworks

Coming up

The next blogs will be about these decisions, and how they manifested in PowerShell.