TB

MoppleIT Tech Blog

Welcome to my personal blog where I share thoughts, ideas, and experiences.

Pipeline-Friendly PowerShell: ValueFromPipelineByPropertyName for Real-World Objects

If you build commands that play nicely with the pipeline, everything downstream becomes cleaner, safer, and more reusable. In PowerShell, the secret weapon is ValueFromPipelineByPropertyName: you can bind parameters from matching property names on any incoming object. That means you can accept files, DirectoryInfo objects, or CSV rows without wrappers, validate early, and return PSCustomObject for predictable chaining.

Why Pipeline-Friendly Functions Matter

Pipeline-first design unlocks a few immediate wins for you and your team:

  • Less glue code: no manual property mapping or Select-Object gymnastics.
  • Predictable behavior: consistent shapes (PSCustomObject) make filtering and exporting trivial.
  • Composability: your function works with cmdlets that output strings, FileInfo/DirectoryInfo, or anything with a matching property.
  • Performance and clarity: streaming results via the Process block keeps memory usage low and feedback quick.

Binding by Property Name vs. Value

  • ValueFromPipeline binds the entire input object to a parameter. Great when your function expects a specific type (e.g., FileInfo).
  • ValueFromPipelineByPropertyName binds when the input object exposes a property whose name matches your parameter (or its aliases). Ideal for generic objects like CSV rows.

You can combine both to cover the widest range of inputs with minimal ceremony.

Core Pattern: Early Validation + Clean Output

Here’s a compact example that demonstrates best practices: bind from property names, validate early, and return PSCustomObject for clean chaining.

function Get-Checksum {
  [CmdletBinding()]
  param(
    [Parameter(Mandatory, ValueFromPipeline, ValueFromPipelineByPropertyName)]
    [Alias('FullName','LiteralPath')]
    [string]$Path,

    [ValidateSet('SHA256','SHA1','MD5')]
    [string]$Algorithm = 'SHA256'
  )
  Process {
    try {
      $full = Resolve-Path -LiteralPath $Path -ErrorAction Stop
      $hash = Get-FileHash -LiteralPath $full -Algorithm $Algorithm
      [pscustomobject]@{
        Path      = $hash.Path
        Algorithm = $hash.Algorithm
        Hash      = $hash.Hash
      }
    } catch {
      Write-Warning ("Skip {0}: {1}" -f $Path, $_.Exception.Message)
    }
  }
}

# Usage
# 'C:\app\data.txt' | Get-Checksum
# Get-ChildItem -Path 'C:\app' -File | Get-Checksum -Algorithm SHA1
# Import-Csv -Path '.\files.csv' | Get-Checksum  # requires a 'Path' column

This gives you:

  • Direct binding from FileInfo.FullName, literal string paths, or CSV rows with a Path column.
  • Early validation via Resolve-Path, which fails fast and clearly.
  • Consistent PSCustomObject output for easy Select-Object, Where-Object, Export-Csv, or ConvertTo-Json chaining.

Real-World Inputs That “Just Work”

  • Literal strings: 'C:\data\a.txt' | Get-Checksum
  • FileInfo: Get-ChildItem -File | Get-Checksum
  • DirectoryInfo: Get-ChildItem -Directory outputs DirectoryInfo with FullName; that alias maps cleanly to Path.
  • CSV rows: Import-Csv .\files.csv where the header contains Path, FullName, or LiteralPath.

Because of [Alias('FullName','LiteralPath')], the function binds property names you already have in typical objects—no extra Select-Object or Rename-ItemProperty ceremony.

Hardening the Pattern: Directories, Recursion, and Verbose Diagnostics

The previous version is great for files. Let’s evolve it to handle directories, optional recursion, and richer diagnostics, while still honoring pipeline-first design.

function Get-Checksum {
  [CmdletBinding(SupportsShouldProcess=$true, ConfirmImpact='Low')]
  [OutputType([pscustomobject])]
  param(
    [Parameter(Mandatory, ValueFromPipeline, ValueFromPipelineByPropertyName)]
    [Alias('FullName','LiteralPath')]
    [string]$Path,

    [ValidateSet('SHA256','SHA1','MD5')]
    [string]$Algorithm = 'SHA256',

    [switch]$Recurse,
    [switch]$Force
  )

  Begin {
    Write-Verbose "Algorithm: $Algorithm"
  }

  Process {
    try {
      $resolved = Resolve-Path -LiteralPath $Path -ErrorAction Stop
      $target = $resolved.ProviderPath

      # Identify whether the path is a directory or file.
      $isDir = Test-Path -LiteralPath $target -PathType Container

      if ($isDir) {
        Write-Verbose "Enumerating directory: $target (Recurse=$Recurse, Force=$Force)"
        $files = Get-ChildItem -LiteralPath $target -File -Recurse:$Recurse -Force:$Force -ErrorAction Stop
        foreach ($f in $files) {
          if ($PSCmdlet.ShouldProcess($f.FullName, "Hash ($Algorithm)")) {
            try {
              $h = Get-FileHash -LiteralPath $f.FullName -Algorithm $Algorithm -ErrorAction Stop
              [pscustomobject]@{
                Path      = $h.Path
                Algorithm = $h.Algorithm
                Hash      = $h.Hash
              }
            } catch {
              Write-Warning ("Skip {0}: {1}" -f $f.FullName, $_.Exception.Message)
            }
          }
        }
      } else {
        if ($PSCmdlet.ShouldProcess($target, "Hash ($Algorithm)")) {
          $h = Get-FileHash -LiteralPath $target -Algorithm $Algorithm -ErrorAction Stop
          [pscustomobject]@{
            Path      = $h.Path
            Algorithm = $h.Algorithm
            Hash      = $h.Hash
          }
        }
      }
    } catch {
      Write-Warning ("Skip {0}: {1}" -f $Path, $_.Exception.Message)
    }
  }
}

# Examples
# Single file (string):
# 'C:\app\data.txt' | Get-Checksum
# Directory (non-recursive):
# Get-Item 'C:\app' | Get-Checksum -Algorithm SHA1
# Directory (recursive):
# [IO.DirectoryInfo]'C:\logs' | Get-Checksum -Recurse -Verbose
# CSV rows with a Path column:
# Import-Csv .\files.csv | Get-Checksum -Algorithm MD5

Why These Choices?

  • ValueFromPipeline + ValueFromPipelineByPropertyName: lets you accept strings, FileInfo/DirectoryInfo, and CSV rows without wrappers.
  • Aliases: FullName and LiteralPath match properties from FileInfo/DirectoryInfo and many cmdlets.
  • Resolve-Path and Test-Path: early validation, explicit path semantics, and safer handling of wildcards (use -LiteralPath to avoid unexpected expansions).
  • SupportsShouldProcess: opt into -WhatIf/-Confirm for safer automation.
  • PSCustomObject output: consistent schema that chains into Export-Csv, Group-Object, Sort-Object, or Where-Object seamlessly.

Actionable Tips for Robust Pipelines

  1. Restrict inputs with validation: use ValidateSet, ValidateScript, or ValidatePattern when feasible to fail fast with helpful messages.
  2. Prefer -LiteralPath over -Path for security and predictability; avoid accidental wildcard expansion, especially with user-supplied inputs.
  3. Stream results in Process: don’t accumulate into arrays; emit objects as you go for responsiveness and lower memory use.
  4. Return rich objects, not strings: PSCustomObject with stable property names (Path, Algorithm, Hash) is easy to sort, group, or export.
  5. Emit warnings per-item: catch per-file errors and continue, instead of failing the whole pipeline.
  6. Honor common parameters: Write-Verbose and Write-Debug provide context; ShouldProcess enables dry-runs with -WhatIf.
  7. Keep parameter names canonical: Path is a common convention—binds from many producers automatically. Add aliases for compatibility.
  8. Write help: at least synopsis and examples via comment-based help to make discoverability and onboarding easier.

Putting It to Work: Real-World Use Cases

1) CI/CD Artifact Verification

After a build, verify artifact integrity before publishing:

$artifacts = Get-ChildItem -Path .\dist -File
$report = $artifacts | Get-Checksum -Algorithm SHA256
$report | Export-Csv -Path .\dist\checksums.csv -NoTypeInformation

Later, validate a download:

$expected = Import-Csv .\dist\checksums.csv | Where-Object Path -Match 'myapp.zip'
$actual = 'C:\downloads\myapp.zip' | Get-Checksum -Algorithm SHA256
if ($expected.Hash -ne $actual.Hash) { throw "Checksum mismatch!" }

2) Inventory and Drift Detection

Detect changes between two directories:

$baseline = Get-Item 'C:\golden' | Get-Checksum -Recurse | Sort-Object Path
$current  = Get-Item 'C:\prod'   | Get-Checksum -Recurse | Sort-Object Path

$diff = Compare-Object -ReferenceObject $baseline -DifferenceObject $current -Property Path,Hash -PassThru
$diff | Format-Table SideIndicator, Path, Hash

3) CSV-Driven Operations

Security or audit teams love CSVs. Drive your function from a spreadsheet without writing glue:

# files.csv must have a header named Path, FullName, or LiteralPath
Import-Csv .\files.csv | Get-Checksum

4) Cloud and Containers

In container builds, checksum files to lock down SBOMs or verify layer contents:

# Dockerfile example snippet (PowerShell 7 layer)
# RUN pwsh -NoProfile -Command \
#   "Get-ChildItem -Recurse -File /app | Get-Checksum | ConvertTo-Json -Depth 3 > /app/checksums.json"

Performance and Security Notes

  • Prefer SHA256 for integrity checks; MD5/SHA1 are fine for quick deduping, not for cryptographic trust.
  • On massive trees, avoid writing to the console per item unless -Verbose is requested. It slows pipelines.
  • If you need maximum speed, compute hashes with a reusable System.Security.Cryptography algorithm instance and buffered streams. For most scenarios, Get-FileHash is clear and fast enough.
  • When accepting user-supplied paths, stick to -LiteralPath and treat untrusted inputs carefully to avoid wildcard or path injection surprises.

Build pipeline-first commands you can trust. Cleaner pipelines, fewer wrappers, predictable inputs, easier reuse. For deeper patterns, recipes, and production hardening guidance, see the PowerShell Advanced CookBook: Read the PowerShell Advanced CookBook →

← All Posts Home →