TB

MoppleIT Tech Blog

Welcome to my personal blog where I share thoughts, ideas, and experiences.

Object-First Output in PowerShell: Build Pipelines You Can Trust

PowerShell shines when you treat data as objects, not text. An object-first approach makes your scripts predictable, testable, and reusable across pipelines, CI/CD, and automation. The core rule: return objects from your commands and format only at the edges. That means you emit PSCustomObject (not strings), use Write-Host only for status, and apply Format-Table (or any Format-*) only at the very end when rendering to a console.

Why Object-First Matters

  • Composability: Objects flow cleanly through pipelines. You can filter, sort, group, join, and export without parsing brittle strings.
  • Predictable output: Upstream scripts and external tools (CI, monitoring, log shippers) can reliably consume structured data like JSON or CSV.
  • Testability: Pester tests can assert types and properties. It’s far easier than matching strings.
  • Performance and reliability: Avoiding string munging reduces errors and culture/locale pitfalls. Calculated properties enforce consistent types.
  • DevOps-ready: Objects serialize naturally to JSON for artifacts, dashboards, and APIs.

Patterns and Anti-Patterns

1) Emit PSCustomObject, not strings

Anti-pattern: turning data into lines of text early.

# Bad: downstream consumers have to parse strings
Get-Process | ForEach-Object { "{0},{1},{2}" -f $_.Name, $_.Id, $_.CPU }

Pattern: shape data with Select-Object or build it explicitly, then emit objects.

# Good: return structured data
Get-Process |
  Select-Object Name, Id, @{N='CPU';E={[math]::Round($_.CPU,1)}} |
  ForEach-Object {
    [pscustomobject]@{
      Name = $_.Name
      Id   = $_.Id
      CPU  = $_.CPU
    }
  }

Objects can be exported or serialized anywhere:

$procs | Export-Csv -Path procs.csv -NoTypeInformation
$procs | ConvertTo-Json -Depth 3 | Out-File procs.json

2) Keep Write-Host for status only

Anti-pattern: using Write-Host for data you expect to pipe or capture.

# Bad: this data cannot be piped/exported
Write-Host ($procs | Format-Table -AutoSize)

Pattern: return objects for data, and use the right channel for diagnostics:

  • Write-Verbose for developer diagnostics (-Verbose opt-in)
  • Write-Information for structured informational messages
  • Write-Progress for progress bars
  • Write-Warning and Write-Error for issues
Write-Verbose "Querying processes..."
Write-Information "Collecting top CPU consumers" -Tags 'metrics','cpu'

3) Apply Format-Table only at the very end

Anti-pattern: formatting mid-pipeline. Format-Table and friends produce formatting objects, not your original objects. Downstream commands can no longer consume structured data.

# Bad: breaks pipelines
Get-Process | Format-Table Name, Id | Export-Csv broken.csv

Pattern: keep objects all the way through; format only when rendering to console.

# Good: keep objects; format for display at the edge
$procs = Get-Process | Select-Object Name, Id, CPU
$procs | Export-Csv procs.csv -NoTypeInformation
$procs | Format-Table -AutoSize  # console-only

Example: Object-First Process Info

The following function returns structured process data using PSCustomObject. It demonstrates calculated properties to standardize numeric formats.

function Get-ProcInfo {
  [CmdletBinding()]
  param([int]$Top = 5)
  Get-Process |
    Sort-Object CPU -Descending |
    Select-Object -First $Top Name, Id,
      @{N='CPU';E={[math]::Round($_.CPU,1)}},
      @{N='PM_MB';E={[math]::Round($_.WorkingSet64/1MB,1)}} |
    ForEach-Object {
      [pscustomobject]@{ Name=$_.Name; Id=$_.Id; CPU=$_.CPU; PM_MB=$_.PM_MB }
    }
}

# Usage: return objects, format only at the end
$procs = Get-ProcInfo -Top 8
$procs |
  Where-Object { $_.CPU -gt 0 } |
  Sort-Object CPU -Descending |
  Format-Table -AutoSize

Because Get-ProcInfo returns objects, you can also serialize and reuse it across tools:

# Persist and share
$procs | ConvertTo-Json -Depth 3 | Out-File .\top-procs.json
$procs | Export-Csv .\top-procs.csv -NoTypeInformation

# Filter and join with other datasets later
$hot = $procs | Where-Object CPU -gt 30

A Reusable Function Template

Use this pattern to write object-first advanced functions with type-safe parameters, pipeline support, and structured output.

function Test-Port {
  [CmdletBinding()]
  [OutputType([pscustomobject])]
  param(
    [Parameter(Mandatory, ValueFromPipeline, ValueFromPipelineByPropertyName)]
    [Alias('ComputerName')]
    [string]$Host,

    [Parameter(ValueFromPipelineByPropertyName)]
    [ValidateRange(1,65535)]
    [int]$Port = 443,

    [int]$TimeoutMs = 2000
  )
  process {
    $sw = [System.Diagnostics.Stopwatch]::StartNew()
    $client = New-Object System.Net.Sockets.TcpClient
    try {
      $async = $client.BeginConnect($Host, $Port, $null, $null)
      if (-not $async.AsyncWaitHandle.WaitOne($TimeoutMs)) {
        throw "Timeout after $TimeoutMs ms"
      }
      $client.EndConnect($async)
      $open = $true
      $errorMsg = $null
    } catch {
      $open = $false
      $errorMsg = $_.Exception.Message
    } finally {
      $sw.Stop(); $client.Dispose()
    }

    [pscustomobject]@{
      Host      = $Host
      Port      = $Port
      Open      = $open
      LatencyMs = [math]::Round($sw.Elapsed.TotalMilliseconds,1)
      Error     = $errorMsg
    }
  }
}

# Compose with other commands
'api.mycorp.local','db.mycorp.local' | Test-Port -Port 5432 |
  Where-Object Open -eq $false |
  Select-Object Host, Port, Error |
  Format-Table -AutoSize  # edge-only formatting

DevOps and CI/CD Integration

Because you emit objects, integration with CI/CD is straightforward. Serialize to JSON for artifacts or dashboards, and keep console formatting separate.

GitHub Actions example

- name: Collect process info
  shell: pwsh
  run: |
    . .\scripts\Get-ProcInfo.ps1
    $data = Get-ProcInfo -Top 10
    $data | ConvertTo-Json -Depth 3 | Out-File artifact\procs.json
    $data | Format-Table -AutoSize  # human-readable console output

Azure DevOps logging

Write-Host "##vso[task.setvariable variable=TopCpuJson]$($data | ConvertTo-Json -Depth 3)"

By separating data from display, you can both present readable output in the console and emit machine-friendly artifacts for pipeline steps, alerts, and audits.

Testing and Reliability

Pester tests for object output

Describe 'Get-ProcInfo' {
  It 'emits PSCustomObject with expected properties' {
    $result = Get-ProcInfo -Top 2
    $result | Should -Not -BeNullOrEmpty
    foreach ($item in $result) {
      $item | Should -BeOfType 'System.Management.Automation.PSCustomObject'
      $item.PSObject.Properties.Name | Should -Contain 'Name'
      $item.PSObject.Properties.Name | Should -Contain 'Id'
      $item.PSObject.Properties.Name | Should -Contain 'CPU'
      $item.PSObject.Properties.Name | Should -Contain 'PM_MB'
    }
  }

  It 'remains consumable after serialization' {
    $json = (Get-ProcInfo -Top 1) | ConvertTo-Json -Depth 3
    $json | Should -Match '"Name"'
  }
}

Error handling and channels

  • Use Write-Error for errors and throw with -ErrorAction Stop when you need to fail fast.
  • Prefer Write-Verbose and Write-Information for diagnostics; they won’t pollute data streams.
  • Avoid Out-String and mid-pipeline Format-* unless you’re done processing.

Actionable Checklist

  • Return PSCustomObject with stable property names and types.
  • Use Select-Object (and calculated properties) to shape data.
  • Reserve Write-Host for progress/status; never for data.
  • Apply Format-Table or Format-List only at the end.
  • Serialize with ConvertTo-Json or Export-Csv for artifacts.
  • Write advanced functions with [CmdletBinding()], validate parameters, and declare [OutputType()].
  • Cover behavior with Pester tests that assert object types and properties.

Adopting an object-first mindset yields cleaner pipelines, predictable output, easier reviews, and reusable code across automation and CI/CD. Build object-focused scripts you can trust—format only at the edges, and let structured data do the heavy lifting.

← All Posts Home →