Structured Logging in PowerShell with Write-Information: Clean Output, Rich Diagnostics
When your PowerShell scripts need to pipe clean objects to the next command while still emitting rich, searchable diagnostics, the Information stream is the right tool. With Write-Information, tags, and a tiny bit of stream redirection, you get structured logs in a file and predictable data on stdout. This post shows how to implement the pattern, strengthen your CI/CD pipelines, and keep logs useful yet safe.
The Pattern: Data on stdout, logs on the Information stream
Minimal working example
The core idea is simple: write data to the success output stream (stdout in shell terms), and write diagnostics to the Information stream (stream 6). Then append the Information stream to a log file with 6>>. You can toggle logging with $InformationPreference or -InformationAction.
$InformationPreference = 'Continue'
function Get-ProjectStats {
[CmdletBinding()]
param([string[]]$Names)
foreach ($n in $Names) {
Write-Information -MessageData @{ Event='Start'; Name=$n; Time=(Get-Date) } -Tags 'STEP'
Start-Sleep -Milliseconds (Get-Random -Minimum 50 -Maximum 120)
[pscustomobject]@{ Name=$n; Length=$n.Length }
Write-Information -MessageData @{ Event='End'; Name=$n; Time=(Get-Date) } -Tags 'STEP'
}
}
$names = @('alpha','beta','gamma')
$log = './run.log'
# Log information stream to file, keep objects on stdout
Get-ProjectStats -Names $names 6>> $log | Sort-Object NameWhat you get:
- Predictable stdout: structured objects suitable for
Sort-Object,Export-Csv,ConvertTo-Json, etc. - Diagnostics in a file: structured informational records serialized by the host, with tags and timestamps.
- Safer pipelines: downstream consumers receive only the data they expect.
Controlling when logs emit
You have two easy switches for enabling/disabling Information messages:
- Preference variable:
$InformationPreference = 'SilentlyContinue' | 'Continue' - Per-command:
-InformationAction SilentlyContinue | Continue
# Global/session toggle
$InformationPreference = 'SilentlyContinue' # quiet by default
# One-off override for a single invocation
Get-ProjectStats -Names $names -InformationAction Continue 6>> $log | Export-Csv -NoTypeInformation out.csvTip: In modules and functions, prefer -InformationAction (or accept the common parameter) so callers control verbosity without you mutating global state.
Make logs structured and useful
Use MessageData and Tags
Write-Information accepts any object as -MessageData. Pair that with -Tags for selective filtering downstream (e.g., performance metrics, steps, network, security).
Write-Information -MessageData @{ Event='CacheHit'; Key=$key; Size=$bytes; Time=Get-Date } -Tags 'CACHE','PERF'
Write-Information -MessageData @{ Event='ApiCall'; Url=$url; Status=$status; Ms=$ms } -Tags 'HTTP','PERF'
Write-Information -MessageData @{ Event='Auth'; User=$env:USERNAME; Outcome='Success' } -Tags 'SECURITY'Because -MessageData stores an object, not just text, you can later parse/filter by properties and tags without regex brittleness.
JSON lines for log files
When you redirect the Information stream to a file, many hosts format output as strings. If you want log files that are line-delimited JSON (great for Splunk/ELK/OpenSearch), serialize the object yourself:
function Get-ProjectStatsJsonLog {
[CmdletBinding()]
param([string[]]$Names)
foreach ($n in $Names) {
$start = @{ Event='Start'; Name=$n; Time=Get-Date }
Write-Information ($start | ConvertTo-Json -Compress) -Tags 'STEP','json'
Start-Sleep -Milliseconds (Get-Random -Minimum 50 -Maximum 120)
[pscustomobject]@{ Name=$n; Length=$n.Length }
$end = @{ Event='End'; Name=$n; Time=Get-Date }
Write-Information ($end | ConvertTo-Json -Compress) -Tags 'STEP','json'
}
}
Get-ProjectStatsJsonLog -Names $names -InformationAction Continue 6>> $logNow run.log contains one JSON object per line. Most log shippers can tail and ingest this format with minimal configuration.
Capture Informational records in-memory
If you want to post-process logs (aggregate, compute durations, redact) before writing, capture the Information stream with -InformationVariable:
$iv = @()
Get-ProjectStats -Names $names -InformationAction Continue -InformationVariable iv | Out-Null
# Inspect shape
$iv | Select-Object -First 1 | Format-List *
# Properties: TimeGenerated, Source, Tags, MessageData, ToString()
# Filter by tag and export structured JSON
$iv.Where({ $_.Tags -contains 'STEP' }) |
Select-Object TimeGenerated, Tags, MessageData |
ConvertTo-Json -Depth 5 |
Set-Content -Encoding UTF8 $logDerive metrics from logs
Because you logged Start/End events with timestamps, you can compute durations without polluting stdout:
$steps = $iv.Where({ $_.Tags -contains 'STEP' })
$groups = $steps | Group-Object { $_.MessageData.Name }
$metrics = foreach ($g in $groups) {
$start = $g.Group | Where-Object { $_.MessageData.Event -eq 'Start' } | Select-Object -First 1
$end = $g.Group | Where-Object { $_.MessageData.Event -eq 'End' } | Select-Object -Last 1
if ($start -and $end) {
[pscustomobject]@{
Name = $g.Name
DurationMs = [int]([datetime]$end.MessageData.Time - [datetime]$start.MessageData.Time).TotalMilliseconds
}
}
}
$metrics | Sort-Object DurationMs | Format-TableThis keeps the pipeline pure: objects continue downstream, while you retain full, structured telemetry for analysis.
Production integration: CI/CD, containers, and safety
GitHub Actions
Emit structured logs to a file and upload as an artifact, while keeping stdout clean for consumers like jq, ConvertFrom-Json, or report generators.
- name: Run build with structured logs
shell: pwsh
run: |
$InformationPreference = 'Continue'
./build.ps1 6>> "$env:RUNNER_TEMP/build.log" | Out-Null
- name: Upload logs
uses: actions/upload-artifact@v4
with:
name: build-logs
path: ${{ runner.temp }}/build.logTip: You can choose JSON lines as shown earlier to make log ingestion trivial in ELK/OpenSearch.
Containers
When running in containers, mount a volume for logs and call PowerShell with Information redirection:
# Build-time: keep script small and deterministic
# Runtime: send info stream to a file under /logs
docker run --rm -v ${PWD}/logs:/logs mcr.microsoft.com/powershell pwsh -NoProfile -Command "
$InformationPreference='Continue';
./script.ps1 6>> /logs/run.log | Out-Null
"In orchestrators (Kubernetes, Nomad), you can log to stdout for collection by the runtime too. If you prefer a single combined stream for centralized logging, temporarily merge streams with 6>&1 to send Information into stdout:
./script.ps1 6>&1 | your-shipperNote: Merging streams loses the separation benefit. Prefer distinct files or sidecars when possible.
Security, performance, and error handling best practices
- Do not log secrets. Redact tokens and passwords before logging. Tag sensitive areas so they can be double-checked by linters.
$tokenHash = (Get-FileHash -InputStream ([IO.MemoryStream]::new([Text.Encoding]::UTF8.GetBytes($token)))).Hash
Write-Information -MessageData @{ Event='Auth'; User=$env:USERNAME; Token='[REDACTED]'; TokenHash=$tokenHash } -Tags 'SECURITY'- Prefer
Write-Errorfor failures; reserveWrite-Informationfor diagnostics. This allowstry/catchwith-ErrorAction Stopwhile keeping telemetry clean.
try {
Invoke-RestMethod -Uri $url -ErrorAction Stop
} catch {
Write-Error -ErrorRecord $_
Write-Information -MessageData @{ Event='ApiError'; Url=$url; Message=$_.Exception.Message } -Tags 'HTTP','ERROR'
throw # fail the pipeline when appropriate
}- Control verbosity for speed. Logging has a cost. Expose a parameter to toggle it:
param([switch]$Log)
$ia = if ($Log) { 'Continue' } else { 'SilentlyContinue' }
Get-ProjectStats -Names $names -InformationAction $ia 6>> $log | Export-Csv stats.csv -NoTypeInformation- Normalize timestamps to UTC (e.g.,
(Get-Date).ToUniversalTime()) for cross-region CI/CD analytics. - Use tags consistently (e.g.,
'STEP','PERF','IO','SECURITY') to enable simple, fast filtering without parsing. - Rotate logs or use a shippable location; large log files can fill disks in ephemeral runners.
By adopting these patterns, you keep functional output clean and machine-readable, while building a robust, structured diagnostic trail you can filter, aggregate, and ship to any observability stack.
Strengthen your PowerShell logging today. For deeper patterns and recipes, explore the PowerShell Advanced Cookbook: https://www.amazon.com/PowerShell-Advanced-Cookbook-scripting-advanced-ebook/dp/B0D5CPP2CQ/