Non-Disruptive Pipeline Logging in PowerShell with Tee-Object and NDJSON
Logging every object that flows through a PowerShell pipeline is incredibly useful for debugging, auditing, and reproducibility. But logging should never change the behavior of your pipeline. In this post, youll learn a clean, non-disruptive pattern that tees compact JSON lines (NDJSON) to a file and then continues processing by rehydrating those lines back into objects keeping your pipeline pure, predictable, and debuggable.
Here 19s the core pattern in action, which logs each object as a single compressed JSON line, appends safely, and then converts the line back into an object for downstream commands:
$log = Join-Path -Path (Get-Location) -ChildPath 'files.ndjson'
$cutoffMB = 10
Get-ChildItem -Path . -File -Recurse -ErrorAction Stop |
Select-Object FullName, Length, LastWriteTime |
ForEach-Object { $_ | ConvertTo-Json -Compress } |
Tee-Object -FilePath $log -Append |
ForEach-Object { $_ | ConvertFrom-Json } |
Where-Object { $_.Length -gt ($cutoffMB * 1MB) } |
Sort-Object Length -Descending |
Select-Object -First 5 @{N='SizeMB';E={[math]::Round($_.Length/1MB,2)}}, FullNameWhat you get: clean, line-delimited logs for every object, intact downstream behavior via ConvertFrom-Json, and a predictable output shape for analysis and replay.
Why NDJSON + Tee-Object Keeps Pipelines Pure
The pattern at a glance
- Shape the data: Use
Select-Objectto choose exactly what you want to log and process. This avoids logging secrets or overly complex types. - Serialize per object:
ConvertTo-Json -Compressturns each object into a single-line JSON string (NDJSON). Newlines in property values are escaped, so each object remains one line. - Tee to file, pass through unchanged:
Tee-Objectwrites the JSON line to disk and passes the same line down the pipeline. Your pipeline 19s control flow remains unchanged. - Rehydrate:
ConvertFrom-Jsonturns each line back into a PowerShell object ([pscustomobject]) so the rest of your commands work as usual.
Predictability and replayability
Because every object is serialized to a single line, you can:
- Tail and grep using standard tools (
tail,Select-String,rg) without parsing multi-line JSON. - Replay a run by feeding the log back:
Get-Content files.ndjson | % { $_ | ConvertFrom-Json }. - Analyze later with batch processing:
Get-Content -ReadCount 1000to chunk for higher throughput.
Data shape and type fidelity
Note that JSON round-trips the data, not the original .NET types. That 19s intentional here. In the sample, you project file info into simple properties (FullName, Length, LastWriteTime), then deserialize into pscustomobject. If you need nested properties, increase depth: ConvertTo-Json -Compress -Depth 5.
Production-Ready Pattern: Rotation, Encoding, and Reuse
Log rotation: keep files small and safe to ship
NDJSON files can grow fast. Rotate proactively to keep logs manageable and avoid long tail operations:
function Rotate-NdjsonLog {
param(
[Parameter(Mandatory)][string]$Path,
[long]$MaxBytes = 50MB
)
if (Test-Path -LiteralPath $Path) {
$len = (Get-Item -LiteralPath $Path).Length
if ($len -ge $MaxBytes) {
$ts = Get-Date -Format 'yyyyMMdd_HHmmss'
$base = [IO.Path]::ChangeExtension($Path, $null)
$archive = "${base}.${ts}.ndjson"
Move-Item -LiteralPath $Path -Destination $archive -Force
}
} else {
$dir = Split-Path -Path $Path -Parent
if ($dir -and -not (Test-Path -LiteralPath $dir)) {
New-Item -ItemType Directory -Path $dir | Out-Null
}
New-Item -ItemType File -Path $Path -Force | Out-Null
}
}Invoke this before your pipeline starts:
$log = Join-Path -Path (Get-Location) -ChildPath 'files.ndjson'
Rotate-NdjsonLog -Path $log -MaxBytes 50MBControlling encoding for cross-platform tooling
For consistent interop, prefer UTF-8 without BOM. In PowerShell 7+, Tee-Object supports -Encoding:
# PowerShell 7+
... |
Tee-Object -FilePath $log -Append -Encoding utf8NoBOM |
ForEach-Object { $_ | ConvertFrom-Json } | ...In Windows PowerShell 5.1, Tee-Object may not expose -Encoding; consider running PS 7+ for consistent encoding. If you must stay on 5.1, be aware default encodings differ and validate your downstream tools can read them.
Reusable helper for NDJSON tee
If you find yourself repeating the pattern, factor it into a helper that expects strings (already serialized), tees them, and rehydrates:
function Invoke-NdjsonTee {
param(
[Parameter(Mandatory)][string]$Path,
[int]$Depth = 5,
[switch]$Utf8NoBom,
[string]$Variable
)
process {
$json = $_ | ConvertTo-Json -Compress -Depth $Depth
if ($PSBoundParameters.ContainsKey('Utf8NoBom')) {
$json | Tee-Object -FilePath $Path -Append -Encoding utf8NoBOM -Variable $Variable |
ForEach-Object { $_ | ConvertFrom-Json }
} else {
$json | Tee-Object -FilePath $Path -Append -Variable $Variable |
ForEach-Object { $_ | ConvertFrom-Json }
}
}
}Usage:
$log = Join-Path (Get-Location) 'files.ndjson'
Rotate-NdjsonLog -Path $log -MaxBytes 50MB
Get-ChildItem -Path . -File -Recurse -ErrorAction Stop |
Select-Object FullName, Length, LastWriteTime |
Invoke-NdjsonTee -Path $log -Depth 3 -Utf8NoBom |
Where-Object { $_.Length -gt 10MB } |
Sort-Object Length -Descending |
Select-Object -First 5 @{N='SizeMB';E={[math]::Round($_.Length/1MB,2)}}, FullNameDebugging, Observability, and Real-World Tips
Capture to a variable and to disk simultaneously
Tee-Object can also store the exact lines to a variable for immediate inspection:
$rawLines = @()
Get-Process |
Select-Object Name, Id, CPU |
ForEach-Object { $_ | ConvertTo-Json -Compress } |
Tee-Object -FilePath $log -Append -Variable rawLines |
ForEach-Object { $_ | ConvertFrom-Json } |
Where-Object CPU
$rawLines | Select-Object -First 3Read NDJSON back efficiently
Process large logs in chunks to reduce overhead:
Get-Content -Path $log -ReadCount 1000 |
ForEach-Object { $_ | ConvertFrom-Json } |
Group-Object -Property Extension |
Sort-Object Count -Descending |
Select-Object -First 10 Name, CountPerformance considerations
- Serialization cost:
ConvertTo-Jsonper object is CPU work. Keep your projection lean and set-Depthonly as high as needed. - Streaming advantages: The pipeline remains streaming-friendlyyou serialize, write, and rehydrate one object at a time, avoiding large in-memory buffers.
- Batch rehydration: When post-processing logs, use
-ReadCountto batch lines and improve throughput.
Security and privacy
- Don 19t log secrets: Place
Select-Objectbefore logging to exclude tokens, passwords, or PII. - Redact where needed: Map sensitive fields to fixed markers (e.g.,
"token":"REDACTED"). - Access control: Store logs in directories with restricted ACLs; rotate and archive to secured storage.
Encoding, line endings, and tooling
- UTF-8 without BOM is the safest for cross-platform parsing. Prefer
-Encoding utf8NoBOMwhere available. - Line endings:
Tee-Objectwrites one line per input item. Your NDJSON stays one-object-per-line, which works with tailing tools.
Concurrency and file contention
- One writer per file: Multiple concurrent writers can interleave lines unpredictably. Prefer per-run filenames (timestamped) or a simple queue (one pipeline writes; others hand off).
- Atomicity: Renames during rotation are atomic on most filesystems, but coordinate rotation so you dont rotate while other jobs are appending.
Round-tripping caveats
- Type loss: JSON rehydrates to
pscustomobject, not original .NET types. If you need original methods/properties, keep that logic before serialization or map explicitly after rehydration. - Depth and enums: Use
-Depthfor nested objects; in PowerShell 7+,-EnumsAsStringscan improve readability if you log enums. - DateTimes: PowerShell JSON uses ISO 8601; be explicit about time zones where it matters.
End-to-End Example and Use Cases
Inventory large files (top 5)
The opening snippet is a practical inventory that logs every file scanned, then filters and reports the top 5 by size. The NDJSON log lets you audit what was scanned and re-run analysis later without rescanning the disk.
API processing with replay
$log = Join-Path (Get-Location) 'responses.ndjson'
Rotate-NdjsonLog -Path $log -MaxBytes 100MB
1..5 |
ForEach-Object { Invoke-RestMethod "https://api.example.com/items?page=$_" -ErrorAction Stop } |
Select-Object -ExpandProperty items |
ForEach-Object { $_ | ConvertTo-Json -Compress -Depth 5 } |
Tee-Object -FilePath $log -Append -Encoding utf8NoBOM |
ForEach-Object { $_ | ConvertFrom-Json } |
Where-Object { $_.status -eq 'active' } |
Select-Object id, name, statusLater, replay without hitting the API:
Get-Content $log | ForEach-Object { $_ | ConvertFrom-Json } |
Where-Object { $_.status -eq 'active' } |
Select-Object id, name, statusCI/CD transforms with traceability
When transforming configuration (e.g., YAML or JSON) during builds, log each intermediate object as NDJSON to reproduce any environment-specific issues locally. Since the pipeline remains pure, you can run the same steps with or without logging enabled.
Wrap-Up
By combining ConvertTo-Json -Compress, Tee-Object -Append, and ConvertFrom-Json, you get non-disruptive, line-oriented logging that preserves pipeline semantics. Add rotation, pick a sane encoding, and youve got a production-friendly pattern thats easy to tail, replay, and analyze.
Further reading: PowerShell Advanced Cookbook.
#PowerShell #TeeObject #Logging #Pipelines #Scripting #Automation #PowerShellCookbook