Faster Pipelines with .Where() and .ForEach() in PowerShell
You can make many everyday PowerShell tasks faster and clearer by leaning on the built-in .Where() and .ForEach() methods on collections. These methods remove much of the per-object pipeline overhead you get with Where-Object and ForEach-Object, while remaining highly readable and expressive. In this post, youll learn when and how to use them, what the different modes mean (like First and Split), and how to apply them to real-world automation scenarios.
Why .Where() and .ForEach() beat the pipeline for in-memory work
When you run a pipeline like ... | Where-Object { ... } | ForEach-Object { ... }, PowerShell constructs a pipeline and invokes your scriptblocks per object. That flexibility is great, but it adds overhead for each item. If you already have your data in memory (arrays, lists, or the output of a command you 19ve captured to a variable), using .Where() and .ForEach() methods keeps the operation inside the engine and avoids pipeline plumbing.
How they work
- .Where({ predicate }, [mode], [count]) filters a collection by a condition. It can stream matches as it enumerates the collection and supports useful modes like First, Last, Until, SkipUntil, and Split.
- .ForEach({ transform }) projects each input item to something else and returns the transformed sequence. It 19s similar to
Select-Object -ExpandPropertyorForEach-Objectin intent, but executes with less overhead on in-memory collections.
These methods are available in Windows PowerShell 4.0+ and all modern PowerShell (Core) versions. They operate over any enumerable collection (arrays, lists, results from cmdlets you 19ve captured into a variable, etc.).
Baseline example
Here 19s a typical cleanup/reporting task over log files. The Split mode partitions the collection into two arrays in a single pass, and .ForEach() cleanly formats the output.
$root = 'C:\Logs'
$cutoff = (Get-Date).AddDays(-30)
$files = Get-ChildItem -Path $root -File -Recurse
$old, $recent = $files.Where({ $_.LastWriteTime -lt $cutoff }, 'Split')
$old.ForEach({
'{0} {1} MB' -f $_.FullName, [math]::Round($_.Length/1MB, 2)
})What you get: faster loops, lower CPU, clearer code, and predictable results.
Practical patterns: filter, transform, partition
1) Filter with intent: First, Last, Until, SkipUntil
Use mode to tell PowerShell exactly what you want and avoid unnecessary work:
# First N matching items (stops early)
$firstLarge = $files.Where({ $_.Length -gt 50MB }, 'First', 5)
# Last N matching items (walks the whole input but only keeps last N)
$lastErrors = (Get-Content 'C:\Logs\app.log').Where({ $_ -match 'ERROR' }, 'Last', 10)
# Until: take items until the condition becomes true
$linesUntilHeader = (Get-Content 'C:\data.txt').Where({ $_ -match '^#\s*HEADER' }, 'Until')
# SkipUntil: drop items until the condition becomes true, then take the rest
$afterMarker = (Get-Content 'C:\data.txt').Where({ $_ -match '^-- START --$' }, 'SkipUntil')These modes express intent and reduce work. For example, First returns early and avoids scanning the entire list when you only need a handful of matches.
2) Transform cleanly with .ForEach()
Project exactly what you need without pipeline overhead:
# Extract a few properties with a computed one
$summary = $files
.Where({ $_.Extension -in '.log', '.txt' })
.ForEach({
[pscustomobject]@{
Name = $_.Name
Path = $_.FullName
AgeDays = [int]((Get-Date) - $_.LastWriteTime).TotalDays
SizeKB = [math]::Round($_.Length/1KB, 1)
}
})The [pscustomobject] projection produces structured results you can sort, export, or feed into other tooling.
3) Partition in one pass with Split
When you want matching and non-matching items, Split saves you an extra pass over the 'Split') # Take action on each group $stale.ForEach({ # Example: archive or delete # Remove-Item -Path $_.Path -Force 'Stale: {0} ({1} days)' -f $_.Path, $_.AgeDays }) $fresh.ForEach({ 'Recent: {0}' -f $_.Path })
4) Real-world DevOps examples
- CI logs triage: Quickly pick top offenders by size and only show the first N lines that matter.
- Service health: Categorize and act on services without pipeline overhead.
- Inventory: Filter and shape asset data from CSV for reports and alerts.
# CI logs: keep only the largest 10 logs and show first 50 lines of each
$largestLogs = (Get-ChildItem 'C:\CI\logs' -File -Recurse)
.Where({ $_.Extension -eq '.log' })
| Sort-Object Length -Descending
$largestLogs = @($largestLogs)[0..([math]::Min(9, $largestLogs.Count-1))]
$largestLogs.ForEach({
'--- {0} ({1} MB) ---' -f $_.FullName, [math]::Round($_.Length/1MB,2)
(Get-Content $_.FullName -TotalCount 50) -join [Environment]::NewLine
})
# Services: fast categorization
$svc = Get-Service
$stopped, $running = $svc.Where({ $_.Status -eq 'Running' }, 'Split')
$stopped.ForEach({ 'Stopped: ' + $_.Name })
$running.ForEach({ 'Running: ' + $_.Name })
# Inventory: shape CSV into a compact schema
$assets = Import-Csv '.\assets.csv'
$servers = $assets.Where({ $_.Type -eq 'server' })
$projection = $servers.ForEach({
[pscustomobject]@{
Host = $_.Hostname
OS = $_.OS
CPU = [int]$_.CPU
MemoryGB = [int]$_.MemoryGB
}
})Tips, benchmarks, and gotchas for production scripts
Measure your wins
Performance claims are workload-dependent. Use Measure-Command on a representative dataset.
$data = 1..5e5 | ForEach-Object { [pscustomobject]@{ N = $_; Odd = ($_ % 2) } }
# Pipeline form
Measure-Command {
$x = $data | Where-Object { $_.Odd } | ForEach-Object { $_.N * 2 }
}
# Method form
Measure-Command {
$x = $data.Where({ $_.Odd }).ForEach({ $_.N * 2 })
}On large in-memory collections, the method form typically shows noticeable reductions in elapsed time and CPU due to lower dispatch overhead.
Stream vs. materialize: choose wisely
- Use .Where()/.ForEach() when your input is already in memory (e.g., you need to reuse it, sort it, or you fetched a bounded dataset).
- Prefer pipeline filters when the source produces a huge stream and you don 19t want to hold everything in memory. For example, prefer
Get-WinEvent -FilterHashtableto prefilter at the source; then optionally use.Where()for fine-tuning on the smaller set.
Combine with smarter sources
Always filter as early as possible. The faster pattern often looks like: limit at the source, then fine-tune in memory.
# Good: narrow at source, then refine
$files = Get-ChildItem 'C:\Logs' -File -Recurse -Filter '*.log'
$recentErrors = $files
.Where({ $_.LastWriteTime -gt (Get-Date).AddDays(-7) })
.ForEach({ Select-String -Path $_.FullName -Pattern 'ERROR' -SimpleMatch })Readability techniques
- Chain sparingly: Two or three method calls read well. If you have many steps, split to variables with meaningful names.
- Name your parameter: For complex expressions, use the
param()form instead of$_.
$recentInfo = $files
.Where({ $_.Extension -eq '.log' })
.Where({ param($f) $f.LastWriteTime -gt (Get-Date).AddDays(-3) })Error handling
Wrap risky operations inside the transform block and emit structured results.
$results = $files.Where({ $_.Extension -eq '.json' }).ForEach({
try {
$obj = Get-Content $_.FullName -Raw | ConvertFrom-Json
[pscustomobject]@{ Path = $_.FullName; Ok = $true; Items = $obj.Count }
}
catch {
[pscustomobject]@{ Path = $_.FullName; Ok = $false; Error = $_.Exception.Message }
}
})Migrate existing scripts safely
- Replace a simple
| Where-Objectwith.Where()only when the data is already collected in memory or comfortably bounded. - Use First/Last/Split to convey intent and reduce extra passes.
- Benchmark on your real data. Keep the clearer version if performance is similar; otherwise choose the faster one.
End-to-end example: logs by age with sized summary
$root = 'C:\Logs'
$cutoff = (Get-Date).AddDays(-30)
# Enumerate once
$files = Get-ChildItem -Path $root -File -Recurse
# Partition in a single pass
$old, $recent = $files.Where({ $_.LastWriteTime -lt $cutoff }, 'Split')
# Shape output for reporting
$report = $old.ForEach({
[pscustomobject]@{
Path = $_.FullName
AgeDays = [int]((Get-Date) - $_.LastWriteTime).TotalDays
SizeMB = [math]::Round($_.Length/1MB, 2)
}
}) | Sort-Object SizeMB -Descending
$top10 = @($report)[0..([math]::Min(9, $report.Count-1))]
$top10.ForEach({ '{0} {1} MB ({2} days)' -f $_.Path, $_.SizeMB, $_.AgeDays })You 19ll notice how few passes you take over the data and how precise the intent is at each step.
Build practical performance habits
- Filter early and exactly (source filters, then
.Where()). - Prefer
.Where()/.ForEach()for in-memory transforms. - Use modes like First and Split to stop early or partition in one pass.
- Benchmark with
Measure-Commandon your data. - Keep transformations small and readable; extract bigger logic into functions.
If you want more advanced patterns, explore community resources and cookbooks for production-grade PowerShell practices.
Hashtags: #PowerShell #Performance #Scripting #BestPractices #Automation