TB

MoppleIT Tech Blog

Welcome to my personal blog where I share thoughts, ideas, and experiences.

Pipeline-Friendly PowerShell: ValueFromPipeline and ValueFromPipelineByPropertyName Done Right

You write PowerShell to automate real work, and that work rarely lives in a single command. The best functions compose cleanly: they stream, they accept objects from other tools, and they return structured objects for the next step. In this post, youll learn how to design pipeline-friendly commands using ValueFromPipeline and ValueFromPipelineByPropertyName, plus Alias for common field names, all wrapped in the Begin/Process/End pattern so your tools stay fast and memory-light.

Why Pipeline-Friendly Commands Matter

Composability over glue code

When parameters bind directly from pipeline values or object properties, you skip adapter scripts. That means fewer temporary variables, fewer foreach loops, and fewer brittle Select-Object renames. Your functions can flow together naturally:

Import-Csv users.csv | Get-UserInfo | Where-Object Found | Set-UserStatus -Disabled

Streaming and memory efficiency

The Begin/Process/End blocks let you initialize resources once, handle one input at a time, and dispose at the end. You avoid collecting everything into arrays, which keeps memory low and lets you handle large inputs gracefully.

Predictable, discoverable inputs

With ValueFromPipelineByPropertyName and helpful [Alias()] attributes, your function reads properties like UPN, SamAccountName, or Id without extra mapping. That19s a big win when integrating with AD, Azure, REST APIs, or CSVs that don19t agree on field names.

Build a Pipeline-Friendly Function

Core pattern

Start with an advanced function that accepts input by value and by property name. Use aliases for common field names and implement Begin/Process/End for streaming:

function Get-UserInfo {
  [CmdletBinding()]
  param(
    [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true, Position=0)]
    [Alias('User','UPN','SamAccountName')]
    [ValidateNotNullOrEmpty()]
    [string]$UserName
  )
  Begin {
    Write-Verbose 'Initializing user lookup'
  }
  Process {
    try {
      $id = $UserName.ToLower()
      # Simulate lookup here. Replace with AD/Graph/API call in real code.
      [pscustomobject]@{
        UserName = $UserName
        Id       = $id
        Found    = $true
      }
    } catch {
      Write-Warning ("Lookup failed for {0}: {1}" -f $UserName, $_.Exception.Message)
    }
  }
  End {
    Write-Verbose 'Lookup complete'
  }
}

# Examples
# By value (strings bind to -UserName)
# 'alice','bob' | Get-UserInfo

# By property name via aliases (UPN / SamAccountName map to -UserName)
# @(
#   [pscustomobject]@{ UPN='carol@example.com' },
#   [pscustomobject]@{ SamAccountName='dave' }
# ) | Get-UserInfo

# From CSV (column named UPN or SamAccountName binds automatically)
# Import-Csv users.csv | Get-UserInfo

Notes:

  • ValueFromPipeline binds direct pipeline values (e.g., strings like "alice") to $UserName.
  • ValueFromPipelineByPropertyName inspects objects for matching properties. [Alias()] expands the match set so UPN or SamAccountName bind to -UserName without Select-Object renames.
  • Begin/Process/End enables streaming and resource reuse. Use Begin to create clients/connections, Process for per-item logic, and End to dispose.

Accepting multiple identifiers without adapters

If your function can resolve users by any of several fields, bind by property name on multiple parameters and choose the best available at runtime:

function Resolve-User {
  [CmdletBinding()]
  param(
    [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true, Position=0)]
    [Alias('User','UserName','SamAccountName','UPN')]
    [string]$Identity,

    [Parameter(ValueFromPipelineByPropertyName=$true)]
    [Alias('Mail','EmailAddress')]
    [string]$Email,

    [Parameter(ValueFromPipelineByPropertyName=$true)]
    [Alias('ObjectId','Guid','Id')]
    [string]$UserId
  )
  Begin { }
  Process {
    $key = $null
    if ($PSBoundParameters.ContainsKey('Identity')) { $key = $Identity }
    elseif ($PSBoundParameters.ContainsKey('Email')) { $key = $Email }
    elseif ($PSBoundParameters.ContainsKey('UserId')) { $key = $UserId }

    if (-not $key) {
      Write-Error 'No usable identity on input. Provide Identity/UserName/UPN, Email/Mail, or UserId/Id.'
      return
    }

    # Do the lookup
    [pscustomobject]@{ Key=$key; Source=(($PSBoundParameters.Keys -join ',')); Resolved=$true }
  }
}

# Works with different shapes without glue code
# [pscustomobject]@{ UPN='eve@contoso.com' } | Resolve-User
# [pscustomobject]@{ Mail='frank@contoso.com' } | Resolve-User
# [pscustomobject]@{ Id='f47ac10b-58cc-4372-a567-0e02b2c3d479' } | Resolve-User

Tip: Avoid marking multiple positional parameters with ValueFromPipeline=$true unless you separate them into distinct parameter sets. Binding the same input to two parameters can create ambiguous sets. Pick a single primary pipeline parameter (-Identity above), then add additional ValueFromPipelineByPropertyName parameters as optional fallbacks.

Streaming with external resources

Initialize clients once and reuse them per item:

function Get-Widget {
  [CmdletBinding()] 
  param(
    [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true, Position=0)]
    [Alias('Id')]
    [string]$WidgetId
  )
  Begin {
    $baseUri = 'https://api.example.com'
    $client = [System.Net.Http.HttpClient]::new()
    $client.BaseAddress = [Uri]$baseUri
  }
  Process {
    try {
      $resp = $client.GetAsync("/widgets/$WidgetId").GetAwaiter().GetResult()
      if ($resp.IsSuccessStatusCode) {
        $json = $resp.Content.ReadAsStringAsync().GetAwaiter().GetResult()
        $obj = $json | ConvertFrom-Json
        $obj
      } else {
        Write-Error ("API returned {0} for {1}" -f [int]$resp.StatusCode, $WidgetId)
      }
    } catch {
      Write-Error -ErrorAction Continue -Message ("Request failed for {0}: {1}" -f $WidgetId, $_.Exception.Message)
    }
  }
  End {
    $client.Dispose()
  }
}

# Example
# (1..3) | ForEach-Object { [pscustomobject]@{ Id = $_ } } | Get-Widget

Patterns for Production-Ready Pipelines

Support common field names with Alias

  • Directory: [Alias('UPN','SamAccountName','UserPrincipalName')]
  • API IDs: [Alias('Id','ObjectId','ResourceId','Guid')]
  • Email: [Alias('Mail','EmailAddress')]
  • Names: [Alias('Name','DisplayName')]

These aliases let your function accept objects from AD, AzureAD, Exchange, Graph, custom REST APIs, and CSV exports without pre-processing.

Emit objects, not text

Return structured [pscustomobject] results so callers can filter, sort, and export easily. Avoid Write-Host. Use Write-Verbose and Write-Debug for diagnostics.

Handle errors predictably

  • Use Write-Error for failures so callers can control behavior with -ErrorAction and -ErrorVariable.
  • For recoverable per-item issues, catch and continue; for unrecoverable initialization problems, throw in Begin.
  • Include the identity in error messages to aid triage.
try {
  # lookup...
} catch {
  Write-Error -Category InvalidOperation -TargetObject $UserName -Message (
    "Lookup failed for '{0}': {1}" -f $UserName, $_.Exception.Message)
}

Use ShouldProcess for changes

Any command that modifies state should support -WhatIf/-Confirm via SupportsShouldProcess:

function Disable-User {
  [CmdletBinding(SupportsShouldProcess=$true, ConfirmImpact='Medium')]
  param(
    [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
    [Alias('User','SamAccountName','UPN')]
    [string]$Identity
  )
  Process {
    if ($PSCmdlet.ShouldProcess($Identity, 'Disable account')) {
      # Call directory/API here
      [pscustomobject]@{ Identity=$Identity; Disabled=$true }
    }
  }
}

Performance tips

  • Don19t accumulate: Avoid $acc += $item in a loop; just Write-Output or return each item from Process.
  • Reuse clients: Create HTTP/DB clients in Begin, reuse in Process, dispose in End.
  • Batch when possible: If the backend supports batch APIs, collect a small window (e.g., 50-200 items) then submit in chunks while still streaming results.
  • Type validation: Narrow parameter types (e.g., [string], [int], [guid]) to avoid expensive conversions and surprise bindings.

Testing and CI

  • Pester: Unit test binding paths (by value, by property name, via each alias) and error behavior.
  • PSScriptAnalyzer: Lint for common issues (Invoke-ScriptAnalyzer).
  • Contract tests: Validate output shape with [OutputType()] and tests that ensure consistent properties for downstream tools.
# Pester snippet
Describe 'Get-UserInfo' {
  It 'binds by value' {
    ('alice' | Get-UserInfo).UserName | Should -Be 'alice'
  }
  It 'binds by property name via alias' {
    ([pscustomobject]@{ UPN='bob@contoso.com' } | Get-UserInfo).UserName | Should -Be 'bob@contoso.com'
  }
}

Common pitfalls (and fixes)

  • Ambiguous parameter sets: If two parameters accept the same type from the pipeline, PowerShell may not resolve a set. Solution: have exactly one primary parameter with ValueFromPipeline=$true, or separate scenarios into distinct ParameterSetNames.
  • Silent property mismatches: If your CSV says UserUPN but you only aliased UPN, nothing binds. Fix: add the alias or use Rename-Item/Select-Object @{Name='UPN';Expression={$_.UserUPN}} as a temporary adapter.
  • Throwing inside Process: Unhandled throws terminate the whole pipeline. Prefer Write-Error for per-item failures unless termination is intended.
  • Returning text: Strings are valid pipeline inputs and might bind unexpectedly down the line. Return objects with stable property names instead.

Putting It All Together

Design each function to accept input by value and by property name, provide intuitive aliases, and stream with Begin/Process/End. You get cleaner integrations, fewer adapters, predictable pipelines, and easier reuse across scripts, CI/CD jobs, and operational runbooks.

  1. Pick a single primary pipeline parameter (e.g., -Identity or -UserName), mark it ValueFromPipeline and ValueFromPipelineByPropertyName, and add helpful aliases.
  2. Initialize expensive resources in Begin, process one item at a time in Process, and clean up in End.
  3. Return objects, not text; surface diagnostics with Write-Verbose and Write-Error.
  4. For state-changing commands, implement SupportsShouldProcess and test -WhatIf.
  5. Validate with Pester and PSScriptAnalyzer to keep your contract stable.

Once you adopt this pattern, your modules will snap into existing pipelines with minimal glue code4easy to compose, easy to test, and easy to maintain.

← All Posts Home →