PowerShell script performance can make the difference between a task that completes in seconds versus one that takes hours. Understanding how to profile, measure, and optimize your scripts is essential for any PowerShell developer working with large datasets, automation pipelines, or resource-intensive operations.
This comprehensive guide covers everything from basic profiling techniques to advanced optimization strategies, complete with practical examples and measurable results.
Understanding PowerShell Performance Basics
Before diving into optimization, you need to understand what affects PowerShell performance. The primary factors include pipeline processing, object creation overhead, remote operations, and inefficient cmdlet usage.
Measuring Execution Time with Measure-Command
The Measure-Command cmdlet is your first tool for performance analysis. It measures how long a script block takes to execute:
# Basic timing
$time = Measure-Command {
Get-Process | Where-Object {$_.CPU -gt 100}
}
Write-Host "Execution time: $($time.TotalMilliseconds) ms"
# Output:
# Execution time: 245.6789 ms
For more detailed analysis, you can measure multiple iterations and calculate averages:
$iterations = 10
$times = 1..$iterations | ForEach-Object {
(Measure-Command {
Get-ChildItem -Path C:\Windows -Recurse -ErrorAction SilentlyContinue |
Where-Object {$_.Length -gt 1MB}
}).TotalMilliseconds
}
$avg = ($times | Measure-Object -Average).Average
$min = ($times | Measure-Object -Minimum).Minimum
$max = ($times | Measure-Object -Maximum).Maximum
Write-Host "Average: $([math]::Round($avg, 2)) ms"
Write-Host "Minimum: $([math]::Round($min, 2)) ms"
Write-Host "Maximum: $([math]::Round($max, 2)) ms"
# Output:
# Average: 1523.45 ms
# Minimum: 1489.23 ms
# Maximum: 1678.91 ms
Profiling Tools and Techniques
Using Stopwatch for Precise Measurements
The System.Diagnostics.Stopwatch class provides more granular control over timing measurements:
$stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
# Your code here
$data = Get-Content -Path "C:\large-file.log"
$filtered = $data | Select-String -Pattern "ERROR"
$stopwatch.Stop()
Write-Host "Elapsed: $($stopwatch.Elapsed.TotalSeconds) seconds"
Write-Host "Ticks: $($stopwatch.ElapsedTicks)"
# Output:
# Elapsed: 2.3456789 seconds
# Ticks: 23456789
Creating a Custom Profiling Function
Build a reusable profiling function to standardize your performance testing:
function Measure-ScriptPerformance {
param(
[Parameter(Mandatory)]
[scriptblock]$ScriptBlock,
[int]$Iterations = 5,
[string]$Description = "Script"
)
$results = @()
for ($i = 1; $i -le $Iterations; $i++) {
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
$time = (Measure-Command $ScriptBlock).TotalMilliseconds
$results += $time
Write-Progress -Activity "Profiling" -Status "Iteration $i of $Iterations" -PercentComplete (($i / $Iterations) * 100)
}
Write-Progress -Activity "Profiling" -Completed
[PSCustomObject]@{
Description = $Description
Iterations = $Iterations
AverageMs = [math]::Round(($results | Measure-Object -Average).Average, 2)
MinMs = [math]::Round(($results | Measure-Object -Minimum).Minimum, 2)
MaxMs = [math]::Round(($results | Measure-Object -Maximum).Maximum, 2)
MedianMs = [math]::Round(($results | Sort-Object)[[math]::Floor($results.Count / 2)], 2)
}
}
# Usage
$result = Measure-ScriptPerformance -ScriptBlock {
1..1000 | ForEach-Object { $_ * 2 }
} -Iterations 10 -Description "Simple Loop"
$result | Format-Table
# Output:
# Description Iterations AverageMs MinMs MaxMs MedianMs
# ----------- ---------- --------- ----- ----- --------
# Simple Loop 10 12.45 11.23 15.67 12.34
Pipeline vs ForEach-Object Performance
Understanding when to use the pipeline versus traditional loops is crucial for optimization:
# Pipeline approach (slower for large datasets)
$pipelineTime = Measure-Command {
1..10000 | ForEach-Object {
[PSCustomObject]@{
Number = $_
Square = $_ * $_
}
}
}
# Traditional loop (faster)
$loopTime = Measure-Command {
$results = foreach ($i in 1..10000) {
[PSCustomObject]@{
Number = $i
Square = $i * $i
}
}
}
Write-Host "Pipeline: $($pipelineTime.TotalMilliseconds) ms"
Write-Host "Loop: $($loopTime.TotalMilliseconds) ms"
Write-Host "Speed improvement: $([math]::Round(($pipelineTime.TotalMilliseconds / $loopTime.TotalMilliseconds), 2))x faster"
# Output:
# Pipeline: 856.34 ms
# Loop: 124.56 ms
# Speed improvement: 6.87x faster
Optimizing Cmdlet Usage
Where-Object vs .Where() Method
The .Where() method introduced in PowerShell 4.0 offers significant performance improvements:
$testData = 1..10000
# Traditional Where-Object
$whereTime = Measure-Command {
$result = $testData | Where-Object { $_ -gt 5000 }
}
# .Where() method
$whereMethodTime = Measure-Command {
$result = $testData.Where({$_ -gt 5000})
}
Write-Host "Where-Object: $($whereTime.TotalMilliseconds) ms"
Write-Host ".Where() method: $($whereMethodTime.TotalMilliseconds) ms"
Write-Host "Improvement: $([math]::Round((($whereTime.TotalMilliseconds - $whereMethodTime.TotalMilliseconds) / $whereTime.TotalMilliseconds) * 100, 2))%"
# Output:
# Where-Object: 45.67 ms
# .Where() method: 8.23 ms
# Improvement: 81.98%
ForEach-Object vs .ForEach() Method
$numbers = 1..5000
# ForEach-Object cmdlet
$foreachObjectTime = Measure-Command {
$result = $numbers | ForEach-Object { $_ * 2 }
}
# .ForEach() method
$foreachMethodTime = Measure-Command {
$result = $numbers.ForEach({$_ * 2})
}
# Traditional foreach loop
$foreachLoopTime = Measure-Command {
$result = foreach ($num in $numbers) { $num * 2 }
}
[PSCustomObject]@{
Method = "ForEach-Object"
TimeMs = [math]::Round($foreachObjectTime.TotalMilliseconds, 2)
}
[PSCustomObject]@{
Method = ".ForEach()"
TimeMs = [math]::Round($foreachMethodTime.TotalMilliseconds, 2)
}
[PSCustomObject]@{
Method = "foreach loop"
TimeMs = [math]::Round($foreachLoopTime.TotalMilliseconds, 2)
} | Format-Table
# Output:
# Method TimeMs
# ------ ------
# ForEach-Object 234.56
# .ForEach() 45.67
# foreach loop 12.34
String Operations Optimization
String concatenation is a common performance bottleneck. Using the right approach can dramatically improve speed:
# Bad: String concatenation with +
$concatTime = Measure-Command {
$result = ""
1..1000 | ForEach-Object {
$result += "Line $_`n"
}
}
# Good: Using StringBuilder
$stringBuilderTime = Measure-Command {
$sb = [System.Text.StringBuilder]::new()
1..1000 | ForEach-Object {
[void]$sb.AppendLine("Line $_")
}
$result = $sb.ToString()
}
# Best: Using -join
$joinTime = Measure-Command {
$result = (1..1000 | ForEach-Object { "Line $_" }) -join "`n"
}
Write-Host "String concatenation: $($concatTime.TotalMilliseconds) ms"
Write-Host "StringBuilder: $($stringBuilderTime.TotalMilliseconds) ms"
Write-Host "-join operator: $($joinTime.TotalMilliseconds) ms"
# Output:
# String concatenation: 892.45 ms
# StringBuilder: 23.67 ms
# -join operator: 18.34 ms
Memory Management and Garbage Collection
Managing memory effectively prevents performance degradation over long-running scripts:
function Get-MemoryUsage {
$process = Get-Process -Id $PID
[PSCustomObject]@{
WorkingSetMB = [math]::Round($process.WorkingSet64 / 1MB, 2)
PrivateMemoryMB = [math]::Round($process.PrivateMemorySize64 / 1MB, 2)
}
}
Write-Host "Initial memory:"
Get-MemoryUsage | Format-List
# Allocate large array
$largeArray = 1..1000000 | ForEach-Object {
[PSCustomObject]@{ID = $_; Data = "X" * 100}
}
Write-Host "`nAfter allocation:"
Get-MemoryUsage | Format-List
# Clear reference and force collection
$largeArray = $null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
Write-Host "`nAfter cleanup:"
Get-MemoryUsage | Format-List
# Output:
# Initial memory:
# WorkingSetMB : 87.34
# PrivateMemoryMB : 92.45
#
# After allocation:
# WorkingSetMB : 456.78
# PrivateMemoryMB : 478.92
#
# After cleanup:
# WorkingSetMB : 95.23
# PrivateMemoryMB : 98.67
File Operations Optimization
Reading Large Files Efficiently
# Create test file
$testFile = "C:\temp\test-large.txt"
1..100000 | ForEach-Object { "Line $_ with some data" } |
Set-Content -Path $testFile
# Method 1: Get-Content (loads entire file)
$getContentTime = Measure-Command {
$content = Get-Content -Path $testFile
$filtered = $content | Where-Object { $_ -like "*5000*" }
}
# Method 2: Get-Content with -ReadCount (streaming)
$readCountTime = Measure-Command {
$filtered = Get-Content -Path $testFile -ReadCount 1000 |
Where-Object { $_ -like "*5000*" }
}
# Method 3: StreamReader (fastest)
$streamReaderTime = Measure-Command {
$reader = [System.IO.StreamReader]::new($testFile)
$filtered = while ($line = $reader.ReadLine()) {
if ($line -like "*5000*") { $line }
}
$reader.Close()
}
[PSCustomObject]@{
Method = "Get-Content"
TimeMs = [math]::Round($getContentTime.TotalMilliseconds, 2)
}
[PSCustomObject]@{
Method = "Get-Content -ReadCount"
TimeMs = [math]::Round($readCountTime.TotalMilliseconds, 2)
}
[PSCustomObject]@{
Method = "StreamReader"
TimeMs = [math]::Round($streamReaderTime.TotalMilliseconds, 2)
} | Format-Table
# Output:
# Method TimeMs
# ------ ------
# Get-Content 567.89
# Get-Content -ReadCount 234.56
# StreamReader 89.12
Parallel Processing with Jobs and Runspaces
ForEach-Object -Parallel (PowerShell 7+)
# Sequential processing
$sequentialTime = Measure-Command {
$results = 1..10 | ForEach-Object {
Start-Sleep -Milliseconds 200
[PSCustomObject]@{
ID = $_
Result = $_ * 2
}
}
}
# Parallel processing
$parallelTime = Measure-Command {
$results = 1..10 | ForEach-Object -Parallel {
Start-Sleep -Milliseconds 200
[PSCustomObject]@{
ID = $_
Result = $_ * 2
}
} -ThrottleLimit 5
}
Write-Host "Sequential: $($sequentialTime.TotalSeconds) seconds"
Write-Host "Parallel: $($parallelTime.TotalSeconds) seconds"
Write-Host "Speed improvement: $([math]::Round($sequentialTime.TotalSeconds / $parallelTime.TotalSeconds, 2))x"
# Output:
# Sequential: 2.15 seconds
# Parallel: 0.52 seconds
# Speed improvement: 4.13x
Using Runspaces for Maximum Performance
function Invoke-ParallelRunspace {
param(
[Parameter(Mandatory)]
[array]$InputObject,
[Parameter(Mandatory)]
[scriptblock]$ScriptBlock,
[int]$ThrottleLimit = 5
)
$runspacePool = [runspacefactory]::CreateRunspacePool(1, $ThrottleLimit)
$runspacePool.Open()
$jobs = foreach ($item in $InputObject) {
$powershell = [powershell]::Create().AddScript($ScriptBlock).AddArgument($item)
$powershell.RunspacePool = $runspacePool
[PSCustomObject]@{
Pipe = $powershell
Result = $powershell.BeginInvoke()
}
}
$results = foreach ($job in $jobs) {
$job.Pipe.EndInvoke($job.Result)
$job.Pipe.Dispose()
}
$runspacePool.Close()
$runspacePool.Dispose()
return $results
}
# Test with web requests
$urls = @(
"https://api.github.com"
"https://jsonplaceholder.typicode.com/posts/1"
"https://httpbin.org/delay/1"
)
$runspaceTime = Measure-Command {
$results = Invoke-ParallelRunspace -InputObject $urls -ScriptBlock {
param($url)
Invoke-RestMethod -Uri $url -TimeoutSec 5
} -ThrottleLimit 3
}
Write-Host "Runspace parallel execution: $($runspaceTime.TotalSeconds) seconds"
Write-Host "Processed $($urls.Count) URLs"
# Output:
# Runspace parallel execution: 1.34 seconds
# Processed 3 URLs
Regular Expression Optimization
Regex operations can be expensive. Pre-compiling patterns and using appropriate methods improves performance:
$testStrings = 1..1000 | ForEach-Object { "Test string number $_ with email [email protected]" }
$pattern = '\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}\b'
# Method 1: -match operator (recompiles each time)
$matchTime = Measure-Command {
$results = $testStrings | Where-Object { $_ -match $pattern }
}
# Method 2: Select-String
$selectStringTime = Measure-Command {
$results = $testStrings | Select-String -Pattern $pattern
}
# Method 3: Compiled regex (fastest)
$compiledRegexTime = Measure-Command {
$regex = [regex]::new($pattern, [System.Text.RegularExpressions.RegexOptions]::Compiled)
$results = $testStrings.Where({$regex.IsMatch($_)})
}
Write-Host "-match operator: $($matchTime.TotalMilliseconds) ms"
Write-Host "Select-String: $($selectStringTime.TotalMilliseconds) ms"
Write-Host "Compiled regex: $($compiledRegexTime.TotalMilliseconds) ms"
# Output:
# -match operator: 145.67 ms
# Select-String: 234.89 ms
# Compiled regex: 45.23 ms
Database Query Optimization
When working with databases, connection management and query structure significantly impact performance:
# Example with SQLite (install: Install-Module PSSQLite)
# Create test database
$dbPath = "C:\temp\test.db"
$connection = New-SQLiteConnection -DataSource $dbPath
# Create and populate test table
Invoke-SqliteQuery -Connection $connection -Query @"
CREATE TABLE IF NOT EXISTS Users (
ID INTEGER PRIMARY KEY,
Name TEXT,
Email TEXT,
Created TEXT
)
"@
# Inefficient: Multiple individual inserts
$inefficientTime = Measure-Command {
1..1000 | ForEach-Object {
Invoke-SqliteQuery -Connection $connection -Query @"
INSERT INTO Users (Name, Email, Created)
VALUES ('User$_', '[email protected]', datetime('now'))
"@
}
}
# Efficient: Batch insert with transaction
$efficientTime = Measure-Command {
Invoke-SqliteQuery -Connection $connection -Query "BEGIN TRANSACTION"
$values = (1..1000 | ForEach-Object {
"('User$_', '[email protected]', datetime('now'))"
}) -join ","
Invoke-SqliteQuery -Connection $connection -Query @"
INSERT INTO Users (Name, Email, Created) VALUES $values
"@
Invoke-SqliteQuery -Connection $connection -Query "COMMIT"
}
Write-Host "Individual inserts: $($inefficientTime.TotalSeconds) seconds"
Write-Host "Batch insert: $($efficientTime.TotalSeconds) seconds"
Write-Host "Speed improvement: $([math]::Round($inefficientTime.TotalSeconds / $efficientTime.TotalSeconds, 2))x"
# Output:
# Individual inserts: 12.45 seconds
# Batch insert: 0.34 seconds
# Speed improvement: 36.62x
Cmdlet Binding and Parameter Optimization
Parameter validation and cmdlet binding affect performance in functions:
# Unoptimized function
function Get-DataSlow {
param($Items)
foreach ($item in $Items) {
if ($item -is [int] -and $item -gt 0) {
$item * 2
}
}
}
# Optimized function with early validation
function Get-DataFast {
[CmdletBinding()]
param(
[Parameter(Mandatory, ValueFromPipeline)]
[ValidateRange(1, [int]::MaxValue)]
[int[]]$Items
)
process {
foreach ($item in $Items) {
$item * 2
}
}
}
$testData = 1..10000
$slowTime = Measure-Command {
$result = Get-DataSlow -Items $testData
}
$fastTime = Measure-Command {
$result = Get-DataFast -Items $testData
}
Write-Host "Unoptimized: $($slowTime.TotalMilliseconds) ms"
Write-Host "Optimized: $($fastTime.TotalMilliseconds) ms"
Write-Host "Improvement: $([math]::Round((($slowTime.TotalMilliseconds - $fastTime.TotalMilliseconds) / $slowTime.TotalMilliseconds) * 100, 2))%"
# Output:
# Unoptimized: 234.56 ms
# Optimized: 89.23 ms
# Improvement: 61.96%
Array Operations Performance
Understanding array performance characteristics prevents common pitfalls:
# Bad: Growing arrays with +=
$arrayPlusTime = Measure-Command {
$array = @()
1..5000 | ForEach-Object {
$array += $_
}
}
# Good: Using ArrayList
$arrayListTime = Measure-Command {
$arrayList = [System.Collections.ArrayList]::new()
1..5000 | ForEach-Object {
[void]$arrayList.Add($_)
}
}
# Good: Using List
$genericListTime = Measure-Command {
$list = [System.Collections.Generic.List[int]]::new()
1..5000 | ForEach-Object {
$list.Add($_)
}
}
# Best: Pre-allocating array
$preallocatedTime = Measure-Command {
$array = foreach ($i in 1..5000) { $i }
}
[PSCustomObject]@{Method = "Array with +="; TimeMs = [math]::Round($arrayPlusTime.TotalMilliseconds, 2)}
[PSCustomObject]@{Method = "ArrayList"; TimeMs = [math]::Round($arrayListTime.TotalMilliseconds, 2)}
[PSCustomObject]@{Method = "List"; TimeMs = [math]::Round($genericListTime.TotalMilliseconds, 2)}
[PSCustomObject]@{Method = "Pre-allocated"; TimeMs = [math]::Round($preallocatedTime.TotalMilliseconds, 2)}
| Format-Table
# Output:
# Method TimeMs
# ------ ------
# Array with += 3456.78
# ArrayList 45.67
# List 34.56
# Pre-allocated 12.34
Comprehensive Performance Testing Framework
Create a complete framework for consistent performance testing across your scripts:
class PerformanceTest {
[string]$Name
[scriptblock]$ScriptBlock
[int]$Iterations
[double[]]$Results
[hashtable]$Statistics
PerformanceTest([string]$name, [scriptblock]$script, [int]$iterations) {
$this.Name = $name
$this.ScriptBlock = $script
$this.Iterations = $iterations
$this.Results = @()
}
[void]Run() {
Write-Host "Running: $($this.Name)" -ForegroundColor Cyan
for ($i = 1; $i -le $this.Iterations; $i++) {
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
$time = (Measure-Command $this.ScriptBlock).TotalMilliseconds
$this.Results += $time
Write-Progress -Activity $this.Name -Status "Iteration $i/$($this.Iterations)" `
-PercentComplete (($i / $this.Iterations) * 100)
}
Write-Progress -Activity $this.Name -Completed
$this.CalculateStatistics()
}
[void]CalculateStatistics() {
$sorted = $this.Results | Sort-Object
$this.Statistics = @{
Average = [math]::Round(($this.Results | Measure-Object -Average).Average, 2)
Median = [math]::Round($sorted[[math]::Floor($sorted.Count / 2)], 2)
Min = [math]::Round(($this.Results | Measure-Object -Minimum).Minimum, 2)
Max = [math]::Round(($this.Results | Measure-Object -Maximum).Maximum, 2)
StdDev = [math]::Round([math]::Sqrt(
($this.Results | ForEach-Object {
[math]::Pow($_ - ($this.Results | Measure-Object -Average).Average, 2)
} | Measure-Object -Sum).Sum / $this.Results.Count
), 2)
}
}
[PSCustomObject]GetResults() {
return [PSCustomObject]@{
Test = $this.Name
Iterations = $this.Iterations
AvgMs = $this.Statistics.Average
MedianMs = $this.Statistics.Median
MinMs = $this.Statistics.Min
MaxMs = $this.Statistics.Max
StdDev = $this.Statistics.StdDev
}
}
}
# Usage example
$tests = @(
[PerformanceTest]::new("Pipeline", { 1..1000 | ForEach-Object { $_ * 2 } }, 10)
[PerformanceTest]::new("ForEach", { foreach ($i in 1..1000) { $i * 2 } }, 10)
[PerformanceTest]::new(".ForEach()", { (1..1000).ForEach({$_ * 2}) }, 10)
)
$results = foreach ($test in $tests) {
$test.Run()
$test.GetResults()
}
$results | Format-Table -AutoSize
# Output:
# Test Iterations AvgMs MedianMs MinMs MaxMs StdDev
# ---- ---------- ----- -------- ----- ----- ------
# Pipeline 10 45.67 44.23 42.15 52.34 3.21
# ForEach 10 12.34 12.01 11.45 14.23 0.89
# .ForEach() 10 8.91 8.67 8.12 9.87 0.54
Best Practices Checklist
Always Profile Before Optimizing
- Use
Measure-Commandto establish baselines - Test with realistic data volumes
- Run multiple iterations to account for variance
- Measure both time and memory consumption
Choose the Right Tool
- Use
.Where()and.ForEach()methods for collections - Prefer
foreachloops overForEach-Objectfor large datasets - Use
StringBuilderor-joinfor string concatenation - Implement parallel processing for independent operations
Memory Management
- Clear large variables when done:
$variable = $null - Use streaming for large files instead of loading everything into memory
- Force garbage collection in long-running scripts:
[System.GC]::Collect() - Pre-allocate arrays when size is known
Code Organization
- Move invariant code outside loops
- Use compiled regex for repeated pattern matching
- Cache frequently accessed data
- Minimize object creation in tight loops
Real-World Optimization Example
Here’s a complete before-and-after example demonstrating multiple optimization techniques:
# BEFORE: Slow implementation
function Get-UserReportSlow {
param([string]$LogPath)
$content = Get-Content -Path $LogPath
$results = @()
foreach ($line in $content) {
if ($line -match "User: (\w+)") {
$user = $Matches[1]
$results += [PSCustomObject]@{
User = $user
Lines = ($content | Where-Object { $_ -like "*$user*" }).Count
Timestamp = Get-Date
}
}
}
return $results
}
# AFTER: Optimized implementation
function Get-UserReportFast {
param([string]$LogPath)
# Pre-compile regex
$userRegex = [regex]::new('User: (\w+)',
[System.Text.RegularExpressions.RegexOptions]::Compiled)
# Use StreamReader for large files
$reader = [System.IO.StreamReader]::new($LogPath)
$userCounts = @{}
$timestamp = Get-Date
while ($line = $reader.ReadLine()) {
$match = $userRegex.Match($line)
if ($match.Success) {
$user = $match.Groups[1].Value
if (-not $userCounts.ContainsKey($user)) {
$userCounts[$user] = 0
}
$userCounts[$user]++
}
}
$reader.Close()
# Use List instead of array
$results = [System.Collections.Generic.List[PSCustomObject]]::new()
foreach ($user in $userCounts.Keys) {
$results.Add([PSCustomObject]@{
User = $user
Lines = $userCounts[$user]
Timestamp = $timestamp
})
}
return $results
}
# Create test log file
$testLog = "C:\temp\test.log"
1..10000 | ForEach-Object {
"User: User$(Get-Random -Min 1 -Max 100) - Action $_"
} | Set-Content -Path $testLog
# Compare performance
$slowTime = Measure-Command { $slowResult = Get-UserReportSlow -LogPath $testLog }
$fastTime = Measure-Command { $fastResult = Get-UserReportFast -LogPath $testLog }
Write-Host "Slow implementation: $($slowTime.TotalSeconds) seconds"
Write-Host "Fast implementation: $($fastTime.TotalSeconds) seconds"
Write-Host "Speed improvement: $([math]::Round($slowTime.TotalSeconds / $fastTime.TotalSeconds, 2))x faster"
# Output:
# Slow implementation: 15.67 seconds
# Fast implementation: 0.42 seconds
# Speed improvement: 37.31x faster
Monitoring Long-Running Scripts
For production scripts, implement progress tracking and performance monitoring:
function Invoke-MonitoredScript {
param(
[Parameter(Mandatory)]
[scriptblock]$ScriptBlock,
[string]$Activity = "Processing"
)
$stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
$startMemory = (Get-Process -Id $PID).WorkingSet64 / 1MB
try {
$result = & $ScriptBlock
$stopwatch.Stop()
$endMemory = (Get-Process -Id $PID).WorkingSet64 / 1MB
[PSCustomObject]@{
Success = $true
Result = $result
ElapsedSeconds = [math]::Round($stopwatch.Elapsed.TotalSeconds, 2)
MemoryUsedMB = [math]::Round($endMemory - $startMemory, 2)
PeakMemoryMB = [math]::Round($endMemory, 2)
}
}
catch {
[PSCustomObject]@{
Success = $false
Error = $_.Exception.Message
ElapsedSeconds = [math]::Round($stopwatch.Elapsed.TotalSeconds, 2)
}
}
}
# Usage
$monitoring = Invoke-MonitoredScript -ScriptBlock {
1..100000 | ForEach-Object { $_ * 2 }
} -Activity "Number Processing"
$monitoring | Format-List
# Output:
# Success : True
# Result : {2, 4, 6, 8...}
# ElapsedSeconds : 0.52
# MemoryUsedMB : 45.23
# PeakMemoryMB : 178.92
Conclusion
Performance tuning PowerShell scripts requires a systematic approach: measure first, identify bottlenecks, apply targeted optimizations, and verify improvements. The techniques covered in this guide can transform slow scripts into high-performance automation tools.
Key takeaways for immediate performance gains:
- Replace
Where-Objectwith.Where()method for filtering - Use
foreachloops instead ofForEach-Objectfor large datasets - Implement parallel processing for independent operations
- Pre-compile regular expressions for repeated use
- Avoid growing arrays with
+=operator - Stream large files instead of loading them entirely into memory
- Batch database operations within transactions
Remember that premature optimization is counterproductive. Always profile your scripts to identify actual bottlenecks before spending time on optimizations. Use the profiling framework and techniques demonstrated here to make data-driven decisions about where to focus your optimization efforts.








