Administration and Governance: Premium Capacity Management
Introduction
Power BI Premium provides dedicated cloud compute resources for your organization, delivering consistent performance, larger model sizes, and enhanced capabilities. Effective administration and governance of Premium capacity is crucial for maximizing ROI, ensuring compliance, and delivering optimal performance across your enterprise BI landscape.
This comprehensive guide covers capacity planning, workload configuration, tenant governance, monitoring strategies, cost optimization, security compliance, and troubleshooting techniques for Power BI Premium administrators.
Prerequisites
- Power BI Premium or Premium Per User (PPU) license
- Power BI Administrator or Capacity Administrator role
- Understanding of Power BI workspace architecture
- Familiarity with PowerShell for automation
Understanding Premium Capacity
Capacity SKUs and Resources
Power BI Premium capacities come in different SKUs with varying resources:
Premium Capacity SKUs:
P1: 8 v-cores, 25 GB RAM (~$4,995/month)
P2: 16 v-cores, 50 GB RAM (~$9,995/month)
P3: 32 v-cores, 100 GB RAM (~$19,995/month)
P4: 64 v-cores, 200 GB RAM (~$39,995/month)
P5: 128 v-cores, 400 GB RAM (~$79,995/month)
Premium Per User (PPU):
- $20/user/month
- Similar capabilities to P1 capacity
- Ideal for smaller teams (<300 users)
EM SKUs (Embedded):
EM1, EM2, EM3 - For embedding scenarios
Capacity Architecture
┌──────────────────────────────────────────┐
│ Premium Capacity (P1/P2/P3) │
├──────────────────────────────────────────┤
│ │
│ ┌────────────┐ ┌────────────┐ │
│ │ Workspace │ │ Workspace │ │
│ │ (Sales) │ │ (Marketing)│ │
│ ├────────────┤ ├────────────┤ │
│ │ Dataset 1 │ │ Dataset 1 │ │
│ │ Dataset 2 │ │ Dataset 2 │ │
│ │ Reports │ │ Reports │ │
│ │ Dashboards │ │ Dashboards │ │
│ └────────────┘ └────────────┘ │
│ │
│ Shared Resources: │
│ - CPU Cores │
│ - Memory │
│ - Background Operations (Refresh) │
│ - Interactive Queries │
└──────────────────────────────────────────┘
Capacity Planning and Sizing
Sizing Calculations
# Calculate required capacity based on workload
# Formula: Required v-cores = (Dataset Size GB × 0.5) + (Concurrent Users × 0.02) + (Refreshes/Hour × 0.1)
$datasets = @(
@{Name="Sales"; SizeGB=15; Users=200; RefreshesPerDay=8},
@{Name="Finance"; SizeGB=8; Users=50; RefreshesPerDay=4},
@{Name="Operations"; SizeGB=12; Users=100; RefreshesPerDay=12}
)
$totalVCores = 0
foreach ($ds in $datasets) {
$datasetCores = $ds.SizeGB * 0.5
$userCores = $ds.Users * 0.02
$refreshCores = ($ds.RefreshesPerDay / 24) * 0.1
$required = $datasetCores + $userCores + $refreshCores
Write-Host "$($ds.Name): $([math]::Round($required, 2)) v-cores required"
$totalVCores += $required
}
Write-Host "`nTotal v-cores needed: $([math]::Round($totalVCores, 2))"
Write-Host "Recommended SKU: $(
if ($totalVCores -le 8) { 'P1' }
elseif ($totalVCores -le 16) { 'P2' }
elseif ($totalVCores -le 32) { 'P3' }
else { 'P4 or higher' }
)"
Capacity Assessment Checklist
☐ Total dataset size across all workspaces
☐ Number of concurrent users (peak hours)
☐ Refresh frequency and duration
☐ Number of paginated reports
☐ Dataflow refresh operations
☐ AI workload requirements
☐ Growth projection (next 12 months)
☐ Geographic distribution (multi-geo needs)
☐ Disaster recovery requirements
Workload Configuration
Configuring Workload Settings
# Connect to Power BI Service
Connect-PowerBIServiceAccount
# Get capacity details
$capacity = Get-PowerBICapacity -Scope Organization | Where-Object {$_.DisplayName -eq "Production Capacity"}
# Configure workload settings using REST API
$headers = Get-PowerBIAccessToken
$capacityId = $capacity.Id
# Dataflow workload configuration
$dataflowConfig = @{
name = "dataflows"
state = "Enabled"
maxMemoryPercentageSetByUser = 20
} | ConvertTo-Json
Invoke-RestMethod -Uri "https://api.powerbi.com/v1.0/myorg/capacities/$capacityId/Workloads" `
-Headers $headers -Method Patch -Body $dataflowConfig -ContentType "application/json"
# Paginated reports configuration
$paginatedConfig = @{
name = "paginatedReports"
state = "Enabled"
maxMemoryPercentageSetByUser = 20
} | ConvertTo-Json
Invoke-RestMethod -Uri "https://api.powerbi.com/v1.0/myorg/capacities/$capacityId/Workloads" `
-Headers $headers -Method Patch -Body $paginatedConfig -ContentType "application/json"
# AI workload configuration
$aiConfig = @{
name = "aiInsights"
state = "Enabled"
maxMemoryPercentageSetByUser = 20
} | ConvertTo-Json
Invoke-RestMethod -Uri "https://api.powerbi.com/v1.0/myorg/capacities/$capacityId/Workloads" `
-Headers $headers -Method Patch -Body $aiConfig -ContentType "application/json"
Workload Distribution Guidelines
Recommended Memory Allocation:
Datasets (Interactive): 40-50% (default, automatic)
Dataflows: 10-20%
Paginated Reports: 15-25%
AI Insights: 10-20%
Example P2 Capacity (50 GB RAM):
├─ Datasets: 20-25 GB
├─ Dataflows: 5-10 GB
├─ Paginated Reports: 7.5-12.5 GB
└─ AI Insights: 5-10 GB
Note: Total should not exceed 100%
Tenant Settings Governance
Critical Tenant Settings
# Export current tenant settings for audit
$tenantSettings = Invoke-PowerBIRestMethod -Url 'admin/tenantsettings' -Method Get | ConvertFrom-Json
# Save to file for compliance review
$tenantSettings | ConvertTo-Json -Depth 10 | Out-File "PowerBI-TenantSettings-$(Get-Date -Format 'yyyy-MM-dd').json"
# Key settings to review:
Write-Host "Current Tenant Settings Status:"
Write-Host "================================"
Write-Host "Export to Excel: $($tenantSettings.exportToExcelSetting.enabled)"
Write-Host "Publish to Web: $($tenantSettings.publishToWebSetting.enabled)"
Write-Host "Share Content External: $($tenantSettings.shareToExternalUsersSetting.enabled)"
Write-Host "Developer Settings: $($tenantSettings.developerSettings.enabled)"
Write-Host "Custom Visuals: $($tenantSettings.tenantCustomVisualsEnabled)"
Governance Policy Matrix
| Setting | Recommended State | Justification | Exceptions |
|---|---|---|---|
| Export to Excel | Enabled (specific groups) | Allow data analysts, restrict general users | Finance team (full access) |
| Export to CSV | Enabled (specific groups) | Control data exfiltration | Data science team |
| Publish to Web | Disabled | Prevent public data exposure | Marketing (pre-approved content) |
| Share External | Enabled (specific groups) | Control guest access | Partner collaboration workspace |
| Custom Visuals | Enabled (certified only) | Security risk mitigation | None |
| Developer Mode | Enabled (specific groups) | Limit API access | Development team |
| Dataflow Storage | Azure Data Lake | Centralized data governance | None |
| Template Apps | Enabled (specific groups) | Control app distribution | None |
| Workspace Creation | Enabled (specific groups) | Prevent workspace sprawl | Department leads |
| Dataset Scale-out | Enabled (Premium workspaces) | Improve query performance | None |
Implementing Tenant Settings
# Example: Restrict "Publish to Web" to specific security group
$publishToWebSetting = @{
settingsName = "PublishToWeb"
enabled = $true
tenantSettingGroup = @(
@{
name = "Marketing-Approved"
type = "SecurityGroup"
objectId = "12345678-1234-1234-1234-123456789012"
}
)
canSpecifySecurityGroups = $true
} | ConvertTo-Json -Depth 5
Invoke-PowerBIRestMethod -Url 'admin/tenantsettings' -Method Put -Body $publishToWebSetting
# Example: Enable custom visuals (certified only)
$customVisualsSetting = @{
settingsName = "CustomVisualsTenant"
enabled = $true
allowVisualDataPointAppliedThemes = $false
addCustomVisualsToBlockList = $false
certifiedCustomVisualsOnly = $true
} | ConvertTo-Json
Invoke-PowerBIRestMethod -Url 'admin/tenantsettings' -Method Put -Body $customVisualsSetting
Monitoring and Performance Management
Capacity Metrics App Setup
# Install and configure the Premium Capacity Metrics app
# Navigate to: https://app.powerbi.com/groups/me/getapps
# Key metrics to monitor:
# 1. CPU utilization (target: <80% sustained)
# 2. Memory utilization (target: <90%)
# 3. Query duration (P50, P95, P99)
# 4. Query wait times (target: <100ms)
# 5. Refresh duration and success rate
# 6. Active datasets and user sessions
# Create custom monitoring dashboard
$monitoringQuery = @"
// KQL query for Log Analytics workspace
PowerBIActivity
| where TimeGenerated > ago(24h)
| where CapacityName == "Production Capacity"
| summarize
AvgCPU = avg(CpuPercentage),
MaxCPU = max(CpuPercentage),
AvgMemory = avg(MemoryPercentage),
MaxMemory = max(MemoryPercentage),
QueryCount = count(),
AvgDuration = avg(DurationMs)
by bin(TimeGenerated, 1h)
| render timechart
"@
Real-Time Monitoring Script
# Monitor capacity health in real-time
function Monitor-CapacityHealth {
param(
[string]$CapacityName,
[int]$IntervalSeconds = 300,
[int]$DurationMinutes = 60
)
$endTime = (Get-Date).AddMinutes($DurationMinutes)
while ((Get-Date) -lt $endTime) {
$capacity = Get-PowerBICapacity -Scope Organization | Where-Object {$_.DisplayName -eq $CapacityName}
# Get capacity metrics (requires REST API call)
$metrics = Invoke-PowerBIRestMethod -Url "admin/capacities/$($capacity.Id)/refreshables" -Method Get | ConvertFrom-Json
$timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
Write-Host "[$timestamp] Capacity: $CapacityName"
Write-Host " Active Refreshes: $($metrics.value.Count)"
Write-Host " Status: $($capacity.State)"
Write-Host " Region: $($capacity.Region)"
Write-Host " SKU: $($capacity.Sku)"
Write-Host " ---"
Start-Sleep -Seconds $IntervalSeconds
}
}
# Run monitoring
Monitor-CapacityHealth -CapacityName "Production Capacity" -IntervalSeconds 300 -DurationMinutes 60
Alert Configuration
# Create alerts for capacity issues
$alertConfig = @{
Name = "Capacity-CPU-High"
Description = "Alert when capacity CPU exceeds 85% for 10 minutes"
Severity = "Critical"
Threshold = 85
Duration = 600 # seconds
Recipients = @("admin@contoso.com", "biops@contoso.com")
Actions = @(
@{Type = "Email"; Template = "CapacityCPUAlert"},
@{Type = "Webhook"; Url = "https://hooks.slack.com/services/YOUR/WEBHOOK/URL"}
)
}
# Monitoring automation with Azure Logic Apps or Power Automate
# Trigger: HTTP request from monitoring system
# Conditions:
# - CPU > 85% for 10+ minutes → Scale up or redistribute workspaces
# - Memory > 90% → Investigate large datasets or failed refreshes
# - Query wait > 500ms → Optimize reports or add capacity
Cost Optimization Strategies
Capacity Utilization Analysis
# Analyze workspace utilization across capacity
function Get-CapacityUtilization {
param([string]$CapacityId)
# Get all workspaces on capacity
$workspaces = Get-PowerBIWorkspace -Scope Organization -Filter "capacityId eq '$CapacityId'"
$utilizationReport = foreach ($workspace in $workspaces) {
# Get datasets in workspace
$datasets = Get-PowerBIDataset -WorkspaceId $workspace.Id
$totalSize = ($datasets | Measure-Object -Property StorageMode -Sum).Sum
$refreshCount = $datasets | Where-Object {$_.IsRefreshable} | Measure-Object | Select-Object -ExpandProperty Count
[PSCustomObject]@{
WorkspaceName = $workspace.Name
DatasetCount = $datasets.Count
TotalSizeGB = [math]::Round($totalSize / 1GB, 2)
RefreshableDatasets = $refreshCount
LastActivity = $workspace.OnPremisesLastSyncDateTime
IsActive = ((Get-Date) - $workspace.OnPremisesLastSyncDateTime).Days -lt 30
}
}
return $utilizationReport | Sort-Object -Property TotalSizeGB -Descending
}
# Run analysis
$utilization = Get-CapacityUtilization -CapacityId "12345678-1234-1234-1234-123456789012"
$utilization | Export-Csv "CapacityUtilization-$(Get-Date -Format 'yyyy-MM-dd').csv" -NoTypeInformation
# Identify candidates for removal or shared capacity migration
$inactiveWorkspaces = $utilization | Where-Object {-not $_.IsActive}
Write-Host "Inactive Workspaces (>30 days): $($inactiveWorkspaces.Count)"
Write-Host "Potential Savings: $([math]::Round(($inactiveWorkspaces | Measure-Object -Property TotalSizeGB -Sum).Sum, 2)) GB capacity freed"
Autoscale Configuration
# Enable autoscale for Premium capacity (P1-P3)
# Note: Autoscale temporarily adds v-cores during high demand, billed per hour
# Configure via Azure Portal or REST API
$autoscaleConfig = @{
isEnabled = $true
maxAutoscaleVCores = 8 # Maximum additional v-cores
notifyOnSuccess = $true
notifyOnFailure = $true
emails = @("admin@contoso.com")
} | ConvertTo-Json
# Apply to capacity
Invoke-RestMethod -Uri "https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroup}/providers/Microsoft.PowerBIDedicated/capacities/{capacityName}/autoscale?api-version=2021-01-01" `
-Method Put -Body $autoscaleConfig -Headers (Get-AzAccessToken).Headers
# Monitor autoscale events
$autoscaleEvents = Get-AzActivityLog -ResourceId "/subscriptions/{subscriptionId}/resourceGroups/{resourceGroup}/providers/Microsoft.PowerBIDedicated/capacities/{capacityName}" `
-StartTime (Get-Date).AddDays(-7) | Where-Object {$_.OperationName -like "*autoscale*"}
Cost Reduction Checklist
☑ Decommission Strategies:
├─ Archive datasets unused for >90 days
├─ Migrate low-usage workspaces to shared capacity
├─ Consolidate duplicate datasets
└─ Remove orphaned reports and dashboards
☑ Efficiency Improvements:
├─ Implement aggregations for large datasets
├─ Enable incremental refresh
├─ Optimize DAX measures (avoid row-context iterations)
├─ Schedule refreshes during off-peak hours
└─ Implement query caching
☑ License Optimization:
├─ Evaluate PPU vs Premium for smaller teams
├─ Right-size capacity SKU (P1→P2 if consistently <50% utilized)
├─ Consider reserved instances for long-term commitments
└─ Leverage autoscale instead of over-provisioning
☑ Governance Controls:
├─ Require workspace approval workflow
├─ Implement dataset certification process
├─ Set workspace retention policies
└─ Regular access reviews (quarterly)
Security and Compliance
Row-Level Security (RLS) Governance
-- Implement dynamic RLS based on user email
[RegionFilter] =
VAR UserEmail = USERPRINCIPALNAME()
VAR UserRegion =
LOOKUPVALUE(
'UserRegionMapping'[Region],
'UserRegionMapping'[Email], UserEmail
)
RETURN
'Sales'[Region] = UserRegion
-- Manager hierarchy RLS
[ManagerHierarchy] =
VAR CurrentUser = USERPRINCIPALNAME()
VAR UserEmployeeId =
LOOKUPVALUE(
'Employees'[EmployeeID],
'Employees'[Email], CurrentUser
)
VAR ManagedEmployees =
FILTER(
'Employees',
PATHCONTAINS([ManagerPath], UserEmployeeId)
)
RETURN
'Sales'[EmployeeID] IN ManagedEmployees
Object-Level Security (OLS)
# Configure OLS using Tabular Object Model (TOM)
# Requires SQL Server Management Studio or Tabular Editor
# Example: Hide sensitive columns from specific roles
$server = New-Object Microsoft.AnalysisServices.Tabular.Server
$server.Connect("powerbi://api.powerbi.com/v1.0/myorg/[WorkspaceName]")
$database = $server.Databases["DatasetName"]
$model = $database.Model
# Hide SalaryColumn from non-HR users
$table = $model.Tables["Employees"]
$column = $table.Columns["Salary"]
$role = $model.Roles["General Users"]
$tablePermission = $role.TablePermissions["Employees"]
$tablePermission.ColumnPermissions.Add("Salary", "None") # Hide column
$model.SaveChanges()
$server.Disconnect()
Sensitivity Labels Integration
# Apply sensitivity labels to datasets
# Requires Microsoft Information Protection (MIP) labels configured
# List available labels
$labels = Get-Label
# Apply label to dataset
$dataset = Get-PowerBIDataset -WorkspaceId "workspace-id" -DatasetId "dataset-id"
Set-PowerBIDataset -WorkspaceId "workspace-id" -DatasetId "dataset-id" -SensitivityLabel "Confidential"
# Audit label usage
$labeledContent = Get-PowerBIActivityEvent -StartDateTime (Get-Date).AddDays(-30) -EndDateTime (Get-Date) `
| Where-Object {$_.Activity -eq "ApplySensitivityLabel"} `
| Group-Object -Property SensitivityLabelName
$labeledContent | Format-Table Name, Count
Audit Log Analysis
# Extract audit logs for compliance reporting
function Export-PowerBIAuditLogs {
param(
[datetime]$StartDate,
[datetime]$EndDate,
[string]$OutputPath
)
$activities = @()
$currentDate = $StartDate
while ($currentDate -lt $EndDate) {
$nextDate = $currentDate.AddDays(1)
Write-Host "Fetching logs for $($currentDate.ToString('yyyy-MM-dd'))..."
$dailyActivities = Get-PowerBIActivityEvent `
-StartDateTime $currentDate.ToString('yyyy-MM-ddT00:00:00') `
-EndDateTime $nextDate.ToString('yyyy-MM-ddT00:00:00')
$activities += $dailyActivities | ConvertFrom-Json
$currentDate = $nextDate
}
# Export to CSV
$activities | Export-Csv -Path $OutputPath -NoTypeInformation
# Generate summary report
$summary = @{
TotalActivities = $activities.Count
UniqueUsers = ($activities.UserId | Sort-Object -Unique).Count
TopActivities = $activities | Group-Object Activity | Sort-Object Count -Descending | Select-Object -First 10
DataExports = ($activities | Where-Object {$_.Activity -like "*Export*"}).Count
SharedReports = ($activities | Where-Object {$_.Activity -eq "ShareReport"}).Count
}
return $summary
}
# Run audit export
$auditSummary = Export-PowerBIAuditLogs `
-StartDate (Get-Date).AddMonths(-1) `
-EndDate (Get-Date) `
-OutputPath "C:\Audits\PowerBI-Audit-$(Get-Date -Format 'yyyy-MM').csv"
$auditSummary
Advanced Optimization Techniques
Dataset Optimization
-- Use variables to avoid recalculation
TotalSalesOptimized =
VAR TotalQuantity = SUM('Sales'[Quantity])
VAR AvgPrice = AVERAGE('Sales'[UnitPrice])
RETURN
TotalQuantity * AvgPrice
-- Implement aggregations for large fact tables
-- In Power BI Desktop: Manage Aggregations
-- Create aggregation table:
SalesAggregated =
SUMMARIZECOLUMNS(
'Date'[Year],
'Date'[Month],
'Product'[Category],
'Customer'[Country],
"TotalSales", SUM('Sales'[Amount]),
"TotalQuantity", SUM('Sales'[Quantity]),
"OrderCount", COUNTROWS('Sales')
)
-- Incremental refresh policy (via XMLA endpoint)
-- Configure in Power BI Desktop under dataset settings
-- RangeStart and RangeEnd parameters required
Query Performance Tuning
# Analyze slow queries using DAX Studio
# Connect to dataset via XMLA endpoint: powerbi://api.powerbi.com/v1.0/myorg/[WorkspaceName]
# Export query performance data
$slowQueries = @"
SELECT
[SPID],
[TEXT_DATA],
[DURATION],
[CPU_TIME],
[READS],
[WRITES],
[START_TIME]
FROM $SYSTEM.DISCOVER_SESSIONS
WHERE [DURATION] > 5000 -- Queries taking >5 seconds
ORDER BY [DURATION] DESC
"@
# Analyze query plans
# Look for:
# - Table scans (should use storage engine when possible)
# - Formula engine bottlenecks
# - Unnecessary CALCULATE/FILTER iterations
Troubleshooting Guide
Common Issues and Resolutions
Issue 1: Capacity Overload
Symptoms:
- High CPU utilization (>90% sustained)
- Query wait times increasing
- Refresh failures due to resource constraints
Diagnosis:
# Check capacity metrics
$capacity = Get-PowerBICapacity -Scope Organization | Where-Object {$_.DisplayName -eq "Production Capacity"}
# Get workspaces using the most resources
$resourceIntensiveWorkspaces = Get-PowerBIWorkspace -Scope Organization -Filter "capacityId eq '$($capacity.Id)'" |
ForEach-Object {
$datasets = Get-PowerBIDataset -WorkspaceId $_.Id
[PSCustomObject]@{
Workspace = $_.Name
DatasetCount = $datasets.Count
TotalSize = ($datasets | Measure-Object -Property AddRowsAPIEnabled -Sum).Sum
}
} | Sort-Object TotalSize -Descending
$resourceIntensiveWorkspaces | Format-Table
Resolution:
- Immediate: Enable autoscale or temporarily upgrade SKU
- Short-term: Redistribute workspaces across multiple capacities
- Long-term: Optimize datasets (aggregations, incremental refresh, remove unused columns)
Issue 2: Refresh Failures
Symptoms:
- Datasets failing to refresh
- Error: "Out of memory" or "Timeout"
Diagnosis:
# Get refresh history for all datasets
$refreshFailures = Get-PowerBIWorkspace -Scope Organization | ForEach-Object {
$workspaceId = $_.Id
$datasets = Get-PowerBIDataset -WorkspaceId $workspaceId
foreach ($dataset in $datasets) {
if ($dataset.IsRefreshable) {
$refreshHistory = Get-PowerBIDatasetRefreshHistory -DatasetId $dataset.Id -WorkspaceId $workspaceId -Top 5
$failures = $refreshHistory | Where-Object {$_.Status -eq "Failed"}
if ($failures) {
[PSCustomObject]@{
Workspace = $_.Name
Dataset = $dataset.Name
FailureCount = $failures.Count
LastError = $failures[0].ServiceExceptionJson
LastAttempt = $failures[0].EndTime
}
}
}
}
}
$refreshFailures | Sort-Object FailureCount -Descending | Format-Table
Resolution:
- Memory issues: Reduce dataset size, implement incremental refresh, schedule refreshes during off-peak
- Timeout issues: Optimize source queries, add indexing to source database, split large refreshes
- Connectivity issues: Check gateway health, verify credentials, review firewall rules
Issue 3: Unauthorized Data Access
Symptoms:
- Users accessing data outside their permissions
- Compliance violations
Diagnosis:
# Audit data access patterns
$accessAudit = Get-PowerBIActivityEvent -StartDateTime (Get-Date).AddDays(-7) -EndDateTime (Get-Date) |
Where-Object {$_.Activity -in @("ViewReport", "ExportReport", "ExportArtifact")} |
ConvertFrom-Json
# Identify unusual access patterns
$suspiciousAccess = $accessAudit |
Group-Object UserId, ReportName |
Where-Object {$_.Count -gt 100} | # More than 100 accesses in 7 days
Select-Object Name, Count
$suspiciousAccess | Format-Table
Resolution:
- Review and enforce RLS/OLS policies
- Tighten tenant settings (disable exports for unauthorized groups)
- Implement periodic access reviews
- Enable MIP sensitivity labels
Issue 4: Cost Overruns
Symptoms:
- Higher-than-expected Azure bills
- Autoscale frequently triggered
Diagnosis:
# Calculate capacity cost breakdown
function Get-CapacityCostAnalysis {
param([string]$CapacityName, [decimal]$HourlyCost)
# For P2: ~$9,995/month ≈ $13.88/hour
# Autoscale: ~$1.74/v-core/hour
$capacity = Get-PowerBICapacity -Scope Organization | Where-Object {$_.DisplayName -eq $CapacityName}
$baseVCores = switch ($capacity.Sku) {
"P1" { 8 }
"P2" { 16 }
"P3" { 32 }
}
# Get autoscale usage (from Azure billing API or portal)
# This is a simplified example
[PSCustomObject]@{
CapacityName = $CapacityName
SKU = $capacity.Sku
BaseVCores = $baseVCores
BaseMonthlyCost = $HourlyCost * 730 # hours per month
EstimatedAutoscaleCost = 0 # Calculate from actual usage
TotalEstimatedCost = $HourlyCost * 730
}
}
Get-CapacityCostAnalysis -CapacityName "Production Capacity" -HourlyCost 13.88
Resolution:
- Disable autoscale if consistently triggered (indicates under-provisioned capacity)
- Implement cost allocation tags by department/workspace
- Enforce workspace approval process
- Archive or delete unused content monthly
- Consider PPU for smaller teams (<300 users)
Deployment and CI/CD
Deployment Pipelines Setup
# Create deployment pipeline programmatically
$pipelineConfig = @{
displayName = "Sales Reports Pipeline"
description = "Dev → Test → Production pipeline"
stages = @(
@{displayName = "Development"; order = 0; workspaceId = "dev-workspace-id"},
@{displayName = "Test"; order = 1; workspaceId = "test-workspace-id"},
@{displayName = "Production"; order = 2; workspaceId = "prod-workspace-id"}
)
} | ConvertTo-Json -Depth 5
$pipeline = Invoke-PowerBIRestMethod -Url "pipelines" -Method Post -Body $pipelineConfig
$pipelineId = ($pipeline | ConvertFrom-Json).id
# Deploy to next stage
Invoke-PowerBIRestMethod -Url "pipelines/$pipelineId/Deploy" -Method Post -Body (@{
sourceStageOrder = 0 # Dev
targetStageOrder = 1 # Test
options = @{
allowOverwriteArtifact = $true
}
} | ConvertTo-Json)
Automated Governance Script
# Daily governance automation script
# Schedule via Azure Automation or Task Scheduler
function Invoke-DailyGovernanceChecks {
Connect-PowerBIServiceAccount -ServicePrincipal -Credential $cred -Tenant $tenantId
$report = @{
Date = Get-Date -Format "yyyy-MM-dd"
Checks = @()
}
# Check 1: Identify uncertified datasets in production
$uncertifiedDatasets = Get-PowerBIWorkspace -Scope Organization -Filter "name eq 'Production'" | ForEach-Object {
Get-PowerBIDataset -WorkspaceId $_.Id | Where-Object {-not $_.IsEffectiveIdentityRequired}
}
$report.Checks += @{
Name = "Uncertified Production Datasets"
Count = $uncertifiedDatasets.Count
Items = $uncertifiedDatasets.Name
}
# Check 2: Find workspaces without owners
$orphanedWorkspaces = Get-PowerBIWorkspace -Scope Organization | Where-Object {
$workspaceUsers = Get-PowerBIWorkspaceUser -WorkspaceId $_.Id
($workspaceUsers | Where-Object {$_.AccessRight -eq "Admin"}).Count -eq 0
}
$report.Checks += @{
Name = "Workspaces Without Owners"
Count = $orphanedWorkspaces.Count
Items = $orphanedWorkspaces.Name
}
# Check 3: Datasets exceeding refresh time SLA (>2 hours)
$longRefreshes = Get-PowerBIWorkspace -Scope Organization | ForEach-Object {
Get-PowerBIDataset -WorkspaceId $_.Id | ForEach-Object {
$refreshHistory = Get-PowerBIDatasetRefreshHistory -DatasetId $_.Id -WorkspaceId $_.Id -Top 1
if ($refreshHistory -and ($refreshHistory.EndTime - $refreshHistory.StartTime).TotalHours -gt 2) {
[PSCustomObject]@{
Dataset = $_.Name
Duration = [math]::Round(($refreshHistory.EndTime - $refreshHistory.StartTime).TotalHours, 2)
}
}
}
}
$report.Checks += @{
Name = "Long-Running Refreshes (>2 hours)"
Count = $longRefreshes.Count
Items = $longRefreshes
}
# Send email report
$emailBody = $report | ConvertTo-Json -Depth 5
Send-MailMessage -To "biops@contoso.com" -Subject "Power BI Governance Report - $($report.Date)" -Body $emailBody -SmtpServer "smtp.contoso.com"
return $report
}
# Execute daily checks
Invoke-DailyGovernanceChecks
Security and Compliance Checklist
Enterprise Security Hardening
Identity and Access Management:
☑ Azure AD Integration
- Enable conditional access policies for Power BI
- Require MFA for all admin accounts
- Implement just-in-time (JIT) admin access
- Regular access reviews (quarterly minimum)
☑ Service Principal Management
- Dedicated service principal per application/pipeline
- Certificate-based authentication (not client secrets)
- Regular credential rotation (90 days)
- Audit service principal permissions monthly
☑ Guest User Controls
- Restrict guest invitations to specific domains
- Limit guest permissions (view only)
- Implement external sharing policies
- Monthly guest user access audits
☑ Workspace Security
- Minimum of 2 admins per workspace (no single point of failure)
- Role-based access aligned with business functions
- Separate DEV/TEST/PROD workspaces
- Document access request and approval workflow
Data Protection:
# Enable sensitivity labels organization-wide
Set-PowerBITenantSetting -SettingName "InformationProtectionPolicies" -Enabled $true
# Configure default sensitivity label for new content
Set-PowerBITenantSetting -SettingName "MandatoryLabelPolicy" -Enabled $true -AppliesTo @("Organization")
# Audit data protection compliance
$labelAudit = Get-PowerBIWorkspace -Scope Organization | ForEach-Object {
$workspaceId = $_.Id
Get-PowerBIDataset -WorkspaceId $workspaceId | Select-Object @{
Name = "Workspace"; Expression = {$workspaceId}
}, Name, @{
Name = "HasSensitivityLabel"; Expression = {$_.SensitivityLabel -ne $null}
}
}
# Identify datasets without labels
$unlabeledDatasets = $labelAudit | Where-Object {-not $_.HasSensitivityLabel}
Write-Host "Unlabeled Datasets: $($unlabeledDatasets.Count)"
Encryption and Key Management:
☑ Data at Rest Encryption
- Premium capacities: BYOK (Bring Your Own Key) via Azure Key Vault
- Rotate encryption keys annually
- Document key management procedures
- Test key recovery process quarterly
☑ Data in Transit
- Enforce TLS 1.2 minimum
- Disable legacy protocols (TLS 1.0/1.1)
- Certificate pinning for on-premises gateway
☑ Gateway Security
- Dedicated machine for gateway (not domain controller)
- Windows Defender enabled and updated
- Restrict network access (firewall rules)
- Log all gateway activities
- Monthly security patching
Compliance Frameworks
GDPR Compliance:
☑ Data Subject Rights
- Document process for access requests (DSR)
- Implement data export functionality
- Define data retention policies (7 years financial, 3 years operational)
- Establish right to deletion workflow
☑ Data Minimization
- Remove PII from datasets when not required
- Use hashing/pseudonymization for analytics
- Implement column-level security (OLS)
- Regular PII audit (quarterly)
☑ Consent Management
- Track consent for data processing
- Implement consent withdrawal process
- Document legal basis for processing
SOX Compliance (Financial Reporting):
☑ Change Management
- Version control for all production reports/datasets
- Change approval workflow (documented and enforced)
- Quarterly certification of financial reports
- Audit trail for all modifications
☑ Access Controls
- Separation of duties (developer ≠ publisher)
- Read-only access for auditors
- Disable ad-hoc data export for financial reports
- Monthly access certification
☑ Data Integrity
- Source system reconciliation (automated daily)
- Refresh failure alerting (immediate)
- Data validation rules (automated)
- Exception reporting and resolution tracking
HIPAA Compliance (Healthcare):
☑ PHI Protection
- Enable row-level security for all PHI datasets
- Encrypt all data at rest and in transit
- Implement BAA (Business Associate Agreement) with Microsoft
- Disable external sharing for workspaces with PHI
☑ Audit Requirements
- Log all access to PHI reports (retain 6 years)
- Monitor for unusual access patterns
- Quarterly risk assessments
- Annual third-party security audit
☑ Breach Notification
- Documented incident response plan
- Notification procedures (within 60 days)
- Remediation tracking
- Post-incident review process
Audit and Monitoring
Activity Monitoring PowerShell:
# Comprehensive 30-day activity audit
function Get-PowerBISecurityAudit {
param([int]$Days = 30)
$startDate = (Get-Date).AddDays(-$Days)
$endDate = Get-Date
$allActivities = @()
# Fetch activities in 24-hour chunks (API limitation)
for ($i = 0; $i -lt $Days; $i++) {
$current = $startDate.AddDays($i)
Write-Host "Fetching activities for $($current.ToString('yyyy-MM-dd'))..."
$activities = Get-PowerBIActivityEvent `
-StartDateTime $current `
-EndDateTime $current.AddDays(1) |
ConvertFrom-Json
$allActivities += $activities
}
# Security-relevant activity analysis
$securityEvents = $allActivities | Where-Object {
$_.Activity -in @(
"ViewReport", "ExportReport", "ExportArtifact",
"ShareReport", "ShareDataset", "CreateWorkspace",
"AddWorkspaceMembers", "UpdateWorkspace",
"SetScheduledRefresh", "UpdateDatasourceCredentials"
)
}
# Generate security metrics
$metrics = @{
TotalActivities = $all Activities.Count
SecurityEvents = $securityEvents.Count
UniqueUsers = ($securityEvents | Select-Object -Unique -ExpandProperty UserId).Count
TopActivities = $securityEvents | Group-Object Activity | Sort-Object Count -Descending | Select-Object -First 10
ExternalSharing = ($securityEvents | Where-Object {$_.Activity -like "*Share*"}).Count
DataExports = ($securityEvents | Where-Object {$_.Activity -like "*Export*"}).Count
}
return [PSCustomObject]$metrics
}
# Run monthly security audit
$securityAudit = Get-PowerBISecurityAudit -Days 30
$securityAudit | Format-List
Compliance Reporting Dashboard:
Create Power BI dashboard tracking:
1. Security Metrics:
- Failed login attempts by user
- Access denied events
- Unusual access patterns (time, geography)
- Service principal activity
2. Compliance Metrics:
- % of datasets with sensitivity labels
- % of reports with RLS enabled
- Refresh success rate
- SLA compliance (refresh duration)
3. Governance Metrics:
- Workspaces without owners
- Uncertified datasets in production
- Orphaned content (unused >90 days)
- Cost per workspace/department
4. Capacity Health:
- CPU utilization trend
- Memory utilization trend
- Query duration percentiles (p50, p95, p99)
- Refresh queue depth
Best Practices Summary
Capacity Management
- Right-size from the start: Use the sizing calculator and monitor for 30 days before committing to SKU
- Enable autoscale strategically: Use for unpredictable spikes, not as a substitute for proper sizing
- Implement geographic distribution: Use multi-geo capacities for global organizations to reduce latency
- Reserve capacity commitments: Save up to 40% with 1-3 year reserved instances
Workload Optimization
- Balance workload allocation: Don't exceed 100% total memory allocation across workloads
- Separate production and development: Use different capacities or at minimum different workspaces
- Schedule refreshes intelligently: Stagger refreshes, avoid peak business hours, use incremental refresh
- Monitor and adjust: Review capacity metrics weekly, adjust workload settings based on usage patterns
Governance
- Principle of least privilege: Grant minimum necessary permissions, use RLS/OLS extensively
- Certification workflow: Require approval for production dataset promotion
- Audit regularly: Weekly access reviews, monthly compliance reports, quarterly capacity audits
- Document everything: Maintain runbooks, decision logs, exception approvals
Cost Control
- Implement showback/chargeback: Allocate costs to business units by workspace tags
- Quarterly cleanup: Archive unused content, decommission orphaned datasets
- Optimize aggressively: Aggregations, incremental refresh, query performance tuning
- Evaluate alternatives: Consider PPU for teams <300 users, Embedded for external scenarios
Key Takeaways
- Premium capacity requires proactive management: monitoring, optimization, and governance are ongoing activities
- Proper sizing prevents 80% of performance issues: invest time upfront in capacity planning
- Security and compliance are non-negotiable: implement RLS, OLS, sensitivity labels, and audit logging
- Cost optimization is continuous: regular reviews, optimization, and cleanup prevent budget overruns
- Automation is essential: use PowerShell, REST API, and deployment pipelines for consistency
Next Steps
- Complete capacity assessment and sizing exercise
- Configure capacity metrics app and alerting
- Implement core tenant governance policies
- Establish deployment pipeline for production content
- Schedule quarterly governance reviews
- Document standard operating procedures
- Train capacity administrators on monitoring tools
- Implement cost allocation and showback reporting
Additional Resources
- Power BI Premium Capacity Planning
- Capacity Metrics App
- Admin Portal Documentation
- Power BI REST API Reference
- Power BI Governance Whitepaper
- Premium Capacity Optimization
Govern. Monitor. Optimize. Scale.