IoT Solution: Azure IoT Hub, Stream Analytics, and Real-Time Dashboards
Introduction
[Explain IoT solution requirements: device connectivity, telemetry ingestion, real-time analytics, alerting, visualization.]
Solution Architecture
flowchart TB
DEVICES[IoT Devices] -->|MQTT/AMQP| IOTHUB[Azure IoT Hub]
IOTHUB --> ASA[Stream Analytics]
ASA --> POWERBI[Power BI Real-Time Dataset]
ASA --> STORAGE[(Azure Storage\nHot Path)]
ASA --> COSMOS[(Cosmos DB\nCold Path)]
ASA --> EVENTGRID[Event Grid]
EVENTGRID --> FUNCTIONS[Azure Functions]
FUNCTIONS --> ALERTS[Alert Notifications]
POWERBI --> DASHBOARD[Live Dashboard]
Components Overview
| Component | Role | Key Features |
|---|---|---|
| IoT Hub | Device registry & messaging | Bidirectional comms, device twins |
| Stream Analytics | Real-time processing | Windowing, joins, anomaly detection |
| Event Grid | Event routing | Pub/sub, filtering |
| Cosmos DB | Time-series storage | Global distribution, low latency |
| Power BI | Visualization | Streaming datasets, auto-refresh |
Step-by-Step Guide
Step 1: Provision IoT Hub
az iot hub create \
--resource-group rg-iot \
--name contoso-iothub \
--sku S1 \
--location eastus
Step 2: Register Devices
az iot hub device-identity create \
--hub-name contoso-iothub \
--device-id sensor-01
# Get connection string
az iot hub device-identity connection-string show \
--hub-name contoso-iothub \
--device-id sensor-01
Step 3: Simulate Device Telemetry
Node.js Device Code:
const { Client } = require('azure-iot-device');
const { Mqtt } = require('azure-iot-device-mqtt');
const connectionString = 'HostName=...';
const client = Client.fromConnectionString(connectionString, Mqtt);
client.open((err) => {
if (err) {
console.error('Connection error:', err);
} else {
setInterval(() => {
const telemetry = {
deviceId: 'sensor-01',
temperature: 20 + Math.random() * 10,
humidity: 60 + Math.random() * 20,
timestamp: new Date().toISOString()
};
const message = new Message(JSON.stringify(telemetry));
client.sendEvent(message, (err) => {
if (err) console.error('Send error:', err);
else console.log('Message sent:', telemetry);
});
}, 5000);
}
});
Step 4: Configure Stream Analytics Job
Input: IoT Hub
{
"name": "iothub-input",
"type": "Microsoft.StreamAnalytics/streamingjobs/inputs",
"properties": {
"type": "Stream",
"datasource": {
"type": "Microsoft.Devices/IotHubs",
"properties": {
"iotHubNamespace": "contoso-iothub",
"sharedAccessPolicyName": "service",
"sharedAccessPolicyKey": "...",
"endpoint": "messages/events",
"consumerGroupName": "$Default"
}
},
"serialization": {
"type": "Json",
"properties": { "encoding": "UTF8" }
}
}
}
Query: Aggregate Telemetry
SELECT
deviceId,
System.Timestamp() AS WindowEnd,
AVG(temperature) AS AvgTemperature,
AVG(humidity) AS AvgHumidity,
MAX(temperature) AS MaxTemperature,
MIN(temperature) AS MinTemperature
INTO [powerbi-output]
FROM [iothub-input]
GROUP BY deviceId, TumblingWindow(minute, 5);
-- Anomaly Detection
SELECT
deviceId,
temperature,
AnomalyDetection_SpikeAndDip(temperature, 95, 120, 'spikesanddips') OVER(LIMIT DURATION(minute, 10)) AS anomalyScore
INTO [alerts-output]
FROM [iothub-input]
WHERE AnomalyDetection_SpikeAndDip(temperature, 95, 120, 'spikesanddips') OVER(LIMIT DURATION(minute, 10)) > 0.8;
Output: Power BI
{
"name": "powerbi-output",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "PowerBI",
"properties": {
"dataset": "IoTTelemetry",
"table": "SensorData",
"groupId": "...",
"refreshToken": "..."
}
}
}
}
Step 5: Azure Function for Alerts
[FunctionName("ProcessAlert")]
public static async Task Run(
[EventGridTrigger] EventGridEvent eventGridEvent,
ILogger log)
{
var alert = JsonConvert.DeserializeObject<IoTAlert>(eventGridEvent.Data.ToString());
if (alert.AnomalyScore > 0.8)
{
// Send email via SendGrid
await SendAlertEmail(alert.DeviceId, alert.Temperature);
// Log to Application Insights
log.LogWarning($"High temperature alert: Device {alert.DeviceId}, Temp: {alert.Temperature}°C");
}
}
Step 6: Power BI Real-Time Dashboard
Create Streaming Dataset:
- Power BI Service → Workspace → New → Streaming Dataset
- Source: Azure Stream Analytics
- Fields: deviceId, WindowEnd, AvgTemperature, AvgHumidity
Dashboard Tiles:
- Line chart: AvgTemperature over time
- Card: Current AvgTemperature
- Gauge: MaxTemperature (threshold 30°C)
- Table: Top 5 devices by temperature
Step 7: Device Twin Management
Update Desired Properties:
az iot hub device-twin update \
--hub-name contoso-iothub \
--device-id sensor-01 \
--set properties.desired='{"telemetryInterval": 10000}'
Device Code (Handle Twin Update):
client.getTwin((err, twin) => {
twin.on('properties.desired', (delta) => {
if (delta.telemetryInterval) {
telemetryInterval = delta.telemetryInterval;
console.log('Updated interval:', telemetryInterval);
}
});
});
Step 8: Cold Path Storage (Cosmos DB)
Stream Analytics Output:
SELECT *
INTO [cosmosdb-output]
FROM [iothub-input];
Cosmos DB Configuration:
{
"datasource": {
"type": "Microsoft.Storage/DocumentDb",
"properties": {
"accountId": "contoso-cosmos",
"accountKey": "...",
"database": "IoTData",
"collectionNamePattern": "telemetry",
"partitionKey": "/deviceId"
}
}
}
Advanced Patterns
Edge Computing with IoT Edge
Deploy Module to Edge Device:
az iot edge set-modules \
--device-id edge-device-01 \
--hub-name contoso-iothub \
--content deployment.json
Module: Local Anomaly Detection
{
"modulesContent": {
"$edgeAgent": { ... },
"$edgeHub": { ... },
"anomalyDetectionModule": {
"version": "1.0",
"type": "docker",
"settings": {
"image": "mcr.microsoft.com/azureml/anomaly-detection:latest",
"createOptions": "{}"
}
}
}
}
Predictive Maintenance
ML Model Integration:
- Train model (Azure Machine Learning)
- Deploy as Azure Container Instance
- Stream Analytics calls scoring endpoint
- Route predictions to maintenance workflow
Digital Twin Integration
Azure Digital Twins:
var client = new DigitalTwinsClient(new Uri("https://contoso-dt.api.weu.digitaltwins.azure.net"), credential);
var patch = new JsonPatchDocument();
patch.AppendReplace("/Temperature", telemetry.Temperature);
await client.UpdateDigitalTwinAsync("sensor-01-twin", patch);
Security & Compliance
Device Authentication
- Use X.509 certificates for production devices
- Rotate device keys via DPS (Device Provisioning Service)
- Implement device attestation
Data Encryption
- In-transit: TLS 1.2+
- At-rest: Azure Storage encryption, Cosmos DB encryption
Access Control
# Create custom IoT Hub policy
az iot hub policy create \
--hub-name contoso-iothub \
--name device-telemetry \
--permissions RegistryRead DeviceConnect
Monitoring & Troubleshooting
IoT Hub Metrics:
AzureMetrics
| where ResourceProvider == "MICROSOFT.DEVICES"
| where MetricName == "d2c.telemetry.ingress.success"
| summarize TotalMessages = sum(Total) by bin(TimeGenerated, 5m)
Stream Analytics Diagnostics:
AzureDiagnostics
| where ResourceType == "STREAMANALYTICS"
| where Level == "Error"
| project TimeGenerated, OperationName, ResultDescription
Cost Optimization
| Service | Cost Driver | Optimization |
|---|---|---|
| IoT Hub | Messages/day | Use S1 tier for <400K msgs/day |
| Stream Analytics | Streaming units | Right-size SU allocation |
| Cosmos DB | RU/s + storage | Use serverless for variable workloads |
| Power BI | Pro licenses | Use embedded for external sharing |
Best Practices
- Batch device messages when possible
- Use device twins for configuration management
- Implement retry logic with exponential backoff
- Monitor IoT Hub throttling metrics
- Archive historical data to cold storage
Key Takeaways
- IoT Hub simplifies device connectivity and management.
- Stream Analytics enables real-time windowed aggregations.
- Power BI streaming datasets provide instant dashboards.
- Event Grid decouples alerting from core data flow.
Next Steps
- Implement predictive maintenance workflows
- Add Azure Digital Twins for spatial intelligence
- Explore Time Series Insights for historical analysis
Additional Resources
What IoT scenario will you build first?