Analytics Builder empowers you to turn raw data into actionable insights by visually connecting blocks and wires. However, as you build more advanced models, you might encounter scenarios where the logic doesn’t behave exactly as expected.
To help you “open the box” and improve your user experience, this article outlines practical strategies for debugging your models, ensuring you can implement your use cases with confidence.
Introduction: Debugging, Testing, and Simulation
Before we look at the tools, it is helpful to distinguish between three distinct activities in the model development lifecycle. While they often overlap, they serve different purposes:
- Debugging: The process of inspecting the internal state and flow of data to understand why a model is acting in a specific way. This is about visibility into the logic as it executes to fix defects.
- Testing (Test mode): Analytics Builder provides a built-in “Test” deployment mode. This runs your model against live incoming data, but safely redirects the final output to a virtual test device. This ensures you can verify correctness on real-time data without accidentally triggering production alarms or altering actual device states.
- Simulation: A specific mode in Analytics Builder that allows you to replay historical data through your model. It provides a safe environment to verify logic against past real-world events without affecting production devices. (see Model Simulation documentation.)
This guide focuses exclusively on Debugging—giving you the techniques to find and fix issues when your logic isn’t working as expected.
Option1: Creating Intermediate Outputs
One of the most effective ways to debug is to “tap the wire.” This involves temporarily introducing output blocks inside your model to expose intermediate values that are passing between blocks. Instead of guessing what the value is after a certain block or calculation, you can persist that value to Cumulocity to verify it visually.
To demonstrate this, let’s look at a common predictive maintenance scenario: Monitoring the differential pressure of a pump and trigger an alarm if the difference is larger than a predefined threshold value, as this might be a warning for clogging.
- Goal: Calculate the difference between Pressure A and Pressure B. If the difference exceeds 2 mbar, trigger a “High Deviation” alert.
- The Challenge: If the alarm doesn’t trigger, is the calculation wrong, or is the threshold logic failing?
- The Resolution: We can use Measurement and Event output blocks simultaneously to answer this.
Step 1: Visualizing the Calculation (Measurement Output)
First, we want to see the “analog” value evolving over time. We need to verify that the difference calculation is correct before it hits the threshold.
- Locate the Wire: Find the wire coming out of your Difference (or Expression) block.
- Add the Block: Connect this wire to a Measurement Output block.
- Configure: Set the device to your input device and name the series Debug_Difference.
- Why we do this: This allows you to open the Measurements tab in Device Management. You will see a graph of the calculated difference. If the value stays flat or looks wrong, you know the issue is in the calculation logic, not the alarm trigger.
Step 2: Verifying the Trigger (Event Output)
Next, we want to see the precise moment the logic decides to act.
- Locate the Wire: Find the wire coming out of your Threshold block (the one that should trigger the alarm).
- Add the Block: Connect this wire to an Event Output block in parallel with your alarm block.
- Configure: Set the Event type to Debug and the Message to Threshold_Breached.
- Why we do this: This allows you to open the Events tab. You will see a timestamped list of every time the threshold was crossed.
Step 3: Activating the model
In the Model overview of Analytics Builder, change the Mode to Production and switch the toggle from Inactive to Active.
Step 4: Debugging via Device Management or Cockpit interface
Finally, navigate to Device Management and select the input device and
- Go to the Measurements Tab: To confirm the calculated difference is fluctuating as expected.
- You might need to add the Debug_Difference series as a data point to the graph. Do this by clicking on the “+ Add data point” in the right sidebar
- Pro-tip: You can even visualize the Debug-Event in this view together with the measurements by clicking the “+ Add event” in the right sidebar, to confirm the event appears exactly when the measurement graph crosses the 2-degree mark.
- Go to the Events Tab: To confirm that an event appears when the measurement graph crosses the 2-degree mark.
Alternative: Verifying via the Cockpit App
The steps above use the preconfigured tabs in the Device Management application for quick navigation. However, you can achieve the exact same visibility using the Cockpit application. Instead of using the individual Measurement or Event tabs, simply navigate to your device or asset in Cockpit and use the Data Explorer to visualize your intermediate debug values over time.
A Quick Housekeeping Tip: since these output blocks are meant for debugging, you want to keep your production model clean and avoid filling your device history with test data. When removing them before you go live, take care to reconnect your main logic wires correctly after deleting the debug blocks.
Pro-Tip: When you “tap the wire” using standard output blocks, you are writing actual Measurements and Events (MEAs) to the database. Over time, this debug data can clutter your tenant and unnecessarily increase your storage costs. You can delete that data manually, but a best practice is to configure a Data Retention Rule for the specific fragment types you use for debugging (for example, setting a 1-day retention for measurements of type Debug_Difference measurements or events of type Debug). This ensures the platform automatically purges your temporary test data once your debugging session is over. You can learn how to configure these rules in the Managing Data documentation.
Option2: Using a Logger Block
The Logger block is currently available in Public Preview or as an open-source contribution in GitHub - Cumulocity-IoT/analytics-builder-blocks-contrib: Unsupported, not productized blocks for use with Apama Analytics Builder · GitHub.
Because the Logger block writes to the microservice logs, it is important to consider whether you have sufficient permissions to access the Administration application and the logs. Additionally, logs are only available in case of a single-tenant Apama-ctrl microservice; in case of multi-tenant Apama-ctrl microservice, the logs are hosted at the parent tenant and as such not accessible directly.
Another powerful way to “tap the wire” is using the Logger Block. While the previous method persists data to the database (Measurements/Events), the Logger block writes information to the microservice logs. This is often faster for transient debugging where you don’t need to / want to store the data history.
Key Features:
- Structured Tracing: The block outputs logs in a structured format (value=… properties=…), allowing you to see exactly what payload and properties are moving through a specific wire.
- Tagging: You can assign a unique “Tag” (e.g., Log_output_DiffBlock) to the block. This makes it easy to filter the log file and find exactly the entries relevant to the specific logic you are debugging.
- Log Levels: You can categorize messages using standard levels like INFO, DEBUG, or WARN.
Let’s see it in action!
Step 1: Configuring the logging block
- In the Utility-section of the block library, select the “Logger”-block and drag it onto the canvas
- Configure the “Logger tag” to be Log_output_DiffBlock for easy recognition in the logs
- Connect the output of the Difference block to the input of the Logger block
Step 2: Activate the model
In the Model overview of Analytics Builder, change the Mode to Production and switch the toggle from Inactive to Active.
Step 3: Debugging via Microservice logs
To access these logs, you need Access to the Administration application and Application Management permissions.
- Access the microservice logs via Administration > Ecosystem > Microservices, select Apama-ctrl-* and select the Logs-tab.
- Navigate to the end of the logs by clicking on the “Fast forward”-button in the bottom right corner.
- Scan the logs for Log_output_DiffBlock to inspect the type, value and properties of the item which is passed through the wire to which the Logger-block is connected
Note: Unlike the standard output blocks, the Logger block is designed to stay in your model. You can map the “Enable Logging” toggle to a Template Parameter. This allows you to keep the debugging logic inside your model permanently. When you deploy to production, you simply untick the parameter to disable logging. If an issue arises later, you can re-enable it instantly without editing the model canvas—making the switch between debugging and production seamless.
What is your go-to strategy?
While these two methods currently cover the majority of debugging scenarios, the Cumulocity community is always finding innovative ways to build and verify models. Do you have a custom approach, a clever workaround, or a different block combination you use for testing? Let us know in the comments below, and we may feature your strategy in the next update to this guide!
Summary
Analytics Builder does not need to remain a black box. This How-To showed you practical, built-in options to look under the hood and understand exactly how your data is flowing.
When you need visual, graphable feedback to see a calculation evolve or pinpoint a trigger in real-time, “tapping the wire” with temporary Measurement and Event blocks gives you immediate clarity. On the other hand, when you want a structured, persistent trace that can be seamlessly toggled off for production, the Logger block is a perfect fit. By leveraging these tools together, you can move from guessing why a model isn’t working to confidently verifying your logic step-by-step.






