In this video, I show a great method to getting started with Telemetry for Business Central using Jupyter Notebooks.

In this video, Erik walks through the complete setup process for getting started with Business Central telemetry using Application Insights, Azure Data Studio, and Jupyter Notebooks. He covers everything from configuring Application Insights in the Azure portal, to installing the necessary tools, to running pre-built troubleshooting queries created by Microsoft — and even shows how you can query Application Insights directly from AL code.
Why Telemetry Matters Now
Erik opens with an observation that many Business Central professionals will relate to: we’ve had this software for decades, and for most of that time, the best we could hope for when something went wrong was maybe finding an event in an event log. Now that Microsoft is hosting and running their own software in the cloud, they’ve built out extensive telemetry tooling — because they need it. And fortunately, that means we get to use it too.
The challenge, however, is that there’s no simple menu item you click to access telemetry. The Admin Center has some telemetry features, but they’re limited and difficult to use for anything beyond quick diagnostics. Microsoft has made a significant effort to make telemetry more accessible, and this video demonstrates the practical steps to get up and running.
What Are Jupyter Notebooks?
Before diving into the setup, Erik explains the concept of Jupyter Notebooks. The name “Jupyter” comes from the programming languages Julia, Python, and R — languages commonly used in scientific communities for data manipulation, AI, and similar work.
The core idea behind Jupyter Notebooks is that you have a document — think of it like a Word document — that contains interspersed sections of:
- Formatted text (explanations, documentation)
- Executable code blocks
- Tabulated data, graphs, and charts
This format is perfect for telemetry work because you can combine documentation explaining what a query does with the actual runnable query and its results, all in one place.
Prerequisites: Setting Up Application Insights
The first step is to have an Application Insights resource in the Azure portal and connect it to your Business Central environment. Erik prepared for this video by:
- Creating an Application Insights resource in the Azure portal
- Copying the connection string from Application Insights and entering it in the Business Central Admin Center for the target environment
- Scrolling down in the Application Insights menu to find API Access
- Noting the Application ID
- Creating an API Key for programmatic access
Once the connection string is configured on the environment, telemetry data will start flowing into Application Insights. You’ll need both the Application ID and the API Key for the next steps.
Installing Azure Data Studio
The main tool for working with the telemetry notebooks is Azure Data Studio. If you’re familiar with SQL Server Management Studio (SSMS), think of Azure Data Studio as the next-generation tool that supports much more.
When you first open Azure Data Studio, you’ll notice it looks very familiar — it’s essentially a specialized version of Visual Studio Code. You’ll recognize the search icon, the extensions marketplace, and the general layout. Azure Data Studio can work with SQL Servers, Azure resources, and — most importantly for this video — Notebooks.
Setting Up Python and KQL Magic
When you first open a notebook in Azure Data Studio, you’ll see it references a Python 3 kernel. The first time, you may be prompted to install Python — follow those prompts to get it set up.
However, installing Python alone isn’t enough. You also need to install the KQL Magic package:
- In Azure Data Studio, look for the Manage Packages icon (in the right corner of the notebook toolbar)
- Click Add New
- Search for kqlmagic
- Select a stable version (Erik recommends version
0.1.114.post4— avoid dev versions) - Install the package
- Restart Azure Data Studio
You’ll know everything is installed correctly when you see the KQL Magic banner appear when opening a notebook.
Adding the Microsoft Troubleshooting Notebooks
Microsoft (with significant credit to Kenny and his team) has published a comprehensive set of Jupyter Notebooks with telemetry examples. To add them:
- In Azure Data Studio, go to the Notebooks section
- Click Add Remote Jupyter Book
- Choose GitHub as the source
- Search for
repos/microsoft/bctag - Select D365 Troubleshooting Guides TSG – The Next Generation
- Choose the available book (Guides), version (1.2), and language (English)
- Click Add
The notebook will be downloaded and you’ll see a Troubleshooting Guide based on telemetry, containing multiple pages covering different areas like performance overview, login issues, extension problems, and more.
Pro tip: Save the notebook locally so you don’t have to re-add the remote book every time you open Azure Data Studio. Use Save All and then next time just open the local folder instead of adding the remote book again.
Running Your First Telemetry Queries
Each notebook page follows a consistent structure:
- Introduction text explaining the purpose
- Load KQL Magic — a code block that initializes the KQL module
- Connect to Application Insights — where you enter your Application ID and API Key
- Define filters — optional filtering by environment, date range, etc.
- Query blocks — individual runnable queries with explanations
You can either click Run All to execute every code block in the document, or run them one at a time by clicking the play button next to each block. The results appear inline — tables, graphs, and charts rendered right in the document.
Erik demonstrated the Login notebook, which showed queries for:
- Pre-open company authorization attempts
- Authorization failures
- Successful logins (broken down by client type — background, web client, web services, etc.)
He also demonstrated the Performance Overview notebook, which revealed page view timings and identified that a role center was consuming significant time due to calculated content.
Since the notebooks contain live code, you can modify queries on the fly. For example, Erik changed limit 100 to limit 10 to reduce the result set — and immediately got updated results.
Querying Application Insights from AL Code
Beyond using notebooks, you can also query Application Insights directly from AL code using the Application Insights REST API. Here’s an example of a codeunit that sends a KQL query and parses the JSON response:
codeunit 50100 "Telemetry Query thing"
{
procedure RunQuery()
var
Client: HttpClient;
Request: HttpRequestMessage;
Response: HttpResponseMessage;
Headers: HttpHeaders;
Content: HttpContent;
RequestJson: JsonObject;
ResponseTxt: Text;
ResponseJson: JsonObject;
tables: JsonArray;
T: JsonToken;
firstTable: JsonObject;
tablerows: JsonArray;
row: JsonArray;
begin
Request.Method := 'POST';
Request.SetRequestUri('https://api.applicationinsights.io/v1/apps/fe398af0-439e-481b-9033-5c3e2a04c5dc/query');
Request.GetHeaders(Headers);
Headers.Add('x-api-key', 'zjfiij0j5xagnx4qu677j61rj2h1jvyvyp7nsjie');
RequestJson.Add('query', 'traces| where timestamp > ago(1d)| project customDimensions.eventId, customDimensions.companyName, message');
Content.WriteFrom(format(RequestJson));
Content.GetHeaders(Headers);
Headers.Remove('Content-Type');
Headers.Add('Content-Type', 'application/json');
Request.Content(Content);
if Client.Send(Request, Response) then begin
if Response.IsSuccessStatusCode() then begin
Response.Content().ReadAs(ResponseTxt);
ResponseJson.ReadFrom(ResponseTxt);
ResponseJson.Get('tables', T);
tables := T.AsArray();
tables.Get(0, T);
firstTable := T.AsObject();
firstTable.get('rows', T);
tablerows := T.AsArray();
tablerows.get(0, T);
row := T.AsArray();
message('%1', format(row));
end else
error('We got %1 errors', Response.HttpStatusCode());
end else
error('Deep trouble!');
end;
}
This codeunit makes a POST request to the Application Insights REST API, sending a KQL query that retrieves the event ID, company name, and message from traces in the last day. The response is parsed through the nested JSON structure (tables → first table → rows → first row) and displayed in a message dialog.
The corresponding page extension adds a button to trigger this query:
pageextension 50100 ItemList extends "Item List"
{
actions
{
addfirst(processing)
{
action(Telemetry)
{
Caption = 'Telemetry';
ApplicationArea = all;
trigger OnAction()
var
t: Codeunit "Telemetry Query thing";
begin
t.RunQuery();
end;
}
}
}
}
This approach opens up interesting possibilities — you could build dashboards or monitoring tools directly within Business Central that pull telemetry data in real-time.
Building Your Own Notebooks
One of the best aspects of this approach is that the notebooks are fully editable. You can:
- Right-click (or use the toolbar) to insert new code or text blocks
- Modify existing queries to suit your needs
- Build your own custom notebooks combining your most-used queries
- Configure notebooks for different Application Insights accounts or multiple environments
If you break something, you can simply download the remote book again and start fresh. But over time, you’ll build up a personal collection of the queries that matter most for your environments.
Summary
Getting started with Business Central telemetry involves a few setup steps, but once you’re up and running, you have powerful diagnostic capabilities at your fingertips. Here’s the quick checklist:
- Create an Application Insights resource in the Azure portal
- Connect it to your Business Central environment via the connection string
- Note your Application ID and create an API Key under API Access
- Install Azure Data Studio
- Install Python and the KQL Magic package
- Add the Microsoft troubleshooting notebooks from GitHub
- Configure your credentials and start running queries
Now that we no longer have the ability to walk down to the server room and put our hands on the CPU to feel if it’s running too warm, telemetry is clearly the best way to understand what’s happening on a tenant. The Microsoft-provided notebooks give you an enormous amount of starting material — enough to keep you busy for days — and from there, you can build out your own custom monitoring toolkit.