SQL support in Business Central, Yes, also in the cloud!

Here’s a fun one. What if you could use SQL inside Cloud Business Central? In this video, we’re going to do just that. Check out the video if you don’t believe me:

https://youtu.be/6xjBCbm5vss

In this video, Erik demonstrates a fully functional SQL workbench built entirely inside Business Central — yes, even in the cloud. What started as a years-long idea finally came to life with a little help from AI coding assistance. Erik walks through the architecture (lexer, parser, runner), then demonstrates SELECT, INSERT, UPDATE, DELETE, JOINs, aggregate functions, and more — all running against live Business Central data.

The Idea Behind SQL in Business Central

One of the most common complaints in the Business Central community is the lack of SQL access, especially in the cloud. While Erik doesn’t fully share that frustration — he’s comfortable working with Business Central’s native data system and filtering capabilities — he understands the appeal. For a couple of years, he’d been thinking about building some form of SQL support inside Business Central but never got around to it.

The catalyst came from Steve Endow’s “vibe coding” project. Steve’s approach of diving in and figuring things out publicly inspired Erik to take the reverse approach: he already knew exactly how he’d architect a SQL engine (having built the AL compiler inside Business Central), so he used AI assistance to accelerate the actual implementation.

The SQL Workbench in Action

The result is a SQL Workbench page right inside Business Central. You type SQL, click “Execute SQL,” and get results. Here’s what a basic session looks like:

SELECT Number, Name, Address FROM Customer WHERE Number < '30000'

This returns filtered customer data just as you’d expect. The results can be exported to Excel as well.

JOINs Across Tables

Things get more interesting with joins. Erik demonstrates joining the Customer table with the Customer Posting Group table:

SELECT Customer.Number, Customer.Name, Customer."Customer Posting Group", 
       "Customer Posting Group".Description 
FROM Customer 
JOIN "Customer Posting Group" 
  ON Customer."Customer Posting Group" = "Customer Posting Group".Code

This returns customer records enriched with the posting group description — Domestic, Foreign, etc. — pulled from the related table.

INSERT, UPDATE, and DELETE

The workbench supports full DML operations. Here’s an insert:

INSERT INTO Customer (Number, Name) VALUES (70000, 'YouTube')

Result: “Rows affected: 1.” A subsequent SELECT confirms the new customer exists. Erik notes that the implementation is forgiving — if you pass a number to a Code field, it accepts it gracefully.

Update works as expected:

UPDATE Customer SET Name = 'Auto Nielson' WHERE Number = 70000

And delete:

DELETE FROM Customer WHERE Number = 70000

Aggregate Functions and More

The SQL engine also supports aggregate functions and grouping:

SELECT "Customer Posting Group", COUNT(*) FROM Customer GROUP BY "Customer Posting Group"

This returns counts per posting group. You can add SUM as well:

SELECT "Customer Posting Group", COUNT(*), SUM("Sales (LCY)") 
FROM Customer 
GROUP BY "Customer Posting Group"

Now you’re getting sales totals by customer group — something that starts to feel genuinely useful for ad-hoc analysis.

DISTINCT, TOP, IN, LIKE, and OR

Erik runs through several additional SQL features:

SELECT DISTINCT City FROM Customer
SELECT TOP 3 * FROM Customer
SELECT * FROM Customer WHERE City IN ('Winnipeg', 'Ottawa', 'Berlin')

DISTINCT, TOP, IN, LIKE, and OR all work. Erik notes that ORDER BY isn’t fully working yet — it’s still a work in progress.

How It Works: Lexer, Parser, Runner

The architecture follows the classic three-stage approach used in compiler design — the same approach Erik used when building the AL compiler inside Business Central:

  1. Lexer — Takes raw SQL text and slices it into tokens
  2. Parser — Consumes tokens and produces a structured representation (AST)
  3. Runner — Takes the parsed structure and executes it against Business Central data

The Lexer

The lexer breaks input into typed tokens. For example, SELECT * FROM Customer produces five tokens:

  • SELECT — a keyword token
  • * — named “multiple” (the asterisk/wildcard token)
  • FROM — a keyword token
  • Customer — an identifier token (a name of some sort)
  • EOF — end-of-file token

The EOF token is a practical design choice: since parsing often requires looking ahead to the next token, having a guaranteed EOF token at the end eliminates the need for null checks at every lookahead point.

The Parser

The parser reads the token stream and builds a structured object. For a SELECT statement, this structure includes the statement type (select, insert, update, delete), the list of columns, the FROM clause, WHERE conditions, JOINs, and so on. This structured output becomes the input for the SQL runner.

The SQL Runner

The runner takes the parsed output and translates it into actual Business Central record operations — applying filters, iterating records, performing joins, computing aggregates, and returning results. Erik mentioned that AI even helped generate documentation for this component.

Future Plans

Erik shared several ideas for where this could go:

  • Integration into the Toolbox — The SQL Workbench could become a feature in Erik’s Business Central Toolbox extension
  • SQL IntelliSense — Replace the standard multiline text input with a custom editor control that provides SQL IntelliSense, similar to how the Toolbox already provides AL IntelliSense
  • Better output formatting — Instead of raw JSON output, display results in a proper table structure
  • Continued SQL support — Fix ORDER BY and expand the supported SQL syntax

Conclusion

This is a compelling proof of concept: a real SQL engine running inside Business Central’s AL runtime, complete with lexer, parser, and execution engine. It supports SELECT, INSERT, UPDATE, DELETE, JOINs, aggregates (COUNT, SUM), DISTINCT, TOP, IN, LIKE, WHERE clauses, and GROUP BY. While some features like ORDER BY still need work, the foundation is solid — built on the same compiler design principles Erik used for the AL language itself. Whether you see it as a gimmick or a genuinely useful tool for ad-hoc data exploration in the cloud, it’s an impressive demonstration of what’s possible within Business Central’s AL environment.