In this video, I travel down a rabbit hole while exporting BC data as JSON. Some operations in AL have not been modernized, so in order to accomplish my goal, I need to add a bit of creativity. Check out the video:

In this video, Erik explores a surprisingly tricky corner of AL development: getting the output of a Query object as JSON. What seems like it should be a simple task turns into an interesting journey through undocumented parameters, character manipulation, and type inference. Along the way, Erik uncovers some hidden behaviors of the SaveAsCsv method and builds a complete solution for converting query output to proper JSON.
The Problem: No SaveAsJson
The task was straightforward: grab some data from Business Central and output it as JSON. This is something developers do all the time — there are well-known patterns for converting records to JSON and back. The interesting twist was that the data request naturally lent itself to being solved with a Query object, due to the volume of data and the relationships involved.
But when you look at what a Query object can actually do in AL, you find two output methods:
SaveAsCsvSaveAsXml
There is no SaveAsJson. If there were, this would be a two-minute task and there’d be no video. But there isn’t, so Erik set out to build a workaround.
Exploring SaveAsCsv: The Undocumented Parameters
The SaveAsCsv method accepts three parameters: an OutStream, a Format integer, and a FormatArgument text. The documentation explains that when Format is set to 0, the output is a fixed-position file — not a comma-separated file at all. You pass column starting positions as the format argument (e.g., '1,040').
Erik notes with some amusement that the documentation for a method called “Save As CSV” describes outputting a fixed-format file. The name is somewhat misleading.
So what happens when you pass 1 as the format parameter? This is where things get interesting — and undocumented. Setting Format to 1 actually produces a proper comma-separated file. And the FormatArgument parameter? It lets you override the delimiter character. Pass '$$$' and you get dollar signs as delimiters. This behavior is not documented anywhere.
Choosing a Safe Delimiter
Since the CSV output doesn’t wrap values in quotes, you need a delimiter that will never appear in your actual data. Dollar signs or commas could easily show up in business data. Erik’s solution is to use a control character — specifically SOH (Start of Heading), which is ASCII character 1:
var
Separator: Text[1];
begin
Separator[1] := 1; // SOH character - extremely unlikely to appear in data
end;
By using the text variable as a character array and assigning the integer value 1 to the first position, you get an SOH character that is virtually impossible to produce in a normal text field. Erik mentions that character 9 (tab) could also work for a tab-separated file, but SOH is safer.
Building the Complete Solution
With a safe delimiter chosen, the approach becomes:
- Run the query and save the output as CSV to a blob using the SOH delimiter
- Read the blob back line by line via an InStream
- Parse the first line as field names (headers)
- Parse each subsequent line as data, splitting on the delimiter
- Build a JSON array with proper typing
Here’s how the core logic works. First, save the query output to a blob:
var
TempBlob: Codeunit "Temp Blob";
OutStr: OutStream;
InStr: InStream;
Separator: Text[1];
begin
Separator[1] := 1; // SOH character
TempBlob.CreateOutStream(OutStr);
MyQuery.SaveAsCsv(OutStr, 1, Separator);
TempBlob.CreateInStream(InStr);
end;
Then read the header line to get field names:
var
FieldNames: List of [Text];
Parts: List of [Text];
Line: Text;
begin
InStr.ReadText(Line);
FieldNames := Line.Split(Separator);
end;
Then iterate through data lines and build JSON:
var
JArray: JsonArray;
DataEntry: JsonObject;
i: Integer;
Dec: Decimal;
DT: DateTime;
Bool: Boolean;
begin
while InStr.ReadText(Line) > 0 do begin
Clear(DataEntry);
Parts := Line.Split(Separator);
for i := 1 to Parts.Count() do
if Evaluate(DT, Parts.Get(i)) then
DataEntry.Add(FieldNames.Get(i), DT)
else
if Evaluate(Dec, Parts.Get(i)) then
DataEntry.Add(FieldNames.Get(i), Dec)
else
if Evaluate(Bool, Parts.Get(i)) then
DataEntry.Add(FieldNames.Get(i), Bool)
else
DataEntry.Add(FieldNames.Get(i), Parts.Get(i));
JArray.Add(DataEntry);
end;
end;
Type Inference: Making JSON Types Correct
A significant challenge with CSV data is that everything is a string — there are no types. Without type handling, amounts would be JSON strings like "1,234.56" instead of proper numbers like 1234.56. Dates would be locale-formatted strings instead of proper ISO date formats.
Erik uses AL’s Evaluate function to attempt type conversion in a specific order:
- DateTime — If it can be evaluated as a DateTime, store it as DateTime (which JSON serializes in ISO format)
- Decimal — If it can be evaluated as a Decimal, store it as a number (no locale-specific comma formatting)
- Boolean — If it can be evaluated as a Boolean, store it as a proper JSON boolean (
true/falsewithout quotes) - Text — Everything else remains a string
This ordering matters. You could also add Integer detection if you need to distinguish between integers and decimals in your JSON output.
An important side note: SaveAsCsv uses your regional settings to format values, which means decimals might come out with commas instead of periods. By evaluating and re-serializing through the JSON object, you get properly formatted numbers regardless of locale.
Bonus: Memory Consumption Considerations
Erik raises an important architectural concern about memory duplication in AL development. In this solution, the data exists in memory multiple times: once in the blob containing the CSV output, and again in the JSON object. This pattern of memory duplication is pervasive in Business Central development.
Common scenarios where memory gets duplicated:
- Receiving Base64 data → decoding it → converting to JSON (three copies)
- Getting HTTP content → converting to string → parsing into a JSON object
- Query output → blob → parsed lines → JSON structure
Erik suggests that Microsoft could reduce memory usage significantly by creating streaming/connected functions that pipe data through transformations without materializing intermediate copies. Given that Business Central runs at scale with potentially thousands of tenants per server, even modest memory savings per operation could have a meaningful impact on density and profitability.
Summary
Converting a Query’s output to JSON in AL requires some creative workarounds due to the absence of a native SaveAsJson method. The key discoveries and techniques covered:
SaveAsCsvwithFormat = 1produces actual comma-separated output (undocumented)- The
FormatArgumentparameter lets you override the delimiter (also undocumented) - Using a control character (SOH, ASCII 1) as a delimiter avoids conflicts with real data
- AL’s
Evaluatefunction enables runtime type inference to produce properly typed JSON - The
List of [Text]type combined withSplit()makes parsing straightforward
It would certainly be simpler if Microsoft added SaveAsJson to the Query object. Until then, this approach provides a reliable way to bridge the gap between Query output and JSON, which is the format most modern integrations expect.