Dealing with binary data in Business Central AL is a job for Temp Blob. In this video, I show to use the Temp Blob codeunit.

In this video, Erik walks through how to work with Blob storage in Business Central’s AL language. He covers the history of BLOBs, demonstrates how to use the Temp Blob codeunit to hold binary data in memory, shows how Persistent Blob stores data in the database across sessions, and briefly touches on the Temp Blob List for handling arrays of binary data. Along the way, he builds a working demo with streams, upload functions, and copy operations.
A Brief History of BLOBs
BLOB stands for Binary Large Object — a chunk of binary data of variable length. The term was coined by Jim Starkey at DEC (the same person who invented InterBase). He named it after the 1958 Steve McQueen movie “The Blob.” It later became “Basic Large Object” at a portal computer before settling on its current name.
Compared to all the other field types and data types in Business Central, the BLOB has a variable length. It can store anything: photos, JSON, XML, PDFs — anything binary.
The Evolution from Temp Blob Table to Codeunit
BLOBs have been part of NAV and Business Central for a long time. At some point, developers needed to manipulate blob data — for example, grabbing the Item table’s picture field and creating a temporary version of it, or using the Customer table’s blob fields. This turned into a pattern, and Microsoft helped that pattern along by creating the Temp Blob table. That has now evolved into the Temp Blob codeunit, along with the broader Blob Storage module.
If you haven’t explored the Business Central source code on GitHub, it’s worth a visit. The System Application source is there, including support for OAuth, password management, printers, math, data classification, confirm management, and more. These are the building blocks you’d need to build any kind of modern application — not just ERP. One of those building blocks is the Blob Storage module.
Setting Up the Demo Table and Page
Erik creates a simple table with a primary key and a blob field, plus a card page to display it:
table 59100 "Blob Demo"
{
fields
{
field(1; PKEY; Integer)
{
}
field(2; BLOB; Blob)
{
Subtype = Bitmap;
}
}
keys
{
key(PK; PKEY)
{ }
}
}
The page displays these fields and includes several actions for importing blob data. The UI actually gives you a built-in way to import values into a blob field, but the goal here is to understand how to do it programmatically using streams and the Temp Blob codeunit.
Working with Temp Blob and Streams
As soon as you work with blobs, you need streams. Streams are your friend. Here’s the core Import action that demonstrates the Temp Blob pattern:
action(Import)
{
Caption = 'Import';
ApplicationArea = All;
Promoted = true;
PromotedCategory = Process;
PromotedIsBig = true;
PromotedOnly = true;
trigger OnAction()
var
InS: InStream;
OutS: OutStream;
FileName: Text;
begin
if UploadIntoStream('Select file', '', '', FileName, InS) then begin
TempBlob.CreateOutStream(OutS);
CopyStream(OutS, InS);
// The universe does something different!
Rec.CalcFields(BLOB);
Rec.BLOB.CreateOutStream(OutS);
TempBlob.CreateInStream(InS);
CopyStream(OutS, InS);
end;
end;
}
How It Works Step by Step
- UploadIntoStream takes a file from the client (the PC running the browser) and uploads it into an InStream. Note: the function takes more parameters than are needed for the web client — these are leftovers from the Windows client days, and there’s no overload to simplify it.
- When
UploadIntoStreamreturnstrue, the InStream is attached to a hidden blob in Business Central that holds the uploaded data. - We create an OutStream on the Temp Blob (because we want to write into it), then use CopyStream to pump data from the InStream to the OutStream. Erik thinks of
CopyStream(OutS, InS)like an assignment:OutStream = InStream. - At this point, whatever we uploaded is sitting in the Temp Blob in memory. This is where “the universe does something different” — you could process, transform, or inspect the data before writing it to its final destination.
- We call CalcFields on the blob field (Erik notes he always does this early because he can never remember the optimal time to call it), create an OutStream on the record’s blob field, create an InStream on the Temp Blob, and use CopyStream again to move the data from Temp Blob into the actual table field.
Of course, you could skip the Temp Blob entirely and pipe the uploaded stream directly into the record’s blob field. But the Temp Blob pattern is invaluable when you need to do something with the data in between — hold it in memory, transform it, send it somewhere else, or use it in multiple places.
The Temp Blob Pattern in Practice
Erik references his earlier video on HTML email as another example of the same pattern: a report was saved to an OutStream obtained from a Temp Blob, and then an InStream was created on the Temp Blob to read the data back out into a text field for formatting an email. The Temp Blob was used to hold data temporarily while it was processed. This is a very common pattern.
If you come from the C#/.NET world, the Temp Blob is essentially the same as a MemoryStream — same use case, same functionality, just expressed through AL’s language constructs.
Persistent Blob
The second part of the Blob Storage module is the Persistent Blob. Microsoft describes it as “the interface for storing blob data between sessions.” But as Erik discovers by looking at the source code, it’s actually a table in the database — so the data persists forever, not just between sessions. It’s essentially a key-value blob storage in the database.
Here’s the action that stores data into a Persistent Blob:
action(Import2)
{
Caption = 'Import into Persistent';
ApplicationArea = All;
Promoted = true;
PromotedCategory = Process;
PromotedIsBig = true;
PromotedOnly = true;
trigger OnAction()
var
InS: InStream;
OutS: OutStream;
FileName: Text;
PNo: BigInteger;
begin
if UploadIntoStream('Select file', '', '', FileName, InS) then begin
PNo := Persistent.Create();
Persistent.CopyFromInStream(Pno, InS);
Message('P Number = %1', PNo);
end;
end;
}
You call Persistent.Create() to get a new ID (a BigInteger), then use CopyFromInStream to write data into it. Notice that the Persistent Blob API has built-in “pipe with a pump” methods — you don’t need to use CopyStream separately.
And here’s the action that reads data back from a Persistent Blob into a record’s blob field:
action(Import3)
{
Caption = 'From Persistent';
ApplicationArea = All;
Promoted = true;
PromotedCategory = Process;
PromotedIsBig = true;
PromotedOnly = true;
trigger OnAction()
var
OutS: OutStream;
begin
Rec.CalcFields(BLOB);
Rec.BLOB.CreateOutStream(OutS);
Persistent.CopyToOutStream(1, OutS);
end;
}
Important Security Note
Persistent Blobs do not have the same protection features as Isolated Storage. There is no app-level isolation — any app can access any persistent blob just by knowing (or guessing) the number. You should not use Persistent Blobs to store secrets. For secrets, use Isolated Storage instead.
Temp Blob List
The third component is the Temp Blob List, which is essentially an array of Temp Blobs. If you need a single Temp Blob, you declare a variable of type Codeunit "Temp Blob". If you need multiple, you could declare several variables, but if you need something closer to an array of binary data, the Temp Blob List is the way to go:
var
TempBlob: Codeunit "Temp Blob";
TempBlob2: Codeunit "Temp Blob List";
BlobList: Codeunit "Temp Blob List";
Persistent: Codeunit "Persistent Blob";
The Temp Blob List works similarly to the Temp Blob — you can get and set individual blobs within the list. It’s particularly useful when working with Media Set types or any scenario where you need to process a collection of binary objects.
Bonus: Record Ref Support
While browsing the Temp Blob’s functions, Erik highlights two noteworthy methods: FromRecordRef and ToRecordRef. There is no “blob ref” data type in AL, but you can assign the value of a blob through a RecordRef using these methods on the Temp Blob. This is how it’s done internally in configuration packages, for example — you create a Temp Blob, use FromRecordRef to get blob data from a record reference, and then do whatever processing you need.
Summary
Here’s a quick recap of the three Blob Storage components in Business Central’s System Application:
- Temp Blob (Codeunit) — Your go-to for holding binary data in memory. Use it whenever you need to process anything that isn’t a regular field value: uploading files, generating reports for web services, preparing data for API calls, or any intermediate binary processing. Think of it as AL’s equivalent of a .NET MemoryStream.
- Persistent Blob (Codeunit) — Key-value blob storage in the database. Data persists across sessions and deployments. No security isolation, so don’t use it for secrets.
- Temp Blob List (Codeunit) — An array of Temp Blobs for when you need to work with collections of binary data.
Understanding streams and the Temp Blob pattern is fundamental to working with binary data in Business Central. The key mental model is: create streams (in for reading, out for writing), connect them with CopyStream as the pump, and use Temp Blob as your intermediate holding area when you need to do processing between source and destination.