What is a Project Madeira Account?

A Madeira account is an Office 365 account, also known as an “work or school account”.

If you are using your Microsoft Account to sign up for Madeira, you are actually adding a “new” Microsoft Account to your email. So the next time you log in with that account, you might be see this:

madeira login

The top one, is the new “Office 365” account, and the bottom one is your old Microsoft Account. Confusing 🙂

So, if you have an Office 365 subscription, use that account for Madeira, that will give you the option of trying out the Outlook integration pieces and avoid creating another account.

Project Madeira – First Look

Today we got the first public preview of “Project Madeira” – A cloud based ERP system, highly integrated into the Microsoft Cloud world.

Madeira has it roots in Dynamics NAV, a ERP system with 30+ years of accounting knowledge build in, and with an impressive technology stack. NAV is used by thousands and thousands of companies all over the world, ranging from quite big, to small single person shops.

NAV has always be able to detach business logic from technology and this has enabled Microsoft to move the NAV application into the cloud, the result Madeira. This is not quite NAV, but a version of NAV tailored to a “cloud first, mobile first” world. It runs in the browser, and as apps on Windows 10 UWP, iOS and Android phones and tables.

Maderia 0jpg

 

Getting started it a normal web signup, and after a few minutes, the complete system is provisioned and ready to go.

Maderia 1

Compared to standard NAV, Madeira has received a very nice, and very sleek onboarding experience, that will take you though system and familiarize you with navigation and usage.

Maderia 2 Maderia 3

Apart from the onboarding experience, Maderia will also help you getting the setup done quickly and correct.

Maderia 6

A big part of NAV success is the possibilities for customization. Madeira can also be customized, this is done through extensions, and already in this preview version there are three extensions installed:

  • Bank statement integration
  • Paypal payment integration
  • QuickBooks data migration

The last one is interesting, since it indicates, that Microsoft knows, that people are coming from other systems, so getting your data into the system is a critical step for success.

My guess is that many of the already existing extensions for NAV will either work directly with Madeira or they will very quickly get updated to support Madeira.

Maderia 4

Madeira also have NAV’s ability to be a connected system, going from connecting to the rest of the Microsoft Cloud world to external services like invoice OCR’ing.

Maderia 5

In NAV its possible to use Word to create custom layouts for reports, effectively turning Word into a report designer – You can do the same in Madeira, so you are not “stuck” with the stock design.

Maderia 7

The great thing about Madeira, is that you get a mature, deep and complete ERP system. This is new, but still a very mature system that will get things done, the right way.

I’m looking forward to play some more with Madeira, catch me on twitter @eHougaard if you got any questions on Madeira or NAV.

Power BI Embedded – A minimal example

Now we have an embedded version of Power BI. Power BI is a great reporting/BI tools, and the possibilities in embedding it into your own products or site are great.

Microsoft has even gone the whole way, and created a beautiful example application web site, it can be found here:

https://azure.microsoft.com/en-us/documentation/articles/power-bi-embedded-get-started-sample/

The problem is (for me at least), that my world is different, compared to the example. So I really just want a minimal, barebone example that I can extend and build a prototype from.

Here is my minimal recipe for a Embedded Power BI example:

First you need a Power BI Workspace Collection. Go to your Azure Portal, and create one (Search for Power BI after pressing New). This is your “empty” report collection and the starting point for a Power BI Embedded installation.  You need to remember the name and copy one of the Access Keys, we are going to need that later. After this, you’re done in Azure for now. (Presently PowerBI Embedded is only available in South Central US. I was told at /BUILD that with this month it should be available in all Azure data centers.)

To continue, we must create a workspace inside our collection. I’m using C# and there are NuGet (pre-release) availables. The first thing we need, is to log in and get a valid PowerBIClient – To do that, we need the Access Key we just got from the Azure Portal.

static IPowerBIClient CreateClient(PowerBIToken token)
{
    var jwt = token.Generate(<access key from azure portal>);
    var credentials = new TokenCredentials(jwt, "AppToken");
    var client = new PowerBIClient(credentials);
    client.BaseUri = new Uri("https://api.powerbi.com");
    return client;
}        

Next we can use the client to create a workspace, you need to supply the name of the Azure Workspace Collection just created.

Workspace workspace;
var provisionToken = PowerBIToken.CreateProvisionToken(<name of workspace collection>);
using (var client = CreateClient(provisionToken))
{
     workspace = await client.Workspaces.PostWorkspaceAsync(<name of workspace collection>);
}

The newly created workspace has an ID – Get it from workspace.WorkspaceId or from the Azure portal.

The next task is to create a report. Report are in a file format called PBIX. Report can be created with Power BI Desktop (download).

When creating a report, you need to decide where the data is stored. My world is SQL and Azure SQL, so I’m using “Direct Query” – meaning that the data is not bundled with the report. Your mileage may vary. There is a lot of material on creating reports in Power BI, so I’ve skip that part in this minimal example.

After that, we need to upload the report to our workspace:

static async Task<Import> ImportPbix(string workspaceCollectionName, string workspaceId, string datasetName, string filePath)
{
    using (var fileStream = File.OpenRead(filePath))
    {
        // Create a dev token for import
        var devToken = PowerBIToken.CreateDevToken(workspaceCollectionName, workspaceId);
        using (var client = CreateClient(devToken))
        {

            // Import PBIX file from the file stream
            var import = await client.Imports.PostImportWithFileAsync(workspaceCollectionName, workspaceId, fileStream, datasetName);

            // Example of polling the import to check when the import has succeeded.
            while (import.ImportState != "Succeeded" && import.ImportState != "Failed")
            {
                import = await client.Imports.GetImportByIdAsync(workspaceCollectionName, workspaceId, import.Id);
                Console.WriteLine("Checking import state... {0}", import.ImportState);
                Thread.Sleep(1000);
            }

            return import;
        }
    }
}

After import the report also gets an ID.

I’m my case, I’m using a dynamic back end where I have multiple databases, so I need to update the report for a specific database that is different from the database I used when I created the report:

static async Task UpdateConnection(string workspaceCollectionName, string workspaceId)
{
    string connectionString = "data source = <Azure SQL Server>.database.windows.net; initial catalog = <database>; persist security info = True; encrypt = True; trustservercertificate = False";

    var devToken = PowerBIToken.CreateDevToken(workspaceCollectionName, workspaceId);
    using (var client = CreateClient(devToken))
    {
        // Get the newly created dataset from the previous import process
        var datasets = await client.Datasets.GetDatasetsAsync(workspaceCollectionName, workspaceId);

        // Optionally udpate the connectionstring details if preent
        if (!string.IsNullOrWhiteSpace(connectionString))
        {
            var connectionParameters = new Dictionary<string, object>
            {
                { "connectionString", connectionString }
            };
            await client.Datasets.SetAllConnectionsAsync(workspaceCollectionName, 
                                                            workspaceId, 
                                                            datasets.Value[datasets.Value.Count - 1].Id, connectionParameters);
        }

        // Get the datasources from the dataset
        var datasources = await client.Datasets.GetGatewayDatasourcesAsync(workspaceCollectionName, workspaceId, datasets.Value[datasets.Value.Count - 1].Id);

        // Reset your connection credentials
        var delta = new GatewayDatasource
        {
            CredentialType = "Basic",
            BasicCredentials = new BasicCredentials
            {
                Username = <Azure SQL User Name>,
                Password = <Azure SQL Password>                        
            }
        };

        // Update the datasource with the specified credentials
        await client.Gateways.PatchDatasourceAsync(workspaceCollectionName, workspaceId, datasources.Value[datasources.Value.Count - 1].GatewayId, datasources.Value[datasources.Value.Count - 1].Id, delta);
    }
}

Now, we are ready, and the report can be used, lets recap all the “keys”, “tokens”, and “IDs” thats needed:

  • Access Key to the Azure Power BI Workspace Collection
  • ID of the workspace
  • ID of the report

To the an active working instance of the report, we need a “Access Token”. This token, together with the Report Id will be visible to users. The token will expire after a defined period of time.

This function will create an report Access Token that we need to execute the report.

public static async Task<string> Report(string reportId)
{
    var devToken = PowerBIToken.CreateDevToken(<Azure WorkSpace Collection>, <Workspace ID>);
    using (var client = CreateClient(devToken))
    {
        var embedToken = PowerBIToken.CreateReportEmbedToken(<Azure WorkSpace Collection>, <Workspace ID>,reportId);

        return embedToken.Generate(<Access Key>);
    }
}

Then, instead of an entire website, I’ve just create one HTML file that will show the report, there is a bit of Javascript in the beginning and one div section with an iframe. The good pieces are at the bottom, the HTML needs the report Id (3 times) and the Access Token we just generated.

<html>
<script>
(function (powerbi) {
    'use strict';

    powerbi.Embed = Embed;

    function Embed() { }

    Embed.prototype = {
        init: function () {
            var embedUrl = this.getEmbedUrl();
            var iframeHtml = '<iframe style="width:100%;height:100%;" src="' + embedUrl + '" scrolling="no" allowfullscreen="true"></iframe>';
            this.element.innerHTML = iframeHtml;
            this.iframe = this.element.childNodes[0];
            this.iframe.addEventListener('load', this.load.bind(this), false);
        },
        load: function () {
            var computedStyle = window.getComputedStyle(this.element);
            var accessToken = this.getAccessToken();
            
            var initEventArgs = {
                message: {
                    action: this.options.loadAction,
                    accessToken: accessToken,
                    width: computedStyle.width,
                    height: computedStyle.height
                }
            };

            powerbi.utils.raiseCustomEvent(this.element, 'embed-init', initEventArgs);
            this.iframe.contentWindow.postMessage(JSON.stringify(initEventArgs.message), '*');
        },
        getAccessToken: function () {
            var accessToken = this.element.getAttribute('powerbi-access-token');

            if (!accessToken) {
                accessToken = powerbi.accessToken;
                
                if (!accessToken) {
                    throw new Error("No access token was found for element. You must specify an access token directly on the element using attribute 'powerbi-access-token' or specify a global token at: powerbi.accessToken.");
                }
            }

            return accessToken;
        },
        getEmbedUrl: function () {
            return this.element.getAttribute('powerbi-embed');
        },
        fullscreen: function () {
            var elem = this.iframe;

            if (elem.requestFullscreen) {
                elem.requestFullscreen();
            } else if (elem.msRequestFullscreen) {
                elem.msRequestFullscreen();
            } else if (elem.mozRequestFullScreen) {
                elem.mozRequestFullScreen();
            } else if (elem.webkitRequestFullscreen) {
                elem.webkitRequestFullscreen();
            }
        },
        exitFullscreen: function () {
            if (!this.isFullscreen()) {
                return;
            }

            if (document.exitFullscreen) {
                document.exitFullscreen();
            } else if (document.mozCancelFullScreen) {
                document.mozCancelFullScreen();
            } else if (document.webkitExitFullscreen) {
                document.webkitExitFullscreen();
            } else if (document.msExitFullscreen) {
                document.msExitFullscreen();
            }
        },
        isFullscreen: function () {
            var options = ['fullscreenElement', 'webkitFullscreenElement', 'mozFullscreenScreenElement', 'msFullscreenElement'];
            for (var i = 0; i < options.length; i++) {
                if (document[options[i]] === this.iframe) {
                    return true;
                }
            }

            return false;
        }
    };
} (window.powerbi = window.powerbi || {}));
(function(powerbi){
    'use strict';
    
    powerbi.Tile = Tile;
    powerbi.Tile.prototype = powerbi.Embed.prototype;
    
    function Tile(element, options) {
        var me = this;
        
        this.element = element;
        this.options = options || {};
        this.options.loadAction = 'loadTile';
        this.getEmbedUrl = getEmbedUrl;

        this.init();

        ///////////////////////////////

        function getEmbedUrl() {
            var embedUrl = powerbi.Embed.prototype.getEmbedUrl.call(me);
            if (!embedUrl) {
                var dashboardId = me.element.getAttribute('powerbi-dashboard');
                var tileId = me.element.getAttribute('powerbi-tile');

                if (!(dashboardId && tileId)) {
                    throw new Error('dashboardId & tileId are required');
                }

                embedUrl = 'https://app.powerbi.com/embed?dashboardId=' + dashboardId + '&tileId=' + tileId;
            }

            return embedUrl;
        }
    }
}(window.powerbi = window.powerbi || {}));
(function(powerbi){
    'use strict';
    
    powerbi.Report = Report;
    powerbi.Report.prototype = powerbi.Embed.prototype;
    
    function Report(element, options) {
        var me = this;
        
        this.element = element;
        this.options = options || {};
        this.options.loadAction = 'loadReport';
        this.getEmbedUrl = getEmbedUrl;

        this.init();

        ///////////////////////////////

        function getEmbedUrl() {
            var embedUrl = powerbi.Embed.prototype.getEmbedUrl.call(me);
            if (!embedUrl) {
                var reportId = me.element.getAttribute('powerbi-report');

                if (!reportId) {
                    throw new Error('reportId is required');
                }

                embedUrl = 'https://app.powerbi.com/reportEmbed?reportId=' + reportId;
            }

            return embedUrl;
        }
    }
}(window.powerbi = window.powerbi || {}));
(function (powerbi) {
    'use strict';
    
    powerbi.utils = {
        raiseCustomEvent: raiseCustomEvent
    };

    function raiseCustomEvent(element, eventName, eventData) {
        var customEvent;
        if (typeof (window.CustomEvent) === 'function') {
            customEvent = new CustomEvent(eventName, {
                detail: eventData,
                bubbles: true,
                cancelable: true
            });
        } else {
            customEvent = document.createEvent('CustomEvent');
            customEvent.initCustomEvent(eventName, true, true, eventData);
        }

        element.dispatchEvent(customEvent);
        if (customEvent.defaultPrevented || !customEvent.returnValue) {
            return;
        }

        var inlineEventAttr = 'on' + eventName.replace('-', '');
        var inlineScript = element.getAttribute(inlineEventAttr);
        if (inlineScript) {
            eval.call(element, inlineScript);
        }
    }
} (window.powerbi = window.powerbi || {}));
(function (powerbi) {
    'use strict';

    var embeds = [];
    var componentTypes = [];

    powerbi.get = get;
    powerbi.embed = embed;
    powerbi.init = init;

    activate();
    
    //////////////////////////////////    
    
    function activate() {
        window.addEventListener('DOMContentLoaded', init, false);
        window.addEventListener('message', onReceiveMessage, false);

        componentTypes = [
            { type: 'powerbi-tile', component: powerbi.Tile },
            { type: 'powerbi-report', component: powerbi.Report }
        ];
    }

    var EmbedEventMap = {
        'tileClicked': 'tile-click',
        'tileLoaded': 'tile-load',
        'reportPageLoaded': 'report-load'
    };

    function init(container) {
        container = (container && container instanceof HTMLElement) ? container : document.body;
        
        var components = container.querySelectorAll('[powerbi-embed]');
        for (var i = 0; i < components.length; i++) {
            embed(components[i]);
        }
    }

    function get(element) {
        return element.powerBIEmbed || embed(element);
    }

    function embed(element) {
        var instance;

        if (element.powerBIEmbed) {
            return element.powerBIEmbed;
        }

        for (var j = 0; j < componentTypes.length; j++) {
            var componentType = componentTypes[j];

            if (element.getAttribute(componentType.type) !== null) {
                instance = new componentType.component(element);
                element.powerBIEmbed = instance;
                embeds.push(instance);
                break;
            }
        }

        return instance;
    }

    function onReceiveMessage(event) {
        if (!event) {
            return;
        }

        try {
            var messageData = JSON.parse(event.data);
            for (var i = 0; i < embeds.length; i++) {
                var embed = embeds[i];

                // Only raise the event on the embed that matches the post message origin
                if (event.source === embed.iframe.contentWindow) {
                    powerbi.utils.raiseCustomEvent(embed.element, EmbedEventMap[messageData.event], messageData);
                }
            }
        }
        catch (e) {
            if (typeof (window.powerbi.onError) === 'function') {
                window.powerbi.onError.call(window, e);
            }
        }
    }
} (window.powerbi = window.powerbi || {}));
</script>
<body>
<div id="Report" 
     powerbi-access-token="<The access token just generated>>" 
     powerbi-embed="https://embedded.powerbi.com/appTokenReportEmbed?reportId=<The Report ID>" 
     powerbi-report="<The Report ID>" 
     style="height:85vh">
     <iframe style="width:100%;height:100%;" src="https://embedded.powerbi.com/appTokenReportEmbed?reportId=<The Report ID>" scrolling="no" allowfullscreen="true"></iframe></div>
</body>
</html>

Open the HTML file in a browser, and the report is available to you:

powerbi-2.

Hope this will you get going faster with Power BI Embedded.

Let’s clean up NAV – Telex

Back in the old days Telex was the most advanced form of electronic communication. But since the 80’ties and 90’ties this has been taken over by Fax and Email.

NAV still have fields for Telex No.  on Customer, Vendors and Contacts, perhaps its time to send this technology to rest.

The fields are not on the UI, but still in the database.

telex

telex2

And the same time, either modernize the current communication setup, so it could be possible to have a cell phone fields or perhaps other fields for newer forms of communication.

 

 

This months CU’s are out..

Grab ’em while they’re hot 🙂

2016 CU5 – https://blogs.msdn.microsoft.com/nav/2016/03/07/cumulative-update-5-for-microsoft-dynamics-nav-2016-has-been-released/

2015 CU17 – https://blogs.msdn.microsoft.com/nav/2016/03/07/cumulative-update-17-for-microsoft-dynamics-nav-2015-has-been-released/

2013R2 CU29 – https://blogs.msdn.microsoft.com/nav/2016/03/07/cumulative-update-29-for-microsoft-dynamics-nav-2013-r2-has-been-released/

2013 CU36 – https://blogs.msdn.microsoft.com/nav/2016/03/07/cumulative-update-36-for-microsoft-dynamics-nav-2013-has-been-released/

And a very nice surprise, hotfixes are no longer distributed as a ZIP inside a EXE, just a regular self extracting EXE – Nice !
Not true, sometimes I get fooled by my own cleverness 🙂 (But it would be nice)

Dynamics NAV Server cannot start after enabling soap or odata

Sometimes when you turn on SOAP or OData services, the service tier starts and stops again, and give you this error:

The service MicrosoftDynamicsNavServer$DynamicsNAV90 failed to start. This could be caused by a configuration error. Detailed error information: System.ServiceModel.AddressAccessDeniedException: HTTP could not register URL http://+:9047/DynamicsNAV90/. Your process does not have access rights to this namespace (see http://go.microsoft.com/fwlink/?LinkId=70353 for details). ---> System.Net.HttpListenerException: Access is denied

This is WCF failing to reserve the URL inside http.sys .

The solution is quite easy, just perform the registration yourself, open a command line and execute the following command:

netsh http add urlacl url=http://+:9047/DynamicsNAV90/ user="NT AUTHORITY\NETWORK SERVICE"

Make sure to specify the correct port number, instance name and the user name of the user running the service tier (In my case here, “NETWORK SERVICE”).

 

 

The new code editor in Dynamics NAV

I have really tried to like, even love this editor. But every time I’m using it, it keeps fighting the C/AL syntax and the way C/AL is usually typed. This is actually a step back in productivity for me.

The editor is missing several context aware features that is essential for programming in C/AL – and it even changes correctly typed code into wrong code.

1. Inside a field name with multiple name parts.
Try to type:
GLEntry.SETFILTER(“Dimension Set ID”);
This is what you’ll get if you just type:
GLEntry.SETFILTER(“Dimension SETDEFAULTTABLECONNECTION ID”);

Try to type:
rec.SETRANGE(“Account No.”);
This is what you’ll get if you just type:
rec.SETRANGE(“Account NORMALDATE”);
Cause the dot in No. gets completed to NORMALDATE

2. Field Name parameters to functions like Rec.SETRANGE()

3. Parameters to functions that takes multiple field name like SETCURRENTKEY, CALCSUMS and CALCFIELDS.

4. You can place the cursor on the line for a function definition, but you cannot do anything more. Accept a Enter Key press to insert a new line at the top of a code section would be very nice.

5. You can DOT yourself via intellisense into the options on a option field, and  the source will written with a dot separator, but not the required :: separation. If you type colon the system does this correctly.

6. If a field in Rec has multiple name parts, and you start typing with the double quote ” the intellisense will not show the fields names.

7. Visual Studio is clever enough to figure out that if you type a name or symbol that will un-indent the line. NAV does not do that, so I end up spending more time on fixing indentation than I would if the editor didn’t try to “help” me.

8. When indented, and I want a blank line and pressing enter twice, that will often result in a cursor placement in column 1. (Leftmost)

The TODO list to salvage this editor.

Learn the language – The parser of C/AL needs to understand Pascal based syntax, this is the basis for any “intellisense” functionality to be intelligent.

If I just type code, letter by letter, the intellisense should NEVER change my correctly typed code into wrong code, this is the first rule for any “smart editor”.

Be aware of the open ” (double quote) state – this would solve most of my issues. Either figure out what field list to show as “intellisense” or simply don’t show anything.

There is an option to use the old editor with FINSQL.EXE

[useoldeditor=<yes|no>]

But as I started out, I really want to use the new one, so here’s hoping for the next cumulative updates 🙂

Export NAV license from SQL Server

Microsoft Dynamics NAV does not give you an easy an option to export the NAV license from a SQL Server.

exportlicense1

You can upload to the server, or temporary change the license that is used in the development client, but you cannot download it again. Usually, that is not a big deal, you got your license file anyway, but in some cases, its practical to be able to export the license.

With NAV, the license can be placed in two different places, in the master database or in separate databases.

exportlicense3

If you check the “Save license in database”, the licens goes into the database, otherwise its stored on the server.

Introducing a simple tool to perform the export

exportlicense2

This tools will export the license file from the SQL server and database specified. If you leave the database field empty, the tools will export the license stored in the master database.

Download from here: ExportLicense

(13/2/2016: Updated to better handle SQL servers with different collations)

Or grab the source from github:
https://github.com/hougaard/ExportLicense

Change your way – Functional Programming In Dynamics NAV

I’m always ranting about “do not use global variables” in Dynamics NAV, this article is an attempt to give some backgrounds on the ranting.

In Microsoft Dynamics NAV we program in a language called C/AL, this language is related to Pascal, a language created by Nicklaus Wirth in the late 60’ties. Pascal belongs in a language family with Modula-2, Oberon, Delphi and many others.

Fun fact, one of the first Pascal compilers was written in Pascal and translated by hand into machine code by Wirth and his students.

And a fun relation between Dynamics and the history of Pascal. The man behind some of the most success Pascal compilers, PolyPascal, Turbo Pascal and Delphi is Anders Hejlsberg and his brother is Thomas Hejlsberg, one of the architects behind Dynamics NAV.

In the late 80’ties when PC-Plus (later Navision Software, later Microsoft) designed the first version of AL, Pascal was the most popular programming language in Denmark and a natural language choice for a new development system.

The first version of AL (character based version 3.00) worked with objects just as we know it today, but it was very limited compared to what we have today. The name later changed to C/AL with the release of the first Windows version, called Financials.

One of the main limits in AL was the lack of local variables. This of cause  lead to a heavy use of global variables (just called variables since there where no local><global definition). This continued into Financials even though Financials did have locals. The reason was, that parts of Financials was actually coded in the old character based version and the converted to Financials with the auto-converter.

So the reason that we are still fighting with less than optimal designed codeunit 80 and 90, is that they where programmed in a environment without local variables, limited call stack levels and other limitations.

So why does local variables matter?

First we need to look to another branch of programming. A branch called functional programming. The origin is math and a language called Lisp. Lisp is older than Pascal, and actually one of the very first programming languages. Lisp is often ridiculed due to a massive usage of parentheses and the beauty is overlooked.

I had my first encounter with Lisp at Copenhagen University, where they decided that we should learn a language that no-one had ever used, that turned out to be Lisp, followed up with ML. I learned a lot 🙂

In (purest form of) Lisp you don’t have global variables, and the only local variables are parameters to functions. So everything needs to be functions with parameters and return values.

This can be much harder to program, but is much easier to test.

There are plenty of other languages that follows the tradition started by Lisp – My current favorite is F# but I do have fond memories of ML (If I could find a compiler that worked back in the 90’ties).

The style of programming in these languages is called Functional Programming.

So what are some of the features in Functional Programming?

First-class and higher-order functions

A function is a value, just like an Int or a String. You can pass a function as parameter to another function. This is not really possible in NAV, only by some clever .NET tricks for creating delegates.
A classic use case for this, could be a sorting function that takes a comparison function as a parameter.

Pure functions

Think of a function as a mathematical function, it takes parameters and returns a value, nothing more nothing less. No side effects. When we throw a database in mix, this get more complicated.

Recursion

Recursion is not a functional programming exclusive feature, recursion was actually invented before Lisp, in the language ALGOL, by another Dane, the late Peter Naur. But recursion does not work well in a global variable dominated system.  Recursion can place a bigger strain on memory usage unless system support tailcall optimization a method to reduce the amount of call stack memory needed for keeping each recursion level in memory.

No side-effects

A side-effect is when a piece of code modifies anything outside its scope. A global variable is outside the scope of a function. The database is also very outside.

In a pure functional environment, there are no side effects.

Classic code mistakes in Dynamics NAV

So if we start looking at NAV code, with the knowledge of functional programming, what “mistakes” could we try to avoid:

Global variables should be avoided. Global variables, if used, should be immutable, a fancy computer science word for “readonly after create”, once written, never overwritten.

Hang over filters” – This is actually globals-on-globals. When people reuse the same Record variable and the properties on the Record stays on the Record. So everytime you use the .RESET function on a record variable, you’re properly doing something wrong. If Microsoft removed .RESET method, your code would actually improve, cause you would be forced to think about scope, and in a lot of places place inner code in loops in separate functions.

Real recursion only works if the globals are reduced to immutable usage.

Big functions – The only reason that functions can get so big are globals (again!) So if you reduce the amounts of globals, your functions will shrink in size.

What about testing?

A function that only takes parameters, and returns a value is very easy to test. You can just call the function and verify the return value with a test codeunit. But if you need to setup global variables and environment, then it suddenly becomes difficult to test the function, and usually you start your testing at a higher level. And if the higher level also has the same issues, testing becomes hard.

Conclusion – TL’DR

These ideas has been a integrated part of how I program for many years. Not that I’m “holy” or even close to that. But I do believe in “borrowing” good ideas from all corners of the computer science world. And functional programming has, for me, a very positive influence on how programs get structured and how easy they are to test and maintain.

There are lots of other aspects in functional programming that I didn’t cover in this already too long blog post, so reach me on Twitter if you want to talk function programming in Dynamics NAV.

(*) Thanks to www.xkcd.org for the comics to live up this boring tale.