NAVUG Focus 2016

Focus 2016 Logo NAVUG_350px

This week I’ll be talking at NAVUG Focus 2016 in Chicago. My talk is called:

12 Things You Always Wanted to Know About the Cloud
But Were Afraid to Ask

If you got a question about Microsoft Dynamics NAV and “The Cloud” send me a tweet on @eHougaard

I will also be part of the “Ask the Experts” panel for questions on everything else.

What is a Project Madeira Account?

A Madeira account is an Office 365 account, also known as an “work or school account”.

If you are using your Microsoft Account to sign up for Madeira, you are actually adding a “new” Microsoft Account to your email. So the next time you log in with that account, you might be see this:

madeira login

The top one, is the new “Office 365” account, and the bottom one is your old Microsoft Account. Confusing 🙂

So, if you have an Office 365 subscription, use that account for Madeira, that will give you the option of trying out the Outlook integration pieces and avoid creating another account.

Project Madeira – First Look

Today we got the first public preview of “Project Madeira” – A cloud based ERP system, highly integrated into the Microsoft Cloud world.

Madeira has it roots in Dynamics NAV, a ERP system with 30+ years of accounting knowledge build in, and with an impressive technology stack. NAV is used by thousands and thousands of companies all over the world, ranging from quite big, to small single person shops.

NAV has always be able to detach business logic from technology and this has enabled Microsoft to move the NAV application into the cloud, the result Madeira. This is not quite NAV, but a version of NAV tailored to a “cloud first, mobile first” world. It runs in the browser, and as apps on Windows 10 UWP, iOS and Android phones and tables.

Maderia 0jpg

 

Getting started it a normal web signup, and after a few minutes, the complete system is provisioned and ready to go.

Maderia 1

Compared to standard NAV, Madeira has received a very nice, and very sleek onboarding experience, that will take you though system and familiarize you with navigation and usage.

Maderia 2 Maderia 3

Apart from the onboarding experience, Maderia will also help you getting the setup done quickly and correct.

Maderia 6

A big part of NAV success is the possibilities for customization. Madeira can also be customized, this is done through extensions, and already in this preview version there are three extensions installed:

  • Bank statement integration
  • Paypal payment integration
  • QuickBooks data migration

The last one is interesting, since it indicates, that Microsoft knows, that people are coming from other systems, so getting your data into the system is a critical step for success.

My guess is that many of the already existing extensions for NAV will either work directly with Madeira or they will very quickly get updated to support Madeira.

Maderia 4

Madeira also have NAV’s ability to be a connected system, going from connecting to the rest of the Microsoft Cloud world to external services like invoice OCR’ing.

Maderia 5

In NAV its possible to use Word to create custom layouts for reports, effectively turning Word into a report designer – You can do the same in Madeira, so you are not “stuck” with the stock design.

Maderia 7

The great thing about Madeira, is that you get a mature, deep and complete ERP system. This is new, but still a very mature system that will get things done, the right way.

I’m looking forward to play some more with Madeira, catch me on twitter @eHougaard if you got any questions on Madeira or NAV.

Power BI Embedded – A minimal example

Now we have an embedded version of Power BI. Power BI is a great reporting/BI tools, and the possibilities in embedding it into your own products or site are great.

Microsoft has even gone the whole way, and created a beautiful example application web site, it can be found here:

https://azure.microsoft.com/en-us/documentation/articles/power-bi-embedded-get-started-sample/

The problem is (for me at least), that my world is different, compared to the example. So I really just want a minimal, barebone example that I can extend and build a prototype from.

Here is my minimal recipe for a Embedded Power BI example:

First you need a Power BI Workspace Collection. Go to your Azure Portal, and create one (Search for Power BI after pressing New). This is your “empty” report collection and the starting point for a Power BI Embedded installation.  You need to remember the name and copy one of the Access Keys, we are going to need that later. After this, you’re done in Azure for now. (Presently PowerBI Embedded is only available in South Central US. I was told at /BUILD that with this month it should be available in all Azure data centers.)

To continue, we must create a workspace inside our collection. I’m using C# and there are NuGet (pre-release) availables. The first thing we need, is to log in and get a valid PowerBIClient – To do that, we need the Access Key we just got from the Azure Portal.

static IPowerBIClient CreateClient(PowerBIToken token)
{
    var jwt = token.Generate(<access key from azure portal>);
    var credentials = new TokenCredentials(jwt, "AppToken");
    var client = new PowerBIClient(credentials);
    client.BaseUri = new Uri("https://api.powerbi.com");
    return client;
}        

Next we can use the client to create a workspace, you need to supply the name of the Azure Workspace Collection just created.

Workspace workspace;
var provisionToken = PowerBIToken.CreateProvisionToken(<name of workspace collection>);
using (var client = CreateClient(provisionToken))
{
     workspace = await client.Workspaces.PostWorkspaceAsync(<name of workspace collection>);
}

The newly created workspace has an ID – Get it from workspace.WorkspaceId or from the Azure portal.

The next task is to create a report. Report are in a file format called PBIX. Report can be created with Power BI Desktop (download).

When creating a report, you need to decide where the data is stored. My world is SQL and Azure SQL, so I’m using “Direct Query” – meaning that the data is not bundled with the report. Your mileage may vary. There is a lot of material on creating reports in Power BI, so I’ve skip that part in this minimal example.

After that, we need to upload the report to our workspace:

static async Task<Import> ImportPbix(string workspaceCollectionName, string workspaceId, string datasetName, string filePath)
{
    using (var fileStream = File.OpenRead(filePath))
    {
        // Create a dev token for import
        var devToken = PowerBIToken.CreateDevToken(workspaceCollectionName, workspaceId);
        using (var client = CreateClient(devToken))
        {

            // Import PBIX file from the file stream
            var import = await client.Imports.PostImportWithFileAsync(workspaceCollectionName, workspaceId, fileStream, datasetName);

            // Example of polling the import to check when the import has succeeded.
            while (import.ImportState != "Succeeded" && import.ImportState != "Failed")
            {
                import = await client.Imports.GetImportByIdAsync(workspaceCollectionName, workspaceId, import.Id);
                Console.WriteLine("Checking import state... {0}", import.ImportState);
                Thread.Sleep(1000);
            }

            return import;
        }
    }
}

After import the report also gets an ID.

I’m my case, I’m using a dynamic back end where I have multiple databases, so I need to update the report for a specific database that is different from the database I used when I created the report:

static async Task UpdateConnection(string workspaceCollectionName, string workspaceId)
{
    string connectionString = "data source = <Azure SQL Server>.database.windows.net; initial catalog = <database>; persist security info = True; encrypt = True; trustservercertificate = False";

    var devToken = PowerBIToken.CreateDevToken(workspaceCollectionName, workspaceId);
    using (var client = CreateClient(devToken))
    {
        // Get the newly created dataset from the previous import process
        var datasets = await client.Datasets.GetDatasetsAsync(workspaceCollectionName, workspaceId);

        // Optionally udpate the connectionstring details if preent
        if (!string.IsNullOrWhiteSpace(connectionString))
        {
            var connectionParameters = new Dictionary<string, object>
            {
                { "connectionString", connectionString }
            };
            await client.Datasets.SetAllConnectionsAsync(workspaceCollectionName, 
                                                            workspaceId, 
                                                            datasets.Value[datasets.Value.Count - 1].Id, connectionParameters);
        }

        // Get the datasources from the dataset
        var datasources = await client.Datasets.GetGatewayDatasourcesAsync(workspaceCollectionName, workspaceId, datasets.Value[datasets.Value.Count - 1].Id);

        // Reset your connection credentials
        var delta = new GatewayDatasource
        {
            CredentialType = "Basic",
            BasicCredentials = new BasicCredentials
            {
                Username = <Azure SQL User Name>,
                Password = <Azure SQL Password>                        
            }
        };

        // Update the datasource with the specified credentials
        await client.Gateways.PatchDatasourceAsync(workspaceCollectionName, workspaceId, datasources.Value[datasources.Value.Count - 1].GatewayId, datasources.Value[datasources.Value.Count - 1].Id, delta);
    }
}

Now, we are ready, and the report can be used, lets recap all the “keys”, “tokens”, and “IDs” thats needed:

  • Access Key to the Azure Power BI Workspace Collection
  • ID of the workspace
  • ID of the report

To the an active working instance of the report, we need a “Access Token”. This token, together with the Report Id will be visible to users. The token will expire after a defined period of time.

This function will create an report Access Token that we need to execute the report.

public static async Task<string> Report(string reportId)
{
    var devToken = PowerBIToken.CreateDevToken(<Azure WorkSpace Collection>, <Workspace ID>);
    using (var client = CreateClient(devToken))
    {
        var embedToken = PowerBIToken.CreateReportEmbedToken(<Azure WorkSpace Collection>, <Workspace ID>,reportId);

        return embedToken.Generate(<Access Key>);
    }
}

Then, instead of an entire website, I’ve just create one HTML file that will show the report, there is a bit of Javascript in the beginning and one div section with an iframe. The good pieces are at the bottom, the HTML needs the report Id (3 times) and the Access Token we just generated.

<html>
<script>
(function (powerbi) {
    'use strict';

    powerbi.Embed = Embed;

    function Embed() { }

    Embed.prototype = {
        init: function () {
            var embedUrl = this.getEmbedUrl();
            var iframeHtml = '<iframe style="width:100%;height:100%;" src="' + embedUrl + '" scrolling="no" allowfullscreen="true"></iframe>';
            this.element.innerHTML = iframeHtml;
            this.iframe = this.element.childNodes[0];
            this.iframe.addEventListener('load', this.load.bind(this), false);
        },
        load: function () {
            var computedStyle = window.getComputedStyle(this.element);
            var accessToken = this.getAccessToken();
            
            var initEventArgs = {
                message: {
                    action: this.options.loadAction,
                    accessToken: accessToken,
                    width: computedStyle.width,
                    height: computedStyle.height
                }
            };

            powerbi.utils.raiseCustomEvent(this.element, 'embed-init', initEventArgs);
            this.iframe.contentWindow.postMessage(JSON.stringify(initEventArgs.message), '*');
        },
        getAccessToken: function () {
            var accessToken = this.element.getAttribute('powerbi-access-token');

            if (!accessToken) {
                accessToken = powerbi.accessToken;
                
                if (!accessToken) {
                    throw new Error("No access token was found for element. You must specify an access token directly on the element using attribute 'powerbi-access-token' or specify a global token at: powerbi.accessToken.");
                }
            }

            return accessToken;
        },
        getEmbedUrl: function () {
            return this.element.getAttribute('powerbi-embed');
        },
        fullscreen: function () {
            var elem = this.iframe;

            if (elem.requestFullscreen) {
                elem.requestFullscreen();
            } else if (elem.msRequestFullscreen) {
                elem.msRequestFullscreen();
            } else if (elem.mozRequestFullScreen) {
                elem.mozRequestFullScreen();
            } else if (elem.webkitRequestFullscreen) {
                elem.webkitRequestFullscreen();
            }
        },
        exitFullscreen: function () {
            if (!this.isFullscreen()) {
                return;
            }

            if (document.exitFullscreen) {
                document.exitFullscreen();
            } else if (document.mozCancelFullScreen) {
                document.mozCancelFullScreen();
            } else if (document.webkitExitFullscreen) {
                document.webkitExitFullscreen();
            } else if (document.msExitFullscreen) {
                document.msExitFullscreen();
            }
        },
        isFullscreen: function () {
            var options = ['fullscreenElement', 'webkitFullscreenElement', 'mozFullscreenScreenElement', 'msFullscreenElement'];
            for (var i = 0; i < options.length; i++) {
                if (document[options[i]] === this.iframe) {
                    return true;
                }
            }

            return false;
        }
    };
} (window.powerbi = window.powerbi || {}));
(function(powerbi){
    'use strict';
    
    powerbi.Tile = Tile;
    powerbi.Tile.prototype = powerbi.Embed.prototype;
    
    function Tile(element, options) {
        var me = this;
        
        this.element = element;
        this.options = options || {};
        this.options.loadAction = 'loadTile';
        this.getEmbedUrl = getEmbedUrl;

        this.init();

        ///////////////////////////////

        function getEmbedUrl() {
            var embedUrl = powerbi.Embed.prototype.getEmbedUrl.call(me);
            if (!embedUrl) {
                var dashboardId = me.element.getAttribute('powerbi-dashboard');
                var tileId = me.element.getAttribute('powerbi-tile');

                if (!(dashboardId && tileId)) {
                    throw new Error('dashboardId & tileId are required');
                }

                embedUrl = 'https://app.powerbi.com/embed?dashboardId=' + dashboardId + '&tileId=' + tileId;
            }

            return embedUrl;
        }
    }
}(window.powerbi = window.powerbi || {}));
(function(powerbi){
    'use strict';
    
    powerbi.Report = Report;
    powerbi.Report.prototype = powerbi.Embed.prototype;
    
    function Report(element, options) {
        var me = this;
        
        this.element = element;
        this.options = options || {};
        this.options.loadAction = 'loadReport';
        this.getEmbedUrl = getEmbedUrl;

        this.init();

        ///////////////////////////////

        function getEmbedUrl() {
            var embedUrl = powerbi.Embed.prototype.getEmbedUrl.call(me);
            if (!embedUrl) {
                var reportId = me.element.getAttribute('powerbi-report');

                if (!reportId) {
                    throw new Error('reportId is required');
                }

                embedUrl = 'https://app.powerbi.com/reportEmbed?reportId=' + reportId;
            }

            return embedUrl;
        }
    }
}(window.powerbi = window.powerbi || {}));
(function (powerbi) {
    'use strict';
    
    powerbi.utils = {
        raiseCustomEvent: raiseCustomEvent
    };

    function raiseCustomEvent(element, eventName, eventData) {
        var customEvent;
        if (typeof (window.CustomEvent) === 'function') {
            customEvent = new CustomEvent(eventName, {
                detail: eventData,
                bubbles: true,
                cancelable: true
            });
        } else {
            customEvent = document.createEvent('CustomEvent');
            customEvent.initCustomEvent(eventName, true, true, eventData);
        }

        element.dispatchEvent(customEvent);
        if (customEvent.defaultPrevented || !customEvent.returnValue) {
            return;
        }

        var inlineEventAttr = 'on' + eventName.replace('-', '');
        var inlineScript = element.getAttribute(inlineEventAttr);
        if (inlineScript) {
            eval.call(element, inlineScript);
        }
    }
} (window.powerbi = window.powerbi || {}));
(function (powerbi) {
    'use strict';

    var embeds = [];
    var componentTypes = [];

    powerbi.get = get;
    powerbi.embed = embed;
    powerbi.init = init;

    activate();
    
    //////////////////////////////////    
    
    function activate() {
        window.addEventListener('DOMContentLoaded', init, false);
        window.addEventListener('message', onReceiveMessage, false);

        componentTypes = [
            { type: 'powerbi-tile', component: powerbi.Tile },
            { type: 'powerbi-report', component: powerbi.Report }
        ];
    }

    var EmbedEventMap = {
        'tileClicked': 'tile-click',
        'tileLoaded': 'tile-load',
        'reportPageLoaded': 'report-load'
    };

    function init(container) {
        container = (container && container instanceof HTMLElement) ? container : document.body;
        
        var components = container.querySelectorAll('[powerbi-embed]');
        for (var i = 0; i < components.length; i++) {
            embed(components[i]);
        }
    }

    function get(element) {
        return element.powerBIEmbed || embed(element);
    }

    function embed(element) {
        var instance;

        if (element.powerBIEmbed) {
            return element.powerBIEmbed;
        }

        for (var j = 0; j < componentTypes.length; j++) {
            var componentType = componentTypes[j];

            if (element.getAttribute(componentType.type) !== null) {
                instance = new componentType.component(element);
                element.powerBIEmbed = instance;
                embeds.push(instance);
                break;
            }
        }

        return instance;
    }

    function onReceiveMessage(event) {
        if (!event) {
            return;
        }

        try {
            var messageData = JSON.parse(event.data);
            for (var i = 0; i < embeds.length; i++) {
                var embed = embeds[i];

                // Only raise the event on the embed that matches the post message origin
                if (event.source === embed.iframe.contentWindow) {
                    powerbi.utils.raiseCustomEvent(embed.element, EmbedEventMap[messageData.event], messageData);
                }
            }
        }
        catch (e) {
            if (typeof (window.powerbi.onError) === 'function') {
                window.powerbi.onError.call(window, e);
            }
        }
    }
} (window.powerbi = window.powerbi || {}));
</script>
<body>
<div id="Report" 
     powerbi-access-token="<The access token just generated>>" 
     powerbi-embed="https://embedded.powerbi.com/appTokenReportEmbed?reportId=<The Report ID>" 
     powerbi-report="<The Report ID>" 
     style="height:85vh">
     <iframe style="width:100%;height:100%;" src="https://embedded.powerbi.com/appTokenReportEmbed?reportId=<The Report ID>" scrolling="no" allowfullscreen="true"></iframe></div>
</body>
</html>

Open the HTML file in a browser, and the report is available to you:

powerbi-2.

Hope this will you get going faster with Power BI Embedded.

Let’s clean up NAV – Telex

Back in the old days Telex was the most advanced form of electronic communication. But since the 80’ties and 90’ties this has been taken over by Fax and Email.

NAV still have fields for Telex No.  on Customer, Vendors and Contacts, perhaps its time to send this technology to rest.

The fields are not on the UI, but still in the database.

telex

telex2

And the same time, either modernize the current communication setup, so it could be possible to have a cell phone fields or perhaps other fields for newer forms of communication.

 

 

This months CU’s are out..

Grab ’em while they’re hot 🙂

2016 CU5 – https://blogs.msdn.microsoft.com/nav/2016/03/07/cumulative-update-5-for-microsoft-dynamics-nav-2016-has-been-released/

2015 CU17 – https://blogs.msdn.microsoft.com/nav/2016/03/07/cumulative-update-17-for-microsoft-dynamics-nav-2015-has-been-released/

2013R2 CU29 – https://blogs.msdn.microsoft.com/nav/2016/03/07/cumulative-update-29-for-microsoft-dynamics-nav-2013-r2-has-been-released/

2013 CU36 – https://blogs.msdn.microsoft.com/nav/2016/03/07/cumulative-update-36-for-microsoft-dynamics-nav-2013-has-been-released/

And a very nice surprise, hotfixes are no longer distributed as a ZIP inside a EXE, just a regular self extracting EXE – Nice !
Not true, sometimes I get fooled by my own cleverness 🙂 (But it would be nice)

Dynamics NAV Server cannot start after enabling soap or odata

Sometimes when you turn on SOAP or OData services, the service tier starts and stops again, and give you this error:

The service MicrosoftDynamicsNavServer$DynamicsNAV90 failed to start. This could be caused by a configuration error. Detailed error information: System.ServiceModel.AddressAccessDeniedException: HTTP could not register URL http://+:9047/DynamicsNAV90/. Your process does not have access rights to this namespace (see http://go.microsoft.com/fwlink/?LinkId=70353 for details). ---> System.Net.HttpListenerException: Access is denied

This is WCF failing to reserve the URL inside http.sys .

The solution is quite easy, just perform the registration yourself, open a command line and execute the following command:

netsh http add urlacl url=http://+:9047/DynamicsNAV90/ user="NT AUTHORITY\NETWORK SERVICE"

Make sure to specify the correct port number, instance name and the user name of the user running the service tier (In my case here, “NETWORK SERVICE”).

 

 

The new code editor in Dynamics NAV

I have really tried to like, even love this editor. But every time I’m using it, it keeps fighting the C/AL syntax and the way C/AL is usually typed. This is actually a step back in productivity for me.

The editor is missing several context aware features that is essential for programming in C/AL – and it even changes correctly typed code into wrong code.

1. Inside a field name with multiple name parts.
Try to type:
GLEntry.SETFILTER(“Dimension Set ID”);
This is what you’ll get if you just type:
GLEntry.SETFILTER(“Dimension SETDEFAULTTABLECONNECTION ID”);

Try to type:
rec.SETRANGE(“Account No.”);
This is what you’ll get if you just type:
rec.SETRANGE(“Account NORMALDATE”);
Cause the dot in No. gets completed to NORMALDATE

2. Field Name parameters to functions like Rec.SETRANGE()

3. Parameters to functions that takes multiple field name like SETCURRENTKEY, CALCSUMS and CALCFIELDS.

4. You can place the cursor on the line for a function definition, but you cannot do anything more. Accept a Enter Key press to insert a new line at the top of a code section would be very nice.

5. You can DOT yourself via intellisense into the options on a option field, and  the source will written with a dot separator, but not the required :: separation. If you type colon the system does this correctly.

6. If a field in Rec has multiple name parts, and you start typing with the double quote ” the intellisense will not show the fields names.

7. Visual Studio is clever enough to figure out that if you type a name or symbol that will un-indent the line. NAV does not do that, so I end up spending more time on fixing indentation than I would if the editor didn’t try to “help” me.

8. When indented, and I want a blank line and pressing enter twice, that will often result in a cursor placement in column 1. (Leftmost)

The TODO list to salvage this editor.

Learn the language – The parser of C/AL needs to understand Pascal based syntax, this is the basis for any “intellisense” functionality to be intelligent.

If I just type code, letter by letter, the intellisense should NEVER change my correctly typed code into wrong code, this is the first rule for any “smart editor”.

Be aware of the open ” (double quote) state – this would solve most of my issues. Either figure out what field list to show as “intellisense” or simply don’t show anything.

There is an option to use the old editor with FINSQL.EXE

[useoldeditor=<yes|no>]

But as I started out, I really want to use the new one, so here’s hoping for the next cumulative updates 🙂

Export NAV license from SQL Server

Microsoft Dynamics NAV does not give you an easy an option to export the NAV license from a SQL Server.

exportlicense1

You can upload to the server, or temporary change the license that is used in the development client, but you cannot download it again. Usually, that is not a big deal, you got your license file anyway, but in some cases, its practical to be able to export the license.

With NAV, the license can be placed in two different places, in the master database or in separate databases.

exportlicense3

If you check the “Save license in database”, the licens goes into the database, otherwise its stored on the server.

Introducing a simple tool to perform the export

exportlicense2

This tools will export the license file from the SQL server and database specified. If you leave the database field empty, the tools will export the license stored in the master database.

Download from here: ExportLicense

(13/2/2016: Updated to better handle SQL servers with different collations)

Or grab the source from github:
https://github.com/hougaard/ExportLicense