Quantcast
Viewing all 97 articles
Browse latest View live

Power BI and Dynamics AX: Part 2: Extract and Transform

 

This is part two continuing from Power BI and Dynamics AX: Part 1: Introduction.

This post will focus on the steps involved in extracting data from Dynamics AX into PowerQuery and transforming that data ready to be exposed in your visualisations, or used in PowerQ&A.

Extracting Data via OData

The first few steps of this process would typically be driven by an IT Professional within the business. It requires some knowledge of the structure of data within Dynamics AX to identify the correct query.

To extract data from Dynamics AX, we will be using OData (Open Data Protocol) services. OData allows us to expose a specific query from Dynamics AX for use in external applications. Which queries are available through OData is managed from within AX in Document Data Sources.

Image may be NSFW.
Clik here to view.

Organisational Administration \ Setup \ Document Management \ Document Data Sources

A detailed guide to setting up your Data source can be found here.

Once you have setup your data source, you can view all your published feeds from this URL (Replace <Server Name> with your AOS Server):

http://<Server Name>:8101/DynamicsAx/Services/ODataQueryService/

Note: some queries due to the way they have been developed or certain joins do not expose through OData Properly. The best way to test is a quick connection from Excel after publishing the OData service to see if the query has worked properly.

Some things to keep in mind when selecting your query:

  • Invest the time to find the right query, there are a huge number of Queries available from standard AX which you should review before trying to extract multiple tables and joining them externally.
  • OData Protocol filters are not supported, so if you require a filtered dataset to be published, you need to apply the filter from within the AX Query or from within PowerQuery. To do this in AX you can do this through the AOT, or by selecting the "Custom Query" option on the Document Data Sources form when creating a new data source.
  • Each record in the response must have a unique primary key. AOT View objects which don't have a unique key will not be presented when you try and query the OData Service.
  • If you try and access the URL from a web browser and you receive a "Internal Server Error" you may have published a 'bad' query, try setting them to inactive and reactivating them one by one to find the problem query.

Once you have your OData service ready to go, we are ready to connect to the data from PowerQuery. PowerQuery will have two main functions when working with data; Extraction – from Dynamics AX as well as other data sources, and Transformation – to remove unwanted columns, rename columns and tidy up our data to make it as user friendly as possible.

Image may be NSFW.
Clik here to view.

PowerQuery is accessed through a ribbon within Microsoft Excel. If you don't have PowerQuery installed, you can get it here.

A detailed guide of how to connect to your OData source from PowerQuery can be found here.

Important Note: If you plan to use the scheduled refresh functionality within Power BI, you need to ensure the correct case has been used on the OData URL when entered into PowerQuery. At the time of writing the authentication process for Power BI refresh lookups credentials for the OData service with the following case:

http://<Server Name>:8101/DynamicsAx/Services/ODataQueryService/

If you have any characters upper/lower case different to the above – the authentication will fail on refresh.

Transform your Data

After you've connected to your OData source and pulled it into PowerQuery, you can now leverage the tools within PowerQuery to transform your data ready for end users and visualisations.

The data exposed from Dynamics will come out with technical names and often unwanted data, below is an example of the ProjTransPostingCube Query from Dynamics AX R3 CU8.

Image may be NSFW.
Clik here to view.

A detailed guide of how to perform transformations can be found here.

The key transformations to implement when working with Dynamics AX data:

  • Remove unwanted columns.
  • Rename Column Names to user friendly names
    • Example "ProjTable_Name" to "Project Name"
    • This step is key to PowerQ&A to support natural language queries.
  • Change DateTime formatted fields to Data type "Date"
    • Example "10/02/2015 00:00:00:00" to "10/02/2015"
  • Merge with other Dynamics AX or Internal Data sources to provide a combined dataset to end users.
    • More details on a merge example can be found here.
  • Insert Year, Quarter and Month Columns to be used in Visualisations.
    • If you select a date field in the query you can add these columns by using the Date option on the "Add Column" ribbon.
    • Once added – ensure to change the Data Type to "Text" otherwise when you include it in visualisations it will try and total Years as real number values.

Once transformed a data source is not only easier to work with when designing visualisations, it is also allows PowerQ&A to work with the data in natural language queries. Below is the same ProjTransPostingCube query after transformation.

Image may be NSFW.
Clik here to view.

 

Enhancing your data with measures

Using PowerPivot within Excel, you can start to add calculated values and KPIs to your data set to use within your visualisations. This functionality is accessed from the PowerPivot tab within Excel, to open up the PowerPivot Data Model, click Manage.

Image may be NSFW.
Clik here to view.

Using the calculation grid at the bottom of the pane you can create calculated measures which will then be available in your visualisations. In the example below we have a new measure for "Actual Cost" which is based on Project Transactions, filtered on "Project – Cost" transactions. A detailed guide of how to create measures can be found here.

Image may be NSFW.
Clik here to view.

Once you've created your measures and saved the data model, they will be available in the field list for PowerView and can be added to visualisations like in the example below.

Image may be NSFW.
Clik here to view.

 

If you would like to align your terminology and calculations to the Standard Dynamics AX cubes review Cube and KPI reference for Microsoft Dynamics AX [AX 2012] for a breakdown of the measures available in the standard cube and the basis of the calculation.

Merging with Data from Azure Marketplace

One of the most powerful features of PowerQuery is leveraging data from other data sources, including the Azure Marketplace. The Marketplace has a collection of data from population statistics, business and revenue information and reference data to help build your visualisations. One of the most helpful is a data source for Date information. While this may sound basic, it's really helpful in presenting visualisations without having to reinvent the wheel in your query.

A great example and one I have used is DateStream (http://datamarket.azure.com/dataset/boyanpenev/datestream) it is a free data source which contains a reference of the Month name, Day name, Quarter, etc for dates.

Image may be NSFW.
Clik here to view.

 

To use a data source from Azure, you first need to sign up on Azure Marketplace with your Microsoft account (Live) https://datamarket.azure.com/home. Once you've signed up and found the data source you would like to use, you subscribe to the data source through the data market. Now when we log in through Excel, it will be available for us.

In Excel, the process is similar to if we are connecting to OData. From the PowerQuery tab select "From Azure" > "From Microsoft Azure Marketplace". You will then be prompted to enter your credentials (using your Microsoft account you used at the Azure Marketplace). After signing in you will be presented with a list of data sources you have subscribed to online.

Once the data is loaded into your data model, you follow the same merge process we described earlier to merge the new date fields with your data source. The result is now the additional columns in your primary query. In the example of the date reference query, we now have Year, Quarter and the Month name to use in visualisations.

Image may be NSFW.
Clik here to view.

Sharing your Transformed Query with others

After you've invested the time to transform your query into a nice end user ready data source, you can save it to your Office 365 Data Catalogue. This will allow other users in your organisation to benefit from the time you've invested in transformation and work with the end result in Excel and in their visualisations. You can access the Data Catalog from the PowerQuery ribbon in excel, you'll need your Office 365 Credentials to log in.

Image may be NSFW.
Clik here to view.

A detailed guide to saving and sharing queries can be found here.

Now you should have a clean and friendly data source available within Excel, the next post will talk about creating and publishing visualisations.

Thanks,

Clay.

Image may be NSFW.
Clik here to view.

Power BI and Dynamics AX: Part 3: Create and Publish Visualisations

 

This is Part Three, continuing from Part 2: Extract and Transform. This post will focus on the creation and publishing of visualisations.

At this point in the process we have a workbook, with a transformed data model – we are ready to create a visualisation to present our data. This post isn't meant to be a detailed guide of how to create a visualisation, you can find one here.

To get started with a PowerView Visualisation, from the workbook we built our query in we are going to click PowerView from the Insert tab.

Image may be NSFW.
Clik here to view.

Excel will load a blank PowerView sheet for you to begin creating your visualisation. Assuming your query was created you should now see your transformed data set in the "PowerView Fields" pane to the right of the screen. (If you don't – jump back here to see how to setup the query).

Image may be NSFW.
Clik here to view.

You can now start by checking the fields you would like to present in your visualisations, or dragging and dropping them from the field list onto the PowerView sheet. Once you start adding data you will notice a new ribbon "Design" is available, from here you can change visualisations and the presentation of the data.

Some key things to keep in mind when creating your PowerView visualisations:

  • Keep your visualisations simple
    • PowerView is designed to give high impact visualisations that allow a user to answer questions, or prompt action quickly and easily. Complex and detailed visualisations make this extremely difficult to do.
    • Don't try and answer every business question in one single PowerView sheet – don't be afraid to have multiple sheets within your workbook to present data in different ways.
  • Keep the end user device in mind
    • With mobile apps and HTML 5 support, PowerView can be viewed on a variety of devices, with different interfaces of different sizes. If you create visualisations which are a small font size or with bars on the chart to small, users on small touch devices won't be able to interact with them.
    • Some visualisations aren't supported on all modes – web, mobile app, HTML 5. While basic visualisations are, there is limited support across devices for PowerMap – check on the PowerBI team site for the latest support.
  • Leverage Hierarchies for drill downs
    • In Chart, Bar and Pie visualisations you can add multiple fields to the Axis to create a dynamic hierarchy. The visualisation will then allow the user to drill into the values in the chart.

      Image may be NSFW.
      Clik here to view.

    • While this example is using date values, you can include any fields in the hierarchy – for example Client, Project, Worker to drill into more detail on project analysis.
  • Select your Axis and Legend Fields carefully
    • Keeping in mind the purpose of simple visualisations, if you select a field that has a high number of values (Like 1,000 Customers) and then add this to your legend on your chart – not only is this going to look terrible – but PowerView will also limit the number of customers it displays and provide you with a sample set. In most cases, you won't want to view only a sample set.
    • Leverage hierarchies to group values so a manageable number of values are presented at any one time. For example: Customer Group, Customer

Publishing your Visualisation

Now you should have a workbook, including your data model and visualisation ready to share on PowerBI. The publishing process is quite simple, the main consideration at time of publishing is security.

Once you publish your workbook there Dynamics AX Security model is no longer applied – the data security is only applied at time of refresh. For example, if the designer of the report has access to 10 Dynamics AX Legal Entities, once they publish the report anyone with access to the workbook will see the contents. This is due to the fact that the information is uploaded into the Excel Data Model in Office 365 as part of the refresh process.

A key component of your PowerBI strategy needs to be focused on the plans for security related to your data. The security will be managed by SharePoint security, based on the workbook. As an example, if you create a folder in SharePoint for "Sales – West Region" and provide access to your West Region team, then only this team will have access to the report once its published to this folder.

The PowerQ&A service is also based on this security model, when a user asks a question on Q&A, PowerBI will look across all workbooks that user has access to within PowerBI. For this reason it's even more important to ensure the correct security has been setup from day 1.

Note: To publish your workbook in PowerBI for Office 365, you will need an Office 365 subscription and an active licence for PowerBI.

Firstly, you will need to create a PowerBI site on your Office 365 tenant if you haven't already done so, to create a new PowerBI site follow the detailed steps here.

Once you've created your new PowerBI site, you can upload your workbook by selecting "Add" > "Upload File" under the documents section.

Image may be NSFW.
Clik here to view.

Once the workbook is uploaded, you will notice it will automatically enabled itself for PowerBI. If you uploaded your document through the SharePoint library, or had already loaded it before adding PowerBI to your Office 365 tenant, you will need to enable the workbook manually. You can do this by clicking "Enable" on the ellipsis menu on the workbook.

Image may be NSFW.
Clik here to view.

One last step is to enable the workbook for PowerQ&A, this is also done from the ellipsis menu on the workbook itself. Workbooks will not be automatically enabled, the main reason here is you may have multiple workbooks with the same data source, you don't need to enable all of them for Q&A, only one as long as it's the same data source in the background.

Now that its enable you can click the workbook to see your visualisations on PowerBI. Done!

Image may be NSFW.
Clik here to view.

 

You'll notice in the bottom right hand corner you have an icon to take you to the HTML 5 view. It's recommended to always view your visualisations in HTML 5 to see how they are presented, depending on the device your user is using, you want to make sure the visualisations are clear and easy to understand. In some cases you will see it renders slightly differently in Excel vs HTML5 and may require some tweaking.

Image may be NSFW.
Clik here to view.

 

You've now got a Dynamics AX data source extracted and published in PowerBI. The next post in this series will talk about PowerQ&A and the data refresh options available for the data source.

Thanks,

Clay.

Image may be NSFW.
Clik here to view.

Power BI and Dynamics AX: Part 4: Data Refresh and Q&A

 

This is part four, continuing from Part 3: Create and Publish Visualisations. This post will focus on Data Refresh within PowerBI and how to get started with Power Q&A.

At this point in the process we have extracted data from Dynamics AX using Power Query, transformed our data, created a visualisation using PowerView and published it to our Power BI site on Office 365. The next step is to setup a data refresh to ensure our data and visualisations stay current. If you don't schedule a data refresh, users will need to open the workbook in a desktop version of Excel, and manually refresh the data. This will also require a connection to the AOS for the user refreshing the workbook.

The refresh process has a few components involved, the initial setup has a few steps involved to connect your AX instance, but after this initial process new workbooks can be setup quite easily.

Firstly you need to install the Data Management Gateway (Download is available here.) the gateway is used to allow PowerBI to connect back to your On-Premise data without a user being involved. You will need to deploy the gateway on a server with internet access and access to the Dynamics AX AOS. Once installed, follow the steps outlined here to configure the gateway with our tenant of Office 365.

Once configured, you can now add a data source. This is done from the PowerBI Admin Centre. You will need to create a data source for each Dynamics AX Instance (Note Instance, not Query – so you will need a data source for Production, Dev, Test, etc).

To create the new Data Source, open the PowerBI Admin Centre from the Settings (Gears) option in the top right hand corner from your PowerBI site. From the Admin centre, select Data Sources. Now click the plus sign to add a new data source and select "Power Query"

Image may be NSFW.
Clik here to view.

You'll now be asked to paste in the connection string used for the connection. You need to get this from your Excel Workbook. Open you excel workbook and open the "Connections" form from the Data tab.

Note: if the Connections option is greyed out, it may be because you're on a PowerView sheet. Insert a new blank Excel sheet and the option will become available. (Don't forget to delete the new sheet later)

You'll see your different data sources in the connection window, you need to select one of your AX data sources, click "Properties". On the Definition tab you will see the connection string, copy and paste the entire connection string into the PowerBI admin centre.

It should now load the list of connections you have in your data source – you need to complete the details for each data source:

  • Name: Use something informative, example "Dynamics AX – Test Environment"
  • Description: For your internal purposes
  • Gateway: Select the gateway for the connection to use, you may have multiple gateways configured in your environment.
  • Set Credentials: These are credentials used for the refresh, this account must have access to Dynamics AX to perform the data refresh. It is recommended to use a system account for this refresh, not a specific users account.

You can now test the connection. The next two steps will allow you to specify users which have admin privileges over the data source and where notifications of errors should be sent.

Once the Data source is configured we can now go and schedule our workbook to refresh. Return to your PowerBI site and navigate to the workbook. From the ellipsis menu on the workbook select the "Schedule Data Refresh option"

Image may be NSFW.
Clik here to view.

From here you can see the refresh history, as well as configure the refresh for this specific workbook. You must setup the data source in the Admin centre first, otherwise this step will fail. You can find detailed steps on the refresh here.

Using Power Q&A: Natural Language Queries

Q&A is an extremely powerful tool, allowing users to use natural language to explore the data that you've prepare in your data model. This is where your transformation steps really pay off, as Q&A can leverage the friendly names you've given fields to allow users to explore your data.

To use PowerQ&A, from your PowerBI site click "Ask with PowerBI Q&A" in the top right hand corner. You'll be presented with a blank canvas ready for questions.

As an example, using my data set I have asked for "Expense amount by client name" – PowerQ&A has prepare a visualisation of my project expense transactions sorted by client name.

Image may be NSFW.
Clik here to view.

Using the "Explore this result" pane on the right hand side you can start changing visualisations, filters and even the fields that are presented.

Q&A does a lot of work on its own to understand natural language, it identifies synonyms, deals with spelling mistakes but you'll notice as you begin to explore that PowerQ&A doesn't always get it right, in the drop down under your question you'll see how Q&A is interpreting your question. In my example you can see it is showing "Show amount where transaction type is expense sorted by client name". What you will notice is sometimes Q&A can't understand what you're looking for and a word will be greyed out – this means Q&A didn't understand your phrasing.

Synonyms are one of the easiest ways to teach Q&A about your business, you can do this through PowerPivot (detailed instructions here.) or through PowerQ&A optimisation in Office 365. To manage this through Office 365 you need to open the Power BI Site Settings from your PowerBI Site. From within your Site Settings you'll see a tab for Q&A which will contain all your workbooks enabled for Q&A. From the ellipsis menu, select "Optimize for Q&A"

Image may be NSFW.
Clik here to view.

You will be presented with a blank Q&A space for you to ask test questions, you'll also notice in the pane on the right hand side a summary of the optimisation that has already take place. The first time you load the workbook you'll notice the synonyms and phrasings already generated by Q&A automatically.

Image may be NSFW.
Clik here to view.

 

Starting with the last tab, Usage, this is extremely helpful for you to understand how your users are using Q&A, as well as what words or phrasing isn't being understood. IT administrators and/or data officers should be regularly monitoring this tab to ensure Q&A is providing the right results, and is continuing to learn about the organisation.

Image may be NSFW.
Clik here to view.

Synonyms: you can use this tab to add a synonym to your column names. For example, you may internally refer to a "Product" as a "Part" – to teach Q&A you can add Part as a synonym to Product, now when users use the work "Part" in their questions, Q&A will be able to provide a response.

Phrasing is extremely powerful, it allows you to teach complex terms or expressions which are used within your organisation. As an example, let's say I asked "Expenses by client" – in my mind I want the same result as my first example, but I didn't say "expense amount". You can see Q&A hasn't interpreted this correctly, what is it showing me is clients which have had expenses, not the actual amount.

Image may be NSFW.
Clik here to view.

This is where optimisation comes into play, where we can teach Q&A that when I say "Expenses" I actually mean the amount of transactions which are of type, Expense. So now under the Phrasing Tab, I can add a new phrasing.

Image may be NSFW.
Clik here to view.

As soon as I click ok and ask the same question again "Expenses by client", the new result is shown below.

Image may be NSFW.
Clik here to view.

 

Some key things to keep in mind for Q&A:

  • Invest the time in PowerQuery transaction to make sure you start with a nice clean data set.
  • Plan for a pilot of Q&A before releasing it to your entire organisation, use this time to optimise your data and ensure you have your synonyms and phrasing worked out.
  • Remember access to data through Q&A is driven by your SharePoint security, so plan accordingly.
  • Q&A understands terms like more than, less than, last year, this month – explore with terminology and learn what works best for you.
  • Use the "featured questions" option to save questions and present them to users as they log in. This not only saves time in retyping questions, but also gives new users an introduction on what they can be asking.

Here are some great resources for Power BI, have a look at this content if you're starting out:

This is the last post in this series focused on PowerBI for Office 365, the next post will be focused on an example of how to use the new PowerBI functionality with on premise data by setting up a tabular SSAS database.

Thanks,

Clay.

Image may be NSFW.
Clik here to view.

Power BI and Dynamics AX: Part 5: PowerBI.com and On Premise Data (Preview)

 

This is the final part of the Power BI and Dynamics AX blog series for now, the earlier posts focused on the current functionality available within Power BI for Office 365. This blog post is going to talk about new functionality, that at time of writing, is only available in preview. This the topic in this post should not be applied to a production environment as it relates to preview functionality.

The new version of PowerBI.com has introduced a lot of new functionality; Dashboards, new visualisations, new apps for iOS and Windows devices all making for an extremely rich PowerBI experience on the web. This post is focused however on the new Live Connectivity functionality for On Premise Data.

Image may be NSFW.
Clik here to view.

The new connector that has been released allows a live connection to an On Premise Analysis Services Tabular Database. While unfortunately at the time of writing the SSAS Database shipped with Dynamics is a Multi-dimensional database you can't connect directly, but you can create a Tabular Database to "flatten" the multi-dimensional database into a tabular Database for use with PowerBI.com. The latency of your data in PowerBI.com will be determined by how often you're processing your SSAS Cube.

As an organisation you need to determine the best connectivity method for you and PowerBI, this may be through SSAS or through OData as previously described. There are limits to the On Premise option at the moment, given the nature of the Q&A Natural Language Queries and the processing that is required, Q&A is not currently supported on On-Premise data, you must still upload data into the data model for Q&A to work. For more information, start with the documentation from the PowerBI team - Which Power BI experience is right for me?

For this example we are going to use:

  • Dynamics AX 2012 R3 CU8
  • SQL Server 2014 Enterprise
    • Two SSAS Instances Deployed
      • Multidimensional with standard Dynamics AX Cubes Deployed (Find a detailed guide here.)
      • Tabular
  • PowerBI.com (Preview)
  • Power BI Analysis Services Connector (Preview)
  • Visual Studio 2013
  • Microsoft SQL Server Data Tools - Business Intelligence for Visual Studio 2013

Important Note: The SSAS Connector requires Azure Active Directory Sync to be running between your on Premise Active Directory and Azure Active Directory – otherwise you will receive an error when trying to connect to the Data source from PowerBI. So for those of you using the Dynamics Demo Virtual Machine, you won't be able to connect. Your SSAS instances will need to be deployed on a machine synced with your Azure Active Directory.

We are going to create our Tabular Database through Visual Studio using a Business Intelligence Project Template that is installed as part of the SQL Server Data Tools pack. For the example today we are going to create a Project analysis tabular model with basic project information along with some key measures.

To start, launch Visual Studio. We are going to create a new SSAS Tabular Project by clicking "New Project", and selecting "Business Intelligence", "Analysis Services Tabular Project"

Image may be NSFW.
Clik here to view.

After selecting the project and entering a name, you will be prompted to connect to an instance of SSAS, this is where you connect to your newly created SSAS Tabular Instance.

Once we have our new project you have a few options of how you would like to create the model, depending on your technical ability. For this example we are going to use the wizard as it's the easiest option. To get started, select "Model" > "Import from Data source". You'll be prompted with the list of data sources you can import from. You have the option of connecting directly to the AX relational DB, but I find the cubes easier as fields like enums, references, etc have been cleaned up and are a lot easier to work with. You also get the benefit of leveraging the calculations already in place in the standard cubes.

Image may be NSFW.
Clik here to view.

For our purposes today, we will use Microsoft Analysis Services. In the next screen you'll enter the connection details for the SSAS Multidimensional Database (Dynamics AX Standard Cubes).

Image may be NSFW.
Clik here to view.

After entering credentials, you'll be prompted for the MDX query that the tabular database should use for this model. You can start with MDX if you wish, or use the "Design" option to launch the visual designer. From this screen we will select the data we want form the cube to be available in our Tabular model. You can drag and drop fields from cubes on the left hand side into the pane on the right. As you add fields you will see your tabular model refresh to give you a preview of the data that will be available.

Image may be NSFW.
Clik here to view.

Once you've selected the information you want available, click Ok – you can now give your query a friendly name and then click Finish. If you had provided incorrect credentials, you will receive an error – you will need to back up to the credentials and update them with an account that has access to the cubes. Once you click finish the MDX query will be processed, once finished, close the window and you will see the results of your MDX query. You can take this time to change column names if you wish to make the data a little friendlier once we load it into PowerBI.

Image may be NSFW.
Clik here to view.

You can now close and save your Model. If you would like to double check its deployment, you can open up SQL Management Studio and you should see your newly created tabular DB. The SSAS On Premise model uses security from SSAS, so this is where you would apply your role security to your SSAS data model. Users need to have access to this DB to be able to explore the data in PowerBI (This is a key difference to the OData/Workbook model previously discussed)

Image may be NSFW.
Clik here to view.

 

The last step On Premise is to install the PowerBI Analysis Services Connector (Preview). You can find a detailed guide of how to download and install the connector here. The installation will require your PowerBI.com login details (Your Office 365 credentials) as well as the details for the Tabular instance of SSAS.

Now we are ready to expose the new tabular database to PowerBI. You can log into the Preview here. At the time of writing this preview is only available to customers within the United States. Once you've logged in, select "Get Data" > "SQL Server Analysis Services" and then "Connect".

Image may be NSFW.
Clik here to view.

You will be presented with a list of all the SSAS connectors published within your organisation. Find and select yours in the list. You will then see a list of Models which are available on your instance. Click the model you had created earlier and click "Connect"

Image may be NSFW.
Clik here to view.

Once connected, you will now have a new Dataset available which is your On Premise Data source. (Note: The name will be your SSAS instance, you can rename it in PowerBi.com if required)

Image may be NSFW.
Clik here to view.

Now your data is available to create reports and dashboards against like any other data source. From the ellipsis menu on the Dataset click "Explore" and you be taken to a blank PowerView page to start building your report. If you're familiar with creating visualisations in PowerView you can follow the same process, if not you can find a detailed guide here.

Below is an example of a Project Profitability analysis report based on On-Premise Data on Dynamics AX. The Invoiced Revenue, Cost, Hours and Gross Profit are all based on calculated measures defined in our standard Dynamics AX SSAS Cubes. You can find a detailed reference of the information available in the cube here.

Image may be NSFW.
Clik here to view.

One of the key benefits of the new PowerBI.com is the ability to create dashboards. Dashboards allow visualisations from multiple reports to be pinned to a single dashboard to give you a quick and easy overview of multiple sets of data. You can then drill into that specific report by clicking the visualisation from the dashboard.

Image may be NSFW.
Clik here to view.

This was a very simple example of exposing some Dynamics AX data to explore the preview; users of PowerBI should consider the best connection method for them, along with planning around what data should and should not be exposed. The PowerBI technology is changing at a great pace at the moment, it's important to keep up to date with what is coming and how it can help shape your ongoing Business Intelligence Strategy.

For information on the new PowerBI.com platform, try these resources:

Hopefully this has been a help insight to some of the new functionality out in preview at the moment, and how it can be applied to Dynamics AX.

Thanks,

Clay.

Image may be NSFW.
Clik here to view.

Sprinkling a little bit of IoT around Contoso's Dynamics business process to make it more intelligent

I am increasingly hearing Dynamics customers asking about IoT solutions from Microsoft. This article describes a recent experience at a Dynamics AX customer, hope you enjoy. 

Internet of Things is growing at a phenomenal pace and there are so many mind boggling predictions about it that it can be hard to follow. To make things simple you can chose to remember just three predictions from Gartner, apparently these numbers will be valid for next five years :-) so you should be ok barring major changes.

  • IoT will include 26 billion units by 2020
  • IoT and service suppliers will generate incremental revenue exceeding $300B, mostly in services by 2020
  • IoT will result in $1.9T in global economic value-add through sales into diverse end markets

As if talk about petabytes and terabytes of data wasn’t sufficient now you also have to remember about billions of sensors and trillions of dollars! Doesn’t it make you wonder where all this is going and how much is really true? Well, if it does, I entertain you to look at this little bit of sprinkling of IoT devices that we are currently doing at Contoso. If this article interests you and you feel like you want to touch and feel some of these devices yourself please come and see us at Atlanta convergence March 16-19 2015 at a general session and a concurrent session. If you miss out on those opportunities you can always find us in the Customer showcase at EXPO hall. 

So I have covered in previous posts what Contoso's business is about. Contoso is UK’s leading foodservice delivery company, supplying full- range of food stuffs across the UK. It is a leading supplier to restaurants, bars, pubs, café's and schools. Forty-five thousand customers place over a million orders a year from a catalogue of about five thousand products. More is here.

Since they make deliveries, they have their own fleet to do this. As you can imagine, any delivery of temperature sensitive products like food or medicine has it's fair share of challenges. One of these challenges for the COO of Contoso is how not to feel terribly sorry while he sits and watches thousands of pounds of food rotting at the backside of a truck which has a broken cooling unit and is stuck in snarling traffic through streets of London on a hot summer day. Half of such an event is enough to justify the meagre cost of implementing a Microsoft Azure based temperature-controlled solution that Contoso has come up with to address their scepticism about internet of things in general! 

On a more serious note it has happened a few times in hot summer months with products like ice-creams, shrimps and peas that customers have called up to complaint about drivers delivering defrosted stuff. There isn't much choice for Contoso other than to provide a credit note when such a thing happens. It's not only lost revenue but more importantly lost customer that is the real concern in such situations - not to mention, the spoilt food. And there is the major risk of big damage with a whole cooling unit breaking down. This is a nightmarish scenario for every distributor engaged in any cold chain.

So, the genesis of what Contoso is building lies in a simple question to Paul Smith, the COO - "Paul, if this is a risk, why don’t you put temperature sensors in your trucks and hook them up to Azure and run some analytics on time series data to alert you before something breaks down? If results tell you something, it would mean there is some substance to all the hype"  

It was a simple enough question and a simple enough proposal. Paul was in as long as we agreed we are not going to hook up sheep pedometers to Azure to measure their health and hence quality of their meat or use automatic devices to feed pets when owners were away deeply engaged in a pub crawl! Convince yourself here - 1, 2, 3, 4, 5, 6

So we looked around for suitable temperature sensors. There are so many sensors in the market sensing so many things that at first, it was hard for us but then quickly we established strict qualification requirements and narrowed down to two sensors. TI sensor and a Zen measure sensor. TI tag has more details here. You are welcome to browse details of their capabilities using the links but I will put here in short, the tests we did and comparisons we made.  

Zen measure is pretty cool, pretty slick and seriously small - it is the size of an overcoat button. It was originally designed for consumer use, for measuring skin temperature of older patients and children in hospitals and care homes.

Image may be NSFW.
Clik here to view.
 
So you can imagine their surprise when they got a request from Contoso to test their sensors to install in delivery trucks. In the early days (three months ago that is) Zen didn’t measure below freezing temperatures but this was quickly changed in an update a week later to allow Contoso to measure temperature of its chilled as well as frozen products on the road. (Note: In this market, there are many vendors and they move very quickly to publish changes to their products/apps).

When you go for something like this or any device of this nature you will quickly realize that most important things are signal strength and battery life. Contoso is still evaluating battery life and comparing it with TI. Contoso procured two sensors at first and Paul installed them in his home refrigerator and freezer and used his iPhone app to test the features. He felt pretty excited about its' capabilities. Now it's been three months into testing and several new Zen sensors have been procured and installed in various places including a few in delivery trucks, none so far indicate much loss in battery life. This is excellent! TI sensor on the other hand is capable of sensing not only temperature but also humidity, motion etc. making it very likely to have lower battery life than Zen. Both communicate using BLE Bluetooth Low Energy, both are very easily discoverable on iPhone and both meet the connection strength and connection range criteria.

Image may be NSFW.
Clik here to view.

Now about the major drawback. None of the sensor vendors we looked at have a Windows app. (Note: this is pretty common, vendors are building for iOS and Android, not so much for Windows, however this is likely to change quickly with Windows 10 and Raspberry Pi support and a number of other things.) We tapped into teams in Microsoft and found out Chris already had made a WP app for TI. He agreed to build a WP app for Zen. He quickly got a Zen sensor for himself and it took him few hours I guess to have the app working. There were a number of issues discovered and resolved, some included here to provide you a flavour of what's in store for you if you go down the IoT road.  

Some mundane stuff - Contoso spent a decent few weeks discussing where exactly to mount the sensors in the delivery trucks. They eventually called refrigeration engineers and got their help to settle on the best location within the delivery truck to mount the sensors. TI sensor comes with a little hook that can be used to hang it on the walls of the truck however Zen doesn’t come with anything so a DIY sort of casing was built to house the sensor so it could be safely stored and mounted.  

But let's say, we get a sensor, we got it working, we got an app for it to see the reading from anywhere. Now what? After all, this is great for a hobbyist but to be commercially useful to Contoso, there should be some use of this data in some business process. Isnt' it?

Image may be NSFW.
Clik here to view.
It's important to recollect here from one of the previous posts that Contoso has a Windows CE based drivers' app that they use to make drop-offs to customers. Some customers ask for temperature reading of the products being delivered. Today drivers' do this by walking to the back of the truck and manually looking at the manual hard-wired thermometers and entering this reading by hand on drivers' app which then ensures that printed invoice contains the reading. 

So, naturally this brought up all the questions related to integration. So how exactly to integrate this Zen app and/or data from the sensor to various other things going on with the business process. Contoso is also building a new WP app for drivers that will run on Microsoft Lumia 635 phones running WP 8.1. The backend is Dynamics AX and it uses a number of Azure technologies. More on that is available on links at bottom of main Contoso page. So this is what is happening with Zen data. With the new drivers' app, when the driver has reached his drop-off destination and is ready to pick goods from truck and drop them at the customer location, the drivers' app interrogates both chiller and freezer Zen sensors about the current temperature reading. This reading is then stored in the SQLite database on local Microsoft Lumia device. Chris also made a suggestion to keep the interrogating to a minimum, more interrogation means lower battery life. Contoso has a regular maintenance schedule for its' delivery trucks that go through regular check-ups every ten weeks and the goal is that Zen sensor with all the love and care showered onto them should not need battery changing at a cycle shorter than that. These are very important considerations from a business process point of view, it's important that disruption to regular business process is kept minimum to justify building this solution for essentially an insurance against a future event of some likelihood.

Once the readings are in local Lumia device SQLite, the data is sent to SQL Azure storage along with all the other data from drivers' app. The reading is then picked up to be transferred to Dynamics AX database from where it is used further down in the business process. Note, this is critical for offline scenarios since connectivity on road is poor in most places and really poor in UK. There is also a Bluetooth printer that drivers carry with them to print invoices with their drop-offs. This led to another challenge as for a while we could not get both Bluetooth printer and BLE sensor to pair with the Lumia device at same time. Soon enough this issue was resolved. The drivers now have the ability not only to print the invoices using the Bluetooth printer but also the ability to print temperature at drop-off time on the invoice itself for customers who want that level of service. This helps customer service folks deal better with customer complaints about defrosted goods. Automated thermometer reading into the app ensures reliability - the reading is captured for each and every drop-off and improves efficiency - driver's don’t have to read the thermometers manually as in the old system.

Image may be NSFW.
Clik here to view.

There is no end to surprising problems we encountered while building this solution. Now, after full deployment, Contoso will have over 100 drivers using 100 Lumia phones in 100 vehicles with 200 Zen sensors installed on any given day. After a long period of contemplation, Paul decided to let his drivers' keep Lumia phones with them after workday is over! So each driver has his phone now. And each truck has two sensors. However driver and trucks are in no way fixed. Any driver can drive any vehicle on any given day on any given route for that day. This meant that after drivers were assigned routes (based on complicated math, topic for another day!) and trucks, they would have to pair the two BLE sensors in that truck to their phones before they set off for the day. This is a step too complicated to add to already complex and very well oiled business process. After all the whole purpose of this exercise is to make the process more intelligent, more efficient and reliable. How can Paul leave this bit to chance? What if driver couldn’t pair properly or pairing itself failed or just didn’t want to, etc. So, Chris worked with the team to build something called promiscuous pairing which ensures that all Microsoft Lumia phones are already permanently paired to all Zen sensors. 

The WP drivers' app also has two additional capabilities.

There are two background tasks running, one to capture battery life from the sensor and another to capture temperature reading every couple minutes form every sensor in over hundred delivery trucks. Keeping in mind the offline scenarios due to poor connectivity these captured time series are also stored locally on Lumia phones and then streamed into Azure Event Hub. There are two plans with these time series. One is to use the battery life time series from all trucks and feed to an Azure ML model that does simple time series forecast to predict when the battery is expected to run out. This will give Contoso a very nice clean way to predict when the battery will run out and use this knowledge to proactively have required batteries available for drivers to change before they head out for the day so sensors are always ON while on road. No excuses!

The temperature time series on the other hand is being used in a number of ways. Contoso is using Azure Stream analytics to run simple temporal queries on the streaming data. Show me average freezer reading for last 30 minutes for all vehicles on the road, Alert me if average reading for last 30 min is beyond a certain threshold for any vehicles on the road etc. Since they already capture GPS co-ordinates for all vehicles they are planning to run finer grained GPS aware temporal queries. Contoso also has a Power BI (if you are keen to know more how to use power BI with AX, see Clay's great series here) dashboard in the transport office that displays current vehicle locations and superimposes average temperature readings from chillers and freezers on to it. Since this is quite new, they are in the process of establishing policies on what to do when temperature readings cross certain thresholds - at what point to call drivers' back to the base etc. Eventually they want to use the temperature time series data for a predictive maintenance Azure ML model that can predict when a particular cooling unit should be sent for an overhaul before it breaks down. There is also a plan in the works to use Anomaly detection Azure ML app to see if it reveals anything interesting and helps in any way.

All in all it was somewhat surprising to note that a consumer device and a consumer app that worked pretty much out-of-the-box for consumer scenarios had to go through pretty rigorous testing and a number of modifications to get to a point where Contoso considers this as a feasible solution to address their scepticism with internet of things! Kudos to this team for their incessant focus on customer service, efficiency and on technological innovation.

Hope this was interesting for you to see how an automated process could be made more intelligent with a live example and how so many new Microsoft Azure tools and technologies can be connected to work with your business processes in Dynamics AX. If you are a Dynamics customer and want to learn more about this solution or want to deploy another IoT solution please write to me aksheyg@microsoft.com

Contributions: Chris Lovett, Akshey Gupta

Image may be NSFW.
Clik here to view.

AX Retail: Create a test site using ecommerce checkout controls

In Dynamics AX 2012 R3 CU8 we released checkout controls. In order to create and browse test website using these integrated controls please follow these steps:

Go to Demo VM

  • Go to “my documents” and then open folder “retail SDK CU8”
  • Open solution file located at “\Online Channel\Clients\Web\Storefront\ Web.Storefront.sln” in VS 2013
  • Create strong name key named as “strongnamekey.snk” and place it adjacent to bldver.cs file
  • Compile your project and publish it to a folder
  • Grab that published folder and create new website in IIS in demo machine
  • Make sure app-pool account is “contoso\administrator”
  • Now go to the folder where site is running and open commerceruntime.config and change defaultOperatingUnitNumber value to 068

  <storage defaultOperatingUnitNumber="068" />

  • Open web.config and change database connection string

   <add name="CommerceRuntimeConnectionString" connectionString="Server=localhost;Database=RetailContosoStore;Trusted_Connection=Yes"/>

  • Same web.config replace section below with <<Put correct strong name key you created above>>

<section name="ecommerceControls" type="Microsoft.Dynamics.Retail.Ecommerce.Sdk.Controls.ControlsSection,Microsoft.Dynamics.Retail.Ecommerce.Sdk.Controls, Version=6.3.0.0,
Culture=neutral, PublicKeyToken=fd6b8d0172171ea7, processorArchitecture=MSIL"/>

  • Open browser and browse to new website just created.

 

Image may be NSFW.
Clik here to view.

Announcement: Upcoming Advanced Solution Architect and Development workshops for AX2012 R3

Extensibility in Dynamics AX 2012 R3 CU8 (CRT, RetailServer, MPOS) Part 2 – New data entity

Overview

This blog is to expand the knowledge you gained in part 1 of the series (http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx). In case you get stuck, I recommended that you make yourself familiar with part 1 first. Some of the information from there is required and assumed.

The steps are based on the Dynamics AX 2012 R3 CU8 VM image that can be requested via LCS (https://lcs.dynamics.com/Logon/Index) which most partners have access to (Contoso sample).  Alternatively, PartnerSource (https://mbs.microsoft.com/partnersource/northamerica/) can be used to download the VM as well. Make sure you get the CU8 version.

It is recommended to review some of the online resources around the Retail solution, either now or during the processes of following this blog (https://technet.microsoft.com/en-us/library/jj710398.aspx).

The areas this blog covers are:

-         AX: Adding a new data entity, related to a retail store, and populating it by means of a job (no UI)

-         CDX: configuring CDX in order to include the new table in data synchronizations

-         CRT: adding a new data entity and new service

-         RetailServer: exposing a new controller for the new entity; adding new ODATA action

-         MPOS: adding plumbing to call RetailServer; updating UI to expose data

A future blog will cover topics and suggestions for changing existing CRT code.  Stay tuned for that.

The changed code is available in ZIP format which includes the files that have been added or changed only. It can be applied (after backing up your existing SDK) on top of the
“Retail SDK CU8” folder.  Note that the ZIP file includes the changes from part 1 as well.

This sample customization will update the MPOS terminal to show more detailed opening times for a store.  Remember that a store worker can look up item availability across multiple stores. Imagine that as part of that flow, the worker would like to advise the customer if a particular store is open or not. See the screen shot below for the UI flow:

 Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.


Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

 

Notes:

-         The sample is to illustrate the process of a simple customization. It is not intended for product use.

-         All changes are being made under the login contoso\emmah. If you use a different account, or different demo data altogether please adjust the below steps accordingly.

 

High-level steps

 

The following steps need to be carried out:

 

  1. Setup the Retail SDK CU8 for development (see part 1)
  2. Prepare MPOS to be run from Visual Studio from unchanged SDK code (see part 1)
  3. Activate the MPOS device (see part 1)
  4. Include new entity in AX
  5. Configure CDX to sync new entity
  6. Channel schema update and test
  7. Add CRT entity, service, request, response, datamanager and RetailServer controller with new action
  8. Update client framework to call RetailServer endpoint
  9. Update client framework channel manager with new functionality
  10. Update client framework’s view model
  11. Update MPOS’s view to consume updated view model
  12. Test

 

Detailed steps

 

Setup the Retail SDK CU8 for development

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

 

Prepare MPOS to be run from Visual Studio from unchanged SDK code

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

 

Activate the MPOS device

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

  

Include new entity in AX

 

In order to store the store hours per store, we will be using a new table called ISVRetailStoreDayHoursTable. It will store the day, and open and closing times for each store.

As part of the ZIP folder you can find the xpo file at SampleInfo\Sample2_StoreDayHours.xpo. Import this file into AX. It includes 2 items: the table and a simple job that populates sample data for the Houston store.

Run the job named Temp_InsertData at least once. Then inspect the table with SQL Server Management studio:

Image may be NSFW.
Clik here to view.

Configure CDX to sync new entity

In AX, add a new location table and the appropriate columns to the AX 2012 R3 schema (USRT/Retail/Setup/Retail Channel Schema)

 Image may be NSFW.
Clik here to view.

 Create a new scheduler subjob (USRT/Retail/Setup/Scheduler subjobs)

 Image may be NSFW.
Clik here to view.

Click transfer field list and make sure that the fields match as above.

Add the new subjob to the 1070 Channel configuration job (USRT/Retail/Setup/Scheduler Job)

 Image may be NSFW.
Clik here to view.

Edit the table distribution XML to include the new table (USRT/Retail/Setup/Retail Channel Schema)

Image may be NSFW.
Clik here to view.
 

The easiest is to get the XML out of the text box, edit it outside with a XML editor, and then get it back in.  The change you need to do is to add this XML fragment:

  <Table name="ISVRETAILSTOREDAYHOURSTABLE">

     <LinkGroup>

     <Link type="FieldMatch" fieldName="RetailStoreTable" parentFieldName="RecId" />

    </LinkGroup>

  </Table>

in two places. Both times, it needs to be added inside the RetailStoreTable table XML node.

At the end, click Generate Classes (USRT/Retail/Setup/Retail Channel Schema/AX 2012 R3)

Channel schema update and test

The equivalent change to the table schema must be made in the channel side.  This has to be done to all channel databases. Use SQL Server Management Studio and create the table. Since this is a sample, we won’t add stored procedures, we just do that in code. However, it is recommended to use sprocs for performance and security reasons.

Image may be NSFW.
Clik here to view.

The file can be found in the ZIP folder at SampleInfo\ChannelSchemaUpdates.txt

Now, go back to AX and run the 1070 job (USRT/Retail/Periodic/Distribution Schedule/1070/Run now)

Then, verify in AX that the job succeeded (USRT/Retail/Inquiries/Download Sessions/Process status messages). You should see a status of “Applied” for the stores. 

 

Add CRT entity, service, request, response, datamanager and RetailServer controller with new action

 Use the solution in the ZIP file at SampleInfo\RSCRTExtension\RSCRTExtension.sln and inspect he code.

Since this part is based on part 1, I assume you have:

-         already configured the pre.settings file (for rapid deployment as part of the build into RetailServer’s bin directory),

-         already configured RetailServer’s version of commerceRuntime.config to include the new CRT extension dll, and

-         already configured RetailServer’s web.config file to include our new extension dll.

Here is a code map view of the code changes required:

Image may be NSFW.
Clik here to view.

 You can see that we need a CRT request, response, service, dataaccessor and entity. Additionally, RetailServer is customized to include a new StoreDayHoursController that exposes a new ODATA endpoint, GetStoreDaysByStore.  That endpoint uses the CRT and the request object to get a response. It does not use the data service directly.

If you have configured all right, compiled the solution and fired up the ODATA metadata url of RetailServer (http://ax2012r2a.contoso.com:35080/RetailServer/v1/$metadata), you should see the new action:

 Image may be NSFW.
Clik here to view.

 

Update client framework to call RetailServer endpoint

 

The first step is to make MPOS aware of the new entity and the new endpoint. This is basically proxy code similar to what tools like wsdl.exe would generate for .NET web services. The Retail team is investigating to provide a tool for automatic regeneration in a future release.

CommerceTypes.ts

This is a class that specifies the new entity, both as an interface and a class.

    export interface StoreDayHours {
        DayOfWeek: number;
        OpenTime: number;
        CloseTime: number;
        ExtensionProperties?: Entities.CommerceProperty[];
    }
    export class StoreDayHoursClass implements StoreDayHours {
        public DayOfWeek: number;
        public OpenTime: number;
        public CloseTime: number;
        public ExtensionProperties: Entities.CommerceProperty[];

        /**
         * Construct an object from odata response.
         *
         * @param {any} odataObject The odata result object.
         */
        constructor(odataObject?: any) {
            odataObject = odataObject || {};
            this.DayOfWeek = odataObject.DayOfWeek ? odataObject.DayOfWeek : null;
            this.OpenTime = odataObject.OpenTime ? odataObject.OpenTime : null;
            this.CloseTime = odataObject.CloseTime ? odataObject.CloseTime : null;
            this.ExtensionProperties = undefined;
            if (odataObject.ExtensionProperties) {
                this.ExtensionProperties = [];
                for (var i = 0; i < odataObject.ExtensionProperties.length; i++) {
                    this.ExtensionProperties[i] = odataObject.ExtensionProperties[i] ? new CommercePropertyClass(odataObject.ExtensionProperties[i]) : null;
                }
            }
        }
    }

 

CommerceContext.ts

This is a class that exposes the ODATA data service to the rest of MPOS.

 public storeDayHoursEntity(storeId?: string): StoreDayHoursDataServiceQuery {
  return new StoreDayHoursDataServiceQuery(this._dataServiceRequestFactory, "StoreDayHoursCollection", "StoreDayHours", Entities.StoreDayHoursClass, storeId);
 }

    export class StoreDayHoursDataServiceQuery extends DataServiceQuery<Entities.StoreDayHours> {

        constructor(dataServiceRequestFactory: IDataServiceRequestFactory, entitySet: string, entityType: string, returnType?: any, key?: any) {
            super(dataServiceRequestFactory, entitySet, entityType, returnType, key);
        }

        public getStoreDaysByStoreAction(storeId: string): IDataServiceRequest {
            var oDataActionParameters = new Commerce.Model.Managers.Context.ODataActionParameters();
            oDataActionParameters.parameters = { StoreNumber: storeId};

            return this.createDataServiceRequestForAction('GetStoreDaysByStore', Entities.StoreDayHoursClass, 'true', oDataActionParameters);
        }
    }

 

Update client framework channel manager with new functionality

Now, we got the low-level proxy code done, we need to expose the new functionality in a more consumable way to the rest of application framework. An appropriate location for the new functionality is the IChannelManager as it already encompasses similar functionality that is of more global, channel-related nature.

IChannelManager.ts:

    getStoreDayHoursAsync(storeId: string): IAsyncResult<Entities.StoreDayHours[]>;

ChannelManager.ts:

  public getStoreDayHoursAsync(storeId: string): IAsyncResult<Entities.StoreDayHours[]> {
  Commerce.Tracer.Information("ChannelManager.getStoreDayHoursAsync()");

  var query = this._commerceContext.storeDayHoursEntity();
  var action = query.getStoreDaysByStoreAction(storeId);

  return action.execute<Entities.StoreDayHours[]>(this._callerContext);
 }

 

Update client framework’s view model

The view model is an abstraction of the view that exposes public properties and commands for any view implementation to use.  Here are the 3 things we need to do in order to customize the existing StoreDetailsViewModel:

  • a variable that holds the result for view to bind to and a variable called isStoreDayHoursVisible that view can use to toggle visibility of the UI:

        public storeDayHours: ObservableArray<Model.Entities.StoreDayHours>;
        public isStoreDayHoursVisible: Computed<boolean>;

  • data initialization in the constructor:

    // empty array
  this.storeDayHours = ko.observableArray([]);
  this.isStoreDayHoursVisible = ko.computed(() => {
   return ArrayExtensions.hasElements(this.storeDayHours());
  });

  • data retrieval function to be called by the view

        public getStoreDayHours(): IVoidAsyncResult {
            var asyncResult = new VoidAsyncResult(this.callerContext);
            Commerce.Tracer.Information("StoreDetailsViewModel.getStoreDayHours()");

            this.channelManager.getStoreDayHoursAsync(this._storeId)
                .done((foundStoreDayHours: Model.Entities.StoreDayHours[]) => {
                    this.storeDayHours(foundStoreDayHours);
                    Commerce.Tracer.Information("StoreDetailsViewModel.getStoreDayHours() Success");
                    asyncResult.resolve();
                })
                .fail((errors: Model.Entities.Error[]) => {
                    asyncResult.reject(errors);
                });

            return asyncResult;
        }

 

Update POS’s view to consume updated view model

The StoreDetailsView.ts already calls into the view model to get store distance. For simplicity, we just hook into the done() event handler to call the new function:

                    this.storeDetailsViewModel.getStoreDistance()
                        .done(() => {
                            this._storeDetailsVisible(true);
                            this.indeterminateWaitVisible(false);

                            this.storeDetailsViewModel.getStoreDayHours()
                                .done(() => {
                                    this._storeDetailsVisible(true);
                                    this.indeterminateWaitVisible(false);
                                })
                                .fail((errors: Model.Entities.Error[]) => {
                                    this.indeterminateWaitVisible(false);
                                    NotificationHandler.displayClientErrors(errors);
                                });

Lastly, we update the html to expose the data:

Image may be NSFW.
Clik here to view.
 

Please use the sample code in the ZIP archive as mentioned above.  This also includes a few other changes not
detailed here, for example in resoures.resjson, Converters.ts.
 

Issues and solutions:

If you cannot run MPOS from the Pos.sln file because it is already installed, uninstall the app first. This link may also be helpful: http://blogs.msdn.com/b/wsdevsol/archive/2013/01/28/registration-of-the-app-failed-another-user-has-already-installed-a-packaged-version-of-this-app-an-unpackaged-version-cannot-replace-this.aspx 

Happy coding,

Andreas

 

Original link: http://blogs.msdn.com/b/axsa/archive/2015/05/20/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-2-new-data-entity.aspx (go back to it for zip file download...)

 

Image may be NSFW.
Clik here to view.

Announcement: Upcoming Advanced Solution Architect workshop for AX2012 R3 in Finland

We have Advanced Solution Architect Workshop for Microsoft Dynamics AX 2012 R3 scheduled for November 9-11 in Finland

This three-day instructor-led workshop is designed for solution architects working for partners that are engaged in large, complex or multi-location projects where systems integration is a key requirement.

The participants of this workshop are expected to have experience in projects where they have driven the definition of business, technical, and architectural requirements, and where they are accountable for validating the solution

The objectives of the workshop include:

  • Learning how to articulate and architect a solution by using the key concepts of the foundational features in Microsoft Dynamics AX 2012 R3.

  • Developing an understanding of how to create a solution to a complex multi-national customer implementation by using the capabilities of AX 2012 R3.

  • Understanding the concepts behind architecting an enterprise solution with AX 2012 R3.

Training Dates: November 09, 2015 - November 11, 2015 
Training Venue: Microsoft Talo, CD-Building, Keilalahdentie 2-4, Espoo 02150, Finland  

 More information and registration links are here

Image may be NSFW.
Clik here to view.

Power BI and Dynamics AX: Part 2: Extract and Transform

 

This is part two continuing from Power BI and Dynamics AX: Part 1: Introduction.

This post will focus on the steps involved in extracting data from Dynamics AX into PowerQuery and transforming that data ready to be exposed in your visualisations, or used in PowerQ&A.

Extracting Data via OData

The first few steps of this process would typically be driven by an IT Professional within the business. It requires some knowledge of the structure of data within Dynamics AX to identify the correct query.

To extract data from Dynamics AX, we will be using OData (Open Data Protocol) services. OData allows us to expose a specific query from Dynamics AX for use in external applications. Which queries are available through OData is managed from within AX in Document Data Sources.

Image may be NSFW.
Clik here to view.

Organisational Administration \ Setup \ Document Management \ Document Data Sources

A detailed guide to setting up your Data source can be found here.

Once you have setup your data source, you can view all your published feeds from this URL (Replace <Server Name> with your AOS Server):

http://<Server Name>:8101/DynamicsAx/Services/ODataQueryService/

Note: some queries due to the way they have been developed or certain joins do not expose through OData Properly. The best way to test is a quick connection from Excel after publishing the OData service to see if the query has worked properly.

Some things to keep in mind when selecting your query:

  • Invest the time to find the right query, there are a huge number of Queries available from standard AX which you should review before trying to extract multiple tables and joining them externally.
  • OData Protocol filters are not supported, so if you require a filtered dataset to be published, you need to apply the filter from within the AX Query or from within PowerQuery. To do this in AX you can do this through the AOT, or by selecting the “Custom Query” option on the Document Data Sources form when creating a new data source.
  • Each record in the response must have a unique primary key. AOT View objects which don’t have a unique key will not be presented when you try and query the OData Service.
  • If you try and access the URL from a web browser and you receive a “Internal Server Error” you may have published a ‘bad’ query, try setting them to inactive and reactivating them one by one to find the problem query.

Once you have your OData service ready to go, we are ready to connect to the data from PowerQuery. PowerQuery will have two main functions when working with data; Extraction – from Dynamics AX as well as other data sources, and Transformation – to remove unwanted columns, rename columns and tidy up our data to make it as user friendly as possible.

Image may be NSFW.
Clik here to view.

PowerQuery is accessed through a ribbon within Microsoft Excel. If you don’t have PowerQuery installed, you can get it here.

A detailed guide of how to connect to your OData source from PowerQuery can be found here.

Important Note: If you plan to use the scheduled refresh functionality within Power BI, you need to ensure the correct case has been used on the OData URL when entered into PowerQuery. At the time of writing the authentication process for Power BI refresh lookups credentials for the OData service with the following case:

http://<Server Name>:8101/DynamicsAx/Services/ODataQueryService/

If you have any characters upper/lower case different to the above – the authentication will fail on refresh.

Transform your Data

After you’ve connected to your OData source and pulled it into PowerQuery, you can now leverage the tools within PowerQuery to transform your data ready for end users and visualisations.

The data exposed from Dynamics will come out with technical names and often unwanted data, below is an example of the ProjTransPostingCube Query from Dynamics AX R3 CU8.

Image may be NSFW.
Clik here to view.

A detailed guide of how to perform transformations can be found here.

The key transformations to implement when working with Dynamics AX data:

  • Remove unwanted columns.
  • Rename Column Names to user friendly names
    • Example “ProjTable_Name” to “Project Name”
    • This step is key to PowerQ&A to support natural language queries.
  • Change DateTime formatted fields to Data type “Date”
    • Example “10/02/2015 00:00:00:00″ to “10/02/2015″
  • Merge with other Dynamics AX or Internal Data sources to provide a combined dataset to end users.
    • More details on a merge example can be found here.
  • Insert Year, Quarter and Month Columns to be used in Visualisations.
    • If you select a date field in the query you can add these columns by using the Date option on the “Add Column” ribbon.
    • Once added – ensure to change the Data Type to “Text” otherwise when you include it in visualisations it will try and total Years as real number values.

Once transformed a data source is not only easier to work with when designing visualisations, it is also allows PowerQ&A to work with the data in natural language queries. Below is the same ProjTransPostingCube query after transformation.

Image may be NSFW.
Clik here to view.

 

Enhancing your data with measures

Using PowerPivot within Excel, you can start to add calculated values and KPIs to your data set to use within your visualisations. This functionality is accessed from the PowerPivot tab within Excel, to open up the PowerPivot Data Model, click Manage.

Image may be NSFW.
Clik here to view.

Using the calculation grid at the bottom of the pane you can create calculated measures which will then be available in your visualisations. In the example below we have a new measure for “Actual Cost” which is based on Project Transactions, filtered on “Project – Cost” transactions. A detailed guide of how to create measures can be found here.

Image may be NSFW.
Clik here to view.

Once you’ve created your measures and saved the data model, they will be available in the field list for PowerView and can be added to visualisations like in the example below.

Image may be NSFW.
Clik here to view.

 

If you would like to align your terminology and calculations to the Standard Dynamics AX cubes review Cube and KPI reference for Microsoft Dynamics AX [AX 2012] for a breakdown of the measures available in the standard cube and the basis of the calculation.

Merging with Data from Azure Marketplace

One of the most powerful features of PowerQuery is leveraging data from other data sources, including the Azure Marketplace. The Marketplace has a collection of data from population statistics, business and revenue information and reference data to help build your visualisations. One of the most helpful is a data source for Date information. While this may sound basic, it’s really helpful in presenting visualisations without having to reinvent the wheel in your query.

A great example and one I have used is DateStream (http://datamarket.azure.com/dataset/boyanpenev/datestream) it is a free data source which contains a reference of the Month name, Day name, Quarter, etc for dates.

Image may be NSFW.
Clik here to view.

 

To use a data source from Azure, you first need to sign up on Azure Marketplace with your Microsoft account (Live) https://datamarket.azure.com/home. Once you’ve signed up and found the data source you would like to use, you subscribe to the data source through the data market. Now when we log in through Excel, it will be available for us.

In Excel, the process is similar to if we are connecting to OData. From the PowerQuery tab select “From Azure” > “From Microsoft Azure Marketplace”. You will then be prompted to enter your credentials (using your Microsoft account you used at the Azure Marketplace). After signing in you will be presented with a list of data sources you have subscribed to online.

Once the data is loaded into your data model, you follow the same merge process we described earlier to merge the new date fields with your data source. The result is now the additional columns in your primary query. In the example of the date reference query, we now have Year, Quarter and the Month name to use in visualisations.

Image may be NSFW.
Clik here to view.

Sharing your Transformed Query with others

After you’ve invested the time to transform your query into a nice end user ready data source, you can save it to your Office 365 Data Catalogue. This will allow other users in your organisation to benefit from the time you’ve invested in transformation and work with the end result in Excel and in their visualisations. You can access the Data Catalog from the PowerQuery ribbon in excel, you’ll need your Office 365 Credentials to log in.

Image may be NSFW.
Clik here to view.

A detailed guide to saving and sharing queries can be found here.

Now you should have a clean and friendly data source available within Excel, the next post will talk about creating and publishing visualisations.

Thanks,

Clay.

Power BI and Dynamics AX: Part 3: Create and Publish Visualisations

 

This is Part Three, continuing from Part 2: Extract and Transform. This post will focus on the creation and publishing of visualisations.

At this point in the process we have a workbook, with a transformed data model – we are ready to create a visualisation to present our data. This post isn’t meant to be a detailed guide of how to create a visualisation, you can find one here.

To get started with a PowerView Visualisation, from the workbook we built our query in we are going to click PowerView from the Insert tab.

Image may be NSFW.
Clik here to view.

Excel will load a blank PowerView sheet for you to begin creating your visualisation. Assuming your query was created you should now see your transformed data set in the “PowerView Fields” pane to the right of the screen. (If you don’t – jump back here to see how to setup the query).

Image may be NSFW.
Clik here to view.

You can now start by checking the fields you would like to present in your visualisations, or dragging and dropping them from the field list onto the PowerView sheet. Once you start adding data you will notice a new ribbon “Design” is available, from here you can change visualisations and the presentation of the data.

Some key things to keep in mind when creating your PowerView visualisations:

  • Keep your visualisations simple
    • PowerView is designed to give high impact visualisations that allow a user to answer questions, or prompt action quickly and easily. Complex and detailed visualisations make this extremely difficult to do.
    • Don’t try and answer every business question in one single PowerView sheet – don’t be afraid to have multiple sheets within your workbook to present data in different ways.
  • Keep the end user device in mind
    • With mobile apps and HTML 5 support, PowerView can be viewed on a variety of devices, with different interfaces of different sizes. If you create visualisations which are a small font size or with bars on the chart to small, users on small touch devices won’t be able to interact with them.
    • Some visualisations aren’t supported on all modes – web, mobile app, HTML 5. While basic visualisations are, there is limited support across devices for PowerMap – check on the PowerBI team site for the latest support.
  • Leverage Hierarchies for drill downs
    • In Chart, Bar and Pie visualisations you can add multiple fields to the Axis to create a dynamic hierarchy. The visualisation will then allow the user to drill into the values in the chart.

      Image may be NSFW.
      Clik here to view.

    • While this example is using date values, you can include any fields in the hierarchy – for example Client, Project, Worker to drill into more detail on project analysis.
  • Select your Axis and Legend Fields carefully
    • Keeping in mind the purpose of simple visualisations, if you select a field that has a high number of values (Like 1,000 Customers) and then add this to your legend on your chart – not only is this going to look terrible – but PowerView will also limit the number of customers it displays and provide you with a sample set. In most cases, you won’t want to view only a sample set.
    • Leverage hierarchies to group values so a manageable number of values are presented at any one time. For example: Customer Group, Customer

Publishing your Visualisation

Now you should have a workbook, including your data model and visualisation ready to share on PowerBI. The publishing process is quite simple, the main consideration at time of publishing is security.

Once you publish your workbook there Dynamics AX Security model is no longer applied – the data security is only applied at time of refresh. For example, if the designer of the report has access to 10 Dynamics AX Legal Entities, once they publish the report anyone with access to the workbook will see the contents. This is due to the fact that the information is uploaded into the Excel Data Model in Office 365 as part of the refresh process.

A key component of your PowerBI strategy needs to be focused on the plans for security related to your data. The security will be managed by SharePoint security, based on the workbook. As an example, if you create a folder in SharePoint for “Sales – West Region” and provide access to your West Region team, then only this team will have access to the report once its published to this folder.

The PowerQ&A service is also based on this security model, when a user asks a question on Q&A, PowerBI will look across all workbooks that user has access to within PowerBI. For this reason it’s even more important to ensure the correct security has been setup from day 1.

Note: To publish your workbook in PowerBI for Office 365, you will need an Office 365 subscription and an active licence for PowerBI.

Firstly, you will need to create a PowerBI site on your Office 365 tenant if you haven’t already done so, to create a new PowerBI site follow the detailed steps here.

Once you’ve created your new PowerBI site, you can upload your workbook by selecting “Add” > “Upload File” under the documents section.

Image may be NSFW.
Clik here to view.

Once the workbook is uploaded, you will notice it will automatically enabled itself for PowerBI. If you uploaded your document through the SharePoint library, or had already loaded it before adding PowerBI to your Office 365 tenant, you will need to enable the workbook manually. You can do this by clicking “Enable” on the ellipsis menu on the workbook.

Image may be NSFW.
Clik here to view.

One last step is to enable the workbook for PowerQ&A, this is also done from the ellipsis menu on the workbook itself. Workbooks will not be automatically enabled, the main reason here is you may have multiple workbooks with the same data source, you don’t need to enable all of them for Q&A, only one as long as it’s the same data source in the background.

Now that its enable you can click the workbook to see your visualisations on PowerBI. Done!

Image may be NSFW.
Clik here to view.

 

You’ll notice in the bottom right hand corner you have an icon to take you to the HTML 5 view. It’s recommended to always view your visualisations in HTML 5 to see how they are presented, depending on the device your user is using, you want to make sure the visualisations are clear and easy to understand. In some cases you will see it renders slightly differently in Excel vs HTML5 and may require some tweaking.

Image may be NSFW.
Clik here to view.

 

You’ve now got a Dynamics AX data source extracted and published in PowerBI. The next post in this series will talk about PowerQ&A and the data refresh options available for the data source.

Thanks,

Clay.

Power BI and Dynamics AX: Part 4: Data Refresh and Q&A

 

This is part four, continuing from Part 3: Create and Publish Visualisations. This post will focus on Data Refresh within PowerBI and how to get started with Power Q&A.

At this point in the process we have extracted data from Dynamics AX using Power Query, transformed our data, created a visualisation using PowerView and published it to our Power BI site on Office 365. The next step is to setup a data refresh to ensure our data and visualisations stay current. If you don’t schedule a data refresh, users will need to open the workbook in a desktop version of Excel, and manually refresh the data. This will also require a connection to the AOS for the user refreshing the workbook.

The refresh process has a few components involved, the initial setup has a few steps involved to connect your AX instance, but after this initial process new workbooks can be setup quite easily.

Firstly you need to install the Data Management Gateway (Download is available here.) the gateway is used to allow PowerBI to connect back to your On-Premise data without a user being involved. You will need to deploy the gateway on a server with internet access and access to the Dynamics AX AOS. Once installed, follow the steps outlined here to configure the gateway with our tenant of Office 365.

Once configured, you can now add a data source. This is done from the PowerBI Admin Centre. You will need to create a data source for each Dynamics AX Instance (Note Instance, not Query – so you will need a data source for Production, Dev, Test, etc).

To create the new Data Source, open the PowerBI Admin Centre from the Settings (Gears) option in the top right hand corner from your PowerBI site. From the Admin centre, select Data Sources. Now click the plus sign to add a new data source and select “Power Query”

Image may be NSFW.
Clik here to view.

You’ll now be asked to paste in the connection string used for the connection. You need to get this from your Excel Workbook. Open you excel workbook and open the “Connections” form from the Data tab.

Note: if the Connections option is greyed out, it may be because you’re on a PowerView sheet. Insert a new blank Excel sheet and the option will become available. (Don’t forget to delete the new sheet later)

You’ll see your different data sources in the connection window, you need to select one of your AX data sources, click “Properties”. On the Definition tab you will see the connection string, copy and paste the entire connection string into the PowerBI admin centre.

It should now load the list of connections you have in your data source – you need to complete the details for each data source:

  • Name: Use something informative, example “Dynamics AX – Test Environment”
  • Description: For your internal purposes
  • Gateway: Select the gateway for the connection to use, you may have multiple gateways configured in your environment.
  • Set Credentials: These are credentials used for the refresh, this account must have access to Dynamics AX to perform the data refresh. It is recommended to use a system account for this refresh, not a specific users account.

You can now test the connection. The next two steps will allow you to specify users which have admin privileges over the data source and where notifications of errors should be sent.

Once the Data source is configured we can now go and schedule our workbook to refresh. Return to your PowerBI site and navigate to the workbook. From the ellipsis menu on the workbook select the “Schedule Data Refresh option”

Image may be NSFW.
Clik here to view.

From here you can see the refresh history, as well as configure the refresh for this specific workbook. You must setup the data source in the Admin centre first, otherwise this step will fail. You can find detailed steps on the refresh here.

Using Power Q&A: Natural Language Queries

Q&A is an extremely powerful tool, allowing users to use natural language to explore the data that you’ve prepare in your data model. This is where your transformation steps really pay off, as Q&A can leverage the friendly names you’ve given fields to allow users to explore your data.

To use PowerQ&A, from your PowerBI site click “Ask with PowerBI Q&A” in the top right hand corner. You’ll be presented with a blank canvas ready for questions.

As an example, using my data set I have asked for “Expense amount by client name” – PowerQ&A has prepare a visualisation of my project expense transactions sorted by client name.

Image may be NSFW.
Clik here to view.

Using the “Explore this result” pane on the right hand side you can start changing visualisations, filters and even the fields that are presented.

Q&A does a lot of work on its own to understand natural language, it identifies synonyms, deals with spelling mistakes but you’ll notice as you begin to explore that PowerQ&A doesn’t always get it right, in the drop down under your question you’ll see how Q&A is interpreting your question. In my example you can see it is showing “Show amount where transaction type is expense sorted by client name”. What you will notice is sometimes Q&A can’t understand what you’re looking for and a word will be greyed out – this means Q&A didn’t understand your phrasing.

Synonyms are one of the easiest ways to teach Q&A about your business, you can do this through PowerPivot (detailed instructions here.) or through PowerQ&A optimisation in Office 365. To manage this through Office 365 you need to open the Power BI Site Settings from your PowerBI Site. From within your Site Settings you’ll see a tab for Q&A which will contain all your workbooks enabled for Q&A. From the ellipsis menu, select “Optimize for Q&A”

Image may be NSFW.
Clik here to view.

You will be presented with a blank Q&A space for you to ask test questions, you’ll also notice in the pane on the right hand side a summary of the optimisation that has already take place. The first time you load the workbook you’ll notice the synonyms and phrasings already generated by Q&A automatically.

Image may be NSFW.
Clik here to view.

 

Starting with the last tab, Usage, this is extremely helpful for you to understand how your users are using Q&A, as well as what words or phrasing isn’t being understood. IT administrators and/or data officers should be regularly monitoring this tab to ensure Q&A is providing the right results, and is continuing to learn about the organisation.

Image may be NSFW.
Clik here to view.

Synonyms: you can use this tab to add a synonym to your column names. For example, you may internally refer to a “Product” as a “Part” – to teach Q&A you can add Part as a synonym to Product, now when users use the work “Part” in their questions, Q&A will be able to provide a response.

Phrasing is extremely powerful, it allows you to teach complex terms or expressions which are used within your organisation. As an example, let’s say I asked “Expenses by client” – in my mind I want the same result as my first example, but I didn’t say “expense amount”. You can see Q&A hasn’t interpreted this correctly, what is it showing me is clients which have had expenses, not the actual amount.

Image may be NSFW.
Clik here to view.

This is where optimisation comes into play, where we can teach Q&A that when I say “Expenses” I actually mean the amount of transactions which are of type, Expense. So now under the Phrasing Tab, I can add a new phrasing.

Image may be NSFW.
Clik here to view.

As soon as I click ok and ask the same question again “Expenses by client”, the new result is shown below.

Image may be NSFW.
Clik here to view.

 

Some key things to keep in mind for Q&A:

  • Invest the time in PowerQuery transaction to make sure you start with a nice clean data set.
  • Plan for a pilot of Q&A before releasing it to your entire organisation, use this time to optimise your data and ensure you have your synonyms and phrasing worked out.
  • Remember access to data through Q&A is driven by your SharePoint security, so plan accordingly.
  • Q&A understands terms like more than, less than, last year, this month – explore with terminology and learn what works best for you.
  • Use the “featured questions” option to save questions and present them to users as they log in. This not only saves time in retyping questions, but also gives new users an introduction on what they can be asking.

Here are some great resources for Power BI, have a look at this content if you’re starting out:

This is the last post in this series focused on PowerBI for Office 365, the next post will be focused on an example of how to use the new PowerBI functionality with on premise data by setting up a tabular SSAS database.

Thanks,

Clay.

Sprinkling a little bit of IoT around Contoso’s Dynamics business process to make it more intelligent

I am increasingly hearing Dynamics customers asking about IoT solutions from Microsoft. This article describes a recent experience at a Dynamics AX customer, hope you enjoy. 

Internet of Things is growing at a phenomenal pace and there are so many mind boggling predictions about it that it can be hard to follow. To make things simple you can chose to remember just three predictions from Gartner, apparently these numbers will be valid for next five years Image may be NSFW.
Clik here to view.
:-)
so you should be ok barring major changes.

  • IoT will include 26 billion units by 2020
  • IoT and service suppliers will generate incremental revenue exceeding $300B, mostly in services by 2020
  • IoT will result in $1.9T in global economic value-add through sales into diverse end markets

As if talk about petabytes and terabytes of data wasn’t sufficient now you also have to remember about billions of sensors and trillions of dollars! Doesn’t it make you wonder where all this is going and how much is really true? Well, if it does, I entertain you to look at this little bit of sprinkling of IoT devices that we are currently doing at Contoso. If this article interests you and you feel like you want to touch and feel some of these devices yourself please come and see us at Atlanta convergence March 16-19 2015 at a general session and a concurrent session. If you miss out on those opportunities you can always find us in the Customer showcase at EXPO hall. 

So I have covered in previous posts what Contoso’s business is about. Contoso is UK’s leading foodservice delivery company, supplying full- range of food stuffs across the UK. It is a leading supplier to restaurants, bars, pubs, café’s and schools. Forty-five thousand customers place over a million orders a year from a catalogue of about five thousand products. More is here.

Since they make deliveries, they have their own fleet to do this. As you can imagine, any delivery of temperature sensitive products like food or medicine has it’s fair share of challenges. One of these challenges for the COO of Contoso is how not to feel terribly sorry while he sits and watches thousands of pounds of food rotting at the backside of a truck which has a broken cooling unit and is stuck in snarling traffic through streets of London on a hot summer day. Half of such an event is enough to justify the meagre cost of implementing a Microsoft Azure based temperature-controlled solution that Contoso has come up with to address their scepticism about internet of things in general! 

On a more serious note it has happened a few times in hot summer months with products like ice-creams, shrimps and peas that customers have called up to complaint about drivers delivering defrosted stuff. There isn’t much choice for Contoso other than to provide a credit note when such a thing happens. It’s not only lost revenue but more importantly lost customer that is the real concern in such situations – not to mention, the spoilt food. And there is the major risk of big damage with a whole cooling unit breaking down. This is a nightmarish scenario for every distributor engaged in any cold chain.

So, the genesis of what Contoso is building lies in a simple question to Paul Smith, the COO – “Paul, if this is a risk, why don’t you put temperature sensors in your trucks and hook them up to Azure and run some analytics on time series data to alert you before something breaks down? If results tell you something, it would mean there is some substance to all the hype”  

It was a simple enough question and a simple enough proposal. Paul was in as long as we agreed we are not going to hook up sheep pedometers to Azure to measure their health and hence quality of their meat or use automatic devices to feed pets when owners were away deeply engaged in a pub crawl! Convince yourself here – 1, 2, 3, 4, 5, 6

So we looked around for suitable temperature sensors. There are so many sensors in the market sensing so many things that at first, it was hard for us but then quickly we established strict qualification requirements and narrowed down to two sensors. TI sensor and a ZenMeasure sensor from a vendor in China. ZenMeasure is called MiaoMiaoCe (in Chinese, this means “measure it every second”). TI tag has more details here. You are welcome to browse details of their capabilities using the links but I will put here in short, the tests we did and comparisons we made.  

ZenMeasure is pretty cool, pretty slick and seriously small – it is the size of an overcoat button. It was originally designed for consumer use, for measuring skin temperature of older patients and children in hospitals and care homes.

Image may be NSFW.
Clik here to view.
 
So you can imagine their surprise when they got a request from Contoso to test their sensors to install in delivery trucks. In the early days (three months ago that is) ZenMeasure didn’t measure below freezing temperatures but this was quickly changed in an update a week later to allow Contoso to measure temperature of its chilled as well as frozen products on the road. (Note: In this market, there are many vendors and they move very quickly to publish changes to their products/apps).

When you go for something like this or any device of this nature you will quickly realize that most important things are signal strength and battery life. Contoso is still evaluating battery life and comparing it with TI. Contoso procured two sensors at first and Paul installed them in his home refrigerator and freezer and used his iPhone app to test the features. He felt pretty excited about its’ capabilities. Now it’s been three months into testing and several new ZenMeasure sensors have been procured and installed in various places including a few in delivery trucks, none so far indicate much loss in battery life. This is excellent! TI sensor on the other hand is capable of sensing not only temperature but also humidity, motion etc. making it very likely to have lower battery life than ZenMeasure. Both communicate using BLE Bluetooth Low Energy, both are very easily discoverable on iPhone and both meet the connection strength and connection range criteria.

Image may be NSFW.
Clik here to view.

Now about the major drawback. None of the sensor vendors we looked at have a Windows app. (Note: this is pretty common, vendors are building for iOS and Android, not so much for Windows, however this is likely to change quickly with Windows 10 and Raspberry Pi support and a number of other things.) We tapped into teams in Microsoft and found out Chris already had made a WP app for TI. He agreed to build a WP app for ZenMeasure. He quickly got a ZenMeasure sensor for himself and it took him few hours I guess to have the app working. There were a number of issues discovered and resolved, some included here to provide you a flavour of what’s in store for you if you go down the IoT road.  

Some mundane stuff – Contoso spent a decent few weeks discussing where exactly to mount the sensors in the delivery trucks. They eventually called refrigeration engineers and got their help to settle on the best location within the delivery truck to mount the sensors. TI sensor comes with a little hook that can be used to hang it on the walls of the truck however ZenMeasure doesn’t come with anything so a DIY sort of casing was built to house the sensor so it could be safely stored and mounted.  

But let’s say, we get a sensor, we got it working, we got an app for it to see the reading from anywhere. Now what? After all, this is great for a hobbyist but to be commercially useful to Contoso, there should be some use of this data in some business process. Isnt’ it?

Image may be NSFW.
Clik here to view.
It’s important to recollect here from one of the previous posts that Contoso has a Windows CE based drivers’ app that they use to make drop-offs to customers. Some customers ask for temperature reading of the products being delivered. Today drivers’ do this by walking to the back of the truck and manually looking at the manual hard-wired thermometers and entering this reading by hand on drivers’ app which then ensures that printed invoice contains the reading. 

So, naturally this brought up all the questions related to integration. So how exactly to integrate this ZenMeasure app and/or data from the sensor to various other things going on with the business process. Contoso is also building a new WP app for drivers that will run on Microsoft Lumia 635 phones running WP 8.1. The backend is Dynamics AX and it uses a number of Azure technologies. More on that is available on links at bottom of main Contoso page. So this is what is happening with ZenMeasure data. With the new drivers’ app, when the driver has reached his drop-off destination and is ready to pick goods from truck and drop them at the customer location, the drivers’ app interrogates both chiller and freezer ZenMeasure sensors about the current temperature reading. This reading is then stored in the SQLite database on local Microsoft Lumia device. Chris also made a suggestion to keep the interrogating to a minimum, more interrogation means lower battery life. Contoso has a regular maintenance schedule for its’ delivery trucks that go through regular check-ups every ten weeks and the goal is that ZenMeasure sensor with all the love and care showered onto them should not need battery changing at a cycle shorter than that. These are very important considerations from a business process point of view, it’s important that disruption to regular business process is kept minimum to justify building this solution for essentially an insurance against a future event of some likelihood.

Once the readings are in local Lumia device SQLite, the data is sent to SQL Azure storage along with all the other data from drivers’ app. The reading is then picked up to be transferred to Dynamics AX database from where it is used further down in the business process. Note, this is critical for offline scenarios since connectivity on road is poor in most places and really poor in UK. There is also a Bluetooth printer that drivers carry with them to print invoices with their drop-offs. This led to another challenge as for a while we could not get both Bluetooth printer and BLE sensor to pair with the Lumia device at same time. Soon enough this issue was resolved. The drivers now have the ability not only to print the invoices using the Bluetooth printer but also the ability to print temperature at drop-off time on the invoice itself for customers who want that level of service. This helps customer service folks deal better with customer complaints about defrosted goods. Automated thermometer reading into the app ensures reliability – the reading is captured for each and every drop-off and improves efficiency – driver’s don’t have to read the thermometers manually as in the old system.

Image may be NSFW.
Clik here to view.

There is no end to surprising problems we encountered while building this solution. Now, after full deployment, Contoso will have over 100 drivers using 100 Lumia phones in 100 vehicles with 200 ZenMeasure sensors installed on any given day. After a long period of contemplation, Paul decided to let his drivers’ keep Lumia phones with them after workday is over! So each driver has his phone now. And each truck has two sensors. However driver and trucks are in no way fixed. Any driver can drive any vehicle on any given day on any given route for that day. This meant that after drivers were assigned routes (based on complicated math, topic for another day!) and trucks, they would have to pair the two BLE sensors in that truck to their phones before they set off for the day. This is a step too complicated to add to already complex and very well oiled business process. After all the whole purpose of this exercise is to make the process more intelligent, more efficient and reliable. How can Paul leave this bit to chance? What if driver couldn’t pair properly or pairing itself failed or just didn’t want to, etc. So, Chris is working with the team to build a solution which ensures that all Microsoft Lumia phones are already permanently paired to all ZenMeasure sensors. 

The WP drivers’ app also has two additional capabilities.

There are two background tasks running, one to capture battery life from the sensor and another to capture temperature reading every couple minutes form every sensor in over hundred delivery trucks. Keeping in mind the offline scenarios due to poor connectivity these captured time series are also stored locally on Lumia phones and then streamed into Azure Event Hub. There are two plans with these time series. One is to use the battery life time series from all trucks and feed to an Azure ML model that does simple time series forecast to predict when the battery is expected to run out. This will give Contoso a very nice clean way to predict when the battery will run out and use this knowledge to proactively have required batteries available for drivers to change before they head out for the day so sensors are always ON while on road. No excuses!

The temperature time series on the other hand is being used in a number of ways. Contoso is using Azure Stream analytics to run simple temporal queries on the streaming data. Show me average freezer reading for last 30 minutes for all vehicles on the road, Alert me if average reading for last 30 min is beyond a certain threshold for any vehicles on the road etc. Since they already capture GPS co-ordinates for all vehicles they are planning to run finer grained GPS aware temporal queries. Contoso also has a Power BI (if you are keen to know more how to use power BI with AX, see Clay’s great series here) dashboard in the transport office that displays current vehicle locations and superimposes average temperature readings from chillers and freezers on to it. Since this is quite new, they are in the process of establishing policies on what to do when temperature readings cross certain thresholds – at what point to call drivers’ back to the base etc. Eventually they want to use the temperature time series data for a predictive maintenance Azure ML model that can predict when a particular cooling unit should be sent for an overhaul before it breaks down. There is also a plan in the works to use Anomaly detection Azure ML app to see if it reveals anything interesting and helps in any way.

All in all it was somewhat surprising to note that a consumer device and a consumer app that worked pretty much out-of-the-box for consumer scenarios had to go through pretty rigorous testing and a number of modifications to get to a point where Contoso considers this as a feasible solution to address their scepticism with internet of things! Kudos to this team for their incessant focus on customer service, efficiency and on technological innovation.

Hope this was interesting for you to see how an automated process could be made more intelligent with a live example and how so many new Microsoft Azure tools and technologies can be connected to work with your business processes in Dynamics AX. If you are a Dynamics customer and want to learn more about this solution or want to deploy another IoT solution please write to me aksheyg@microsoft.com

Contributions: Chris Lovett, Akshey Gupta

Power BI and Dynamics AX: Part 5: PowerBI.com and On Premise Data (Preview)

 

This is the final part of the Power BI and Dynamics AX blog series for now, the earlier posts focused on the current functionality available within Power BI for Office 365. This blog post is going to talk about new functionality, that at time of writing, is only available in preview. This the topic in this post should not be applied to a production environment as it relates to preview functionality.

The new version of PowerBI.com has introduced a lot of new functionality; Dashboards, new visualisations, new apps for iOS and Windows devices all making for an extremely rich PowerBI experience on the web. This post is focused however on the new Live Connectivity functionality for On Premise Data.

Image may be NSFW.
Clik here to view.

The new connector that has been released allows a live connection to an On Premise Analysis Services Tabular Database. While unfortunately at the time of writing the SSAS Database shipped with Dynamics is a Multi-dimensional database you can’t connect directly, but you can create a Tabular Database to “flatten” the multi-dimensional database into a tabular Database for use with PowerBI.com. The latency of your data in PowerBI.com will be determined by how often you’re processing your SSAS Cube.

As an organisation you need to determine the best connectivity method for you and PowerBI, this may be through SSAS or through OData as previously described. There are limits to the On Premise option at the moment, given the nature of the Q&A Natural Language Queries and the processing that is required, Q&A is not currently supported on On-Premise data, you must still upload data into the data model for Q&A to work. For more information, start with the documentation from the PowerBI team – Which Power BI experience is right for me?

For this example we are going to use:

  • Dynamics AX 2012 R3 CU8
  • SQL Server 2014 Enterprise
    • Two SSAS Instances Deployed
      • Multidimensional with standard Dynamics AX Cubes Deployed (Find a detailed guide here.)
      • Tabular
  • PowerBI.com (Preview)
  • Power BI Analysis Services Connector (Preview)
  • Visual Studio 2013
  • Microsoft SQL Server Data Tools – Business Intelligence for Visual Studio 2013

Important Note: The SSAS Connector requires Azure Active Directory Sync to be running between your on Premise Active Directory and Azure Active Directory – otherwise you will receive an error when trying to connect to the Data source from PowerBI. So for those of you using the Dynamics Demo Virtual Machine, you won’t be able to connect. Your SSAS instances will need to be deployed on a machine synced with your Azure Active Directory.

We are going to create our Tabular Database through Visual Studio using a Business Intelligence Project Template that is installed as part of the SQL Server Data Tools pack. For the example today we are going to create a Project analysis tabular model with basic project information along with some key measures.

To start, launch Visual Studio. We are going to create a new SSAS Tabular Project by clicking “New Project”, and selecting “Business Intelligence”, “Analysis Services Tabular Project”

Image may be NSFW.
Clik here to view.

After selecting the project and entering a name, you will be prompted to connect to an instance of SSAS, this is where you connect to your newly created SSAS Tabular Instance.

Once we have our new project you have a few options of how you would like to create the model, depending on your technical ability. For this example we are going to use the wizard as it’s the easiest option. To get started, select “Model” > “Import from Data source”. You’ll be prompted with the list of data sources you can import from. You have the option of connecting directly to the AX relational DB, but I find the cubes easier as fields like enums, references, etc have been cleaned up and are a lot easier to work with. You also get the benefit of leveraging the calculations already in place in the standard cubes.

Image may be NSFW.
Clik here to view.

For our purposes today, we will use Microsoft Analysis Services. In the next screen you’ll enter the connection details for the SSAS Multidimensional Database (Dynamics AX Standard Cubes).

Image may be NSFW.
Clik here to view.

After entering credentials, you’ll be prompted for the MDX query that the tabular database should use for this model. You can start with MDX if you wish, or use the “Design” option to launch the visual designer. From this screen we will select the data we want form the cube to be available in our Tabular model. You can drag and drop fields from cubes on the left hand side into the pane on the right. As you add fields you will see your tabular model refresh to give you a preview of the data that will be available.

Image may be NSFW.
Clik here to view.

Once you’ve selected the information you want available, click Ok – you can now give your query a friendly name and then click Finish. If you had provided incorrect credentials, you will receive an error – you will need to back up to the credentials and update them with an account that has access to the cubes. Once you click finish the MDX query will be processed, once finished, close the window and you will see the results of your MDX query. You can take this time to change column names if you wish to make the data a little friendlier once we load it into PowerBI.

Image may be NSFW.
Clik here to view.

You can now close and save your Model. If you would like to double check its deployment, you can open up SQL Management Studio and you should see your newly created tabular DB. The SSAS On Premise model uses security from SSAS, so this is where you would apply your role security to your SSAS data model. Users need to have access to this DB to be able to explore the data in PowerBI (This is a key difference to the OData/Workbook model previously discussed)

Image may be NSFW.
Clik here to view.

 

The last step On Premise is to install the PowerBI Analysis Services Connector (Preview). You can find a detailed guide of how to download and install the connector here. The installation will require your PowerBI.com login details (Your Office 365 credentials) as well as the details for the Tabular instance of SSAS.

Now we are ready to expose the new tabular database to PowerBI. You can log into the Preview here. At the time of writing this preview is only available to customers within the United States. Once you’ve logged in, select “Get Data” > “SQL Server Analysis Services” and then “Connect”.

Image may be NSFW.
Clik here to view.

You will be presented with a list of all the SSAS connectors published within your organisation. Find and select yours in the list. You will then see a list of Models which are available on your instance. Click the model you had created earlier and click “Connect”

Image may be NSFW.
Clik here to view.

Once connected, you will now have a new Dataset available which is your On Premise Data source. (Note: The name will be your SSAS instance, you can rename it in PowerBi.com if required)

Image may be NSFW.
Clik here to view.

Now your data is available to create reports and dashboards against like any other data source. From the ellipsis menu on the Dataset click “Explore” and you be taken to a blank PowerView page to start building your report. If you’re familiar with creating visualisations in PowerView you can follow the same process, if not you can find a detailed guide here.

Below is an example of a Project Profitability analysis report based on On-Premise Data on Dynamics AX. The Invoiced Revenue, Cost, Hours and Gross Profit are all based on calculated measures defined in our standard Dynamics AX SSAS Cubes. You can find a detailed reference of the information available in the cube here.

Image may be NSFW.
Clik here to view.

One of the key benefits of the new PowerBI.com is the ability to create dashboards. Dashboards allow visualisations from multiple reports to be pinned to a single dashboard to give you a quick and easy overview of multiple sets of data. You can then drill into that specific report by clicking the visualisation from the dashboard.

Image may be NSFW.
Clik here to view.

This was a very simple example of exposing some Dynamics AX data to explore the preview; users of PowerBI should consider the best connection method for them, along with planning around what data should and should not be exposed. The PowerBI technology is changing at a great pace at the moment, it’s important to keep up to date with what is coming and how it can help shape your ongoing Business Intelligence Strategy.

For information on the new PowerBI.com platform, try these resources:

Hopefully this has been a help insight to some of the new functionality out in preview at the moment, and how it can be applied to Dynamics AX.

Thanks,

Clay.

AX Retail: Create a test site using ecommerce checkout controls

In Dynamics AX 2012 R3 CU8 we released checkout controls. In order to create and browse test website using these integrated controls please follow these steps:

Go to Demo VM

  • Go to “my documents” and then open folder “retail SDK CU8”
  • Open solution file located at “\Online Channel\Clients\Web\Storefront\ Web.Storefront.sln” in VS 2013
  • Create strong name key named as “strongnamekey.snk” and place it adjacent to bldver.cs file
  • Compile your project and publish it to a folder
  • Grab that published folder and create new website in IIS in demo machine
  • Make sure app-pool account is “contoso\administrator”
  • Now go to the folder where site is running and open commerceruntime.config and change defaultOperatingUnitNumber value to 068

  <storage defaultOperatingUnitNumber=”068″ />

  • Open web.config and change database connection string

   <add name=”CommerceRuntimeConnectionString” connectionString=”Server=localhost;Database=RetailContosoStore;Trusted_Connection=Yes”/>

  • Same web.config replace section below with <<Put correct strong name key you created above>>

<section name=”ecommerceControls” type=”Microsoft.Dynamics.Retail.Ecommerce.Sdk.Controls.ControlsSection,Microsoft.Dynamics.Retail.Ecommerce.Sdk.Controls, Version=6.3.0.0,
Culture=neutral, PublicKeyToken=fd6b8d0172171ea7, processorArchitecture=MSIL”/>

  • Open browser and browse to new website just created.

 


Announcement: Upcoming Advanced Solution Architect and Development workshops for AX2012 R3

Extensibility in Dynamics AX 2012 R3 CU8 (CRT, RetailServer, MPOS) Part 2 – New data entity

Overview

This blog is to expand the knowledge you gained in part 1 of the series (http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx). In case you get stuck, I recommended that you make yourself familiar with part 1 first. Some of the information from there is required and assumed.

The steps are based on the Dynamics AX 2012 R3 CU8 VM image that can be requested via LCS (https://lcs.dynamics.com/Logon/Index) which most partners have access to (Contoso sample).  Alternatively, PartnerSource (https://mbs.microsoft.com/partnersource/northamerica/) can be used to download the VM as well. Make sure you get the CU8 version.

It is recommended to review some of the online resources around the Retail solution, either now or during the processes of following this blog (https://technet.microsoft.com/en-us/library/jj710398.aspx).

The areas this blog covers are:

-         AX: Adding a new data entity, related to a retail store, and populating it by means of a job (no UI)

-         CDX: configuring CDX in order to include the new table in data synchronizations

-         CRT: adding a new data entity and new service

-         RetailServer: exposing a new controller for the new entity; adding new ODATA action

-         MPOS: adding plumbing to call RetailServer; updating UI to expose data

A future blog will cover topics and suggestions for changing existing CRT code.  Stay tuned for that.

The changed code is available in ZIP format which includes the files that have been added or changed only. It can be applied (after backing up your existing SDK) on top of the
“Retail SDK CU8” folder.  Note that the ZIP file includes the changes from part 1 as well.

This sample customization will update the MPOS terminal to show more detailed opening times for a store.  Remember that a store worker can look up item availability across multiple stores. Imagine that as part of that flow, the worker would like to advise the customer if a particular store is open or not. See the screen shot below for the UI flow:

 Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

 

Notes:

-         The sample is to illustrate the process of a simple customization. It is not intended for product use.

-         All changes are being made under the login contoso\emmah. If you use a different account, or different demo data altogether please adjust the below steps accordingly.

 

High-level steps

 

The following steps need to be carried out:

 

  1. Setup the Retail SDK CU8 for development (see part 1)
  2. Prepare MPOS to be run from Visual Studio from unchanged SDK code (see part 1)
  3. Activate the MPOS device (see part 1)
  4. Include new entity in AX
  5. Configure CDX to sync new entity
  6. Channel schema update and test
  7. Add CRT entity, service, request, response, datamanager and RetailServer controller with new action
  8. Update client framework to call RetailServer endpoint
  9. Update client framework channel manager with new functionality
  10. Update client framework’s view model
  11. Update MPOS’s view to consume updated view model
  12. Test

 

Detailed steps

 

Setup the Retail SDK CU8 for development

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

 

Prepare MPOS to be run from Visual Studio from unchanged SDK code

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

 

Activate the MPOS device

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

  

Include new entity in AX

 

In order to store the store hours per store, we will be using a new table called ISVRetailStoreDayHoursTable. It will store the day, and open and closing times for each store.

As part of the ZIP folder you can find the xpo file at SampleInfo\Sample2_StoreDayHours.xpo. Import this file into AX. It includes 2 items: the table and a simple job that populates sample data for the Houston store.

Run the job named Temp_InsertData at least once. Then inspect the table with SQL Server Management studio:

Image may be NSFW.
Clik here to view.

Configure CDX to sync new entity

In AX, add a new location table and the appropriate columns to the AX 2012 R3 schema (USRT/Retail/Setup/Retail Channel Schema)

 Image may be NSFW.
Clik here to view.

 Create a new scheduler subjob (USRT/Retail/Setup/Scheduler subjobs)

 Image may be NSFW.
Clik here to view.

Click transfer field list and make sure that the fields match as above.

Add the new subjob to the 1070 Channel configuration job (USRT/Retail/Setup/Scheduler Job)

 Image may be NSFW.
Clik here to view.

Edit the table distribution XML to include the new table (USRT/Retail/Setup/Retail Channel Schema)

Image may be NSFW.
Clik here to view.
 

The easiest is to get the XML out of the text box, edit it outside with a XML editor, and then get it back in.  The change you need to do is to add this XML fragment:

  <Table name=”ISVRETAILSTOREDAYHOURSTABLE”>

     <LinkGroup>

     <Link type=”FieldMatch” fieldName=”RetailStoreTable” parentFieldName=”RecId” />

    </LinkGroup>

  </Table>

in two places. Both times, it needs to be added inside the RetailStoreTable table XML node.

At the end, click Generate Classes (USRT/Retail/Setup/Retail Channel Schema/AX 2012 R3)

Channel schema update and test

The equivalent change to the table schema must be made in the channel side.  This has to be done to all channel databases. Use SQL Server Management Studio and create the table. Since this is a sample, we won’t add stored procedures, we just do that in code. However, it is recommended to use sprocs for performance and security reasons.

Image may be NSFW.
Clik here to view.

The file can be found in the ZIP folder at SampleInfo\ChannelSchemaUpdates.txt

Now, go back to AX and run the 1070 job (USRT/Retail/Periodic/Distribution Schedule/1070/Run now)

Then, verify in AX that the job succeeded (USRT/Retail/Inquiries/Download Sessions/Process status messages). You should see a status of “Applied” for the stores. 

 

Add CRT entity, service, request, response, datamanager and RetailServer controller with new action

 Use the solution in the ZIP file at SampleInfo\RSCRTExtension\RSCRTExtension.sln and inspect he code.

Since this part is based on part 1, I assume you have:

-         already configured the pre.settings file (for rapid deployment as part of the build into RetailServer’s bin directory),

-         already configured RetailServer’s version of commerceRuntime.config to include the new CRT extension dll, and

-         already configured RetailServer’s web.config file to include our new extension dll.

Here is a code map view of the code changes required:

Image may be NSFW.
Clik here to view.

 You can see that we need a CRT request, response, service, dataaccessor and entity. Additionally, RetailServer is customized to include a new StoreDayHoursController that exposes a new ODATA endpoint, GetStoreDaysByStore.  That endpoint uses the CRT and the request object to get a response. It does not use the data service directly.

If you have configured all right, compiled the solution and fired up the ODATA metadata url of RetailServer (http://ax2012r2a.contoso.com:35080/RetailServer/v1/$metadata), you should see the new action:

 Image may be NSFW.
Clik here to view.

 

Update client framework to call RetailServer endpoint

 

The first step is to make MPOS aware of the new entity and the new endpoint. This is basically proxy code similar to what tools like wsdl.exe would generate for .NET web services. The Retail team is investigating to provide a tool for automatic regeneration in a future release.

CommerceTypes.ts

This is a class that specifies the new entity, both as an interface and a class.

    export interface StoreDayHours {
        DayOfWeek: number;
        OpenTime: number;
        CloseTime: number;
        ExtensionProperties?: Entities.CommerceProperty[];
    }
    export class StoreDayHoursClass implements StoreDayHours {
        public DayOfWeek: number;
        public OpenTime: number;
        public CloseTime: number;
        public ExtensionProperties: Entities.CommerceProperty[];

        /**
         * Construct an object from odata response.
         *
         * @param {any} odataObject The odata result object.
         */
        constructor(odataObject?: any) {
            odataObject = odataObject || {};
            this.DayOfWeek = odataObject.DayOfWeek ? odataObject.DayOfWeek : null;
            this.OpenTime = odataObject.OpenTime ? odataObject.OpenTime : null;
            this.CloseTime = odataObject.CloseTime ? odataObject.CloseTime : null;
            this.ExtensionProperties = undefined;
            if (odataObject.ExtensionProperties) {
                this.ExtensionProperties = [];
                for (var i = 0; i < odataObject.ExtensionProperties.length; i++) {
                    this.ExtensionProperties[i] = odataObject.ExtensionProperties[i] ? new CommercePropertyClass(odataObject.ExtensionProperties[i]) : null;
                }
            }
        }
    }

 

CommerceContext.ts

This is a class that exposes the ODATA data service to the rest of MPOS.

 public storeDayHoursEntity(storeId?: string): StoreDayHoursDataServiceQuery {
  return new StoreDayHoursDataServiceQuery(this._dataServiceRequestFactory, “StoreDayHoursCollection”, “StoreDayHours”, Entities.StoreDayHoursClass, storeId);
 }

    export class StoreDayHoursDataServiceQuery extends DataServiceQuery<Entities.StoreDayHours> {

        constructor(dataServiceRequestFactory: IDataServiceRequestFactory, entitySet: string, entityType: string, returnType?: any, key?: any) {
            super(dataServiceRequestFactory, entitySet, entityType, returnType, key);
        }

        public getStoreDaysByStoreAction(storeId: string): IDataServiceRequest {
            var oDataActionParameters = new Commerce.Model.Managers.Context.ODataActionParameters();
            oDataActionParameters.parameters = { StoreNumber: storeId};

            return this.createDataServiceRequestForAction(‘GetStoreDaysByStore’, Entities.StoreDayHoursClass, ‘true’, oDataActionParameters);
        }
    }

 

Update client framework channel manager with new functionality

Now, we got the low-level proxy code done, we need to expose the new functionality in a more consumable way to the rest of application framework. An appropriate location for the new functionality is the IChannelManager as it already encompasses similar functionality that is of more global, channel-related nature.

IChannelManager.ts:

    getStoreDayHoursAsync(storeId: string): IAsyncResult<Entities.StoreDayHours[]>;

ChannelManager.ts:

  public getStoreDayHoursAsync(storeId: string): IAsyncResult<Entities.StoreDayHours[]> {
  Commerce.Tracer.Information(“ChannelManager.getStoreDayHoursAsync()”);

  var query = this._commerceContext.storeDayHoursEntity();
  var action = query.getStoreDaysByStoreAction(storeId);

  return action.execute<Entities.StoreDayHours[]>(this._callerContext);
 }

 

Update client framework’s view model

The view model is an abstraction of the view that exposes public properties and commands for any view implementation to use.  Here are the 3 things we need to do in order to customize the existing StoreDetailsViewModel:

  • a variable that holds the result for view to bind to and a variable called isStoreDayHoursVisible that view can use to toggle visibility of the UI:

        public storeDayHours: ObservableArray<Model.Entities.StoreDayHours>;
        public isStoreDayHoursVisible: Computed<boolean>;

  • data initialization in the constructor:

    // empty array
  this.storeDayHours = ko.observableArray([]);
  this.isStoreDayHoursVisible = ko.computed(() => {
   return ArrayExtensions.hasElements(this.storeDayHours());
  });

  • data retrieval function to be called by the view

        public getStoreDayHours(): IVoidAsyncResult {
            var asyncResult = new VoidAsyncResult(this.callerContext);
            Commerce.Tracer.Information(“StoreDetailsViewModel.getStoreDayHours()”);

            this.channelManager.getStoreDayHoursAsync(this._storeId)
                .done((foundStoreDayHours: Model.Entities.StoreDayHours[]) => {
                    this.storeDayHours(foundStoreDayHours);
                    Commerce.Tracer.Information(“StoreDetailsViewModel.getStoreDayHours() Success”);
                    asyncResult.resolve();
                })
                .fail((errors: Model.Entities.Error[]) => {
                    asyncResult.reject(errors);
                });

            return asyncResult;
        }

 

Update POS’s view to consume updated view model

The StoreDetailsView.ts already calls into the view model to get store distance. For simplicity, we just hook into the done() event handler to call the new function:

                    this.storeDetailsViewModel.getStoreDistance()
                        .done(() => {
                            this._storeDetailsVisible(true);
                            this.indeterminateWaitVisible(false);

                            this.storeDetailsViewModel.getStoreDayHours()
                                .done(() => {
                                    this._storeDetailsVisible(true);
                                    this.indeterminateWaitVisible(false);
                                })
                                .fail((errors: Model.Entities.Error[]) => {
                                    this.indeterminateWaitVisible(false);
                                    NotificationHandler.displayClientErrors(errors);
                                });

Lastly, we update the html to expose the data:

Image may be NSFW.
Clik here to view.
 

Please use the sample code in the ZIP archive as mentioned above.  This also includes a few other changes not
detailed here, for example in resoures.resjson, Converters.ts.
 

Issues and solutions:

If you cannot run MPOS from the Pos.sln file because it is already installed, uninstall the app first. This link may also be helpful: http://blogs.msdn.com/b/wsdevsol/archive/2013/01/28/registration-of-the-app-failed-another-user-has-already-installed-a-packaged-version-of-this-app-an-unpackaged-version-cannot-replace-this.aspx 

Happy coding,

Andreas

 

Original link: http://blogs.msdn.com/b/axsa/archive/2015/05/20/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-2-new-data-entity.aspx (go back to it for zip file download…)

 

Retail SDK CU8 – Extensibility Sample 2.zip

Announcement: Upcoming Advanced Solution Architect workshop for AX2012 R3 in Finland

We have Advanced Solution Architect Workshop for Microsoft Dynamics AX 2012 R3 scheduled for November 9-11 in Finland

This three-day instructor-led workshop is designed for solution architects working for partners that are engaged in large, complex or multi-location projects where systems integration is a key requirement.

The participants of this workshop are expected to have experience in projects where they have driven the definition of business, technical, and architectural requirements, and where they are accountable for validating the solution

The objectives of the workshop include:

  • Learning how to articulate and architect a solution by using the key concepts of the foundational features in Microsoft Dynamics AX 2012 R3.

  • Developing an understanding of how to create a solution to a complex multi-national customer implementation by using the capabilities of AX 2012 R3.

  • Understanding the concepts behind architecting an enterprise solution with AX 2012 R3.

Training Dates: November 09, 2015 – November 11, 2015 
Training Venue: Microsoft Talo, CD-Building, Keilalahdentie 2-4, Espoo 02150, Finland  

 More information and registration links are here

Dynamics Technical Conference February 2016 Deep Dive Training

 

Happy New Year! I hope everyone is back refreshed and ready for an exciting year of Dynamics AX ahead. Over the break our readiness team announced training which will be taking place the three days after the Technical Conference in February in Seattle, Washington.

 

If you haven’t yet registered for the Technical Conference (February 23 – 25, 2016) you can here:

https://www.microsoft.com/en-us/dynamics/techconference/

 

In addition to all the content, breakout sessions and labs at the Technical Conference, the Solution Architecture group and core R&D teams are also running three deep dive training sessions all on the new Microsoft Dynamics AX (AX7).

 

Implementation Lifecycle Workshop for Microsoft Dynamics AX – Register Here

Training Details: February 26, 2016 – February 28, 2016

Training Time: Check-in 8:30am; Training 9:00am-5:00pm PST

Training Venue: Microsoft Conference Center, 16070 NE 36th Way, Building 33, Redmond, WA 98052, United States

Description

The Implementation Lifecycle Workshop for the new Dynamics AX (AX7) is designed for Functional, Technical and Project managers to understand the technology and tools available through Dynamics Lifecycle services and Visual Studio online to support the implementation of Dynamics AX. The workshop is primarily hands on, taking attendees through the deployment of Dynamics AX and the use of tools commonly used through an implementation to support the management of business processes, data, code and environments. The workshop will follow through an example case study to design, deploy, configure and test a solution on Dynamics AX.

 

Advanced Performance Workshop for Microsoft Dynamics AX – Register Here

Training Details: February 26, 2016 – February 29, 2016

Training Time: Check In Time: 8:30, Training: 9am-5pm PST

Training Venue: Microsoft Conference Center, 16070 NE 36th Way, Building 33, Redmond, WA 98052, United States

Description

The Advanced Performance Workshop for Microsoft Dynamics AX is designed for helping solution architects and senior consultants on planning, designing, implementing, stabilizing and releasing a Dynamics AX implementation with a focus on performance. Participants will understand the different phases and steps of the performance lifecycle; from analysis to deployment and the new tools required to get implementation complete.

 

Advanced Presales Workshop for Microsoft Dynamics AX – Register Here

Training Details: February 26, 2016 – February 28, 2016

Training Time: Check-in 7:30am; Training 8:00am-5:00pm PST

Training Venue: Microsoft Conference Center, 16070 NE 36th Way, Building 33, Redmond, WA 98052, United States

Description

The Presales workshop for Microsoft Dynamics AX is designed to provide a pre-sales consultant the grounding needed to demonstrate Microsoft Dynamics AX as a solution superior to competitive products meeting the needs of a customer’s business. The workshop is based on typical business scenarios, starting with foundational base knowledge, persona scenarios, exercises and case discussions by using real-world examples to enable consultants to apply and demonstrate the range of capabilities in Microsoft Dynamics AX, being used as a base to future specialize consultants in vertical industries.

 

We are extremely excited about the new Dynamics AX, and the new training around. See you at the Technical Conference.

 

Thanks,

Clay.

Commerce Data Exchange and Shared AX DB/Channel DB in Dynamics 365 for Operations

Commerce Data Exchange is a component of Dynamics 365 for Operations and Retail that transfers data between Microsoft Dynamics AX and retail channels, such as online stores or brick-and-mortar stores. In Dynamics 365 for Operations, Channel Database is part of AX DB itself and can also be part of Retail Store Scale Unit.   Retail Store Scale Unit consists of Retail Server, CDX Async client,  Cloud POS Server and Channel Database is an optional component. Channel Database that is part of AX DB is installed and configured in part with every installation of Dynamics 365 for Operations.

 

Image may be NSFW.
Clik here to view.

Retail Architecture Diagram

Commerce Data Exchange component that moves data between AX and Channel Database comprises of two batch jobs. One extracts data from AX DB and writes files for each of the distribution schedule jobs to blob storage. Second batch job that takes extracted data files from blob storage and writes to Channel Database that is part of AX DB with tables starting with AX.*.

For stores that are setup with Retail Store Scale Unit, the batch job on AX HQ side extracts the data and writes a file to blob storage. CDX Async client that is part of RSSU picks up the file from blob storage and applies to channel database that is part of RSSU. This part of the architecture remains similar to AX 2012 R3.  Stores/Channels were grouped using channel data groups and all channels/stores in a channel data group shared same data. Channel databases that were part of RSSU or Stores were defined using Channel database and grouped using Channel Database Group forms.

 

Key Learnings for implementations

  1. Multiple databases pointing to same HQ hosted Channel Database –  The channel database that is created with deployment has encrypted database connection string pointing to HQ hosted channel database. This cannot be configured for newly created database unless this is for RSSU. Further there is no real need to push same tables like CustTable, InventTable or EcoResProduct as part of different distribution schedule jobs more than once. In Customer implementations that are not using RSSU, there is absolutely no need to create multiple channel databases pointing to same database and additional channel data groups.
  2. Golden Configuration database move and Retail retargeting tool – It is also common practice for implementations to move configuration databases between environments. As part of this process, Retail retargeting tool needs to run to modify this connection string among other tasks. As of date of publication of this blog, Retail retargeting tool assumes that name of the only allowed channel database that is shared with AX DB to be called “Default”.

 

 

Viewing all 97 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>