D365 Portals – application/xml

I keep forgetting how to do this and can’t remember where I saw it written down so I am writing this down now. If you need to check the fetchXML that is being called as part of a Web Template then you can return that fetchXML in a Web Page. This can be useful where you have fetchXML that is generated dynamically based on request parameters or if you just want to check the values that are included in your fetchXML from the request.params collection.

  1. Create the Web Template with your fetchXML query
  2. After the {% endfetchxml %} line, display the content of you fetchXML query using the XML property of your fetchXML object.
  3. Set the MIME Type of the Web Template to application/json
  4. Create a Page Template for your Web Template, with no headers or footers.
  5. Create a Page using the Page Template so you can get the XML content

For example, my Web Template would look like this

{% fetchxml assetQuery %}
<fetch version='1.0' output-format='xml-platform' mapping='logical' page='1' count='200'>
  <entity name='msdyn_customerasset' >
    <attribute name='msdyn_customerassetid' />
    <attribute name='msdyn_account' />
    <attribute name='msdyn_parentasset' />
    <attribute name='msdyn_name' />
        ...
           ...
              ...
  </entity>
</fetch>
{% endfetchxml %}

{{ assetQuery.xml }}

XrmToolbox – Portal Records Mover – Folder Structures

Version 1.2019.10.9 of the Portal Records Mover tool in the XrmToolbox has introduced an excellent new feature – the ability to export the content as a folder structure and with the option to ZIP up the content.

Portal Records Mover

Previously, the configuration would be exported into a single XML file which made it difficult to identify what items were included in your export once you closed down XrmToolbox and filed away your configuration in source control. With the new version if you elect to Export as folder structure then not only do your exports get separated into different folders but the configuration records are created as separate files. The filename is the GUID of the configuration record.

Portal Records Mover

Portal Records Mover

The other advantage this creates is that you can now be more selective on the configuration you want to import into your target environment. The source file for your import can now be the entire root folder, individual folders or individual file (ZIP or XML).

Portal Records Mover

D365 Portals : X-Content-Type-Options Header

In an earlier post I provided a few of options for dealing with JavaScript code in your D365 portals. The second of those options was to modify the extension of the file that is attached to the Web File Note so it isn’t blocked as an attachment.

With a recent upgrade to the portal it appears that Microsoft have now closed the door on that particular option and for good (security) reasons. They have now added the X-Content-Type-Options header to the response with a setting of nosniff. This means that when the browser detects a difference between the file extension and the MIME type then the browser generates an error and the script is not loaded.

This means that in order to use custom JavaScript files in your D365 portals you are left with either option #1 or option #3 from my previous post, or use a CDN.

D365 – Field Service – Debugging Booking Rules

Booking Rules in D365 are generated as Web Resources (JavaScript) to show warning or error messages to the scheduler. The problem with debugging these web resources is that the files are loaded dynamically which makes them a little harder to debug. When you have complex or a large number of booking rules you have a few options to assist you with debugging.

  • console.log – this is the most basic option but allows you to write out values to the debug console of the browser. This can give you some information may not pinpoint the exact line were an issue is occurring.
  • debugger statement – this will cause your web resource to load into the dev tools of the browser and stop on the line where the statement exists. This at least allows you to continue on with debugging but you need to remember to remove it before you publish the resource into a production environment.
  • sourceURL comment – if you place the following comment at the end of your script file – after the closing curly brace – the file will then show in the sources tab.
//# sourceURL=filename.js

You can replace filename.js with any filename you like, e.g. BookingRules.js. When you browse your Sources in the Dev Tools you will find the file and be able to put in breakpoints as required. You will need to force the booking rules to run at least once for the file to be loaded. The other good point about this technique is that you don’t need to remove the comment for the file when its moved to a production environment.

This works for Chrome, Firefox and the new Chromium Edge browsers.

D365 Portals – Custom JS

There are a few ways you can host your custom JavaScript for your D365 portal within the portal configuration.

  1. Circumvent the System Settings and allow JS files to be uploaded to D365. The default system settings disallow JS files from being uploaded to D365. If you remove the JS extension from this list you can upload the files to D365. However, this is not ideal because it exposes your whole environment to malicious JS files potentially being uploaded.
  2. Change the file extension on your JS files. Although D365 won’t allow JS files because of the reason mentioned above, you can change the extension of the file, for example, .AXD. You are now able to attach the file to the Notes in your Web File and you can still give your web file a Partial URL ending in .JS. The JavaScript file will still be accessible as normal through the SCRIPT tag.
  3. My preferred option uses a similar technique to that used to return JSON responses from FetchXML queries in a Web Template. If we enter the JavaScript code directly in a Web Template and set the MIME Type to application/javascript we can use that Web Template in a page, with no header/footer, to serve up our custom JavaScript.
  4. Step 1. Create the Web Template

    Step 2. Create the Page Template with no header/footer

    Step 3. Create the JavaScript ‘Page’

    Step 4. Create the a regular Portal Page and embed the SCRIPT tag to request the custom JavaScript

    The result is that the JavaScript file is served from D365 and the functions run as normal.

    Obviously, if you have access to a CDN then you can upload them to there and add the necessary tags to your portal Pages, Web Templates etc.

D365, Flow and Twilio – Part 2

In my previous post I walked through the process of setting up your Twilio account to provide two factor authentication for registering with a Microsoft D365 portal. In this part I will show you how to verify that the verification code is correct. This post assumes you have created a page that can be used to enter the verification code you requested in the previous post and the verification code is sent to the D365 Contact record.

The first step is to create a trigger to do the verification check when the D365 Contact record is updated. We only want the verification to be performed under specific circumstances; In this case we put a condition to check the field we are storing the verification code in has a value and the Mobile Phone Confirmed field is currently false. For the purposes of the demo I am using the Fax field to store the verification code but you can create your own custom field to hold the code.

Step-1

If the Contact record is being updated for some other reason then the verification is not performed and the Flow just exits. If we are doing a Mobile Phone verification then we need to generate a GET request to the Twilio. Using this API we need to provide a couple of header values. The first is normal Content-type (application/json) and the important one is the X-Authy-API-Key which is available in the General Settings of your Verify Project.

Step-2

The dynamic values we pass to the API are the Mobile Phone and Fax (Verification Code) from the Contact record. The GET request will return a response that we parse using the Parse JSON Action. To fill out the Schema box I put the request we created (above) into Postman to get the response. Selecting the Use sample payload to generate schema option I was able to paste in the response and generate the schema.

Step-3a

For the Content field we just use the Body of the response we receive from the HTTP GET request.

Step-3

The next step is to check what response we have received as a Flow Condition so we can determine whether to update the record.

Step-4

Again, using the response from Postman you can determine what key value pairs are returned from the GET request. The screenshot below shows an example of a failure (Status: 404 Not Found) but the structure of the response is still the same as a successful response.

Step-4a

If the Condition has success equal to true, then we can update the Contact record to confirm the mobile phone number has been confirmed. Click on the Show advanced options to find the relevant field.

Step-5

Step-5a

We have now completed the steps of using Microsoft Flow and Twilio to confirm the mobile phone number for a Contact within the D365 Portal.

D365 Product Bundles – max number of products

By default, D365 CE has a limit of 15 Products that can be added to a Product Bundle. When you try to add a 16th you get presented with a nice warning message (above). Fortunately, this is one of the many configurable items of D365. In the System Settings, navigate to the Sales tab and change the Set maximum products in a bundle setting to a higher value.