Minimise Attacks on Your Microsoft Account

One of the common issues faced by users on the internet is exposure of your email address following a data breach from a website. Although companies and website owners are becoming better at defending their websites from attacks by nefarious actors, even the largest companies are not immune.

Probably the most commonly hacked piece of information is your email address since this is not generally stored in an encrypted format. This email address you used as a username for that breached website is probably also the same email address you use to sign into your account provided by the company who manage your emails, e.g. Microsoft, Google, Apple. Once the hackers have your email address they can then try and access your account by cross-referencing other data breaches using the same email addresses and hoping you use the same email/password combination for more than one account.

For those major account providers like Microsoft, Google and Apple access to your account doesn’t just mean access to your emails (which itself is bad enough) but access to a whole range of other services linked to your account. Obviously, one of the ways you can minimise any attacks on your account is to turn on 2-step verification, preferably using an authentication app. Another way that I have found that works with your personal Microsoft accounts is to use Account Aliases.

Account Aliases are unique email addresses that are linked to your main account and can be used as alternative email addresses for you to send/receive email. The other more useful purpose of Account Aliases is to create an email address that is ONLY used to sign into your Microsoft account. By keeping one email address that you NEVER use as a login for another website you reduce the likelihood of your account being attacked.

To set up and Account Alias for your personal Microsoft account follow these steps:

  1. Navigate to https://account.microsoft.com/profile and go to the Account Info section.
  2. Click on the Sign-in preferences link at the bottom of that section which will take you to a page to manage how you sign into Microsoft.
  3. Under the list of existing Account Aliases click on the link to Add email address.
  4. Select a unique email address for your account. The system will only allow you to add an alias once it is unique to the chosen domain e.g. outlook.com. Remember, this email address is ONLY going to be used to login to your Microsoft account.
  5. Once Account Alias has been added successfully you will be returned to the page to manage your sign-in preferences. The next step is to change your Primary Alias as this will be the alias you will use as your username. To do this, select the Make primary link next to your newly created alias.
  6. From here, click on the link at the bottom of the page to Change sign-in preference. This page allows you to define which aliases can be used to sign into your account.
  7. Your new alias should be checked by default. Make sure any other aliases, including your original account alias, is not checked and click the Save button.
  8. Your account is now accessible using this new alias as the username. Just remember NOT to use this email address for any other websites.
  9. If you want to know whether someone has been trying to access your personal Microsoft account, you can go to the Activity page (https://account.live.com/Activity) and see when/where access to your account was recently attempted. This will show the successful attempts (hopefully by you) and unsuccessful attempts (possibly by bad people). If you monitor this page after you set up an alias, you should see the number of unsuccessful attempts go to zero.
  10. Finally, if you try logging into your Microsoft account with your original email address, you should see an error message telling you that the account is not valid. Only your new unique alias will get you into your account.

D365 – Client Api, Sub-grids & Card Forms

The D365 Client API allows us to access the rows and columns of a sub-grid on a form (see docs for more details). The screenshot below shows an example of the sub-grid and using the Client API we can access the data in those specific columns.

Read-only sub-grid on a form

This is a regular Read-Only grid that is displayed on a normal sized form when viewed on a device large enough to display the content. However, what happens when your device is only large enough to show a more compact view of the data, i.e. in a list view using a Card form.

Card form ‘list view’ of the sub-grid

If the default Card form does not contain the same fields as the normal Read-Only grid then your JavaScript code will not function correctly as it will not be able to access the same ‘cells’ of data in the list view format. To overcome this problem it’s important make sure that the Card form associated to the sub-grid contains the same fields as the Read-Only sub-grid. An important consideration here is that a Card form can only have a limited number of fields so we have to ensure any cells we reference in the Read-Only full size grid are also available to us on the Card form. If you are looking for details about configuring Card forms on sub-grids, there is a good resource here.

BONUS TIP – after making my changes for my Card form and sub-grids I found that my D365 Solution would not import into my TEST environment. Fortunately, I am not alone in experiencing this problem and found my resolution here.

Power Apps Portals – ‘Page Not Found’ goes missing

The Power Apps (D365) Portals has a standard ‘Page Not Found’ feature that displays a customisable page when the user navigates to a URL that does not exist.

Page Not Found Page

What do you do if your ‘Page Not Found’ page is not found? I recently encountered this problem in an environment where an incorrect URL was displaying the dreaded 502 Error rather than the ‘Page Not Found’ page.

Portal 502 Error

This is where the Portal Checker in the Power Apps Portal Admin Centre is your friend. Run the Portal Checker and examine the results for the following entries.

Page Not Found Site Marker Configuration

Home Site Marker Configuration

If either of these have a Warning against them, check the respective Site Marker to ensure it is pointing to the correct page. In my case the Home Site Marker had been incorrectly modified and was no longer pointing to the relevant Home page. Having made the adjustment my ‘Page Not Found’ was operational again and incorrect URLs were being sent to the right page.

Power Apps Portals – Login By Email

Power Apps Portals support multiple authentication methods. For external accounts, managed through Contact records in D365, the process of registration used to require the user to provide a separate Username and Email address. Although this is still the default scenario it is now possible to require the user to provide only their email address for sign-in.

To modify the registration process, change the Authentication/Registration/LocalLoginByEmail Site Setting from false to true. No more having the use some crafty JQuery to populate a (hidden) Username field with the content of an email address.

Business Process Flows and hidden dependencies

When trying to delete a Business Process Flow you may get conflicting messages about dependencies that need to be removed before the BPF can be removed.

For example, trying the delete the BPF through Power Apps I got the following message.

zero-dependencies

When I tried to view the dependencies for the BPF it would show zero dependencies

zero-dependencies-3

The problem is that the BPF also creates a entity with the same name and we have to determine which dependencies are still linked to this entity. Details for completing this task can be found here – https://support.microsoft.com/en-in/help/4527365/business-process-flow-deletion-fails-due-to-an-unknown-dependency.

D365 Portals – application/xml

I keep forgetting how to do this and can’t remember where I saw it written down so I am writing this down now. If you need to check the fetchXML that is being called as part of a Web Template then you can return that fetchXML in a Web Page. This can be useful where you have fetchXML that is generated dynamically based on request parameters or if you just want to check the values that are included in your fetchXML from the request.params collection.

  1. Create the Web Template with your fetchXML query
  2. After the {% endfetchxml %} line, display the content of you fetchXML query using the XML property of your fetchXML object.
  3. Set the MIME Type of the Web Template to application/json
  4. Create a Page Template for your Web Template, with no headers or footers.
  5. Create a Page using the Page Template so you can get the XML content

For example, my Web Template would look like this

{% fetchxml assetQuery %}
<fetch version='1.0' output-format='xml-platform' mapping='logical' page='1' count='200'>
  <entity name='msdyn_customerasset' >
    <attribute name='msdyn_customerassetid' />
    <attribute name='msdyn_account' />
    <attribute name='msdyn_parentasset' />
    <attribute name='msdyn_name' />
        ...
           ...
              ...
  </entity>
</fetch>
{% endfetchxml %}

{{ assetQuery.xml }}

XrmToolbox – Portal Records Mover – Folder Structures

Version 1.2019.10.9 of the Portal Records Mover tool in the XrmToolbox has introduced an excellent new feature – the ability to export the content as a folder structure and with the option to ZIP up the content.

Portal Records Mover

Previously, the configuration would be exported into a single XML file which made it difficult to identify what items were included in your export once you closed down XrmToolbox and filed away your configuration in source control. With the new version if you elect to Export as folder structure then not only do your exports get separated into different folders but the configuration records are created as separate files. The filename is the GUID of the configuration record.

Portal Records Mover

Portal Records Mover

The other advantage this creates is that you can now be more selective on the configuration you want to import into your target environment. The source file for your import can now be the entire root folder, individual folders or individual file (ZIP or XML).

Portal Records Mover

D365 Portals : X-Content-Type-Options Header

In an earlier post I provided a few of options for dealing with JavaScript code in your D365 portals. The second of those options was to modify the extension of the file that is attached to the Web File Note so it isn’t blocked as an attachment.

With a recent upgrade to the portal it appears that Microsoft have now closed the door on that particular option and for good (security) reasons. They have now added the X-Content-Type-Options header to the response with a setting of nosniff. This means that when the browser detects a difference between the file extension and the MIME type then the browser generates an error and the script is not loaded.

This means that in order to use custom JavaScript files in your D365 portals you are left with either option #1 or option #3 from my previous post, or use a CDN.

D365 – Field Service – Debugging Booking Rules

Booking Rules in D365 are generated as Web Resources (JavaScript) to show warning or error messages to the scheduler. The problem with debugging these web resources is that the files are loaded dynamically which makes them a little harder to debug. When you have complex or a large number of booking rules you have a few options to assist you with debugging.

  • console.log – this is the most basic option but allows you to write out values to the debug console of the browser. This can give you some information may not pinpoint the exact line were an issue is occurring.
  • debugger statement – this will cause your web resource to load into the dev tools of the browser and stop on the line where the statement exists. This at least allows you to continue on with debugging but you need to remember to remove it before you publish the resource into a production environment.
  • sourceURL comment – if you place the following comment at the end of your script file – after the closing curly brace – the file will then show in the sources tab.
//# sourceURL=filename.js

You can replace filename.js with any filename you like, e.g. BookingRules.js. When you browse your Sources in the Dev Tools you will find the file and be able to put in breakpoints as required. You will need to force the booking rules to run at least once for the file to be loaded. The other good point about this technique is that you don’t need to remove the comment for the file when its moved to a production environment.

This works for Chrome, Firefox and the new Chromium Edge browsers.

D365 Portals – Custom JS

There are a few ways you can host your custom JavaScript for your D365 portal within the portal configuration.

  1. Circumvent the System Settings and allow JS files to be uploaded to D365. The default system settings disallow JS files from being uploaded to D365. If you remove the JS extension from this list you can upload the files to D365. However, this is not ideal because it exposes your whole environment to malicious JS files potentially being uploaded.
  2. Change the file extension on your JS files. Although D365 won’t allow JS files because of the reason mentioned above, you can change the extension of the file, for example, .AXD. You are now able to attach the file to the Notes in your Web File and you can still give your web file a Partial URL ending in .JS. The JavaScript file will still be accessible as normal through the SCRIPT tag.
  3. My preferred option uses a similar technique to that used to return JSON responses from FetchXML queries in a Web Template. If we enter the JavaScript code directly in a Web Template and set the MIME Type to application/javascript we can use that Web Template in a page, with no header/footer, to serve up our custom JavaScript.
  4. Step 1. Create the Web Template

    Step 2. Create the Page Template with no header/footer

    Step 3. Create the JavaScript ‘Page’

    Step 4. Create the a regular Portal Page and embed the SCRIPT tag to request the custom JavaScript

    The result is that the JavaScript file is served from D365 and the functions run as normal.

    Obviously, if you have access to a CDN then you can upload them to there and add the necessary tags to your portal Pages, Web Templates etc.