D365 HR FinOps CDS Integration : Resolving date time conversion issue

The integration between Microsoft Dynamics 365 Human Resource (Talent) and Finance & Operations (FinOps) is pretty much out of the box. Talent entities are instantly synched with the Power Platform entities (read more about it here), and the integration from Talent to FinOps can be configured from the Power Platform admin portal (more info on the official Microsoft docs). There is an issue up till now April 2020 (FinOps v10.0.9) that causes the date fields to be formatted wrongly in FinOps if you are not working with en-us date format and UTC time zone.

Date and Time in UTC

When Talent writes a date and time value to the database, it stores the data in UTC and en-us format. This allows users to see any Date and Time data relative to the time zone defined in their user options. As per MS docs, HR > Understand Date and Time fields, when using external sources like DMF to view or author data, it is important to keep in mind that the Date and Time values are considered by default to be in UTC regardless of the time zone of the user’s computer or their current user time zone settings.

The date conversion issue

The en-us date format and UTC time zone cause an issue if you are using a date time zone other than UTC/en-us. The data integration between CDS and FinOps takes the UTC date and converts it to the locale format and FinOps DMF settings converts it again. This dual conversion is causing the issue.

The following is an example using the en-GB date format and UTC+1 time zone for the Job Position entity (activation and retirement date fields).

We first create a job position 000729 in Human Resource with duration 2nd March 2020 to 7th May 2020 in UK date format.

Human Resources – Job Position 000729

The dates for job position 000729 look 1 hour earlier due to UTC time zone in CDS, therefore the user sees them from 1st March 23:00 to 6th May 22:59.

The problem occurs in FinOps because the resulting date range after the integration with CDS is 4th January to 6th June 2020. Not only there is a time zone difference but also the day and month values swapped which creates completely wrong dates.

FinOps – Job Position 000729 showing wrong dates in UK format after integration with CDS entities

Issue resolution

Until there is an official fix for this this, fortunately you can resolve it with a configuration in DMF data source properties. If you look at the DMFEntityWriter class it uses the DMFDataSourceProperties table in FinOps to convert the date time. We therefore need to tell DMF that the date format we need is en-us & GMT time zone to avoid the second conversion.

Let’s first check which source data format is the CDS integration using. By looking at the import projects job history you can see that the source data format used for the standard CDS to FinOps integration is CSV-Unicode.

By double-clicking on the source data format (or from Data Management workspace > Configure Data Source tile > select CSV-Unicode source data format) you will open the DMF data source properties form. Change the timezone to Coordinated Universal Timezone and Language as En-US to avoid the second conversion from regional settings as shown below.

In summary, to resolve the issue :

  • Open D365 FinOps
  • Open the Data Management Workspace
  • Click on the “Configure data source” tile
  • Select and click on CSV-Unicode
  • Change the Timezone to Coordinated Universal Timezone and Language as En-US in the regional settings tab

Data Entity Export from code

This post  explains how to implement the same data entity export functionality provided in the Data Management workspace in a form or a menu item.

Below is form dialog that I created to research how to create data entities and run the export functionality directly from X++ code and using standard DMF classes.

Creating an export data project

The first code sample is how I created the export data project. The definition group name is anything that you want. For the source name (data format EXCEL, CSV etc) you can use the standard lookup method DMFQuickImportExportFormHelper.buildSourceNameControlLookup(). For the entity name there is another standard lookup DMFUtil::lookupEntity(this).

/// <summary>
/// Create DMF Processing group to export 'Customer Groups'
/// </summary>
/// <param name = "_groupName">Definition group name</param>
/// <param name = "_sourceName">Export file type e.g. CSV</param>
/// <param name = "_entityName">Entity Name</param>
public static void createNewDefinitionGroup(DMFDefinitionGroupName _groupName, DMFSourceName _sourceName, DMFEntityName _entityName)
    DMFEntity dmfEntity = DMFEntity::find(_entityName);
    DMFDataSource dmfDataSource = DMFDataSource::find(_sourceName);

    if (dmfEntity &amp;&amp; dmfDataSource)
        DMFDefinitionGroupName dmfDefinitionGroupName = _groupName;
        Description description = _groupName;

        //Create data project
                                            "", //Sample file
                                            "@SYS6047", //"All fields"
                                            NoYes::Yes, //Skip staging
                                            NoYes::Yes, //Export to File. Set Yes to create a definition group of operation type Export
                                            "", //Sheet lookup
                                            NoYes::No, //Truncate Entity Data

Exporting the data definition group

To export the definition group we use the DMFPackageExporter class after creating a new DMF Definition Group Execution.

DMFDefinitionGroup definitionGroup = DMFDefinitionGroup::find("My demo export"); 

if (definitionGroup)
    DMFExecutionId dmfExecutionId = DMFUtil::setupNewExecution(definitionGroup.DefinitionGroupName);
    Description description = strFmt('Execution - %1 for Definition Group - %2', dmfExecutionId, definitionGroup.DefinitionGroupName);

    //Create new dmf execution group for the new execution
    if (DMFDefinitionGroupExecution::insertOrDisplay(definitionGroup, dmfExecutionId, description, false))
        dmfDefinitionGroupExecution = DMFDefinitionGroupExecution::find(definitionGroup.DefinitionGroupName, dmfDefinitionGroupEntity.Entity, dmfExecutionId);
        if (dmfDefinitionGroupExecution)
            DMFExportImportDataContract dmfDataContract = new DMFExportImportDataContract();
            DMFPackageExporter packageExporter = new DMFPackageExporter();

            //Download file for the newly created dmf execution group


Download file

To send the file to the user from the DMF Definition Group Execution table we use DMFEntityExportDetails to get the download URL.

public static void downloadFile(DMFDefinitionGroupExecution dmfDefinitionGroupExecution)
    DMFEntityExportDetails  exportDetails;
    DMFEntityName           entityName;
    DMFEntity               dMFEntity = DMFEntity::find(dmfDefinitionGroupExecution.Entity);
    if (dMFEntity.ParentEntityName != '')
        entityName = dMFEntity.ParentEntityName;
        entityName = dmfDefinitionGroupExecution.Entity;

    select firstonly1 SampleFilePath, DefinitionGroup from exportDetails where
                exportDetails.DefinitionGroup == dmfDefinitionGroupExecution.DefinitionGroup &amp;&amp;
                exportDetails.ExecutionId == dmfDefinitionGroupExecution.ExecutionId &amp;&amp;
                exportDetails.Entity == entityName;

    if (exportDetails.RecId &amp;&amp; exportDetails.SampleFilePath)
        str downloadUrl = DMFDataPopulation::getAzureBlobReadUrl(str2Guid(exportDetails.SampleFilePath));
            System.IO.Stream stream = File::UseFileFromURL(downloadUrl);
            File::SendFileToUser(stream, strFmt('%1-%2.%3', exportDetails.DefinitionGroup, exportDetails.Entity, DMFDataPopulation::getFileExtensionFromURL(downloadUrl)));
        warning(strFmt("@DMF:DMFNoExportFileFoundForEntity", dmfDefinitionGroupExecution.Entity, dmfDefinitionGroupExecution.ExecutionId));

Changing the query on the export definition group

Another useful class is the DMFFormBrowser. This opens the entity query dialog and lets the user add filters. You can also code your own ranges on the query and then package it to attached to the DMF Group Execution. You would use this code before calling the exportToPackageV2 method shown previously.

container   queryData;

if (dmfDefinitionGroupEntity)
    queryData = DMFUtil::getDefaultQueryForEntityV3(dmfDefinitionGroupEntity.Entity, dmfDefinitionGroupEntity.DefinitionGroup);
    //Show query dialog
    queryData = DMFFormBrowser::showQuery(queryData);
//Update Query filter on DMF Definition Group
if(queryData != connull())
    dmfDefinitionGroupExecution.QueryData = queryData;
    if (dmfDefinitionGroupEntity)
        //Save updated query for next time
        dmfDefinitionGroupEntity.QueryData = queryData;

DMF Blob Temporary Storage

Insight on how the Data Management Framework uses the Azure Blob to temporarily store exported files. Includes a code sample on how to delete these files from the Blob containers.


When you import or export a file using the Data Management Framework in Microsoft Dynamics 365 Finance and Operations (D365FO) a temporary copy of the file is created in Azure Blob. Its temporary because the file created has an expiration date, currently hard-coded to 7 days (10080 minutes) and re-generated with a different GUID every time a user downloads the file.

To demonstrate this on a developer VM we will use the Azure Storage Emulator and the Azure Storage Explorer which you can download for free. In the following demo we will export and download the Customer Groups data entity.

Export Data Project

In this example, we create an export data project to export two entities (Customer groups and Vendor groups) in Excel format.

There are two options how to export the data. We can either click on the download action pane button or click on the export menu item button.

Both the export menu item and the download button will create the Excel files in the blob storage “dmf” folder as shown below.

Send File to User

The download button will take a step further by packaging the files in a single zip file and it send it to the user client browser. To send the file to the user a temporary file is created in the blob storage with a download link (URL).

Example download URL =

You can see the file in the temporary-file blob container of the Azure Emulator.

The download URL has an expiration time in minutes which you can specify from System Administration > Setup > System parameters > Blob link expiration timespan. If this is left to zero, a default expiration of 10 minutes is applied.

Download Link

When the export finishes executing, it will create the files in the “dmf” storage but will not send them to the user and therefore will not create the files in the “temporary-file” blob container yet. When the DMFExecutionHistoryList form opens, you will have a “Download file” that generates the file in the “temporary-file” blob container and a download URL is provided.

The two Blob containers “dmf” and “temporary-file” can be found in the #DMF macro as:



The temporary container is also a public constant string in the FileUploadTemporaryStorageStrategy class.

Deleting the temporary files

In a recent project we had to export files using the Data Management Framework by executing the data project export functionality from code. One of the requirements was to delete the files from the Blob container once the file was sent to the user browser via the download link. Below is an extract of the code to achieve this.

Code Sample

First of all, file identifiers that are downloaded and therefore created in the “temporary-file” blob container are not stored anywhere in the D365FO database because they are created on the fly by generating a new GUID every time (refer to method uploadFile in class FileUploadTemporaryStorageStrategy). We therefore have to create our own log table and extend the method uploadFile which is generating the GUID to save these file ids in our custom table.

public final class BFTFileUploadTemporaryStorageStrategy_Extension
    public FileUploadResultBase uploadFile(System.IO.Stream _stream, str _fileName, str _contentType, str _fileExtension, str _metaData)
        FileUploadResultBase fileUploadResult = next uploadFile(_stream, _fileName, _contentType, _fileExtension, _metaData);
        if (fileUploadResult is FileUploadTemporaryStorageResult
            &amp;&amp; fileUploadResult.getUploadStatus())
            BFTFileUploadResult fileUpload; //Custom table to store uploaded file ids to the temporary blob
            FileUploadTemporaryStorageResult fileUploadResultTempStorage = fileUploadResult as FileUploadTemporaryStorageResult;
            fileUpload.Filename     = fileUploadResultTempStorage.getFileName();
            fileUpload.FileId       = fileUploadResultTempStorage.getFileId();
        return fileUploadResult;

Next step is to create a class (can be batch executed nightly) that can delete these temporary files. In the example below we have a runnable class that loops all exported files in both the “dmf” and “temporary-file” containers that were exported and/or downloaded.

using Microsoft.DynamicsOnline.Infrastructure.Components.SharedServiceUnitStorage;
class DeleteDMFBlobFiles
    public static void main(Args _args)
        var blobStorageService = new SharedServiceUnitStorage(SharedServiceUnitStorage::GetDefaultStorageContext());
        str azureStorageCategory = #DmfExportContainer;
        DMFEntityExportDetails entityExportDetails;
        while select entityExportDetails
            if (entityExportDetails.SampleFilePath)
                blobStorageService.deletedata(entityExportDetails.SampleFilePath, azureStorageCategory);
            if (entityExportDetails.PackageFilePath)
                blobStorageService.deletedata(entityExportDetails.PackageFilePath, FileUploadTemporaryStorageStrategy::AzureStorageCategory);
        BFTFileUploadResult bftFileUploadResultTemp;
        while select bftFileUploadResultTemp
            blobStorageService.deletedata(bftFileUploadResultTemp.FileId, FileUploadTemporaryStorageStrategy::AzureStorageCategory);

Article written for Bluefort Malta