D365 HR FinOps CDS Integration : Resolving date time conversion issue

The integration between Microsoft Dynamics 365 Human Resource (Talent) and Finance & Operations (FinOps) is pretty much out of the box. Talent entities are instantly synched with the Power Platform entities (read more about it here), and the integration from Talent to FinOps can be configured from the Power Platform admin portal (more info on the official Microsoft docs). There is an issue up till now April 2020 (FinOps v10.0.9) that causes the date fields to be formatted wrongly in FinOps if you are not working with en-us date format and UTC time zone.

Date and Time in UTC

When Talent writes a date and time value to the database, it stores the data in UTC and en-us format. This allows users to see any Date and Time data relative to the time zone defined in their user options. As per MS docs, HR > Understand Date and Time fields, when using external sources like DMF to view or author data, it is important to keep in mind that the Date and Time values are considered by default to be in UTC regardless of the time zone of the user’s computer or their current user time zone settings.

The date conversion issue

The en-us date format and UTC time zone cause an issue if you are using a date time zone other than UTC/en-us. The data integration between CDS and FinOps takes the UTC date and converts it to the locale format and FinOps DMF settings converts it again. This dual conversion is causing the issue.

The following is an example using the en-GB date format and UTC+1 time zone for the Job Position entity (activation and retirement date fields).

We first create a job position 000729 in Human Resource with duration 2nd March 2020 to 7th May 2020 in UK date format.

Human Resources – Job Position 000729

The dates for job position 000729 look 1 hour earlier due to UTC time zone in CDS, therefore the user sees them from 1st March 23:00 to 6th May 22:59.

The problem occurs in FinOps because the resulting date range after the integration with CDS is 4th January to 6th June 2020. Not only there is a time zone difference but also the day and month values swapped which creates completely wrong dates.

FinOps – Job Position 000729 showing wrong dates in UK format after integration with CDS entities

Issue resolution

Until there is an official fix for this this, fortunately you can resolve it with a configuration in DMF data source properties. If you look at the DMFEntityWriter class it uses the DMFDataSourceProperties table in FinOps to convert the date time. We therefore need to tell DMF that the date format we need is en-us & GMT time zone to avoid the second conversion.

Let’s first check which source data format is the CDS integration using. By looking at the import projects job history you can see that the source data format used for the standard CDS to FinOps integration is CSV-Unicode.

By double-clicking on the source data format (or from Data Management workspace > Configure Data Source tile > select CSV-Unicode source data format) you will open the DMF data source properties form. Change the timezone to Coordinated Universal Timezone and Language as En-US to avoid the second conversion from regional settings as shown below.

In summary, to resolve the issue :

  • Open D365 FinOps
  • Open the Data Management Workspace
  • Click on the “Configure data source” tile
  • Select and click on CSV-Unicode
  • Change the Timezone to Coordinated Universal Timezone and Language as En-US in the regional settings tab

Split purchase order line delivery schedules in X++ code

The schedule lines functionality of the Purchase Order details form is needed when the quantity received from the supplier (vendor) is less than the quantity ordered in the purchase line. This is normally due to multiple shipments, so you would need to split the original line in two: the (i) actual received quantity and the (ii) remaining quantity to be received with the next shipment(s). You might need to replicate this functionality in X++ code such as a D365 custom interface to receive items from a 3PL. The following post explains how you can easily achieve this.

The Delivery Schedule form can be opened from the Purchase Order Details form, Line view, click on “Purchase Order Line” > “Delivery Schedule” from the lines grid action pane. If you look at the code of the form that is opened PurchDeliverySchedule there a number of methods being called in the init, active, create and write methods of the datasource. The form actually creates two temporary purchase lines and only splits the original line on the closeOk method of the form.

If you try to replicate the code it might be a bumpy ride until you get it right. What really helped me was discovering that there is an ATL (Acceptance test library) class AtlCommandPurchaseOrderDeliverySchedule that already has this functionality in X++ code. You will find this class from D365 FO version 10.0.2 onwards. Below are some extracts of this code including some additional code and comments that I added myself.

// First create an instance of the PurchTableForm_DeliverySchedule class using the // original purchase line record.
PurchTableForm_DeliverySchedule purchTableForm_DeliverySchedule = new PurchTableForm_DeliverySchedule(originalPurchLine);

// Then add the two lines (or as many as you need) with the different quantity for // the various shipments. 
// This method is in class AtlCommandPurchaseOrderDeliverySchedule.
public final AtlCommandPurchaseOrderDeliverySchedule addDeliveryScheduleLine(
        PurchQty    _qty = 1,
        TransDate   _confirmDate = dateNull(),
        TransDate   _deliveryDate = DateTimeUtil::getToday(DateTimeUtil::getUserPreferredTimeZone()))
{
        tmpPurchaseLine.setTmp();

        if (isFirstScheduleLineFlag)
        {
            purchTableForm_DeliverySchedule.purchLine_Init(tmpPurchaseLine);
            isFirstScheduleLineFlag = false;
        }
        else
        {
            tmpPurchaseLine.clear();
            purchTableForm_DeliverySchedule.purchLine_InitValue(tmpPurchaseLine);
        }
                    purchTableForm_DeliverySchedule.interCompanySetLineAccess(tmpPurchaseLine);
        purchTableForm_DeliverySchedule.directDeliverySetLineAccess(tmpPurchaseLine);

        purchTableForm_DeliverySchedule.purchLine_CreatePreSuper(tmpPurchaseLine);

        tmpPurchaseLine.PurchQty = _qty;
        PurchLine::modifyPurchQty(tmpPurchaseLine, tmpPurchaseLine.inventDim());

        var inputContractPre = PurchLineWritePreSuperInputContract::construct();
        var inputContractPost = PurchLineWritePostSuperInputContract::construct();
        inputContractPre.parmPurchLine(tmpPurchaseLine);
        purchTableForm_DeliverySchedule.purchLine_WritePreSuper(inputContractPre);

        if (tmpPurchaseLine.RecId)
        {
            tmpPurchaseLine.doUpdate();
        }
        else
        {
            var numberSeq = NumberSeq::newGetNum(InventParameters::numRefInventTransId());
            tmpPurchaseLine.InventTransId = numberSeq.num();
            tmpPurchaseLine.sourceDocumentLine = this.getNextSouceDocumentNumber();

            tmpPurchaseLine.doInsert();
        }

        inputContractPost.parmPurchLine(tmpPurchaseLine);
        purchTableForm_DeliverySchedule.purchLine_WritePostSuper(inputContractPost);

        return this;
}

//Finally execute the method to create the new lines and update the original line.
// 1. First add the temporary lines to a list
List scheduleLines = new List(Types::Record);

while select tmpPurchaseLine
{
        scheduleLines.addEnd(tmpPurchaseLine);
}

// 2. Set the parameters and execute updatePurchLineTable on purchTableForm_DeliverySchedule

purchTableForm_DeliverySchedule.parmScheduleLines(scheduleLines);
purchTableForm_DeliverySchedule.parmMarkupConversionMode(DlvScheduleMarkupConversionMode::Copy);
purchTableForm_DeliverySchedule.updatePurchLineTable(false);

When the process runs successfully you will see the original line with the two new lines. The original line will have an inventory deliver remainder quantity of zero.

DMF Blob Temporary Storage

Insight on how the Data Management Framework uses the Azure Blob to temporarily store exported files. Includes a code sample on how to delete these files from the Blob containers.

Background

When you import or export a file using the Data Management Framework in Microsoft Dynamics 365 Finance and Operations (D365FO) a temporary copy of the file is created in Azure Blob. Its temporary because the file created has an expiration date, currently hard-coded to 7 days (10080 minutes) and re-generated with a different GUID every time a user downloads the file.

To demonstrate this on a developer VM we will use the Azure Storage Emulator and the Azure Storage Explorer which you can download for free. In the following demo we will export and download the Customer Groups data entity.

Export Data Project

In this example, we create an export data project to export two entities (Customer groups and Vendor groups) in Excel format.

There are two options how to export the data. We can either click on the download action pane button or click on the export menu item button.

Both the export menu item and the download button will create the Excel files in the blob storage “dmf” folder as shown below.

Send File to User

The download button will take a step further by packaging the files in a single zip file and it send it to the user client browser. To send the file to the user a temporary file is created in the blob storage with a download link (URL).

Example download URL = http://127.0.0.1:10000/devstoreaccount1/temporary-file/%7BACCE3CFE-3EC5-431A-9245-003F298D3C01%7D/Export%20Demo_525CDCF8-5EE9-4D87-89E2-49F9023BA96F_DMFPackage.zip?sv=2014-02-14&sr=b&sig=2unM4CQ8jDUjQ5QWHV0vGsx9OJBHnLa22ekqUj4ZXEA%3D&st=2019-09-26T12%3A54%3A51Z&se=2019-09-26T13%3A59%3A51Z&sp=r

You can see the file in the temporary-file blob container of the Azure Emulator.

The download URL has an expiration time in minutes which you can specify from System Administration > Setup > System parameters > Blob link expiration timespan. If this is left to zero, a default expiration of 10 minutes is applied.

Download Link

When the export finishes executing, it will create the files in the “dmf” storage but will not send them to the user and therefore will not create the files in the “temporary-file” blob container yet. When the DMFExecutionHistoryList form opens, you will have a “Download file” that generates the file in the “temporary-file” blob container and a download URL is provided.

The two Blob containers “dmf” and “temporary-file” can be found in the #DMF macro as:

#define.DmfExportContainer(‘dmf’)

#define.TemporaryStorageContainer(‘temporary-file’)

The temporary container is also a public constant string in the FileUploadTemporaryStorageStrategy class.

Deleting the temporary files

In a recent project we had to export files using the Data Management Framework by executing the data project export functionality from code. One of the requirements was to delete the files from the Blob container once the file was sent to the user browser via the download link. Below is an extract of the code to achieve this.

Code Sample

First of all, file identifiers that are downloaded and therefore created in the “temporary-file” blob container are not stored anywhere in the D365FO database because they are created on the fly by generating a new GUID every time (refer to method uploadFile in class FileUploadTemporaryStorageStrategy). We therefore have to create our own log table and extend the method uploadFile which is generating the GUID to save these file ids in our custom table.

[ExtensionOf(classStr(FileUploadTemporaryStorageStrategy))]
public final class BFTFileUploadTemporaryStorageStrategy_Extension
{
    public FileUploadResultBase uploadFile(System.IO.Stream _stream, str _fileName, str _contentType, str _fileExtension, str _metaData)
    {
        FileUploadResultBase fileUploadResult = next uploadFile(_stream, _fileName, _contentType, _fileExtension, _metaData);
        
        if (fileUploadResult is FileUploadTemporaryStorageResult
            && fileUploadResult.getUploadStatus())
        {
            BFTFileUploadResult fileUpload; //Custom table to store uploaded file ids to the temporary blob
            FileUploadTemporaryStorageResult fileUploadResultTempStorage = fileUploadResult as FileUploadTemporaryStorageResult;
            fileUpload.Filename     = fileUploadResultTempStorage.getFileName();
            fileUpload.FileId       = fileUploadResultTempStorage.getFileId();
            fileUpload.insert();
        }
        return fileUploadResult;
    }
}

Next step is to create a class (can be batch executed nightly) that can delete these temporary files. In the example below we have a runnable class that loops all exported files in both the “dmf” and “temporary-file” containers that were exported and/or downloaded.

using Microsoft.DynamicsOnline.Infrastructure.Components.SharedServiceUnitStorage;
class DeleteDMFBlobFiles
{        
    public static void main(Args _args)
    {
        #DMF        
        var blobStorageService = new SharedServiceUnitStorage(SharedServiceUnitStorage::GetDefaultStorageContext());
        str azureStorageCategory = #DmfExportContainer;
        DMFEntityExportDetails entityExportDetails;
        while select entityExportDetails
        {
            if (entityExportDetails.SampleFilePath)
            {
                blobStorageService.deletedata(entityExportDetails.SampleFilePath, azureStorageCategory);
            }
                
            if (entityExportDetails.PackageFilePath)
            {
                blobStorageService.deletedata(entityExportDetails.PackageFilePath, FileUploadTemporaryStorageStrategy::AzureStorageCategory);
            }
        }
        BFTFileUploadResult bftFileUploadResultTemp;
        while select bftFileUploadResultTemp
        {
            blobStorageService.deletedata(bftFileUploadResultTemp.FileId, FileUploadTemporaryStorageStrategy::AzureStorageCategory);
        }
    }
}

Article written for Bluefort Malta