Split purchase order line delivery schedules in X++ code

The schedule lines functionality of the Purchase Order details form is needed when the quantity received from the supplier (vendor) is less than the quantity ordered in the purchase line. This is normally due to multiple shipments, so you would need to split the original line in two: the (i) actual received quantity and the (ii) remaining quantity to be received with the next shipment(s). You might need to replicate this functionality in X++ code such as a D365 custom interface to receive items from a 3PL. The following post explains how you can easily achieve this.

The Delivery Schedule form can be opened from the Purchase Order Details form, Line view, click on “Purchase Order Line” > “Delivery Schedule” from the lines grid action pane. If you look at the code of the form that is opened PurchDeliverySchedule there a number of methods being called in the init, active, create and write methods of the datasource. The form actually creates two temporary purchase lines and only splits the original line on the closeOk method of the form.

If you try to replicate the code it might be a bumpy ride until you get it right. What really helped me was discovering that there is an ATL (Acceptance test library) class AtlCommandPurchaseOrderDeliverySchedule that already has this functionality in X++ code. You will find this class from D365 FO version 10.0.2 onwards. Below are some extracts of this code including some additional code and comments that I added myself.

// First create an instance of the PurchTableForm_DeliverySchedule class using the // original purchase line record.
PurchTableForm_DeliverySchedule purchTableForm_DeliverySchedule = new PurchTableForm_DeliverySchedule(originalPurchLine);

// Then add the two lines (or as many as you need) with the different quantity for // the various shipments. 
// This method is in class AtlCommandPurchaseOrderDeliverySchedule.
public final AtlCommandPurchaseOrderDeliverySchedule addDeliveryScheduleLine(
        PurchQty    _qty = 1,
        TransDate   _confirmDate = dateNull(),
        TransDate   _deliveryDate = DateTimeUtil::getToday(DateTimeUtil::getUserPreferredTimeZone()))

        if (isFirstScheduleLineFlag)
            isFirstScheduleLineFlag = false;


        tmpPurchaseLine.PurchQty = _qty;
        PurchLine::modifyPurchQty(tmpPurchaseLine, tmpPurchaseLine.inventDim());

        var inputContractPre = PurchLineWritePreSuperInputContract::construct();
        var inputContractPost = PurchLineWritePostSuperInputContract::construct();

        if (tmpPurchaseLine.RecId)
            var numberSeq = NumberSeq::newGetNum(InventParameters::numRefInventTransId());
            tmpPurchaseLine.InventTransId = numberSeq.num();
            tmpPurchaseLine.sourceDocumentLine = this.getNextSouceDocumentNumber();



        return this;

//Finally execute the method to create the new lines and update the original line.
// 1. First add the temporary lines to a list
List scheduleLines = new List(Types::Record);

while select tmpPurchaseLine

// 2. Set the parameters and execute updatePurchLineTable on purchTableForm_DeliverySchedule


When the process runs successfully you will see the original line with the two new lines. The original line will have an inventory deliver remainder quantity of zero.

DMF Blob Temporary Storage

Insight on how the Data Management Framework uses the Azure Blob to temporarily store exported files. Includes a code sample on how to delete these files from the Blob containers.


When you import or export a file using the Data Management Framework in Microsoft Dynamics 365 Finance and Operations (D365FO) a temporary copy of the file is created in Azure Blob. Its temporary because the file created has an expiration date, currently hard-coded to 7 days (10080 minutes) and re-generated with a different GUID every time a user downloads the file.

To demonstrate this on a developer VM we will use the Azure Storage Emulator and the Azure Storage Explorer which you can download for free. In the following demo we will export and download the Customer Groups data entity.

Export Data Project

In this example, we create an export data project to export two entities (Customer groups and Vendor groups) in Excel format.

There are two options how to export the data. We can either click on the download action pane button or click on the export menu item button.

Both the export menu item and the download button will create the Excel files in the blob storage “dmf” folder as shown below.

Send File to User

The download button will take a step further by packaging the files in a single zip file and it send it to the user client browser. To send the file to the user a temporary file is created in the blob storage with a download link (URL).

Example download URL =

You can see the file in the temporary-file blob container of the Azure Emulator.

The download URL has an expiration time in minutes which you can specify from System Administration > Setup > System parameters > Blob link expiration timespan. If this is left to zero, a default expiration of 10 minutes is applied.

Download Link

When the export finishes executing, it will create the files in the “dmf” storage but will not send them to the user and therefore will not create the files in the “temporary-file” blob container yet. When the DMFExecutionHistoryList form opens, you will have a “Download file” that generates the file in the “temporary-file” blob container and a download URL is provided.

The two Blob containers “dmf” and “temporary-file” can be found in the #DMF macro as:



The temporary container is also a public constant string in the FileUploadTemporaryStorageStrategy class.

Deleting the temporary files

In a recent project we had to export files using the Data Management Framework by executing the data project export functionality from code. One of the requirements was to delete the files from the Blob container once the file was sent to the user browser via the download link. Below is an extract of the code to achieve this.

Code Sample

First of all, file identifiers that are downloaded and therefore created in the “temporary-file” blob container are not stored anywhere in the D365FO database because they are created on the fly by generating a new GUID every time (refer to method uploadFile in class FileUploadTemporaryStorageStrategy). We therefore have to create our own log table and extend the method uploadFile which is generating the GUID to save these file ids in our custom table.

public final class BFTFileUploadTemporaryStorageStrategy_Extension
    public FileUploadResultBase uploadFile(System.IO.Stream _stream, str _fileName, str _contentType, str _fileExtension, str _metaData)
        FileUploadResultBase fileUploadResult = next uploadFile(_stream, _fileName, _contentType, _fileExtension, _metaData);
        if (fileUploadResult is FileUploadTemporaryStorageResult
            && fileUploadResult.getUploadStatus())
            BFTFileUploadResult fileUpload; //Custom table to store uploaded file ids to the temporary blob
            FileUploadTemporaryStorageResult fileUploadResultTempStorage = fileUploadResult as FileUploadTemporaryStorageResult;
            fileUpload.Filename     = fileUploadResultTempStorage.getFileName();
            fileUpload.FileId       = fileUploadResultTempStorage.getFileId();
        return fileUploadResult;

Next step is to create a class (can be batch executed nightly) that can delete these temporary files. In the example below we have a runnable class that loops all exported files in both the “dmf” and “temporary-file” containers that were exported and/or downloaded.

using Microsoft.DynamicsOnline.Infrastructure.Components.SharedServiceUnitStorage;
class DeleteDMFBlobFiles
    public static void main(Args _args)
        var blobStorageService = new SharedServiceUnitStorage(SharedServiceUnitStorage::GetDefaultStorageContext());
        str azureStorageCategory = #DmfExportContainer;
        DMFEntityExportDetails entityExportDetails;
        while select entityExportDetails
            if (entityExportDetails.SampleFilePath)
                blobStorageService.deletedata(entityExportDetails.SampleFilePath, azureStorageCategory);
            if (entityExportDetails.PackageFilePath)
                blobStorageService.deletedata(entityExportDetails.PackageFilePath, FileUploadTemporaryStorageStrategy::AzureStorageCategory);
        BFTFileUploadResult bftFileUploadResultTemp;
        while select bftFileUploadResultTemp
            blobStorageService.deletedata(bftFileUploadResultTemp.FileId, FileUploadTemporaryStorageStrategy::AzureStorageCategory);

Article written for Bluefort Malta

AX2012 command cheat sheet


STOP AOS and from CMD (Run as administrator) go to “cd [AOS Name]\bin” and execute:

axbuild.exe xppcompileall /s=01 /altbin="C:\Program Files (x86)\Microsoft Dynamics AX\60\Client\Bin"

/s = AOS Server instance from Server Configuration

Refer to: https://docs.microsoft.com/en-us/dynamicsax-2012/developer/axbuild-exe-for-parallel-compile-on-aos-of-x-to-p-code


On AOS server change directory to C:\Program Files\Microsoft Dynamics AX\60\ManagementUtilities.

//List Models
axutil list /config:

//Export ModelStore
axutil exportstore /file: [/verbose] /config:

//Export Model
axutil export /model: /file:

//View Model AOT elements
axutil view /model:

//Import modelstore
axutil importstore /file:"specify the location from where you need to import the file" /config:

//Import model
axutil import /file:ModelName.axmodel

//View model properties
axutil manifest /model: /xml
axutil manifest /file:

Keyboard shortcuts

  • Filter Grid: Shift + G
  • Filter Column: Shift + K
  • Filter selected cell: Alt+ F3
  • Remove filters: Ctrl + Shift + F3
  • Edit form : Ctrl + Shift + E
  • Development Workspace: Ctrl + D
  • Element Open: Ctrl + O