F&O development with multiple version control workspaces

The usual setup (documented by Microsoft) of version control for F&O is using Team Foundation Version Control and configuring the workspace mapping like this:

$/MyProject/Trunk/Main/Metadatak:\AosService\PackagesLocalDirectory
$/MyProject/Trunk/Main/Projectsc:\Users\Admin123456789\Documents\Visual Studio 2019\Projects

This means that PackagesLocalDirectory contains both standard packages (such as ApplicationSuite) and custom packages, i.e. your development and ISV solutions you’ve installed.

This works fine if you always want to work with just a single version of your application, but the need to switch to a different branch or a different version is quite common. For example:

  • You merged code to a test branch and want to compile it and test it.
  • You found a bug in production or a test environment and you want to debug code of that version, or even make a fix.
  • As a VAR, you want to develop code for several clients in on the same DEV box.

It’s possible to change the workspace to use a different version of code, but it’s a lot of work. For example, let’s say you want to switch from Dev branch to Test to debug something. You have to:

  1. Suspend pending changes
  2. Delete workspace mapping
  3. Ideally, delete all custom files from PackagesLocalDirectory (otherwise you have to later deal with extra files that exist in Dev branch but not Test)
  4. Create a new workspace mapping
  5. Get latest code
  6. Compile custom packages (because source control contains just source code, not runnable binaries)

When you’re done and want to continue with the Dev branch, you need to repeat all the steps again to switch back.

This is very inflexible, time-consuming, and it doesn’t allow you to have multiple sets of pending changes. Therefore I use a different approach – I utilize multiple workspaces and switch between them.

I never put any custom packages to PackagesLocalDirectory. Instead, I create a separate folder for my workspace, e.g. k:\Repo\Dev. Then PackagesLocalDirectory contains only code from Microsoft, k:\Repo\Dev folder contains code from Dev branch, k:\Repo\Test code from Test branch and so on.

Each workspace has not just its version of the application, but also a list of pending changes. It also contains binaries – I need to compile code once when creating the workspace, but not again when switching between workspaces.

F&O doesn’t have to always use PackagesLocalDirectory and Visual Studio doesn’t have to have projects in the default location (such as Documents\Visual Studio 2019\Projects). The paths can be changed in configuration files. Therefore if I want to use, say, the Test workspace, I tell F&O to take packages from k:\Repo\Dev\Metadata and Visual Studio to use k:\Repo\Dev\Projects for projects. Doing it manually would be time-consuming and error-prone, therefore I do it by a script (see more about the scripts at the end).

If my workspace folder contains just custom packages, I can use it for things like code merge, but I couldn’t run the application or even open Application Explorer, because custom code depend on standard code, which isn’t there. I could copy standard packages to each workspace folder, but I like the separation, it would require a lot of time and disk space and I would have to update each folder after updating the standard application.

For example, I could take k:\AosService\PackagesLocalDirectory\ApplicationPlatform and copy it to k:\Repo\Dev\Metadata\ApplicationPlatform. But I can achieve the same goal by creating a symbolic link to the folder in PackagesLocalDirectory. Of course, no one wants to add 150 symbolic links manually – a simple script can iterate the folders and create a symbolic link for each of them.

A few more remarks

  • We’re usually interested in the single workspace used by F&O, but note that some actions can be done without switching workspaces. For example, we can use Get Latest to download latest code from source control (it’s even better if the folder isn’t used by F&O/VS, because no files are locked), merge code and commit changes, or even build the application from command line.
  • If you use Git, branch switching is easier, but you’ll still likely want to keep standard and custom packages in separate folders.
  • Before applying an application update from Microsoft, it’s better to tell F&O to use PackagesLocalDirectory. If you don’t do it and there is a new package, it’ll be created in the active workspace and other workspaces couldn’t see it. You’d have to identify the problem and move the new package to PackagesLocalDirectory.
  • If a new package is added, you’ll also need to regenerate symbolic links for your workspaces.
  • You can have multiple workspaces for a single branch. For example, I use a separate workspace for code reviews, so I don’t have to suspend the development I’m working on.

Scripts

The scripts are available on GitHub: github.com/goshoom/d365fo-workspace. Use them as a base or inspiration for your own scripts; don’t expect them to cover all possible ways of working.

When you create a new workspace folder, open it in Powershell (with elevated permissions) and run Add-FOPackageSymLinks. Then you can tell F&O to use this folder by running Switch-FOWorkspace. If you want to see which folder is currently in use, call Get-FOWorkspace.

You can also see documentation comments, and the actual implementation, inside D365FOWorkspace.psm1.

Acceptance Test Library

Acceptance Test Library (ATL) in F&O isn’t a new feature, but many people aren’t aware of it, therefore let me try to raise awareness a bit.

ATL is used in automated tests written by developers and its purpose is to easily create test data and verify results.

Here is an example of such a test:

// Create the data root node
var data = AtlDataRootNode::construct();
 
// Get a reference to a well-known warehouse 
var warehouse = data.invent().warehouses().default();
 
// Create a new item with the "default" setup using the item creator class. Adjust the default warehouse before saving the item.
var item = items.defaultBuilder().setDefaultWarehouse(warehouse).create();
 
// Add on-hand (information about availability of the item in the warehouse) by using the on-hand adjustment command.
onHand.adjust().forItem(item).forInventDims([warehouse]).setQty(100).execute();
 
// Create a sales order with one line using the sales order entity
var salesOrder = data.sales().salesOrders().createDefault();
var salesLine = salesOrder.addLine().setItem(item).setQuantity(10).save();
 
// Reserve 3 units of the item using the reserve() command that is exposed directly on the sales line entity
salesLine.reserve().setQty(3).execute();
 
// Verify inventory transactions that are associated with the sales line using the inventoryTransactions query and specifications
salesLine.inventoryTransactions().assertExpectedLines(
    invent.trans().spec().withStatusIssue(StatusIssue::OnOrder).withInventDims([warehouse]).withQty(-7),
    invent.trans().spec().withStatusIssue(StatusIssue::ReservPhysical).withInventDims([warehouse]).withQty(-3));

These few lines do a lot of things – create an item and ensure that it has quantity on hand, create a sales order, run quantity reservation and so on. At the end, they ensure that the expect set of inventory transactions has been created, and the test with fail if more or less lines are created or they don’t have the expected field values. Writing code for that without ATL would require a lot of work.

AX/F&O has a framework for unit tests (SysTest) and that’s where you’ll use Acceptance Test Library, you’ll just create acceptance tests rather then unit tests. Unit tests should test just a single code unit, be very fast and so on, which isn’t the case with ATL, but ATL has other benefits. It allows you to test complete processes and it may be used for testing of code that wasn’t written with unit testing in mind (which is basically all X++ code…). The disadvantage is slower execution, more things (unrelated to what you’re testing) that can break, more difficult identification of the cause of a test failure, and so on.

If you’ve never seen SysTest framework, a simple test class may look like this:

public class MyTest extends SysTestCase
{
    [SysTestMethod]
    public void demo()
    {
        int calculationResult = 1 + 2;
        this.assertEquals(3, calculationResult);
    }
}

The ATL adds special assertions methods such as assertExpectedLines(), but you can utilize the usual assertions of SysTest framework (such as assertEquals()) as well.

You write code in classes and then execute in Test Explorer, where you can see the result, you can easily navigate to or start debugging a particular test.

You can learn more about ALT in documentation, but let me share my real-world experience and a few tips.

Development time

These tests surely require time to write, especially if you’re new to it. Usually the first test for a given use case takes a lot of time and adding more tests is much easier, because they’re just variations of the same thing.

It’s not just about what the test does, but you also need to set up the system correctly, which typically isn’t trivial.

As any other code, test code too may contain bugs and debugging will take time.

Isolation and performance

A great feature of SysTest is data isolation. When you run a test, a new partition is created and your tests run there, therefore your tests can’t be broken by wrong existing data (including those from previous tests), nor the tests can destroy any data you use for manual testing.

But it means that there is no data at all (unless you give up this isolation) and you must prepare everything inside your test case. Of course, the Acceptance Test Library is there to help you. On the other hand, it’s easy to forget some important setup.

This creation of the partition and setting up test data takes time, therefore running these tests takes a few minutes. It’s a bit annoying when you have a single test, but the more tests you have, the more time you’re saving.

Number sequences

One of thing you typically need to set up are number sequences. Fortunately, there is a surprisingly easy solution: decorate your test case class with SysTestCaseAutomaticNumberSequences attribute and the system will create number sequences as needed.

Code samples

F&O comes with a model called Acceptance Test Library – Sample Tests, where you’ll find a few tests that you can review and execute. To see how complete test cases may look like is very useful for learning.

Documentation

Documentation exist: Acceptance test library resources.

You don’t need to read the whole thing to use ATL, but it’s very beneficial if you familiarize yourself with things like Navigation concepts.

You’ll need to go a bit deeper if you decide to create ATL classes for you own entities, or those in the standard application that aren’t covered well by Microsoft. For example, I added ATL classes for trade agreements, because we made significant changes to pricing and utilizing ATL was beneficial.

From another point of view, tests also work as a kind of running documentation. Not only that I document my own code by showing others how it’s supposed to be called and what behaviour we expect, but I sometimes look to ATL to see how Microsoft does certain actions that I need in my real code.

Models and pipelines

You can’t put tests into the same module as your normal code. You’ll need reference to ATL modules (Acceptance Test Library Foundation, at least), which aren’t available in Tier 2+ environments, therefore you will have to configure your build pipeline not to add you test module to deployable packages.

Feeling safe

It’s not specific to tests with ATL, but a great thing of automated tests in general is the level of certainty that my recent changes didn’t break anything. Without automated tests, you either have to spend a lot of time with manual testing (and hope that all tests were executed and interpreted correctly), or you just hope for the best…

DynamicsMinds conference speakers

I’ve just checked the list of sessions proposed for DynamicsMinds conference (22–24 May 2023, Slovenia), where I’ll also have a few, and recognized many familiar names. It’ll be great not only to listen their sessions, but also to finally meet them again in person.

The list is long, but to mention at least some names, there’re going to be several fellow MVPs (such as André Arnaud de Calavon, Paul Heisterkamp or Adrià Ariste Santacreu), ex-MVPs now working for Microsoft (but we still love them :)) like Rachel Profitt, Ludwig Reinhard and Tommy Skaue, the author of d365fo.tools Mötz Jensen, my former colleague Laze Janev and many more.

This is gonna be big.

HcmWorkerV2

I was asked to investigate why some changes disappeared from Employess form in F&O. If I open Human resources > Workers > Employees and right-click the form, it shows that its name is HcmWorker.

That’s expected. But it’s a lie.

There is a feature called Streamlined employee entry (HcmWorkerV2Feature class) and if it’s enabled, another form (HcmWorkerV2) opens instead of HcmWorker.

Microsoft implemented the logic in init() method of HcmWorker form. If the feature is enabled, HcmWorkerV2 opens and HcmWorker is closed.

if (HcmFeatureStateProvider::isFeatureEnabled(HcmWorkerV2Feature::instance()))
{
    this.redirectToHcmWorkerV2();
    this.lifecycleHelper().exitForm();
}

But the logic that shows Form information isn’t aware of it for some reason; it knows that we’re opening HcmWorker and it believes it’s really been opened.

By the way, if I press F5 and reload the form, Form information starts showing the right form name.

If we want our customizations to work in both cases (with the feature enabled or disabled), we need to change both forms.

By the way, if you want to read more about the new form, here is a documentation page: Streamlined employee navigation and entry.