When you overlayered an application element (e.g. a method or a form) in Dynamics AX, a copy was saved in a higher layer. You modified the object there, therefore you ended up with two copies of the same element – the original one in a lower layer (such as SYS) and your modified one in a layer like CUS.
A problem occurred when Microsoft updated the element, either by directly changing its code or by introducing an additional copy in another layer, such as SYP or GLS. Your copy based on the older version effectively hid those changes, unless you upgraded your code and incorporated them. The upgrade could be relatively difficult, especially if developers didn’t think about upgrades in advance.
In F&O, you can’t use overlayering anymore, therefore your changes can’t hide standard code (in most cases). Microsoft gives you a new version, your extensions get applied and everything is fine. The difficult and expensive code upgrade process isn’t needed anymore.
Almost.
There are still breaking changes and new features that you need to take into account. For example, if you have extensions of Sales order header V2 entity and Microsoft introduces V3, you need to add your extensions to the new version too.
But sometimes we introduce exactly the same problem that we used to have in Dynamics AX. We can’t create a copy by overlayering, but we still can manually duplicate an element (a method, a data entity etc.). For example, we want to use an Microsoft’s internal class, therefore we duplicate the class in our model and use it from our code. Then Microsoft change their code, but we’re still using the old version, until we notice that problem and apply the same changes to our copy.
It’s the same problem as in Dynamics AX, but now we’re in even worse position. There we used to have tools to detect code conflicts for us (and even to fix some of them automatically), but we don’t have them in F&O.
With layers, we knew that our element is a copy of a standard one. For example, it was clear that our CustTable form in CUS layer was related to CustTable form is SYP layer. There was a tool that could compare two versions of an application, notice that CustTable form changed and that we have overlayered it, therefore we have an upgrade conflict there. It could tell us how Microsoft changed the form and what changes we made to the older version. Then we had to merge these two sets of changes.
Without layers, duplicating an element means creating a new one with a different name. For example, I could duplicate CustTransEntity and create XYZCustTransEntity. There isn’t anything linking these two entities. Without a careful examination, we can’t say whether XYZCustTransEntity was created as a copy of CustTransEntity or not. Therefore even knowing which elements we should check is problematic.
Let’s assume for now that we’re able to do it and we have a list of standard elements and our copies.
When upgrading the application, we have to compare the old and the new version to identify changed elements. It’s not particularly difficult – it means comparing the files and if they’re different, we can also check individual elements in XML files. We’re interested just in those objects that we’ve duplicated.
This would allow us to create a report showing which custom elements we need to upgrade (and possible what has changed in the standard application).
Implementation
I’m not aware of any tools for this purpose; please let me know if you do. And I don’t have any either, but let me consider how it could be done.
To be able to compare two versions of the application, we simply need to have both sets of files somewhere. For example, we can copy standard packages to a separate folder, install an application update and compare the folders. It would be nice to have such a repository already provided by Microsoft. A similar process is needed for ISV solutions as well.
Comparing the sets of files isn’t difficult. For example, Powershell offers Compare-Object cmdlet that can be used to compare file contents or hashes.
Knowing that there is a change in a file is insufficient if we’ve duplicated just something like an individual method. But that can be addressed easily; we just need to extract the particular elements from XML files and compare the values.
The key problem I see is the identification of our copies of standard objects. There isn’t any reliable automatic way; the best what we could get is information that some objects are very similar, but that doesn’t necessarily mean that one is a duplicate of the other and it should be kept in sync. I believe we need to explicitly establish the relation when making a copy. Potentially, Visual Studio could help us with that.
There are many ways how to store the information; one of the simplest is using an XML file. For instance:
<Copies> <Entry Orig="dynamics://Table/SalesLine/Method/recalculateDatesForDirectDelivery" OrigVersion="10.0.41" Copy="dynamics://Class/SalesLineXYZ_Extension/Method/xyzRecalculateDatesForDirectDelivery" /> </Copies>
This is easy to process by tools; the disadvantage I see is that it’s not automatically maintained with the code. For example, if I decide to change the prefix of the method (and don’t update the file), the link will break. Of course, we can have a process that checks the file and report invalid references (e.g. during CI builds).
Another approach could, for instance, involve a special attribute for code changes:
[CodeCopy('dynamics://Table/SalesLine/Method/recalculateDatesForDirectDelivery', '10.0.41')] internal void xyzRecalculateDatesForDirectDelivery() { ... }
This can’t be used everywhere, e.g. you can’t put an attribute to an SSRS report. We have tags there, although putting a lot of information there would look ugly. If Microsoft got involved, they could create a new property for this purpose.
Let’s keep it simple and consider the XML file for now.
The process of upgrade conflict detection could look like this:
- Read an entry from the XML file.
- Identify the source element and the source file (e.g. the class file for a class method).
- Get both versions of the file: the original version and the current one.
- Compare files. If they’re identical, stop processing of this entry.
- If the source element is a method, compare just the method element in both files. If they’re identical, end processing.
- Report the upgrade conflict.
It doesn’t address the actual upgrade, but it at least tell us which elements must be upgraded. We would compare the files to see the changes.
I don’t have such a process in place, but I’m thinking about building something like that. I rarely copy existing application elements by myself, but I see a lot of problems in codebases of my clients. There are many duplicated objects that no one maintains at all; they include bugs fixed by Microsoft years ago, they refer to obsolete objects and so on.
If we aren’t able to detect problems caused by our our outdated code, the number of issues increases with every update. With a process like this, we could significantly reduce the degradation. I don’t expect being able to identify all duplication in legacy codebases, but we could still make a big difference if we address at least some (the more important ones) and cover all copies introduced in new code.
Hi Martin, we addressed this problem a few years ago. For that, we developed a way to tag methods or whole objects. We export all the elements of the current version, and then, after installing the new version, we export the elements again. Then we use an external tool like BeyondCompare to see what changes were made in the standard and reimplement them in our copies of the code. It’s far from perfect, but it has been working for us for years. It’s good to know that many people are aware of this problem!