Search This Blog
March 06, 2026
WorthKnowing: Adding system fields in data entity
March 02, 2026
[Solved] The ‘VSProjectPackage’ package did not load correctly. UDE VS setup issue
The ‘VSProjectPackage’ package did not load correctly.
The problem may have been caused by a configuration change or by the installation of another extension. You can get more information by examining the fileC:\Users\deepak.agarwal\AppData\Roaming\Microsoft\VisualStudio\17.0_0dc3asdf9\ActivityLog.xml.
Restarting Visual Studio could help resolve this issue.
Cheers!!!
January 26, 2026
Practical Ways to Use GitHub Copilot for X++ Development (Tips, Tricks, Hidden Gems & What to Avoid)
Hi Folks,
In my last post about GitHub Copilot we covered what GitHub Copilot is, how to enable
it, and the licensing options, let’s move into the part everyone actually cares
about — Practical Ways to Use GitHub Copilot for X++ Development
Copilot isn’t magic, but when used well, it feels pretty
close. Below are practical examples, prompt ideas, best practices, and a few
“don’ts” that will save you time and frustration.
1. X++ Boilerplate Generation (Your New Time Saver)
X++ is full of repetitive patterns — table methods, service
classes, form event handlers, data entities, and more. Copilot can generate
60–80% of this instantly.
Try prompts like:
- “Create
a validateWrite method for this table following best practices.”
- “Generate
a runnable class that updates sales orders with status ‘Open’.”
- “Create
a SysOperation framework class with contract, controller, and service.”
You still review and refine — but the heavy lifting is done. (I would still recommend this to do every single time)
2. Documentation & XML Comments (The Most Underrated
Feature)
Copilot is brilliant at generating XML documentation
for methods, classes, and services.
Prompt idea:
- “Add
XML documentation to this method explaining parameters and return values.”
Why it matters
- Better
readability
- Faster
onboarding for new developers
- Cleaner code reviews
3. OData & Integration Summaries
When working with integrations, Copilot can help you quickly
draft:
- OData
endpoint descriptions
- Request/response
examples
- Error-handling
notes
- API
documentation
Prompt idea:
- “Explain
how this data entity will behave when called via OData.”
This is especially useful when writing technical design documents.
4. Best Practice Suggestions on Selected Code
This is where Copilot becomes a mini code reviewer.
Highlight a block of X++ code → open Inline Chat (Alt +
/) → ask:
- “Suggest
best practices for this code.”
- “Is
there any performance issue here?”
- “Rewrite
this using recommended patterns.”
Example: Summarize Sales Order
Your example:
public void summarizeSalesOrder(str salesOrderId)
{
SalesTable
salesTable = SalesTable::find(salesOrderId);
info(strFmt("Sales Order %1 for customer %2 has total amount
%3",
salesTable.SalesId, salesTable.CustAccount, salesTable.TotalAmount));
}
Copilot might suggest:
- Add
null checks
- Use ttsbegin/ttscommit
if modifying data
- Avoid
info() in production code
- Consider
using SalesTotals class for accurate totals
This is where Copilot shines — it nudges you toward better patterns.
5. Plugin Development with Copilot Studio
If you’re building Copilot plugins for D365FO,
Copilot can help with:
- API
schema generation
- JSON
payload examples
- C#
wrapper classes
- Error-handling
templates
This is still evolving, but it’s already a huge productivity boost.
6. Hidden Gems Most Developers Miss
- Use #filename to give Copilot context
Example:
#SalesTable Explain how validateWrite works for this table.
- ü Use /intent to guide Copilot
Example:
/refactor Rewrite this method to improve readability.
- ü Keep related files open
Copilot reads open tabs to understand your project
structure.
- ü Use natural language
You don’t need to be formal.
“Make this faster” works surprisingly well.
7. What NOT to Do with Copilot (Important!)
Copilot is powerful, but not perfect. Avoid:
❌ Blindly accepting code
Always review for:
- Security
issues
- Performance
problems
- Deprecated
APIs
❌ Using Copilot for business
logic decisions
Copilot doesn’t know your customer’s requirements.
❌ Expecting Copilot to understand
custom frameworks
It learns from your codebase over time — but not instantly.
❌ Using vague prompts
“Fix this” → not helpful
“Optimize this query to reduce DB calls” → much better
While all this working magically, GitHub Copilot won’t replace your X++ expertise — but it will absolutely amplify it. So don't afraid of using it, make this your companion. The more you use it, the better it understands your patterns, naming conventions, and coding style.
January 06, 2026
Streamlining D365FO Access with Entra ID Security Groups
Lets see some quick steps to setup Entra security groups,
- Enable the Feature: In D365FO, navigate to Feature Management and enable Microsoft Entra ID Security Groups.
- Create
Security Groups in Entra ID: Use the Microsoft Entra admin center to create groups. You can choose:
- Assigned
groups (manual membership)
- Dynamic
groups (rule-based membership based on user attributes)
- Assign Roles to Groups in D365FO: Go to System Administration > Security Configuration > Entra ID Security Groups.
- Import
your Entra groups.
- Assign
D365FO roles to each group.
- User
Provisioning: When a user logs in, D365FO checks their group membership and
automatically assigns roles based on the group configuration. This
supports just-in-time (JIT) provisioning
Of course there are advantages Over Traditional Role-Based Access, like
- Centralized
Management: Admins can manage access across multiple apps from Entra
ID.
- Dynamic
Membership: Automatically assign users to groups based on attributes
(e.g., department, location).
- Bulk
Provisioning: Assign roles to many users at once—ideal for onboarding.
- Lifecycle
Automation: Role changes happen automatically when user attributes
change.
- Just In time access
- Centralized onboarding and offboarding of users
And yes, there are some limitations Compared to Traditional Role Assignments, like
- No
Role Visibility in User Profile: Roles assigned via groups don’t
appear in the user’s security role list in D365FO.
- Audit
Complexity: Harder to trace exact role assignments for individual
users. Few out of box report doesn't support these users.
- Limited
Granularity: Cannot assign roles based on task-level needs unless you
create many groups.
- External user in Entra doesn't get access automatically.
- Complex workflows may not work as expected.
- Use
Dynamic Groups for Automation: Define rules like user. department
-eq "Finance" to auto-assign users to finance roles.
- Combine
with Direct Role Assignments: For exceptions or sensitive roles, assign them directly in D365FO to
maintain visibility.
- Document
Group-to-Role Mapping: Maintain a clear mapping of which Entra groups correspond to which D365FO
roles.
- Audit
Regularly: Use PowerShell or Graph API to extract group membership and validate
access.
- Avoid
Overlapping Assignments: Ensure users don’t get conflicting roles from multiple groups.
Entra ID security groups simplify access management in
D365FO, especially for large-scale or dynamic environments. However, they’re
best used in tandem with traditional role assignments to balance automation
with control. By following best practices, you can streamline provisioning
while maintaining auditability and compliance.
December 11, 2025
OData Authentication for On-Premises D365FO
- AD
FS Configuration
ü
AD FS must be properly configured and integrated
with D365FO.
ü
The AOS (Application Object Server) uses AD FS
metadata to validate tokens.
ü
Ensure the AD FS XML configuration file is
accessible to AOS.
- Client
Application Setup
ü
External apps (e.g., Postman, .NET clients) must
be registered in AD FS.
ü
You’ll need:
ü
Client ID (from AD FS or Azure App Registration)
ü
Resource URI (typically the D365FO base URL)
ü
Token Endpoint (AD FS OAuth2 endpoint)
- Token
Acquisition
ü
Use OAuth2 protocol to acquire a bearer token.
ü
The token request includes:
§ grant_type=password
§ client_id
§ username
and password
§ resource
(D365FO URL)
ü
AD FS returns a JWT token if credentials are
valid.
- Calling
OData
ü Include the token in the Authorization header: Authorization: Bearer <access_token>
ü Use standard OData URLs like: https://<your-d365fo-url>/data/Customers
- Get
Token
ü POST
to AD FS token endpoint: https://<adfs-url>/adfs/oauth2/token
ü
Body (x-www-form-urlencoded):
client_id=<your-client-id>
username=<your-username>
password=<your-password>
grant_type=password
resource=https://<your-d365fo-url>
- Use
Token
ü
Add Authorization: Bearer<token> header to your OData request.
- Test
Endpoint
ü GET: https://<your-d365fo-url>/data/Customers
Token Expiry: Tokens typically expire after 1 hour. Refresh or reacquire as needed.
AD FS Clock Skew: Ensure time sync between AD FS and AOS servers.
SSL Certificates: AD FS endpoints must be secured with valid SSL certs.
User Permissions: The authenticated user must have access to the data entities.
November 13, 2025
Working with Bicep – Part 2: Azure Function App Deployment
Hi Folks,
In Part 1, we explored how to use Azure Bicep to deploy Logic Apps and integrate with Dynamics 365 for Finance and Operations (D365FO). Now, in Part 2, we’ll extend that architecture by deploying Azure Function Apps using Bicep—enabling custom logic, token handling, and deeper orchestration capabilities.
- Custom
logic execution (e.g., token parsing, data shaping)
- Event-driven
triggers (e.g., D365FO business events)
- Scalable
backend processing without managing infrastructure
October 29, 2025
WorthKnowing: Measuring Code Quality: Beyond Subjective Judgment
- Defect count: Fewer bugs mean higher reliability. Static analysis tools can help identify defects early.
- Availability: Measured using metrics like Mean Time Between Failures (MTBF), which indicates how often the system fails.
- Codebase size and structure
- Consistency and complexity
- Testability and understandability
- Stylistic warnings from linters
- Halstead complexity measures, which quantify code readability and effort
- Control and observability of components
- Ability to isolate and automate tests
- Testing on multiple platforms throughout development—not just at the end
- Using multiple compilers with strict warning levels
- Enforcing consistent coding standards
- Modularity: Components are self-contained
- Loose coupling: Minimal dependencies between modules
September 30, 2025
Working with Bicep – Part 1: Setup, Licensing, and D365FO Integration
As enterprise solutions evolve,
Infrastructure as Code (IaC) is becoming a cornerstone of modern DevOps
practices. While ARM templates have long been the standard for Azure resource
deployment, Bicep—a domain-specific language (DSL) for ARM—offers a cleaner,
more maintainable alternative.
In this two-part blog series, we’ll explore how to use Bicep to deploy Azure
Logic Apps, with a focus on integrating with Dynamics 365 for Finance and
Operations (D365FO). In Part 2, we’ll extend this by incorporating Azure
Functions into the mix.
Why Bicep Over ARM Templates?
ARM templates are powerful but verbose and
error-prone. Bicep simplifies this by offering:
- Simplified syntax: No more deeply nested JSON.
- Modularity: Reusable components via modules.
- Better tooling: IntelliSense, type safety, and validation in VS Code.
- Native support: Compiles directly to ARM templates.
Tools Required
1. Azure CLI (v2.20.0+)
2. Bicep CLI (or use `az bicep install`)
3. Visual Studio Code with the Bicep extension
4. Access to Azure Subscription with permissions to deploy resources
5. Logic Apps Standard or Consumption Plan
Licensing Considerations
- Logic Apps Consumption: Pay-per-action;
no upfront cost.
- Logic Apps Standard: Fixed pricing per integration service environment (ISE)
or App Service Plan.
- D365FO: Ensure the user/service principal has appropriate OData API
permissions (typically via Azure AD App Registration).
Sample Bicep Template for Logic App Deployment
Here’s a minimal Bicep file to deploy a
Logic App (Consumption Plan):
param logicAppName string =
'd365fo-integration-app'
param location string = resourceGroup().location
resource logicApp 'Microsoft.Logic/workflows@2019-05-01' = {
name: logicAppName
location: location
properties: {
definition:
loadTextContent('./logicapp-definition.json')
parameters: {}
}
}
Sample Logic App: Calling a D365FO Service
Let’s say you want to call a custom service
in D365FO via OData. Your Logic App might include:
1. HTTP Trigger
2. HTTP Action to call D365FO
Example HTTP Action Configuration:
{
"method": "GET",
"uri":
"https://<your-env>.cloudax.dynamics.com/data/CustomService",
"headers": {
"Authorization":
"Bearer @{body('Get_Token')?['access_token']}",
"Accept":
"application/json"
}
}
To authenticate, use Azure AD OAuth 2.0
with a client credential flow. You can retrieve the token using a separate HTTP
action or Azure Function.
Folder Structure Recommendation
/logicapps
├── main.bicep
├── logicapp-definition.json
└── parameters.json
Deployment Command
--resource-group my-rg \
--template-file main.bicep
In the next post, we’ll explore how to
extend Logic Apps with Azure Functions, enabling more complex orchestration and
custom logic—especially useful when working with D365FO’s business events or
data transformations.
September 01, 2025
[Solved] DVT script for service model: AOSService on machine: D365FO_DEVBox
- Go to the Environment Details page in LCS.
- Abort the update process.
- Click Maintain → Rotate secrets.
- Select Rotate the SSL certificates and confirm.
August 22, 2025
Github copilot for D365FO development
GitHub Copilot is transforming how
developers write code. For those working with Dynamics 365 Finance and
Operations (D365FO) in Visual Studio, Copilot can be a powerful assistant—helping
with boilerplate code, documentation, and even plugin development. In today’s
post let’s understand all about GitHub Copilot and how this can make your day-to-day
development experience awesome.
Licensing & Pricing
First thing first, Lets understand
Ø How much will it cost
me?
Ø What all licenses are
there?
Ø What license do I need
to use?
(One
more license to buy 😊)
GitHub Copilot offers several tiers:
- Copilot Free: $0 - Limited completions and chat (2,000 completions, 50
chats/month)
- Copilot Pro: $10 - Unlimited completions and chat, access to GPT-4o
- Copilot Business: $19/user - Team management, advanced features
- Copilot Enterprise: $39/user - Enterprise-grade AI features and integrations
(Official
Pricing Guide for more information)
How to Enable GitHub Copilot in Visual Studio
To install Copilot in Visual Studio 2022
(v17.10+):
1. Open Visual Studio Installer
2. Select your installed version → Click Modify
3. Choose any workload (e.g., .NET desktop development)
4. Under Optional Components, check GitHub Copilot
5. Click Modify to install
Sign in with your GitHub account
Another option could be , on the top right
corner you will get the option to login with Github Copilot
Here is detailed Installation
Guide
Copilot for D365FO Development
Copilot can assist with many things, below
are some areas which I explored so far,
- X++ boilerplate generation
- Documentation comments
- Plugin development using Copilot Studio
- OData integration summaries
Give suggestion on best practice on
selected object or selected piece of code.
Example: Summarize Sales Order in X++
public void summarizeSalesOrder(str
salesOrderId)
{
SalesTable salesTable =
SalesTable::find(salesOrderId);
info(strFmt("Sales Order %1 for
customer %2 has total amount %3",
salesTable.SalesId,
salesTable.CustAccount, salesTable.TotalAmount));
}
Best Practices (From Microsoft Learn)
Here are some tips to maximize Copilot’s
effectiveness:
- Use clear comments to guide Copilot
- Keep relevant files open—Copilot uses them to infer context
- Use Inline Chat (Alt + /)
- Refer to files using #filename or /intent
- Always review suggestions for accuracy and security
Copilot Fundamentals: https://learn.microsoft.com/en-us/training/paths/copilot/
Prompt Engineering Module:
https://learn.microsoft.com/en-us/training/modules/introduction-prompt-engineering-with-github-copilot/
August 18, 2025
WorthKnowing: WHS vs WMS Prefixes in D365FO Warehouse Management
Hi folks,
If you've worked with Warehouse Management in Dynamics 365
Finance and Operations (D365FO), you might have come across two different
prefixes in the system: WHS and WMS. Ever wondered
why both exist and what they signify? Let’s break it down.
The Basics: WHS vs WMS
Both WHS and WMS stand
for Warehouse Management System, but they refer to different
generations of warehouse management solutions within Microsoft
Dynamics.
WMS – The Legacy System
Objects prefixed with WMS belong to
the legacy warehouse management system, which was part of earlier
AX versions. This system supports basic warehouse operations, such
as:
- Inventory
location tracking
- Simple
picking and receiving
- Basic
location control
Examples of WMS objects:
- WMSLocation –
Stores warehouse locations
- WMSOrderTrans, WMSJournalTable,
etc.
While functional, the legacy WMS lacks the flexibility and
scalability required for complex warehouse scenarios. It’s generally not
recommended for new implementations.
WHS – The Advanced System
With the release of AX 2012 R3, Microsoft
introduced a more robust and feature-rich warehouse management solution, which
continues to evolve in D365FO. Objects prefixed with WHS are
part of this advanced warehouse management system, designed to
handle complex warehouse processes, including:
- Mobile
device integration
- Work
creation and execution (e.g., picking, putaway)
- Location
directives, work templates, and wave processing
Examples of WHS objects:
- WHSWorkTable –
Stores work headers
- WHSWorkLine –
Stores work lines
- WHSInventEnabled, WHSParameters,
etc.
This system is highly configurable and supports modern
warehouse operations across industries.
Summary
- Use
WHS objects for advanced, scalable warehouse management in
D365FO.
- WMS
objects are legacy and suitable only for basic scenarios or
backward compatibility.
Understanding the distinction between WHS and WMS helps
ensure you're building solutions on the right foundation, especially when
designing or extending warehouse functionality in D365FO.
August 13, 2025
[Solved]The specified module 'C:\Program Files\Microsoft Security Client\MpProvider' was not loaded because no valid module file was found in any module directory.
- Do a get latest on your dev box
- Clear any of your pending changes, if feasible. Or at least make them error free.
- Do a full build and synch, make sure there are no errors.
- Close the VS and any other application running on your machine.
- Close the RDP.
- Try to update your VM now.
July 11, 2025
WorthKnowing: Where to add new fields on Item master
When customizing the item master in Dynamics 365
Finance and Operations (D365FO), one common question is:
Should I extend InventTable or EcoResProduct?
Here’s a WorthKnowing guide to help you decide:
1. InventTable
- Purpose:
Holds inventory-specific data for released products.
- Scope:
Company-specific (per legal entity).
- Use
When:
You need to add fields related to: - Inventory
management
- Warehousing
- Sales
or purchasing
- Any
data that varies by company
- Examples:
Item group, inventory dimensions, default warehouse, sales price.
2. EcoResProduct
- Purpose:
Stores the core product definition (product master).
- Scope:
Shared across all companies (global).
- Use
When:
You need to add fields related to: - Product
identity
- Technical
specifications
- Data
that should be consistent across all legal entities
- Examples:
Product name, product type, product number, technical specs.
Knowing the difference between these two tables can save you
time and ensure your customizations align with D365FO’s architecture. Choose
wisely based on the scope and nature of your data.
July 08, 2025
New post series 'WorthKnowing'
- Hidden gems that have been around for a while but often go unnoticed
- New features and their best use cases
- Common functionalities you might be using differently than intended
- Similar looking things with their core differences
July 01, 2025
How to Remove a Model or ISV Solution from Dynamics 365 Finance and Operations (D365FO)
Hi Folks,
Removing a model or ISV solution from a Dynamics 365 Finance
and Operations (D365FO) environment can be a necessary step during system
cleanup, decommissioning unused features, or resolving conflicts. However, it
must be done carefully to avoid breaking dependencies or corrupting the
application metadata.
In this post I’ll Walk through the safe and structured
process of removing a model or ISV solution from D365FO.
Before you begin, ensure:
- You have access to a development
environment (Tier 1).
- You have admin rights in Visual
Studio and Lifecycle Services (LCS).
- You’ve taken a full backup of model you want to delete/remove.
- You’ve identified dependencies and verified that no other models rely on the one you’re removing.
1. Identify the Model
Open Visual Studio in your D365FO development environment:
·
Go to Dynamics 365 > Model Management
> Model Management.
·
Use the Model Util tool or
PowerShell to list installed models:
Get-AXModel -Details
Identify the model name, publisher, and layer (e.g., ISV, VAR, CUS).
2. Check for Dependencies
Use the Model Dependencies Report in Visual
Studio:
·
Go to Dynamics 365 > Model Management
> Model Dependencies.
·
Select the model and generate the report.
· Ensure no other models depend on the one you want to remove.
3. Uninstall the Model
If the model was installed via a deployable package (e.g., from an ISV), you’ll need to:
·
Remove the package from the
AOSService\PackagesLocalDirectory.
·
Delete the model folder manually.
·
Optionally, use PowerShell: (This command only
works for models installed via XPO or model store, not for package-based
deployments.)
Uninstall-AXModel -Model <ModelName
4. Rebuild and Synchronize
After removal:
- Open Visual Studio.
- Rebuild the solution to ensure no broken
references.
- Run Database Synchronization to update the schema.
5. Check in the changes
Do the code check in for deleted objects, make sure you select all deleted objects from your ‘pending changes’ tab.
6. Prepare package for Sandbox
·
Create DefaultModelDelete.txt This file tells
the system which models to remove. It must contain the exact model name as
registered in the model manifest. Just add exact model name in this file, if
there are multiple models pl add into next line for example,
§ Model1
§ MyISVModel
- Create a
new deployable package using your build pipeline or manually.
- Place
the DefaultModelDelete.txt file in the root of
the deployable package folder (same level as AXDeployablePackage).
- Ensure the model binaries and metadata are not included in the package.
7. Package deployment
· Upload the package to LCS Asset Library.
· Apply it to your sandbox or production environment.
· Monitor the deployment logs for confirmation that the model was removed.
Removing a model—especially in production—requires precision and planning. Using the DefaultModelDelete.txt method ensures a clean and supported way to decommission ISV or custom solutions without manual intervention in higher-tier environments.