January 26, 2026

Practical Ways to Use GitHub Copilot for X++ Development (Tips, Tricks, Hidden Gems & What to Avoid)

Hi Folks, 

In my last post about GitHub Copilot we covered what GitHub Copilot is, how to enable it, and the licensing options, let’s move into the part everyone actually cares about — Practical Ways to Use GitHub Copilot for X++ Development

Copilot isn’t magic, but when used well, it feels pretty close. Below are practical examples, prompt ideas, best practices, and a few “don’ts” that will save you time and frustration.

1. X++ Boilerplate Generation (Your New Time Saver)

X++ is full of repetitive patterns — table methods, service classes, form event handlers, data entities, and more. Copilot can generate 60–80% of this instantly.

Try prompts like:

  • “Create a validateWrite method for this table following best practices.”
  • “Generate a runnable class that updates sales orders with status ‘Open’.”
  • “Create a SysOperation framework class with contract, controller, and service.”

You still review and refine — but the heavy lifting is done. (I would still recommend this to do every single time)

2. Documentation & XML Comments (The Most Underrated Feature)

Copilot is brilliant at generating XML documentation for methods, classes, and services.

Prompt idea:

  • “Add XML documentation to this method explaining parameters and return values.”

Why it matters

  • Better readability
  • Faster onboarding for new developers
  • Cleaner code reviews

3. OData & Integration Summaries

When working with integrations, Copilot can help you quickly draft:

  • OData endpoint descriptions
  • Request/response examples
  • Error-handling notes
  • API documentation

Prompt idea:

  • “Explain how this data entity will behave when called via OData.”

This is especially useful when writing technical design documents.

4. Best Practice Suggestions on Selected Code

This is where Copilot becomes a mini code reviewer.

Highlight a block of X++ code → open Inline Chat (Alt + /) → ask:

  • “Suggest best practices for this code.”
  • “Is there any performance issue here?”
  • “Rewrite this using recommended patterns.”

Example: Summarize Sales Order

Your example:

public void summarizeSalesOrder(str salesOrderId)

{

    SalesTable salesTable = SalesTable::find(salesOrderId);

    info(strFmt("Sales Order %1 for customer %2 has total amount %3",

        salesTable.SalesId, salesTable.CustAccount, salesTable.TotalAmount));

}

Copilot might suggest:

  • Add null checks
  • Use ttsbegin/ttscommit if modifying data
  • Avoid info() in production code
  • Consider using SalesTotals class for accurate totals

This is where Copilot shines — it nudges you toward better patterns.

5. Plugin Development with Copilot Studio

If you’re building Copilot plugins for D365FO, Copilot can help with:

  • API schema generation
  • JSON payload examples
  • C# wrapper classes
  • Error-handling templates

This is still evolving, but it’s already a huge productivity boost.

6. Hidden Gems Most Developers Miss

  •       Use #filename to give Copilot context

Example:

#SalesTable Explain how validateWrite works for this table.

  • ü      Use /intent to guide Copilot

Example:

/refactor Rewrite this method to improve readability.

  • ü  Keep related files open

Copilot reads open tabs to understand your project structure.

  • ü  Use natural language

You don’t need to be formal.
“Make this faster” works surprisingly well.

7. What NOT to Do with Copilot (Important!)

Copilot is powerful, but not perfect. Avoid:

Blindly accepting code

Always review for:

  • Security issues
  • Performance problems
  • Deprecated APIs

Using Copilot for business logic decisions

Copilot doesn’t know your customer’s requirements.

Expecting Copilot to understand custom frameworks

It learns from your codebase over time — but not instantly.

Using vague prompts

“Fix this” → not helpful
“Optimize this query to reduce DB calls” → much better

While all this working magically, GitHub Copilot won’t replace your X++ expertise — but it will absolutely amplify it. So don't afraid of using it, make this your companion. The more you use it, the better it understands your patterns, naming conventions, and coding style.


-Harry 
 Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

January 06, 2026

Streamlining D365FO Access with Entra ID Security Groups

Hi Friends,

Happy new year!!! Lets continue learning...

In this post, I will share some insight on how Entra security group can used to streamline D365FO access management. 

Managing user access in Dynamics 365 Finance and Operations can be complex—especially in large organizations with frequent onboarding (or offboarding), role changes, and compliance needs. Microsoft Entra ID security groups provide an alternative to traditional role-based access control (RBAC), offering centralized, scalable access management. In this post let’s explore how to set them up, their pros and cons, and best practices for implementation.

Lets see some quick steps to setup Entra security groups, 

  1. Enable the Feature:  In D365FO, navigate to Feature Management and enable Microsoft Entra ID Security Groups.
  2. Create Security Groups in Entra ID: Use the Microsoft Entra admin center to create groups. You can choose:
    • Assigned groups (manual membership)
    • Dynamic groups (rule-based membership based on user attributes)
  3. Assign Roles to Groups in D365FO: Go to System Administration > Security Configuration > Entra ID Security Groups.
    • Import your Entra groups.
    • Assign D365FO roles to each group.
  4. User Provisioning: When a user logs in, D365FO checks their group membership and automatically assigns roles based on the group configuration. This supports just-in-time (JIT) provisioning

Of course there are advantages Over Traditional Role-Based Access, like

  • Centralized Management: Admins can manage access across multiple apps from Entra ID.
  • Dynamic Membership: Automatically assign users to groups based on attributes (e.g., department, location).
  • Bulk Provisioning: Assign roles to many users at once—ideal for onboarding.
  • Lifecycle Automation: Role changes happen automatically when user attributes change.
  • Just In time access
  • Centralized onboarding and offboarding of users 

And yes, there are some limitations Compared to Traditional Role Assignments, like

  • No Role Visibility in User Profile: Roles assigned via groups don’t appear in the user’s security role list in D365FO.
  • Audit Complexity: Harder to trace exact role assignments for individual users. Few out of box report doesn't support these users.
  • Limited Granularity: Cannot assign roles based on task-level needs unless you create many groups.
  • External user in Entra doesn't get access automatically. 
  • Complex workflows may not work as expected. 
Now lets talk about few of best Practices for Using Entra ID Groups in D365FO,

  • Use Dynamic Groups for Automation:  Define rules like user. department -eq "Finance" to auto-assign users to finance roles.
  • Combine with Direct Role Assignments: For exceptions or sensitive roles, assign them directly in D365FO to maintain visibility.
  • Document Group-to-Role Mapping: Maintain a clear mapping of which Entra groups correspond to which D365FO roles.
  • Audit Regularly: Use PowerShell or Graph API to extract group membership and validate access.
  • Avoid Overlapping Assignments: Ensure users don’t get conflicting roles from multiple groups.
My view:

Entra ID security groups simplify access management in D365FO, especially for large-scale or dynamic environments. However, they’re best used in tandem with traditional role assignments to balance automation with control. By following best practices, you can streamline provisioning while maintaining auditability and compliance.

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

December 11, 2025

OData Authentication for On-Premises D365FO

Hi Folks, 

Integrating with D365FO via OData is a powerful way to enable external systems to interact with ERP data. While cloud-hosted environments use Azure Active Directory (AAD) for authentication, on-premises deployments require a different approach—primarily relying on Active Directory Federation Services (AD FS). This post walks through the essentials of authenticating OData requests in an on-prem D365FO setup.

OData in D365FO exposes data entities over RESTful endpoints, enabling CRUD operations. In on-prem environments, authentication is handled by AD FS, which issues security tokens based on user credentials. These tokens are then used to authorize access to the OData endpoints.

Below are key component for this entire process, 

  1. AD FS Configuration

ü  AD FS must be properly configured and integrated with D365FO.

ü  The AOS (Application Object Server) uses AD FS metadata to validate tokens.

ü  Ensure the AD FS XML configuration file is accessible to AOS.

  1. Client Application Setup

ü  External apps (e.g., Postman, .NET clients) must be registered in AD FS.

ü  You’ll need:

ü  Client ID (from AD FS or Azure App Registration)

ü  Resource URI (typically the D365FO base URL)

ü  Token Endpoint (AD FS OAuth2 endpoint)

  1. Token Acquisition

ü  Use OAuth2 protocol to acquire a bearer token.

ü  The token request includes:

§  grant_type=password

§  client_id

§  username and password

§  resource (D365FO URL)

ü  AD FS returns a JWT token if credentials are valid.

  1. Calling OData

ü  Include the token in the Authorization header:  Authorization: Bearer <access_token>

ü  Use standard OData URLs like:  https://<your-d365fo-url>/data/Customers 


Lets take an example to authentication via Postman;

  1. Get Token

ü  POST to AD FS token endpoint: https://<adfs-url>/adfs/oauth2/token

ü  Body (x-www-form-urlencoded):

            client_id=<your-client-id>

username=<your-username>

password=<your-password>

grant_type=password

resource=https://<your-d365fo-url>

 

  1. Use Token

ü  Add Authorization: Bearer<token> header to your OData request.

  1. Test Endpoint

ü  GET:  https://<your-d365fo-url>/data/Customers


Please be aware; 
  • Token Expiry: Tokens typically expire after 1 hour. Refresh or reacquire as needed.

  • AD FS Clock Skew: Ensure time sync between AD FS and AOS servers.

  • SSL Certificates: AD FS endpoints must be secured with valid SSL certs.

  • User Permissions: The authenticated user must have access to the data entities.

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

November 13, 2025

Working with Bicep – Part 2: Azure Function App Deployment

Hi Folks, 

In Part 1, we explored how to use Azure Bicep to deploy Logic Apps and integrate with Dynamics 365 for Finance and Operations (D365FO). Now, in Part 2, we’ll extend that architecture by deploying Azure Function Apps using Bicep—enabling custom logic, token handling, and deeper orchestration capabilities.

Just for a recap, lets understand why Azure function is so awesome, Azure Functions are serverless compute services ideal for lightweight processing, transformations, and integrations. When paired with Logic Apps, they offer:
  • Custom logic execution (e.g., token parsing, data shaping)
  • Event-driven triggers (e.g., D365FO business events)
  • Scalable backend processing without managing infrastructure
In todays post lets take a real time scenario, To authenticate Logic Apps with D365FO, you often need to retrieve an OAuth token. Here’s how an Azure Function can help, below is sample code which can be called from Logic Apps to retrieve and inject the token dynamically.




  Here are some known issues and their possible fixes (atlest they worked for me ;) )



-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta