March 20, 2026

Creating multiple records in D365FO using OData batch

Hi Folks, 

In todays post I will try to explains how to create multiple records in Dynamics 365 Finance & Operations (D365FO) using a single OData batch request. The example uses Postman and demonstrates how each part of the batch payload works, including boundaries, headers, and common pitfalls.

[Please read and understand the full content first before using the examples.]

1. Overview

OData batch allows you to send several operations in one HTTP request. This is useful when you want to create multiple records at once, such as multiple Ledger Journal Headers. Instead of sending separate POST requests, you wrap them inside a batch and a changeset.

  • The batch groups everything together.
  • The changeset groups write operations (POST, PATCH, DELETE) and makes them transactional.
  • Each record creation is an application/http part inside the changeset.

2. Full Batch Payload Example



3. Understanding Each Part of the Payload

Batch Boundary : The batch begins with:

    --batch_7bf57939-a923-4e49-92d3-20fb4f2c8435

This boundary must match the boundary declared in the main HTTP header:

    Content-Type: multipart/mixed; boundary=batch_7bf57939-a923-4e49-92d3-20fb4f2c8435

The batch ends with:

    --batch_7bf57939-a923-4e49-92d3-20fb4f2c8435--


Changeset Boundary

Inside the batch, you declare a changeset:

    Content-Type: multipart/mixed; boundary=changeset_8a6f6ebe-e9c9-44a2-b980-c33a69370eb4

Each POST request is wrapped inside this changeset. The changeset ends with:

    --changeset_8a6f6ebe-e9c9-44a2-b980-c33a69370eb4--


Individual Operations

Each operation starts with:

--changeset_...

Content-Type: application/http

Content-Transfer-Encoding: binary

Content-ID: <unique number>

Content-ID is used to identify the operation. It becomes important when referencing results between operations.

Inner HTTP Request

Each operation contains a full HTTP request:

POST /data/LedgerJournalHeaders HTTP/1.1

Content-Type: application/json;odata.metadata=minimal

Accept: application/json;odata.metadata=minimal

Company: usmf

Prefer: return=representation

Key headers:

  • Company: Specifies the legal entity.
  • Prefer: return=representation: Returns the created record in the response.

The JSON body follows after a blank line.

4. How to Send This in Postman

  1. Set method to POST.
  2. URL:
  3. https://<your-environment>.cloudax.dynamics.com/data/$batch
  4. Add headers:
    • Authorization: Bearer <token>
    • Content-Type: multipart/mixed; boundary=batch_7bf57939-a923-4e49-92d3-20fb4f2c8435
    • Accept: application/json
  5. Paste the entire batch payload into the body (raw text).

Important:

  • Do not add extra spaces before boundary lines.
  • Ensure blank lines exist where required.
  • Boundary names must match exactly.

5. Common Pitfalls

  • Boundary mismatch: The most common cause of errors.
  • Missing blank lines: Required between headers and body.
  • Incorrect Company header: May cause data to be created in the wrong legal entity.
  • Changeset errors: If one operation fails, all operations in the changeset fail.

6. Summary

Using OData batch in D365FO allows you to create multiple records efficiently in a single request. Understanding boundaries, headers, and formatting is essential for successful execution. The example provided can be used as a template for creating multiple Ledger Journal Headers or any other entity records.

Hope you find this post useful, pl drop a comment if you need some more examples. 


-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

March 06, 2026

WorthKnowing: Adding system fields in data entity

Hi Folks, 

System fields like CreatedBy, CreatedDateTime, ModifiedBy, and ModifiedDateTime appear in a D365FO entity’s metadata, but they don’t show up in exports or OData. This happens because system fields aren’t automatically exposed through the staging table.

To make them available, create wrapper fields with different names—such as CreatedBy1, CreatedDateTime1, ModifiedBy1, and ModifiedDateTime1—and map them to the system fields.

Steps
- Add the new fields to the staging table or its extension.
- Add them to the data entity and map each one to the corresponding system field.
- Build and synchronize the project.
- Export or call the entity via OData to confirm the fields appear.

This simple approach ensures audit fields are available for integrations, exports, and reporting.


-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

March 02, 2026

[Solved] The ‘VSProjectPackage’ package did not load correctly. UDE VS setup issue

Hi Folks, 

After configuring Visual Studio for UDE development, when accessing anything in Dynamics 365 menu, I was getting below error, 


Error message: 

The ‘VSProjectPackage’ package did not load correctly.

The problem may have been caused by a configuration change or by the installation of another extension. You can get more information by examining the file
C:\Users\deepak.agarwal\AppData\Roaming\Microsoft\VisualStudio\17.0_0dc3asdf9\ActivityLog.xml.

Restarting Visual Studio could help resolve this issue.




Solution: 

Note: Please save and work and close Visual Studio before performing these steps. 

Its most likely you didn't installed the visual studio properly, check and install below missing component. You can do this by from
Control Panel > Uninstall program > Select Visual studio 2022 > Click on change on top

Now check for these three component, 

1. Under Workloads >.NET desktop development 



2. Navigate to Individual Component >  Search for Model and select Model SDK



3. Within induvial component , now search for DGML and select 'DGML editor'




and with these three component modify/install Visual studio. It may take few minutes to download and install on your visual studio. 

Once installation is successfully completed, you must be able to access all the options from Dynamics 365 menu. 

Cheers!!!

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

January 26, 2026

Practical Ways to Use GitHub Copilot for X++ Development (Tips, Tricks, Hidden Gems & What to Avoid)

Hi Folks, 

In my last post about GitHub Copilot we covered what GitHub Copilot is, how to enable it, and the licensing options, let’s move into the part everyone actually cares about — Practical Ways to Use GitHub Copilot for X++ Development

Copilot isn’t magic, but when used well, it feels pretty close. Below are practical examples, prompt ideas, best practices, and a few “don’ts” that will save you time and frustration.

1. X++ Boilerplate Generation (Your New Time Saver)

X++ is full of repetitive patterns — table methods, service classes, form event handlers, data entities, and more. Copilot can generate 60–80% of this instantly.

Try prompts like:

  • “Create a validateWrite method for this table following best practices.”
  • “Generate a runnable class that updates sales orders with status ‘Open’.”
  • “Create a SysOperation framework class with contract, controller, and service.”

You still review and refine — but the heavy lifting is done. (I would still recommend this to do every single time)

2. Documentation & XML Comments (The Most Underrated Feature)

Copilot is brilliant at generating XML documentation for methods, classes, and services.

Prompt idea:

  • “Add XML documentation to this method explaining parameters and return values.”

Why it matters

  • Better readability
  • Faster onboarding for new developers
  • Cleaner code reviews

3. OData & Integration Summaries

When working with integrations, Copilot can help you quickly draft:

  • OData endpoint descriptions
  • Request/response examples
  • Error-handling notes
  • API documentation

Prompt idea:

  • “Explain how this data entity will behave when called via OData.”

This is especially useful when writing technical design documents.

4. Best Practice Suggestions on Selected Code

This is where Copilot becomes a mini code reviewer.

Highlight a block of X++ code → open Inline Chat (Alt + /) → ask:

  • “Suggest best practices for this code.”
  • “Is there any performance issue here?”
  • “Rewrite this using recommended patterns.”

Example: Summarize Sales Order

Your example:

public void summarizeSalesOrder(str salesOrderId)

{

    SalesTable salesTable = SalesTable::find(salesOrderId);

    info(strFmt("Sales Order %1 for customer %2 has total amount %3",

        salesTable.SalesId, salesTable.CustAccount, salesTable.TotalAmount));

}

Copilot might suggest:

  • Add null checks
  • Use ttsbegin/ttscommit if modifying data
  • Avoid info() in production code
  • Consider using SalesTotals class for accurate totals

This is where Copilot shines — it nudges you toward better patterns.

5. Plugin Development with Copilot Studio

If you’re building Copilot plugins for D365FO, Copilot can help with:

  • API schema generation
  • JSON payload examples
  • C# wrapper classes
  • Error-handling templates

This is still evolving, but it’s already a huge productivity boost.

6. Hidden Gems Most Developers Miss

  •       Use #filename to give Copilot context

Example:

#SalesTable Explain how validateWrite works for this table.

  • ü      Use /intent to guide Copilot

Example:

/refactor Rewrite this method to improve readability.

  • ü  Keep related files open

Copilot reads open tabs to understand your project structure.

  • ü  Use natural language

You don’t need to be formal.
“Make this faster” works surprisingly well.

7. What NOT to Do with Copilot (Important!)

Copilot is powerful, but not perfect. Avoid:

Blindly accepting code

Always review for:

  • Security issues
  • Performance problems
  • Deprecated APIs

Using Copilot for business logic decisions

Copilot doesn’t know your customer’s requirements.

Expecting Copilot to understand custom frameworks

It learns from your codebase over time — but not instantly.

Using vague prompts

“Fix this” → not helpful
“Optimize this query to reduce DB calls” → much better

While all this working magically, GitHub Copilot won’t replace your X++ expertise — but it will absolutely amplify it. So don't afraid of using it, make this your companion. The more you use it, the better it understands your patterns, naming conventions, and coding style.


-Harry 
 Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

January 06, 2026

Streamlining D365FO Access with Entra ID Security Groups

Hi Friends,

Happy new year!!! Lets continue learning...

In this post, I will share some insight on how Entra security group can used to streamline D365FO access management. 

Managing user access in Dynamics 365 Finance and Operations can be complex—especially in large organizations with frequent onboarding (or offboarding), role changes, and compliance needs. Microsoft Entra ID security groups provide an alternative to traditional role-based access control (RBAC), offering centralized, scalable access management. In this post let’s explore how to set them up, their pros and cons, and best practices for implementation.

Lets see some quick steps to setup Entra security groups, 

  1. Enable the Feature:  In D365FO, navigate to Feature Management and enable Microsoft Entra ID Security Groups.
  2. Create Security Groups in Entra ID: Use the Microsoft Entra admin center to create groups. You can choose:
    • Assigned groups (manual membership)
    • Dynamic groups (rule-based membership based on user attributes)
  3. Assign Roles to Groups in D365FO: Go to System Administration > Security Configuration > Entra ID Security Groups.
    • Import your Entra groups.
    • Assign D365FO roles to each group.
  4. User Provisioning: When a user logs in, D365FO checks their group membership and automatically assigns roles based on the group configuration. This supports just-in-time (JIT) provisioning

Of course there are advantages Over Traditional Role-Based Access, like

  • Centralized Management: Admins can manage access across multiple apps from Entra ID.
  • Dynamic Membership: Automatically assign users to groups based on attributes (e.g., department, location).
  • Bulk Provisioning: Assign roles to many users at once—ideal for onboarding.
  • Lifecycle Automation: Role changes happen automatically when user attributes change.
  • Just In time access
  • Centralized onboarding and offboarding of users 

And yes, there are some limitations Compared to Traditional Role Assignments, like

  • No Role Visibility in User Profile: Roles assigned via groups don’t appear in the user’s security role list in D365FO.
  • Audit Complexity: Harder to trace exact role assignments for individual users. Few out of box report doesn't support these users.
  • Limited Granularity: Cannot assign roles based on task-level needs unless you create many groups.
  • External user in Entra doesn't get access automatically. 
  • Complex workflows may not work as expected. 
Now lets talk about few of best Practices for Using Entra ID Groups in D365FO,

  • Use Dynamic Groups for Automation:  Define rules like user. department -eq "Finance" to auto-assign users to finance roles.
  • Combine with Direct Role Assignments: For exceptions or sensitive roles, assign them directly in D365FO to maintain visibility.
  • Document Group-to-Role Mapping: Maintain a clear mapping of which Entra groups correspond to which D365FO roles.
  • Audit Regularly: Use PowerShell or Graph API to extract group membership and validate access.
  • Avoid Overlapping Assignments: Ensure users don’t get conflicting roles from multiple groups.
My view:

Entra ID security groups simplify access management in D365FO, especially for large-scale or dynamic environments. However, they’re best used in tandem with traditional role assignments to balance automation with control. By following best practices, you can streamline provisioning while maintaining auditability and compliance.

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta