November 13, 2025

Working with Bicep – Part 2: Azure Function App Deployment

Hi Folks, 

In Part 1, we explored how to use Azure Bicep to deploy Logic Apps and integrate with Dynamics 365 for Finance and Operations (D365FO). Now, in Part 2, we’ll extend that architecture by deploying Azure Function Apps using Bicep—enabling custom logic, token handling, and deeper orchestration capabilities.

Just for a recap, lets understand why Azure function is so awesome, Azure Functions are serverless compute services ideal for lightweight processing, transformations, and integrations. When paired with Logic Apps, they offer:
  • Custom logic execution (e.g., token parsing, data shaping)
  • Event-driven triggers (e.g., D365FO business events)
  • Scalable backend processing without managing infrastructure
In todays post lets take a real time scenario, To authenticate Logic Apps with D365FO, you often need to retrieve an OAuth token. Here’s how an Azure Function can help, below is sample code which can be called from Logic Apps to retrieve and inject the token dynamically.




  Here are some known issues and their possible fixes (atlest they worked for me ;) )



-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

October 29, 2025

WorthKnowing: Measuring Code Quality: Beyond Subjective Judgment

Hi Folks, 

There are five main measurement for any code review, 

1. Reliability

Reliability reflects the likelihood that software will operate without failure over a defined period. It hinges on two factors:
  • Defect count: Fewer bugs mean higher reliability. Static analysis tools can help identify defects early.
  • Availability: Measured using metrics like Mean Time Between Failures (MTBF), which indicates how often the system fails.
A reliable codebase is foundational to building robust software systems.

2. Maintainability

Maintainability assesses how easily code can be updated, fixed, or extended. It depends on:
  • Codebase size and structure
  • Consistency and complexity
  • Testability and understandability
No single metric can capture maintainability, but useful indicators include:
  • Stylistic warnings from linters
  • Halstead complexity measures, which quantify code readability and effort
Both automated tools and human reviewers play vital roles in maintaining clean, adaptable code.

3. Testability

Testability measures how effectively software can be tested. It’s influenced by:
  • Control and observability of components
  • Ability to isolate and automate tests
One way to assess testability is by evaluating how many test cases are needed to uncover faults. Tools like cyclomatic complexity analysis can help identify overly complex code that’s harder to test.

4. Portability

Portability gauges how well software performs across different environments. While there’s no universal metric, best practices include:
  • Testing on multiple platforms throughout development—not just at the end
  • Using multiple compilers with strict warning levels
  • Enforcing consistent coding standards
These steps help ensure your code isn’t locked into a single ecosystem.

5. Reusability

Reusability determines whether existing code assets can be repurposed. Reusable code typically exhibits:
  • Modularity: Components are self-contained
  • Loose coupling: Minimal dependencies between modules
Static analysis tools can identify interdependencies that hinder reuse, helping teams refactor for better modularity.

Code quality isn’t a one-size-fits-all concept. But by focusing on measurable traits like reliability, maintainability, testability, portability, and reusability, teams can build software that’s not only functional—but also resilient, scalable, and future-proof.


-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

September 30, 2025

Working with Bicep – Part 1: Setup, Licensing, and D365FO Integration


As enterprise solutions evolve, Infrastructure as Code (IaC) is becoming a cornerstone of modern DevOps practices. While ARM templates have long been the standard for Azure resource deployment, Bicep—a domain-specific language (DSL) for ARM—offers a cleaner, more maintainable alternative.

In this two-part blog series, we’ll explore how to use Bicep to deploy Azure Logic Apps, with a focus on integrating with Dynamics 365 for Finance and Operations (D365FO). In Part 2, we’ll extend this by incorporating Azure Functions into the mix.

Why Bicep Over ARM Templates?

ARM templates are powerful but verbose and error-prone. Bicep simplifies this by offering:
- Simplified syntax: No more deeply nested JSON.
- Modularity: Reusable components via modules.
- Better tooling: IntelliSense, type safety, and validation in VS Code.
- Native support: Compiles directly to ARM templates.

Tools Required

1. Azure CLI (v2.20.0+)
2. Bicep CLI (or use `az bicep install`)
3. Visual Studio Code with the Bicep extension
4. Access to Azure Subscription with permissions to deploy resources
5. Logic Apps Standard or Consumption Plan

Licensing Considerations

- Logic Apps Consumption: Pay-per-action; no upfront cost.
- Logic Apps Standard: Fixed pricing per integration service environment (ISE) or App Service Plan.
- D365FO: Ensure the user/service principal has appropriate OData API permissions (typically via Azure AD App Registration).

Sample Bicep Template for Logic App Deployment

Here’s a minimal Bicep file to deploy a Logic App (Consumption Plan):

param logicAppName string = 'd365fo-integration-app'
param location string = resourceGroup().location

resource logicApp 'Microsoft.Logic/workflows@2019-05-01' = {
  name: logicAppName
  location: location
  properties: {
    definition: loadTextContent('./logicapp-definition.json')
    parameters: {}
  }
}

Sample Logic App: Calling a D365FO Service

Let’s say you want to call a custom service in D365FO via OData. Your Logic App might include:
1. HTTP Trigger
2. HTTP Action to call D365FO

Example HTTP Action Configuration:

{
  "method": "GET",
  "uri": "https://<your-env>.cloudax.dynamics.com/data/CustomService",
  "headers": {
    "Authorization": "Bearer @{body('Get_Token')?['access_token']}",
    "Accept": "application/json"
  }
}

To authenticate, use Azure AD OAuth 2.0 with a client credential flow. You can retrieve the token using a separate HTTP action or Azure Function.

Folder Structure Recommendation

/logicapps
  ├── main.bicep
  ├── logicapp-definition.json
  └── parameters.json

Deployment Command

az deployment group create \
  --resource-group my-rg \
  --template-file main.bicep

In the next post, we’ll explore how to extend Logic Apps with Azure Functions, enabling more complex orchestration and custom logic—especially useful when working with D365FO’s business events or data transformations.

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

September 01, 2025

[Solved] DVT script for service model: AOSService on machine: D365FO_DEVBox

Hi Folks, 

Sharing a quick fix for an issue I encountered while updating a Tier-1 D365FO environment to version 10.0.44. This environment hadn’t been used for a while, and the update process failed near the final stages—specifically around steps 65 and 66 (note: step numbers may vary).


Error:

DVT script for service model: AOSService on machine: <MachineName>














Reason:

When attempting to log in via the front end, I received an “unsafe connection” error. This indicates that the SSL certificate for the environment has expired or become invalid.


To fix this, you need to rotate the SSL certificates via LCS:

  1. Go to the Environment Details page in LCS.
  2. Abort the update process.
  3. Click Maintain → Rotate secrets.
  4. Select Rotate the SSL certificates and confirm.

Once the rotation completes successfully, reinitiate the update. It should now proceed without errors.

Please check other posts on similar issues



-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta