Quantcast
Channel: Kauffmann @ Dynamics 365 Business Central
Viewing all 118 articles
Browse latest View live

Batch calls with Business Central (2) – Error handling

$
0
0

This is the second post in a batch series about batch calls with Business Central APIs. You can find the first one about the basic operations with batch calls here. In this post, I want to cover one of the most frequently asked questions about batch calls: what happens if one of the operations runs into an error? Will it stop further execution, will the already processed operations be rolled back? Let’s have a look.

It’s very important about batch requests to understand that the response status is available on two levels: on the overall batch request and on each individual operation in the batch. For a single, direct API call you get the status back on the response level. Then you check if the status is in the 200 series to see if the call was successful. The response status of the batch request however does not indicate if the individual operations in the batch were successful or not. You will get back status 200 if the $batch endpoint did receive the request and was able to read it. A response status other than 200 indicates a malformed request body, or not authorized, etc. But if the $batch was able to parse the request body, then you will find the results of each individual operation in the response body. See the example response body in the first post, where each individual response has status 201 (created).

So, what happens if there is an error? The default behavior for Business Central APIs is that further execution will be stopped and the returned response contains the results up to the operation that failed. Let’s take the same example batch request from the first post to create three journal lines, but now with an invalid date in the second operation.

{
	"requests": [
		{
			"method": "POST",
			"id": "r1",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": -3250,
			    "description": "Salary to Bob"
			}
		},
		{
			"method": "POST",
			"id": "r2",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId}}",
			    "postingDate": "2020-10-20x",
			    "documentNumber": "SALARY2020-12",
			    "amount": -3500,
			    "description": "Salary to John"
	        }
		},
        {
			"method": "POST",
			"id": "r3",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId2}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": 6750,
			    "description": "Salaries December 2020"
	        }
		}	
	]
}

The response looks like this:

HTTP/1.1 200 OK
Transfer-Encoding: chunked
Content-Type: application/json
Content-Encoding: gzip
Server: Microsoft-HTTPAPI/2.0
OData-Version: 4.0
Access-Control-Allow-Origin: *
Access-Control-Allow-Credentials: true
Access-Control-Expose-Headers: Date, Content-Length, Server, OData-Version
request-id: ae550433-524b-40bc-96d1-6f4efc8651dc
Date: Mon, 21 Dec 2020 22:46:36 GMT
{
    "responses": [
        {
            "id": "r1",
            "status": 201,
            "headers": {
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(9f161476-1d3d-eb11-bb72-000d3a2b9218)/journals(f91409ba-1d3d-eb11-bb72-000d3a2b9218)/journalLines(3c9b67d0-0c41-eb11-a853-d0e7bcc597da)",
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://bcsandbox.docker.local:7048/bc/api/v2.0/$metadata#companies(9f161476-1d3d-eb11-bb72-000d3a2b9218)/journals(f91409ba-1d3d-eb11-bb72-000d3a2b9218)/journalLines/$entity",
                "@odata.etag": "W/\"JzQ0O0RwczFRK2dKNlhPZlhINUl5bGgzR3ZvVUdhRnYrZUZvTU4wUzVVeU54QWM9MTswMDsn\"",
                "id": "3c9b67d0-0c41-eb11-a853-d0e7bcc597da",
                "journalId": "f91409ba-1d3d-eb11-bb72-000d3a2b9218",
                "journalDisplayName": "DEFAULT",
                "lineNumber": 130000,
                "accountType": "G_x002F_L_x0020_Account",
                "accountId": "ae4110b4-1d3d-eb11-bb72-000d3a2b9218",
                "accountNumber": "60700",
                "postingDate": "2020-10-20",
                "documentNumber": "SALARY2020-12",
                "externalDocumentNumber": "",
                "amount": -3250.00,
                "description": "Salary to Bob",
                "comment": "",
                "taxCode": "NONTAXABLE",
                "balanceAccountType": "G_x002F_L_x0020_Account",
                "balancingAccountId": "00000000-0000-0000-0000-000000000000",
                "balancingAccountNumber": "",
                "lastModifiedDateTime": "2020-12-18T08:41:26.68Z"
            }
        },
        {
            "id": "r2",
            "status": 400,
            "headers": {
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "error": {
                    "code": "BadRequest",
                    "message": "Cannot convert the literal '2020-10-20x' to the expected type 'Edm.Date'.  CorrelationId:  f6123128-1db7-4ee8-9a79-469d713e1e54."
                }
            }
        }
    ]
}

The response status of the batch call itself was 200, indicating a success! Obviously, you need to read the response body to figure if an operation failed and also which operation didn’t execute at all (they are missing in the response body). There might be scenarios where this is perfectly fine, but in most cases, you want to continue to process the other operations or to completely roll back the whole batch. Let’s explore these two options.

Continue on error

The first option is to continue to process the additional requests in the batch. This can be requested by means of the Prefer: odata.continue-on-error header.

POST {{baseurl}}/api/v2.0/$batch
Content-Type: application/json
Accept: application/json
Prefer: odata.continue-on-error

Now the response body contains a result for all operations in the batch:

{
    "responses": [
        {
            "id": "r1",
            "status": 201,
            "headers": {
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(9f161476-1d3d-eb11-bb72-000d3a2b9218)/journals(f91409ba-1d3d-eb11-bb72-000d3a2b9218)/journalLines(6fea4f2b-1041-eb11-a853-d0e7bcc597da)",
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://bcsandbox.docker.local:7048/bc/api/v2.0/$metadata#companies(9f161476-1d3d-eb11-bb72-000d3a2b9218)/journals(f91409ba-1d3d-eb11-bb72-000d3a2b9218)/journalLines/$entity",
                "@odata.etag": "W/\"JzQ0O05nSVI2ZVU1NVpXazQySUlBS2dUbHVlK0dhUElJb2hRYjZqbk12ZUxrbk09MTswMDsn\"",
                "id": "6fea4f2b-1041-eb11-a853-d0e7bcc597da",
                "journalId": "f91409ba-1d3d-eb11-bb72-000d3a2b9218",
                "journalDisplayName": "DEFAULT",
                "lineNumber": 140000,
                "accountType": "G_x002F_L_x0020_Account",
                "accountId": "ae4110b4-1d3d-eb11-bb72-000d3a2b9218",
                "accountNumber": "60700",
                "postingDate": "2020-10-20",
                "documentNumber": "SALARY2020-12",
                "externalDocumentNumber": "",
                "amount": -3250.00,
                "description": "Salary to Bob",
                "comment": "",
                "taxCode": "NONTAXABLE",
                "balanceAccountType": "G_x002F_L_x0020_Account",
                "balancingAccountId": "00000000-0000-0000-0000-000000000000",
                "balancingAccountNumber": "",
                "lastModifiedDateTime": "2020-12-18T09:05:27.69Z"
            }
        },
        {
            "id": "r2",
            "status": 400,
            "headers": {
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "error": {
                    "code": "BadRequest",
                    "message": "Cannot convert the literal '2020-10-20x' to the expected type 'Edm.Date'.  CorrelationId:  b3bd539c-729f-4a20-9c8e-3868917d0283."
                }
            }
        },
        {
            "id": "r3",
            "status": 201,
            "headers": {
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(9f161476-1d3d-eb11-bb72-000d3a2b9218)/journals(f91409ba-1d3d-eb11-bb72-000d3a2b9218)/journalLines(70ea4f2b-1041-eb11-a853-d0e7bcc597da)",
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://bcsandbox.docker.local:7048/bc/api/v2.0/$metadata#companies(9f161476-1d3d-eb11-bb72-000d3a2b9218)/journals(f91409ba-1d3d-eb11-bb72-000d3a2b9218)/journalLines/$entity",
                "@odata.etag": "W/\"JzQ0O3ZqVHpoVm12UGd2NEQ0Tk1NM0lYSnVMUlorZDlpS2dYUk5xWDNyQmpJSzQ9MTswMDsn\"",
                "id": "70ea4f2b-1041-eb11-a853-d0e7bcc597da",
                "journalId": "f91409ba-1d3d-eb11-bb72-000d3a2b9218",
                "journalDisplayName": "DEFAULT",
                "lineNumber": 150000,
                "accountType": "G_x002F_L_x0020_Account",
                "accountId": "844110b4-1d3d-eb11-bb72-000d3a2b9218",
                "accountNumber": "20700",
                "postingDate": "2020-10-20",
                "documentNumber": "SALARY2020-12",
                "externalDocumentNumber": "",
                "amount": 6750.00,
                "description": "Salaries December 2020",
                "comment": "",
                "taxCode": "NONTAXABLE",
                "balanceAccountType": "G_x002F_L_x0020_Account",
                "balancingAccountId": "00000000-0000-0000-0000-000000000000",
                "balancingAccountNumber": "",
                "lastModifiedDateTime": "2020-12-18T09:05:27.737Z"
            }
        }
    ]
}

As you can see, the second operation has status 400, while operations r1 and r3 do have the expected status 201.

Let’s move on to transactional behavior.

Transactional

The other scenario is to rollback all operations. The batch is handled as one transaction. The OData standard has a feature for this, but this feature is not implemented by Business Central. Luckily, there is an alternative. Let’s first look at the standard OData feature, just in case you come across it and wonder why it doesn’t work. The standard feature is called ‘atomicity group’ or ‘changeset’. This is an additional property of the operation to indicate multiple operations that must be processed as an atomic operation and must either all succeed or all will fail. Here is an example of the batch request body with all operations in one atomicity group (the OData specification allows for multiple atomicity groups in a batch request):

{
	"requests": [
		{
			"method": "POST",
            "atomicityGroup": "group1",
			"id": "r1",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": -3250,
			    "description": "Salary to Bob"
			}
		},
		{
			"method": "POST",
            "atomicityGroup": "group1",
			"id": "r2",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": -3500,
			    "description": "Salary to John"
	        }
		},
        {
			"method": "POST",
            "atomicityGroup": "group1",
			"id": "r3",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId2}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": 6750,
			    "description": "Salaries December 2020"
	        }
		}
    ]
}

Unfortunately, when you send this call to Business Central, the response has status 500 and this response body:

{
    "error": {
        "code": "BadRequest_NotSupported",
        "message": "Multiple requests within the same change set are not supported by Microsoft Dynamics 365 Business Central OData web services.  CorrelationId:  0b74f9b1-f9f1-42fa-bcf3-6fc8879d6bb8."
    }
}

The reason to not support this feature is because it’s quite hard to implement it. Each operation is individually processed and committed to the database. Rolling back multiple committed transactions is not easy, also when you consider that other processes can do modifications simultaneously.

Alternative: Isolation

The alternative is to enable transactional batch behavior with the header Isolation: snapshot. If one of the operations fails, then all committed changes from the operations in the same batch will be rolled back. Microsoft documented this feature here: https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/developer/devenv-connect-apps-tips#batch

What is quite strange, is that the Isolation header seems to be a non-standard header. I couldn’t find any official documentation about an OData header with the name Isolation. However, there is a standard OData header called OData-Isolation, with snapshot as the only supported value. And guess what, this official header has the same effect. I really wonder why Microsoft came up with a non-standard header while there is a standard header available.

The isolation header is officially not designed for implementing transactional behavior. So what’s the deal with isolation, why does it work? Snapshot isolation makes sure that the API request only returns data that is a result of the API call. No data from other processes can be included. Which theoretically can happen with a batch request, because the operations are processed and committed individually. Consecutive operations in a batch request could return data that has been modified by another process. The snapshot isolation prevents that, all operations will work on the database as it was at the start of the request plus all modifications from the request itself. By the way, this also works for single requests, not only for batch requests.

Apparently, this behavior has been taken as an opportunity to roll back the isolated modifications if any operation in the batch fails. Again, it’s not what it was intended for, and I would definitely like to see the atomicity group instead of this workaround. It’s the next best option, but not ideal.

What does the response look like when snapshot isolation has been applied? Surprisingly, there is no difference with the previous examples. The batch request will either fail on the first error or continue, depending on the Prefer:odata.continue-on-error header. But if there was an error, then it will rollback all committed changes.

Recommendation

My recommendation is to combine those two headers. With snapshot isolation, you get transactional behavior. And with continue-on-error you will get a full list of all failing operations instead of only the first one.


Batch calls with Business Central APIs (3) – Tips and Tricks

$
0
0

This is the third post in a series about batch calls with Business Central. If you haven’t read the the other posts, then I recommend to do so. You’ll find the first post about basic operations with batch calls here. The second post about error handling and transactions can be found here. This third and final post will cover these topics:

  • Reduce the size of the response payload
  • Combine multiple operation types, like POST and GET
  • Define the order of operations
  • Batch calls with amounts running into an error or timeout

Reduce the size of the response payload

This tip works for both batch calls and normal API calls. As you have seen in the first post about basic operations, the response of the batch call contains a response for every single operation. What if you don’t do anything with this information? Maybe the only thing you want to know is if the operations was successful, but you don’t really use the returned data.

An example could be when you create multiple sales order lines, but in the end you want to retrieve the full sales order with header and lines, instead of getting back each individual line. Or maybe you want to post the sales order after creating the lines and then retrieve the posted invoice.

Another scenario could be that you create multiple records, but you don’t want to get the full details in the result. Only the created id and number could be sufficient.

Not returning the full data of the individual operations greatly reduces the JSON payload that goes over the wire and has a positive effect on the performance.

Let’s first look at only retrieving those details we are interested in. In the following example, three items are being created and we only want to get back the created number (from the number series) and the id.

POST {{baseurl}}/api/v2.0/$batch
Content-Type: application/json
Accept: application/json
{
	"requests": [
		{
			"method": "POST",
		    "id": "1",
			"url": "companies({{companyId}})/items?$select=id,number",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
                "displayName" : "Item 1"
			}
		},
		{
			"method": "POST",
			"id": "2",
			"url": "companies({{companyId}})/items?$select=id,number",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
                "displayName" : "Item 2"
	        }
		},
        {
			"method": "POST",
			"id": "3",
			"url": "companies({{companyId}})/items?$select=id,number",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
                "displayName" : "Item 3"
	        }
		}
    ]
}

As you can see, the URLs of the operations have the parameter $select=id,number. Compare this to a SQL SELECT query. If you don’t use $select, you are in fact saying SELECT * FROM. But with $select, you retrieve a reduced dataset of only the fields you are interested in. That works not only for the GET command, but also for the POST command.

The result of the batch call looks like:

{
    "responses": [
        {
            "id": "1",
            "status": 201,
            "headers": {
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items(356f53bb-874f-eb11-a856-8ee7d7617d9e)",
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://bcsandbox.docker.local:7048/bc/api/v2.0/$metadata#companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items/$entity",
                "@odata.etag": "W/\"JzQ0O3kzdGdHUEgzZThHU3lTZU5BVlNjM3JXR0lFSnNGWE1uQTh2MmczRU1WcVU9MTswMDsn\"",
                "id": "356f53bb-874f-eb11-a856-8ee7d7617d9e",
                "number": "1001"
            }
        },
        {
            "id": "2",
            "status": 201,
            "headers": {
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items(376f53bb-874f-eb11-a856-8ee7d7617d9e)",
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://bcsandbox.docker.local:7048/bc/api/v2.0/$metadata#companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items/$entity",
                "@odata.etag": "W/\"JzQ0O0VCRUZGRlFuS1VWb0xGUFl2NkhSQmowUi9PUlBqMC9sdG96UnRTekNuRVE9MTswMDsn\"",
                "id": "376f53bb-874f-eb11-a856-8ee7d7617d9e",
                "number": "1002"
            }
        },
        {
            "id": "3",
            "status": 201,
            "headers": {
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items(396f53bb-874f-eb11-a856-8ee7d7617d9e)",
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://bcsandbox.docker.local:7048/bc/api/v2.0/$metadata#companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items/$entity",
                "@odata.etag": "W/\"JzQ0OzBCdG1WRVA2NGpqZEJaSjJiaW5VYkNnbXhJYTZ1ckphdGdUMWdvOFNBalE9MTswMDsn\"",
                "id": "396f53bb-874f-eb11-a856-8ee7d7617d9e",
                "number": "1003"
            }
        }
    ]
}

What if you are not even interested in a single field, but only want to see the status? The solution is easy: add a header Prefer: return-no-content to the operation which tells the server to not include a body. This header works also on direct calls!

{
	"requests": [
		{
			"method": "POST",
		    "id": "1",
			"url": "companies({{companyId}})/items?$select=id,number",
			"headers": {
				"Content-Type": "application/json",
                "Prefer": "return-no-content"
			},
			"body": {
                "displayName" : "Item 1"
			}
		},
		{
			"method": "POST",
			"id": "2",
			"url": "companies({{companyId}})/items?$select=id,number",
			"headers": {
				"Content-Type": "application/json",
                "Prefer": "return-no-content"
			},
			"body": {
                "displayName" : "Item 2"
	        }
		},
        {
			"method": "POST",
			"id": "3",
			"url": "companies({{companyId}})/items?$select=id,number",
			"headers": {
				"Content-Type": "application/json",
                "Prefer": "return-no-content"
			},
			"body": {
                "displayName" : "Item 3"
	        }
		}
    ]
}

The result now looks like this:

{
    "responses": [
        {
            "id": "1",
            "status": 204,
            "headers": {
                "preference-applied": "return-no-content",
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items(83830e14-8a4f-eb11-a856-8ee7d7617d9e)"
            }
        },
        {
            "id": "2",
            "status": 204,
            "headers": {
                "preference-applied": "return-no-content",
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items(85830e14-8a4f-eb11-a856-8ee7d7617d9e)"
            }
        },
        {
            "id": "3",
            "status": 204,
            "headers": {
                "preference-applied": "return-no-content",
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/items(87830e14-8a4f-eb11-a856-8ee7d7617d9e)"
            }
        }
    ]
}

As you can see, the bodies of the operations are missing, and instead a location header has been added with a direct URL of the created record. Also notice that the status of the operation is 204 No Content, not 201 Created as with the previous example. It still means OK, because it is a 2xx status code. If there was an error, then the status will have the usual 400 status code and the body will still include the error message, ignoring the no content preference.

Combine multiple operation types

A batch call doesn’t have to consist of only similar operations to the same API or of the same type. Each operation can be be for a different API. For example, it is no problem to create a new customers, vendors, items, etc., all in one batch. The operations also don’t have to be of the same type, it is perfectly possible to have any combination of POST, GET, PATCH and DELETE operations in one batch. Some scenarios where that might be useful are (spoiler alert: but not all of these are possible):

  • Create a new customer and a new sales order for this new customer
  • Create a new sales order, post it and retrieve the created invoice
  • Create multiple journal lines and post them

The last scenario is the most simple one. But before we look at the batch request for this scenario, we have to think about a potential problem. Although it seems that all operations in a batch request are executed one after another, that does not mean they will wait for each other to complete. The operations may be processed in parallel. But you don’t want the posting of the journal lines to start before the creation of the lines has been completed. That’s where the property dependsOn is for. With this property you can define that an operation in the batch should not start before the other operations, as defined with the dependsOn property, are successfully completed.

Let’s look at the body of this batch request and look at the dependsOn property of the last operation with id 4. It tells that operations 1, 2 and 3 must be completed first. Also note that the Prefer: return-no-content header is used to reduce the resulting payload.

{
	"requests": [
		{
			"method": "POST",
		    "id": "1",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json",
                "Prefer": "return-no-content"
			},
			"body": {
			    "accountId": "{{accountId}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": -3250,
			    "description": "Salary to Bob"
			}
		},
		{
			"method": "POST",
			"id": "2",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json",
                "Prefer": "return-no-content"
			},
			"body": {
			    "accountId": "{{accountId}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": -3500,
			    "description": "Salary to John"
	        }
		},
        {
			"method": "POST",
			"id": "3",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json",
                "Prefer": "return-no-content"
			},
			"body": {
			    "accountId": "{{accountId2}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": 6750,
			    "description": "Salaries December 2020"
	        }
		},
        {
            "method": "POST",
			"id": "4",
            "dependsOn": ["1","2","3"],
			"url": "companies({{companyId}})/journals({{journalId}})/Microsoft.NAV.post",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": { }
        }
    ]
}

The result looks like:

{
    "responses": [
        {
            "id": "1",
            "status": 204,
            "headers": {
                "preference-applied": "return-no-content",
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/journals(875626ff-de49-eb11-bb51-000d3a25738b)/journalLines(116001c3-994f-eb11-a856-8ee7d7617d9e)"
            }
        },
        {
            "id": "2",
            "status": 204,
            "headers": {
                "preference-applied": "return-no-content",
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/journals(875626ff-de49-eb11-bb51-000d3a25738b)/journalLines(126001c3-994f-eb11-a856-8ee7d7617d9e)"
            }
        },
        {
            "id": "3",
            "status": 204,
            "headers": {
                "preference-applied": "return-no-content",
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/journals(875626ff-de49-eb11-bb51-000d3a25738b)/journalLines(136001c3-994f-eb11-a856-8ee7d7617d9e)"
            }
        },
        {
            "id": "4",
            "status": 204,
            "headers": {
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/journals(875626ff-de49-eb11-bb51-000d3a25738b)"
            }
        }
    ]
}

As you can see, no content included for the inserts of the lines. They are gone after the post anyway, so it wouldn’t be useful to have them. And the final call to post the journal also completed successfully.

Let’s now look at a little bit more complicated scenario. I want to create a new sales order, post it and retrieve the posted invoice, all in one go. Let’s look at the body below. The first operation is a POST command to create the sales order including two lines (using a deep insert). The second operation is to post the created sales order (standard ship and invoice) and the third operation is a GET to retrieve the just posted invoice. Can you see where this goes wrong?

{
    "requests": [
        {
            "method": "POST",
            "id": "1",
            "url": "companies({{companyId}})/salesOrders",
            "headers": {
                "Content-Type": "application/json; odata.metadata=minimal; odata.streaming=true",
                "Prefer": "return-no-content"
            },
            "body": {
                "customerId": "{{customerId}}",
                "externalDocumentNumber": "1234",
                "orderDate": "2021-01-05",
                "salesOrderLines": [
                    {
                        "itemId": "{{itemId}}",
                        "quantity": 1
                    },
                    {
                        "itemId": "{{itemId2}}",
                        "quantity": 6
                    }
                ]
            }
        },
        {
            "method": "POST",
            "id": "2",
            "url": "{{url}}/companies({{companyId}})/salesOrders({{salesOrderId}})/Microsoft.NAV.Post",
            "dependsOn": ["1"],
            "headers": {
                "Content-Type": "application/json; odata.metadata=minimal; odata.streaming=true"
            },
            "body": {}
        },
        {
            "method": "GET",
            "id": "3",
            "url": "{{url}}/companies({{companyId}})/salesInvoices({{salesOrderId}})",
            "dependsOn": ["2"]
        }
    ]
}

The problem is that the second and third operations need the id of the sales order that was created with the first operation. However, the id is unknown at the moment when we create the batch call. The question is now, would it be possible to get the id of the created record and use it the next operation? The answer is that the OData specification officially supports this in a change set (aka atomicity group). But as I explained in the previous post, Business Central does not support change sets. Microsoft came up with the Isolation header to support transactions, while the change set feature is really designed for transactions and also supports referencing new entities in the same change set.

Bummer…

OData V4.01 even supports referencing new entities without change sets. However, it appears that Business Central is not using OData v4.01, the response headers include an OData-Version: 4.0 header.

For OData V4.01, the batch request would look like this (note the URLs of operation 2 and 3):

{
    "requests": [
        {
            "method": "POST",
            "id": "1",
            "url": "companies({{companyId}})/salesOrders",
            "headers": {
                "Content-Type": "application/json; odata.metadata=minimal; odata.streaming=true",
                "Prefer": "return-no-content"
            },
            "body": {
                "customerId": "{{customerId}}",
                "externalDocumentNumber": "1234",
                "orderDate": "2021-01-05",
                "salesOrderLines": [
                    {
                        "itemId": "{{itemId}}",
                        "quantity": 1
                    },
                    {
                        "itemId": "{{itemId2}}",
                        "quantity": 6
                    }
                ]
            }
        },
        {
            "method": "POST",
            "id": "2",
            "url": "$1/Microsoft.NAV.Post",
            "dependsOn": ["1"],
            "headers": {
                "Content-Type": "application/json; odata.metadata=minimal; odata.streaming=true"
            },
            "body": {}
        },
        {
            "method": "GET",
            "id": "3",
            "url": "$2",
            "dependsOn": ["2"]
        }
    ]
}

But the response comes back with an error:

{
    "responses": [
        {
            "id": "1",
            "status": 204,
            "headers": {
                "preference-applied": "return-no-content",
                "location": "https://bcsandbox.docker.local:7048/bc/api/v2.0/companies(eaf06bb2-de49-eb11-bb51-000d3a25738b)/salesOrders(6a2c7a9a-a94f-eb11-a856-8ee7d7617d9e)"
            }
        },
        {
            "id": "",
            "status": 500,
            "headers": {
                "content-type": "application/json; charset=utf-8"
            },
            "body": {
                "error": {
                    "code": "Unknown",
                    "message": "This operation is not supported for a relative URI.  CorrelationId:  07ececa3-e062-46b1-aedd-ad4f5919f0cd."
                }
            }
        }
    ]
}

I’ve tried many different scenarios, but unfortunately I was not able to find a way to do a batch call that creates a new entity and then use the created entity in a subsequent operation. To be honest, this really reduces the possible scenarios of batch calls with Business Central. I can only hope Microsoft will support this in a next version of Business Central.

Batch calls with decimals run into an error or timeout

Recently, there was an issue reported on Twitter about batch calls that ran into a timeout and finally return a 500 error. It appears that this happens on the SaaS platform, on-prem environments like a docker container return an error immediately. Consider this batch request:

{
	"requests": [
		{
			"method": "POST",
			"id": "1",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": -3250.25,
			    "description": "Salary to Bob"
			}
		},
		{
			"method": "POST",
			"id": "2",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": -3500.75,
			    "description": "Salary to John"
	        }
		},
        {
			"method": "POST",
			"id": "3",
			"url": "companies({{companyId}})/journals({{journalId}})/journalLines",
			"headers": {
				"Content-Type": "application/json"
			},
			"body": {
			    "accountId": "{{accountId2}}",
			    "postingDate": "2020-10-20",
			    "documentNumber": "SALARY2020-12",
			    "amount": 6751,
			    "description": "Salaries December 2020"
	        }
		}
    ]
}

The amounts do have decimals, which is perfectly fine with single API calls. But when you do this in a batch call, it results in this error message:

{
    "responses": [
        {
            "id": "1",
            "status": 400,
            "headers": {
                "content-type": "application/json; odata.metadata=minimal",
                "odata-version": "4.0"
            },
            "body": {
                "error": {
                    "code": "BadRequest",
                    "message": "Cannot convert a value to target type 'Edm.Decimal' because of conflict between input format string/number and parameter 'IEEE754Compatible' false/true.  CorrelationId:  8afaf53f-e0df-4101-8000-768aaefdb603."
                }
            }
        }
    ]
}

For unknown reasons, the SaaS environment doesn’t even respond with this error. It just runs for minutes and then finally returns an error 500.

The solution for this error is to add IEEE754Compatible=true to the Content-Type header. More information about this header can be found here. The header can be specified on the batch request level, it does not have to be on the operations in the batch request. The headers will then look like:

POST {{baseurl}}/api/v2.0/$batch
Content-Type: application/json;IEEE754Compatible=true
Accept: application/json

With this Content-Type header the batch request will work as normal.

This was the last post about in the series about batch calls. But certainly not the last one around APIs!

Deprecation of Basic Auth for SaaS has been postponed to 2022

$
0
0

Good news: Microsoft decided to postpone the deprecation of Web Service Access Keys (Basic Authentication) until version 2022 wave 1. See confirmation here: https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/upgrade/deprecated-features-w1#web-service-access-keys-basic-auth-for-saas

Originally the plan was to remove basic authentication with version 2021 wave 1. Integrations with Business Central APIs should be using OAuth instead. However, currently it’s only possible to use a user-interactive OAuth flow, the Authorization Code flow. But basic authentication provided a way to call APIs without any user interaction, and that’s how many partners have been using it.

OAuth does support a way to authenticate without any user interaction. That is called the Client Credentials flow, aka service-to-service authentication. This is the best scenario for processes that run in the background without any UI to authenticate against external APIs. Business Central supports this flow for the automation APIs, as I’ve explained here. This should also become available standard APIs. Many partners were eagerly waiting for this because they wanted to switch to OAuth before basic authentication was deprecated.

Now that this has been postponed, partners will have more time to get prepared. There is no information yet when the service-to-service authentication will become available. As far as I know, it’s not for a technical reason that it hasn’t been enabled yet. It’s a matter of licensing. Because those service accounts are created in a different way, they can’t be assigned a Business Central license as you do with normal users. It’s my educated guess that this needs to solved first before we get our hands on it.

Actually, I was preparing for a blog post about implementing the Resource Owner Password Credentials flow. I got it to work, the only user interaction is the AAD App registration in Business Central. However, it does not support Multi Factor Authentication (MFA). Because this OAuth flow is less secure and requires more setup compared to using the web service access key, I think we should stick with basic authentication for the time being. Well, only for background services of course. If you have an integration that has any form of user interaction, then you should implement the authorization code flow!

So, this blog post is completely different then I was planning for and way shorter than my usual blog posts. 😄

Extending standard APIs (1)

$
0
0

I’ve got this question so many times: how can we extend the standard APIs with extra fields? So I decided to write a blog post about it. Actually, it’s going to be two blog posts, because one would become too long. So here it goes…

This could be a very short blog post. 😉 Microsoft disabled the option to customize the standard APIs in Business Central. Full stop…

No, of course not!!!

In this first blog post I want to cover the easy scenario: add a missing field to an API for master data. With master data, I mean for example the customers or items API. Those API pages are directly based on the table, while transaction APIs like sales orders are based on a buffer table. That makes it more complex to add extra fields. But not impossible! Just a little more complex. That will be for the next blog post. Let’s first focus on adding an extra field to the customers API.

The scenario

Let’s say we have a table extension for the customer table with two fields: Shoe Size and Hair Color. And we want these two fields to be added to the customers API. This is the table extension for the Customer table:

tableextension 50100 CustomerAJK extends Customer
{
    fields
    {
        field(50100; ShoeSizeAJK; Integer)
        {
            Caption = 'Shoe Size';
        }
        field(50101; HairColorAJK; Text[30])
        {
            Caption = 'Hair Color';
        }
    }
}

Creating the API page

Because it is not possible to create a page extension for a standard customers API, the only option is to copy it and make it our own. The downside of this is that we always need to keep it in sync with the standard API as released by Microsoft. If you want to keep it exactly the same of course. But we should not forget that APIs never get breaking changes. Instead, a new API version will be released alongside the existing API. And that does not happen with every release of Business Central. In other words, there is not a big risk involved.

How can we get the code of the standard APIs? That’s pretty simple: just clone the GitHub repository https://github.com/microsoft/ALAppExtensions. The app with the standard APIs v2 is in the folder Apps/W1/APIV2/app. If you open that folder in VS Code, then you will get many errors because the symbol files are missing. Just download symbol file to get rid of those errors.

Find the file APIV2Customers.Page.al in the APIV2 app and copy it to the app that has the extra fields. Next step is to make some corrections:

  • Change the object ID so it fits in your object range
  • Change the object name and the file name according to your naming conventions
  • Add the APIPublisher and APIGroup property
  • Optionally, modify the APIVersion property. This also depends on how you solve the remaining errors in the page about missing page parts (see below)
  • Optionally, reorder the page properties (that’s just my preference, to order them more logically)
  • Optionally, fix the implicit with warnings (to get rid of warnings during the build)

The properties of the API page now look like this:

page 50100 CustomersAPIv1AJK
{
    PageType = API;
    APIPublisher = 'ajk';
    APIGroup = 'demo';
    APIVersion = 'v2.0';
    EntitySetName = 'customers';
    EntityName = 'customer';
    EntitySetCaption = 'Customers';
    EntityCaption = 'Customer';
    ChangeTrackingAllowed = true;
    DelayedInsert = true;
    ODataKeyFields = SystemId;
    SourceTable = Customer;
    Extensible = false;

Fixing compilation errors

The page is also showing some compilation errors about missing page parts. There are a number of related entities, like customerFinancialDetails that are missing. These parts are in the APIV2 app. There are three options to solve these errors.

Option 1 is to just remove the related parts. If you don’t need them, then that’s the simplest solution of course.

Option 2 is to copy the missing page parts. That means that you have to do the same fixes as with the customers API. In fact, you are copying a complete set of API pages that belong to each other.

Option 3 is my preference: set a dependency on the APIV2 app. Then you don’t have to copy the missing page parts. If you choose this option, then there is one requirement: the property APIVersion of your custom API page must be the same as in the related API page from the depending app. That’s why I left it to v2.0 in the example above. The dependencies part in the app.json looks like this:

  "dependencies": [
    {
      "id":  "10cb69d9-bc8a-4d27-970a-9e110e9db2a5",
      "name":  "_Exclude_APIV2_",
      "publisher":  "Microsoft",
      "version":  "17.4.0.0"
    }
  ]

Adding the custom fields

Now we have a new custom API page that is an exact copy of the original one. The only step left is to add the two new fields. This is done by adding these lines to the field list:

                field(shoeSize;Rec.ShoeSizeAJK)
                {
                    Caption = 'Show Size';

                    trigger OnValidate()
                    begin
                        RegisterFieldSet(Rec.FieldNo(ShoeSizeAJK));
                    end;
                }
                field(hairColor;Rec.HairColorAJK)
                {
                    Caption = 'Hair Color';

                    trigger OnValidate()
                    begin
                        RegisterFieldSet(Rec.FieldNo(HairColorAJK));
                    end;
                }

Note the extra code line on the OnValidate, with the call to RegisterFieldSet. This is necessary, because of a pattern that is used in some standard API pages. After the record has been inserted the API page calls a function ProcessNewRecordFromAPI in the codeunit “Graph Mgt – General Tools”. This function will search for templates that need to be applied. To not overwrite any value that was inserted with the API, it needs to know the fields that were set with the API call. Those fields will be skipped when applying the template. Not all API pages use this pattern, so you must make sure to apply the same pattern as used in the standard API page.

That’s it! Now you can use your new custom fields with an API that is based on the original API. Of course this also works for any standard fields that you are missing in the API.

Next part will be about extending the sales order and sales line API. That’s going to be fun. 😁

Extending standard APIs (2)

$
0
0

In the previous blog post, I’ve demonstrated how to extend standard APIs by using them as a template for a new custom API that contains the customizations you want to do. While that approach works for most APIs, there are some APIs that are more complicated. I’m talking about APIs for sales and purchase documents. These APIs are special because they are not based on the standard tables Sales Header, Sales Line, etc.

I take the sales documents as examples here, but as you can imagine this also applies to purchase documents.

API salesOrders

The API page for Sales Orders is based on the table “Sales Order Entity Buffer”. This table holds a copy of all records in the table “Sales Header” of type Order. This is managed from codeunit “Graph Mgt – Sales Order Buffer”. I’d recommend to take a look at this codeunit to read how this is managed. Similarly, the API page for Sales Order Lines is based on table “Sales Invoice Line Aggregate”. The word Invoice in this name is not a typo, this is really the name. The same table is also used for the Sales Invoice Line API. But unlike the “Sales Order Buffer” table, the lines are not copied over. Instead, the API pages use them as temporary source tables and fill them on demand.

Here is a high-level dataflow diagram for the salesOrders API.

As you can see, the salesOrders API reads data from the table “Sales Order Entity Buffer”. It does not insert or update data in that table. At least not directly. The API page is based on that table, so when you insert a new order or update an existing one, the API page uses the same table to store the values. But right before it inserts or updates it in the database, it calls the codeunit “Graph Mgt – Sales Order Buffer” to propagate the values to the real “Sales Header” table. This happens in the OnInsert / OnModify / OnDelete triggers in the API page. See below the code of these triggers and notice the exit(false) which cancels the operation to the source table of the page.

    trigger OnDeleteRecord(): Boolean
    begin
        GraphMgtSalesOrderBuffer.PropagateOnDelete(Rec);

        exit(false);
    end;

    trigger OnInsertRecord(BelowxRec: Boolean): Boolean
    begin
        CheckSellToCustomerSpecified();

        GraphMgtSalesOrderBuffer.PropagateOnInsert(Rec, TempFieldBuffer);
        SetDates();

        UpdateDiscount();

        SetCalculatedFields();

        exit(false);
    end;

    trigger OnModifyRecord(): Boolean
    begin
        if xRec.Id <> Id then
            Error(CannotChangeIDErr);

        GraphMgtSalesOrderBuffer.PropagateOnModify(Rec, TempFieldBuffer);
        UpdateDiscount();

        SetCalculatedFields();

        exit(false);
    end;

The codeunit “Graph Mgt – Sales Order Buffer” transfers the values from “Sales Order Entity Buffer” to the “Sales Header” table. Below is the code of the PropagateOnInsert. The Modify works similar, but it gets the record first before updating it or inserts if it doesn’t exist.

    procedure PropagateOnInsert(var SalesOrderEntityBuffer: Record "Sales Order Entity Buffer"; var TempFieldBuffer: Record "Field Buffer" temporary)
    var
        SalesHeader: Record "Sales Header";
        TypeHelper: Codeunit "Type Helper";
        TargetRecordRef: RecordRef;
        DocTypeFieldRef: FieldRef;
    begin
        if SalesOrderEntityBuffer.IsTemporary or (not GraphMgtGeneralTools.IsApiEnabled) then
            exit;

        TargetRecordRef.Open(DATABASE::"Sales Header");

        DocTypeFieldRef := TargetRecordRef.Field(SalesHeader.FieldNo("Document Type"));
        DocTypeFieldRef.Value(SalesHeader."Document Type"::Order);

        TypeHelper.TransferFieldsWithValidate(TempFieldBuffer, SalesOrderEntityBuffer, TargetRecordRef);

        // SetTable does not transfer globals, which will affect the logic in OnInsert trigger. We have to insert here and modify latter.
        TargetRecordRef.Insert(true);

        SalesHeader.Get(TargetRecordRef.RecordId());
        SalesHeader.CopySellToAddressToBillToAddress;
        SalesHeader.SetDefaultPaymentServices;
        SalesHeader.Modify(true);

        SalesOrderEntityBuffer."No." := SalesHeader."No.";
        SalesOrderEntityBuffer.Get(SalesOrderEntityBuffer."No.");
    end;

As you can see, the field values are actually transferred with the generic function TransferFieldsWithValidate in codeunit TypeHelper. This function uses the TempFieldBuffer as the list of field values that needs to be transferred. It is important to understand that this table has a field Order which is automatically filled by the function RegisterFieldSet in the API page. In short, without copying a lot of source code here, the fields are transferred in the order in which they appear in the API page. Furthermore, the fields are mapped on their field ID only, not by name.

When a record is inserted or updated in the “Sales Header” table, then the very same codeunit synchronizes the data to the table “Sales Order Entity Buffer”. By the way, the same pattern exists for the Sales Quote.

API salesInvoices

The API page for Sales Invoices on the other hand, is based on the table “Sales Invoice Entity Aggregate”. From the API point of view, the data flow is exactly the same as with sales orders. However, there is a difference from a functional perspective. The table “Sales Invoice Entity Aggregate” does not only contain unposted sales invoices from table “Sales Header”, it also contains posted invoices from table ‘Sales Invoice Header”. And that makes a huge difference of course. The idea behind this is to have one endpoint that returns both unposted and posted sales invoices. On top of that, the posted sales invoice will keep the same id. As a result, the sales invoice will keep the same resource URL throughout the whole lifecycle from unposted to posted document.

Here is the high-level data flow diagram for the salesInvoices API:

Status field

There is one thing about the salesOrders and salesInvoices APIs that can be confusing. And that is the status field. We all know the status field in the table “Sales Header”. But the value Open in the sales APIs has a different meaning then the same value on the sales document in the Business Central UI. In the API it actually means that the sales document is Released. In other words, status Open in the API means that the sales document is not open in terms of Business Central UI. 😵 Instead, the status Draft in the APIs is equivalent to Open in the sales document. My educated guess for this difference is that the Microsoft tried to align the APIs with common status values as used by other solutions.

Sales Document status

Sales orders and invoices have a status based on the enum “Sales Document Status” which has these values:

ValueDescription
OpenThe sales document can be modified
ReleasedThe sales document is ready to be shipped and cannot be modified
Pending ApprovalThe sales document is waiting to be approved before being released
Pending PrepaymentThe sales document is awaiting upfront payment from the customer before being released

salesOrder API status

The salesOrders API has a status field, which is based on the enum “Sales Order Entity Buffer Status”. This enum has these values:

ValueDescription
DraftThe sales order status is Open
In ReviewThe sales order status is Pending Approval
OpenThe sales order status is Released or Pending Prepayment

salesInvoices API status

The salesInvoices API has yet another status field, based on the enum “Invoice Entity Aggregate Status” with these values:

ValueDescription
” “No idea…
DraftUnposted sales invoice with status Open
In ReviewUnposted sales invoice with status Pending Approval
OpenIf unposted: status Released or Pending Prepayment.
If posted: waiting for payment
PaidPosted invoice is closed
CanceledPosted invoice is canceled and a corrective credit memo has been created for the posted invoice
CorrectivePosted invoice is canceled and a new sales invoice has been created

For the canceled and corrected invoice, see https://docs.microsoft.com/en-us/dynamics365/business-central/sales-how-correct-cancel-sales-invoice

Sales Lines

Now we have that out of our way, let’s look at the sales lines before we dive into modifying the API. While the sales header values are actually duplicated in the buffer tables, the sales lines are not. Instead, all APIs for sales lines are using the same table: “Sales Invoice Line Aggregate” as a temporary source table. The lines are loaded into the temporary table by calling function LoadLines in codeunit “Sales Invoice Aggregator”. It is basically a TransferFields, followed by some extra calls to update line amounts.

Inserting or updating sales lines is done in a similar way as the sales headers. The codeunit “Sales Invoice Aggregator” has functions to propagate the field values from the temporary record to the table “Sales Line”. Again, the function TransferFieldsWithValidate from codeunit “Type Helper” is used. No magic stuff in here, just the same story again.

Adding a custom field to the table and API

Let’s now look at how we can customize those sales APIs. We start with the salesInvoices API as an example. The scenario is to add a custom field Webshop Reference Id to the table “Sales Header” and to the API.

The tableextension for the “Sales Header” table looks like:

tableextension 50100 SalesHeaderAJK extends "Sales Header"
{
    fields
    {
        field(50100; WebshopReferenceIdAJK; Text[30])
        {
            Caption = 'Webshop Reference Id';
        }
    }
}

And of course we want to have this field on the posted sales invoice as well:

tableextension 50101 SalesInvoiceHeaderAJK extends "Sales Invoice Header"
{
    fields
    {
        field(50100; WebshopReferenceIdAJK; Text[30])
        {
            Caption = 'Webshop Reference Id';
        }
    }
}

We start with copying the standard API as explained in the previous blog post. In order to add the field to the copied API page, it needs to be added to the table “Sales Invoice Entity Aggregate” as well. So you can guess the other tableextension:

tableextension 50102 SalesInvoiceEntityAggregateAJK extends "Sales Invoice Entity Aggregate"
{
    fields
    {
        field(50100; WebshopReferenceIdAJK; Text[30])
        {
            Caption = 'Webshop Reference Id';
        }
    }
}

Now we can add the field to the API page. Don’t forget to include the call to RegisterFieldSet! And if the order is important, then make sure to place it in the correct position as well. Otherwise you can just add it to the end, or before the first part.

                field(webshopReferenceId; Rec.WebshopReferenceIdAJK)
                {
                    Caption = 'Webshop Reference Id';

                    trigger OnValidate()
                    begin
                        RegisterFieldSet(Rec.FieldNo(WebshopReferenceIdAJK));
                    end;
                }

Now this field can be used to create a new sales invoice.

So far so good. This works for both headers and lines in the same way. To summarize: in order to add a new field to the API:

  • Add the field with a tableextension to Sales Header or Sales Line
  • Optionally add the field with a tableextension to the posted document tables
  • Add the field with a tableextension to buffer table used by the API page (see table below)
  • Add the field to the API and call RegisterFieldSet from the OnValidate

The buffer tables for all sales and purchase document APIs:

APITable
salesQuotesSales Quote Entity Buffer
salesQuoteLinesSales Invoice Line Aggregate
salesOrdersSales Order Entity Buffer
salesOrderLinesSales Invoice Line Aggregate
salesInvoicesSales Invoice Entity Aggregate
salesInvoiceLinesSales Invoice Line Aggregate
salesCreditMemosSales Cr. Memo Entity Buffer
salesCreditMemoLinesSales Invoice Line Aggregate
purchaseInvoicesPurch. Inv. Entity Aggregate
purchaseInvoiceLinesPurch. Inv. Line Aggregate

Adding an existing field to the API

The second scenario is less obvious. Not all fields in the table are available in the API. How can we add these fields to the API? Let’s say we want to add the fields “Blanket Order No.” and “Blanket Order Line No.” to the salesOrderLines API. I didn’t take these fields randomly, I’ve actually got this question a couple of times and it was actually the reason for writing this blog post. 😊

Let’s start by creating the custom salesOrderLines API. That’s just the same procedure as described in the previous blog post. But the salesOrderLines API is linked to the salesOrders API (the header). We need to create a custom API page for the headeras well. Follow the same procedure to create the custom salesOrders API and then make this change the salesOrderLines part to point to the custom salesOrderLines API page.

                part(salesOrderLines; SalesOrderLinesAPIAJK)
                {
                    Caption = 'Lines';
                    EntityName = 'salesOrderLine';
                    EntitySetName = 'salesOrderLines';
                    SubPageLink = "Document Id" = Field(Id);
                }

Now we are ready to add the fields to our custom salesOrderLines API. Since these fields are already available in the “Sales Line” table we don’t have to create a tableextension for the “Sales Line” table. But unfortunately, they are not available in the table “Sales Invoice Line Aggregate”. And that is a BIG miss from Microsoft. Apparently, only the fields that are used in the standard APIs have been added. The same counts for the “Sales Header” table and the corresponding buffer tables. I’m not sure why. In an attempt to save some space because the data is being duplicated? That doesn’t make a big difference I would say, and for the sales line it does not make a difference at all because that table is only used as a temporary table. I think that the scenario of extending the standard API with fields from the base app just wasn’t considered. So, if Microsoft is reading this, can you please make sure that the API buffer tables contain all fields from the document tables? That would be very helpful!

The only option we have now is to add the fields to the buffer table in our own number range and then somehow copy the values to the target table. So let’s start with the tableextension:

tableextension 50103 SalesInvoiceLineAggregateAJK extends "Sales Invoice Line Aggregate"
{
    fields
    {
        field(50100; BlanketOrderNoAJK; Code[20])
        {
            AccessByPermission = TableData "Sales Shipment Header" = R;
            Caption = 'Blanket Order No.';
            TableRelation = "Sales Header"."No." WHERE("Document Type" = CONST("Blanket Order"));
        }
        field(50101; BlanketOrderLineNoAJK; Integer)
        {
            AccessByPermission = TableData "Sales Shipment Header" = R;
            Caption = 'Blanket Order Line No.';
            TableRelation = "Sales Line"."Line No." WHERE("Document Type" = CONST("Blanket Order"),
                                                           "Document No." = FIELD(BlanketOrderNoAJK));
        }
    }
}

And these fields need to be added to the custom salesOrderLines API page as well. The position is not really important because we will be validating those fields at the end.

                field(blanketOrderNo; Rec.BlanketOrderNoAJK)
                {
                    Caption = 'Blanket Order No.';
                }

                field(blanketOrderLineNo; Rec.BlanketOrderLineNoAJK)
                {
                    Caption = 'Blanket Order Line No.';
                }

Next question is how to get the values from these fields to the target fields in the base table. First thing to look at would be events. But unfortunately, there are no events at all in the codeunits “Sales Invoice Aggregator” or the other codeunits used by the API pages. Bummer…. Now we need to write custom code for something that could have been solved out-of-the-box.

The best way I could think of is by using event subscribers to the events of the “Sales Line” table. And these event subscribers should be in a manual subscriber codeunit to avoid unnecessary event hits. Here is an example of the code for this manual event subscriber codeunit:

codeunit 50100 SalesLineAPISubscriberAJK
{
    EventSubscriberInstance = Manual;

    var
        SalesInvoiceLineAggregate: Record "Sales Invoice Line Aggregate";

    procedure SetSalesLineInvoiceLineAggregate(Rec: Record "Sales Invoice Line Aggregate")
    begin
        SalesInvoiceLineAggregate := Rec;
    end;

    [EventSubscriber(ObjectType::Table, Database::"Sales Line", 'OnBeforeInsertEvent', '', false, false)]
    local procedure OnBeforeInsertSalesLine(var Rec: Record "Sales Line")
    begin
        if Rec.IsTemporary then
            exit;
        UpdateSalesLine(Rec);
    end;

    [EventSubscriber(ObjectType::Table, Database::"Sales Line", 'OnBeforeModifyEvent', '', false, false)]
    local procedure OnBeforeModifySalesLine(var Rec: Record "Sales Line")
    begin
        if Rec.IsTemporary then
            exit;
        UpdateSalesLine(Rec);
    end;

    local procedure UpdateSalesLine(var SalesLine: Record "Sales Line")
    begin
        SalesLine.Validate("Blanket Order No.", SalesInvoiceLineAggregate.BlanketOrderNoAJK);
        SalesLine.Validate("Blanket Order Line No.", SalesInvoiceLineAggregate.BlanketOrderLineNoAJK);
    end;
}

Don’t forget to test for Rec.IsTemporary! During the validation of fields, temporary lines are being created, and they will hit this event subscriber as well, resulting in a potential recursive overflow.

As you can see, the UpdateSalesLine function just validates the fields that I need. This could be more generic by using a similar field buffer table, but I decided to keep the code readable.

No we need to bind this codeunit to the right procedure. That’s done in the custom salesOrderLines API page:

    var
        SalesLineAPISubscriber: Codeunit SalesLineAPISubscriberAJK;

    trigger OnInsertRecord(BelowxRec: Boolean): Boolean
    var
        GraphMgtSalesOrderBuffer: Codeunit "Graph Mgt - Sales Order Buffer";
    begin
        SalesLineAPISubscriber.SetSalesLineInvoiceLineAggregate(Rec);
        BindSubscription(SalesLineAPISubscriber);
        GraphMgtSalesOrderBuffer.PropagateInsertLine(Rec, TempFieldBuffer);
        UnbindSubscription(SalesLineAPISubscriber);
        SetCustomFields();
    end;

    trigger OnModifyRecord(): Boolean
    var
        GraphMgtSalesOrderBuffer: Codeunit "Graph Mgt - Sales Order Buffer";
    begin
        SalesLineAPISubscriber.SetSalesLineInvoiceLineAggregate(Rec);
        BindSubscription(SalesLineAPISubscriber);
        GraphMgtSalesOrderBuffer.PropagateModifyLine(Rec, TempFieldBuffer);
        UnbindSubscription(SalesLineAPISubscriber);
        SetCustomFields();
    end;

There is one thing left that we can’t solve with these event subscribers. The propagate functions return the buffer record which has been updated with all values from the sales line. It does that by first clearing all values and use TransferFields. As a result, our custom fields have been cleared, so we need to put them back. For that I have created the function SetCustomFieldValues. The call is already in the code above. The function looks like this:

    local procedure SetCustomFields()
    var
        SalesLine: Record "Sales Line";
	begin
        SalesLine.GetBySystemId(Rec.SystemId);
        Rec.BlanketOrderNoAJK := SalesLine."Blanket Order No.";
        Rec.BlanketOrderLineNoAJK := SalesLine."Blanket Order Line No.";
	end;

The final step that we need to do is to get the values into the buffer table when the record is read with a HTTP GET. For that, we can use the same function SetCustomFields. Just call it from the OnAfterGetRecord trigger.

    trigger OnAfterGetRecord()
    begin
        SetCustomFields();
    end;

This works the same for the salesOrder and salesInvoice APIs (the headers). Of course you could also create event subscribers to keep the field values synchronized. Below is the test result that proves all of this is actually working. 😊

I hope that this long post helps you in creating custom APIs for the sales documents. Maybe I should do a video about this as well, just to demonstrate all the steps above. Let me know in the comments! And if you found another way to do it, then please feel free to share.

Happy coding!

Service to service authentication in Business Central 18.3 – Usage and license terms

$
0
0

Business Central 18.3 is just around the corner, and it comes with a long-awaited feature: support for OAuth client credentials flow, aka service-to-service authentication.

Many API integrations with Business Central SaaS are using the web service access key for basic authentication. But the 2022 release wave 1 (version 20) will remove this feature in favor of OAuth2. Since 2020 release wave 2 (version 17) a warning is displayed on the User card that the web service access key will be deprecated. This is only for SaaS and NOT for on-premise! For on-prem environments, the web service access key will still be an option.

The upcoming update 18.3 closes a gap in the OAuth story. At this moment Business Central SaaS already supports OAuth for delegated permissions. Version 18.3 adds application permissions as another option. An option many were waiting for.

The official documentation can be found here: https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/administration/automation-apis-using-s2s-authentication.

In this blog post, I want to explain the difference between delegated permissions and application permissions in relation to the usage scenarios and license terms. The next blog posts will show how to set it up and how to use it, including code examples.

Delegated permissions

Delegated permissions are used by apps that have a signed-in user (a human being). The user must have a user account in Business Central and consents to the permissions the app requests. As a result, the app is delegated to act as the signed-in user when it makes calls to Business Central APIs. In other words, the app itself does not have permissions, but the permissions of the app user are used.

The process to consent to the permissions requires a user interface. The user needs to log in and he will be asked if he trusts the application to access Business Central on his behalf. There are different ways to achieve this user interaction, known as authorization flows. A flow is simply a series of steps to let the user log in, request the permission, consent to the permission and finally retrieve the required authorization grant.

Most people are struggling with the user interaction. The series of steps involves forwarding the user to Azure to log in and to provide a redirect URI to catch the authorization code returned by Azure after the login procedure. This is way more complicated compare to just a username and password.

Application permissions

Application permissions are used by apps that run without a signed-in user. For example background services or websites that face non Business Central users. The application has its own account in Business Central, with its own set of permissions. The steps to request the permissions is known as the client credentials flow, whereas client stands for the external application or background service.

Usage scenarios

It’s quite important to understand when to use delegated permissions or application permissions.

As a rule of thumb, delegated permissions should be used if the external application is used by Business Central users. Access to Business Central should take place with respect to the permissions of that user. But this is also the biggest hurdle towards OAuth. The application needs to implement one of the delegated permissions flows and apply them for every single user.

Many external applications are using a single user to get access to Business Central. With the web service access key it was quite easy to set up a connection with basic authentication. Moving to OAuth, where every user needs to log in requiring extra user interaction proved to be a challenge to many Business Central partners. Also because the OAuth2 implementation in Azure Active Directory has so many options that it’s quite hard to find the correct settings.

That’s the reason why many were waiting for the client credentials flow because that flow is much easier to implement. No user interaction required, just one call to retrieve that access token, and that’s it.

But then… license terms…

If an application connects to Business Central with a single account, then that’s considered multiplexing. In itself, this is not prohibited. However, it does not reduce the number of required licenses. All users accessing Business Central need to be properly licensed. No matter if they access Business Central with their own credentials or by sharing a single user.

The Dynamics 365 Licensing Guide distinguishes between internal and external users.

Internal users are all users who are providing business processes, like employees, contractors, agents, etc. They need to be licensed for Business Central.

External users are described as “customers and suppliers not performing business processes on behalf of the organization”. These external users do not require a user license to access Dynamics 365 (including Business Central). In fact, they come for free with the internal user license.

The licensing guide also describes multiplexing as pooled connections, using a non-interactive Dynamics 365 user account. A non-interactive user account can access the system only via the web service layer. Internal users accessing Dynamics 365 indirectly, using a pooled connection, must also be properly licensed. Even if they don’t have their own user account! External users however are free to use a pooled connection.

This opens many scenarios, for example the web shop scenario where the web shop application needs to access Business Central, but there is no Business Central user.

Another scenario would be a background service uploading files to Business Central to be processed by users.

In these scenarios, a non-interactive user account should be used rather than a normal user account. But Business Central did not have such non-interactive user account. And that is going to change.

In Business Central 2020 release wave 20 (version 17) the Azure Active Directory Application page was introduced in Business Central. This will be used to configure a non-interactive user account. Behind the scenes, it creates a new record in the user table with license type = Application.

Permissions and limitations

In the AAD Application card, you can set up the permissions for this application user. Try to limit the permissions to the very minimum that is required by the external application to perform his tasks. It is not possible to assign SUPER permissions to an application account. The application account will also not be able to start a background process. These are two limitations that make sense considering the role of the external user that is supposed to connect to Business Central using this application account.

Final thoughts

The application account enables the OAuth client credentials flow, making it much easier to retrieve an access token and call Business Central APIs.

But be careful!

The permissions assigned to the application account are shared by all sessions using this account to access Business Central. This could potentially lead to the situation that an internal user gets more permissions than he should. Or more permissions than he has with his personal user account, if he has one.

It’s recommended to create specific apps with the least required permissions. Do not share or mix application accounts for different purposes. There is no limit as to how many application accounts you can create. And they are for free, there is no license cost involved.

As I explained above, using the application account does not reduce the number of required licenses for internal users. The application account enables the scenario for external users, that is what it has been designed for in the first place.

I would like to finish with a list of usage scenarios:

  • Webshop integration
    • Retrieving price information and actual inventory
    • Creating sales quotes and orders
  • Uploading files to Business Central with a background process like Azure Functions
    • Attachments from incoming emails
    • Bank statement files
    • New files in a SharePoint document library
  • Vendor portal
    • Creating purchase quotes
  • Production portal
    • Register finished items
    • Register item usage
  • Timesheet app
    • Entering time spent on a job

The Production portal and Timesheet app are examples of scenarios where the users are considered internal users because they are working for the organization. However, it might be useful to use an application account to connect to Business Central and not bother the user with the OAuth authorization flow. The timesheet app could even be used by contractors, not having an Azure Activate Directory account at all. But what they have in common is that they need one set of permissions to enter the information. In that case an application account with proper permissions would be much easier to use. BUT… because these users are internal the organization needs to buy licenses for these users anyway!

I hope the above was clear enough to give you the information you need to choose the proper OAuth authorization flow while still complying with the license terms!

Service to service authentication in Business Central 18.3 – How to set up

$
0
0

In the previous blog post, I’ve described the usage scenarios around OAuth client credentials flow for Business Central. In this post, I want to show how to set up this new feature. The next blog post will contain code examples of how to use it.

Please note that the steps explained below already work in the current version, but you need to wait for version 18.3 to be able to actually call the APIs.

The official documentation can be found here, which includes similar information. I’ll just try to clarify some of the steps and provide some screenshots.

The process consist of three steps:

  • Register the external application in Azure Active Directory
  • Create the external application account in Business Central
  • Grant consent

Note: In the text below “application” means “the external application, accessing Business Central APIs”.

Step 1: Register the external application in Azure Active Directory

Business Central uses Azure AD for Identity and Access Management. This means that users accessing Business Central are stored and managed in Azure AD. If an external application needs to access Business Central, it also needs its own identity. When you register your application with Azure AD, you’re creating an identity configuration for your application that allows it to integrate with Azure AD.

An Azure AD application is defined by its one and only application object, which resides in the Azure AD tenant where the application was registered (known as the application’s “home” tenant). An application object is used as a template or blueprint to create one or more service principal objects. A service principal is created in every tenant where the application is used. What a service principal is will be explained later in this article.

Registering the application is done by the organization owning the application that will call the Business Central APIs in their home tenant. The registration consists of these steps:

  • Create the application in Azure
  • Set the required permissions
  • Create a secret

Create the application in Azure

An important choice you need to make here is if the application will be single tenant or multitenant.

If the application will only be used inside the same organization, then you should choose single tenant. For example an in-house developed portal, or a self-hosted webshop. But if you develop an application that will be used by other organizations to integrate with their Business Central environment, then you should choose multitenant.

Navigate to Azure portal and open Azure Active Directory. Click on App Registrations in the menu and then on New registration. Fill in a name and choose the supported account type. Only choose from single tenant or multitenant, don’t look at the options that include personal Microsoft accounts as these are not supported by Business Central.

Under Redirect URI choose Web and then fill in this URL: https://businesscentral.dynamics.com/OAuthLanding.htm. Please note that this property is case-sensitive!

Click on Register to create the application. Copy the Application (client) ID from the overview screen to a text file. You will need this later when registering the application in Business Central and when calling the APIs.

Set the required permissions

The next step is to set the API permissions that the external application needs. Click on API permissions in the menu and then on Add a permission. From the list of commonly used Microsoft APIs, you select Dynamics 365 Business Central. Since the app is going to have its own account in Business Central, you must select Application permissions. As the description says, this is for applications that run as a background service without a signed-in user. See the previous blog post for more information about the difference between delegated permissions and application permissions.

There are three permissions available:

To access all Business Central APIs, we need to select the API.ReadWrite.All permission. Please note that this does not mean that the application will be able to unlimitedly read and write data with all APIs. The actual access to data is limited by the permissions that are assigned to the application account in Business Central.

Click Add permissions to save the settings. You should now see a screen like this:

Under status, you can see that the newly added permission has not been granted for the current organization. If you are registering a single tenant application, then it would be an option to click on the Grant admin consent button. This also makes sense if you are registering a multitenant application that will be used in your own organization as well. In all other cases, when the application will be used by another organization, then they need to grant access from Business Central. Which will be explained below.

Create a secret

The last step in registering the app in Azure is to create a secret. Click on Certificates & secrets in the menu and then click on New client secret. Choose whatever expiration period you want and click on Add. Don’t forget to copy the created secret because this is the only time you will be able to see it.

Tip: it’s not possible to set an unlimited expiration period. The longest period is 24 months. This means that you need to update the secret every now and then.

That’s it, you are now ready with the first step to register the application in Azure. The next step is to create the application account in Business Central.

Step 2: Create the external application account in Business Central

There are two ways to create the external application account in Business Central: manually or automatically from code. In both cases, the app needs to be granted consent manually, but the option to create the record from code is of course much more convenient for end-users. Let’s first see how to manually create the application account.

Manually create the Azure Active Directory Application account

In Business Central search for Azure Active Directory Applications (or just AAD) and open the page. Click on New to add a new record. Fill in the Client Id that you copied in the previous step or received from the organization that owns the external application. The curly brackets will be added automatically. A description is also needed.

On this card page, we can also configure the permissions. These permissions will be applied to all API sessions of the application account. In the example below the external application is assigned the permission set D365 BASIC and D365 SALES DOC, EDIT.

Create the Azure Active Directory Account from code

Let’s now see how the Azure AD account can be created from code in Business Central. Creating the account, including permissions, is much easier for the customer of course. And it would make perfect sense to combine this with an app that also includes custom APIs. The code could be executed automatically, during installation, or from an action on a setup page.

The BC base app provides an interface to create the AAD application from code with codeunit “AAD Application Interface”. This will create a record in the table “AAD Application”. It’s not a complete interface, it only supports creating the app and optionally enable it. It does not set the Extension information automatically, so you need to add it after the record has been created. It also doesn’t support adding permission sets, so we need to do it ourselves.

To assign permissions sets a corresponding user record must be created in the User table. If we use the function CreateAADApplication without setting the parameter EnableAADApplication to true, then there will be no user record created. Because we can’t assign permissions to a non-existing user, we should enable the AAD application first. Of course, you can set the status to disabled afterward, that is no problem.

If you call the code from an install codeunit, then make sure to use the OnInstallAppPerDatabase trigger because the table “AAD Application” is database-wide, and not per company.

Here is an example of an install codeunit and two codeunits that creates the AAD Application record, updates it with extra info, and assigns permission sets.

codeunit 50100 InstallAJK
{
    Subtype = Install;

    trigger OnInstallAppPerDatabase()
    begin
        CreateAADApplications();
    end;

    local procedure CreateAADApplications()
    var
        WebshopIntegrationAADSetup: Codeunit WebshopIntegrationAADSetupAJK;
    begin
        WebshopIntegrationAADSetup.CreateWebshopIntegrationAADApplication();
    end;
}

codeunit 50101 WebshopIntegrationAADSetupAJK
{
    var
        WebshopIntegrationClientIdTok: Label '3870c15c-5700-4704-8b1b-e020052cc860', Locked = true;
        WebshopIntegrationDescriptionTxt: Label 'Integration with the award winning webshop';

    procedure CreateWebshopIntegrationAADApplication()
    var
        AADApplicationInterface: Codeunit AADApplicationInterfaceAJK;
        AppInfo: ModuleInfo;
        ClientDescription: Text[50];
        ContactInformation: Text[50];
    begin
        NavApp.GetCurrentModuleInfo(AppInfo);
        ClientDescription := CopyStr(WebshopIntegrationDescriptionTxt, 1, MaxStrLen(ClientDescription));
        ContactInformation := CopyStr(AppInfo.Publisher, 1, MaxStrLen(ContactInformation));

        AADApplicationInterface.CreateAADApplication(
            GetWebshopIntegrationClientId(),
            ClientDescription,
            ContactInformation,
            AppInfo,
            GetPermissionSets(),
            GetPermissionGroups());
    end;

    local procedure GetPermissionSets() PermissionSets: List of [Code[20]]
    begin
        PermissionSets.Add('D365 BASIC');
        PermissionSets.Add('D365 SALES DOC, EDIT');
    end;

    local procedure GetPermissionGroups() PermissionGroups: List of [Code[20]]
    begin
    end;


    local procedure GetWebshopIntegrationClientId() Id: Guid
    begin
        Id := WebshopIntegrationClientIdTok;
    end;
}

codeunit 50102 AADApplicationInterfaceAJK
{
    procedure CreateAADApplication(
        ClientId: Guid;
        ClientDescription: Text[50];
        ContactInformation: Text[50];
        AppInfo: ModuleInfo;
        PermissionSets: List of [Code[20]];
        PermissionGroups: List of [Code[20]])
    var
        AADApplication: Record "AAD Application";
    begin
        AADApplication := InsertAADApplication(ClientId, ClientDescription, ContactInformation, AppInfo);
        AssignUserGroupsToAADApplication(AADApplication, PermissionGroups);
        AssignPermissionsToAADApplication(AADApplication, PermissionSets);
    end;

    local procedure InsertAADApplication(
        ClientId: Guid;
        ClientDescription: Text[50];
        ContactInformation: Text[50];
        AppInfo: ModuleInfo) AADApplication: Record "AAD Application"
    var
        AADApplicationInterface: Codeunit "AAD Application Interface";
    begin
        AADApplicationInterface.CreateAADApplication(
            ClientId, ClientDescription, ContactInformation, true);

        AADApplication.Get(ClientId);

        AADApplication."App ID" := AppInfo.PackageId;
        AADApplication."App Name" := AppInfo.Name;
        AADApplication.Modify();
    end;

    local procedure AssignUserGroupsToAADApplication(var AADApplication: Record "AAD Application"; UserGroups: List of [Code[20]])
    var
        UserGroupCode: Text;
    begin
        if not UserExists(AADApplication) then
            exit;

        foreach UserGroupCode in UserGroups do
            AddUserToGroup(AADApplication."User ID", UserGroupCode, '')
    end;

    local procedure AssignPermissionsToAADApplication(var AADApplication: Record "AAD Application"; PermissionSets: List of [Code[20]])
    var
        PermissionSetName: Text;
    begin
        if not UserExists(AADApplication) then
            exit;

        foreach PermissionSetName in PermissionSets do
            AddPermissionSetToUser(AADApplication."User ID", PermissionSetName, '');
    end;

    local procedure AddUserToGroup(UserSecurityID: Guid; UserGroupCode: Code[20]; Company: Text[30])
    var
        UserGroupMember: Record "User Group Member";
    begin
        UserGroupMember.SetRange("User Security ID", UserSecurityID);
        UserGroupMember.SetRange("User Group Code", UserGroupCode);
        UserGroupMember.SetRange("Company Name", Company);

        if not UserGroupMember.IsEmpty() then
            exit;

        UserGroupMember.Init();
        UserGroupMember."User Security ID" := UserSecurityID;
        UserGroupMember."User Group Code" := UserGroupCode;
        UserGroupMember."Company Name" := Company;
        UserGroupMember.Insert(true);
    end;

    local procedure AddPermissionSetToUser(UserSecurityID: Guid; RoleID: Code[20]; Company: Text[30])
    var
        AccessControl: Record "Access Control";
    begin
        AccessControl.SetRange("User Security ID", UserSecurityID);
        AccessControl.SetRange("Role ID", RoleID);
        AccessControl.SetRange("Company Name", Company);

        if not AccessControl.IsEmpty() then
            exit;

        AccessControl.Init();
        AccessControl."User Security ID" := UserSecurityID;
        AccessControl."Role ID" := RoleID;
        AccessControl."Company Name" := Company;
        AccessControl.Insert(true);
    end;

    local procedure UserExists(var AADApplication: Record "AAD Application") Result: Boolean
    var
        User: Record User;
    begin
        if IsNullGuid(AADApplication."User ID") then
            exit;

        Result := User.Get(AADApplication."User ID");
    end;
}

The code above creates the AAD Application record as shown below. The only instruction for the user would be to open the record and click on Grant Consent.

Step 3: Grant consent

The web application must be represented in Azure AD by a service principal. The application object created in step 1 is the global representation of the application for use across all tenants, and the service principal is the local representation for use in a specific tenant. The application object serves as the template from which common and default properties are derived for use in creating corresponding service principal objects.

A service principal must be created in each tenant where the application is used, enabling it to establish an identity for sign-in and/or access to Business Central being secured by the tenant. The process is called “grant consent”. This means that we tell Azure AD that the application is allowed to access Business Central, which is then stored as the service principal representing the consented application.

Click on the Grant Consent action at the top in the Azure Active Directory Application card in Business Central. It will open a popup page that requires you to log in. The user that gives consent needs to be a Global Administrator, an Application Administrator, or a Cloud Application Administrator. If you don’t have that role, then let a user who has the required role log in here. This is only needed in this step of the registration process. After logging in you will get a page that asks to accept the permission requested by the application.

Please note the word unverified in the screenshot below, under the name of the application. That is because I didn’t associate an MPN account with the Azure app registration. It’s highly recommended to associate the MPN account of your organization under Branding in the Azure app registration to get your company displayed.


Click on Accept and the page will close and you will get a confirmation message in Business Central.

Now the external application has been fully set up to access Business Central APIs.

Overview

After finishing all these steps, several components have been created in Azure AD and Business Central. Because a picture says more than a thousand words, here is the story again:

In this picture:

Now you made it this far, it would also be a good idea to read the official documentation: https://docs.microsoft.com/en-us/azure/active-directory/develop/app-objects-and-service-principals. As you will see, I’ve been using parts of the documentation in this article.

Revoke consent

What if the external application is not used anymore? How do you block access to Business Central? Well, that’s quite easy. Open the AAD Application card and set State to Disabled. That will also set the corresponding user account in the User table to disabled. Or you delete the complete AAD Application record, which will also disable the corresponding user account.

However, when you granted consent to the AAD Application, a service principal was also created in Azure AD. Deleting the AAD Application in Business Central does not delete the service principal in Azure AD.

To revoke consent in Azure, you need to find the service principal in Azure AD. This is a procedure that must be done by an administrator. This procedure can be done through Azure Active Directory PowerShell or the Microsoft Graph API, but the easiest way for the average administrator is right through the Azure portal.

Open the Azure Portal at https://portal.azure.com and navigate to the Enterprise Applications blade.


Then click on “All Applications” and search for the application you want to revoke consent for.


Click on the application to open the Overview section of the application. Then click on Permissions in the left menu to review the permissions for this application. It should show Dynamics 365 Business Central and Full access to web services API.


If you have reviewed and are sure that this is the application you want to revoke consent for, click on Properties in the left menu. On the properties page click on Delete. By deleting the application here you remove all its OAuth permission grants in this Active Directory. Think about it as uninstalling the application.


That’s it! The application and all consent associated with the application are now gone. Perform the Grant Consent procedure in Business Central to get it back, which will recreate the service principal in Azure AD.

The next blog post will contain code examples of how to call the Business Central APIs using client credentials flow.

Service to service authentication in Business Central 18.3 – How to test (REST Client & PowerShell)

$
0
0

In the previous blog posts I’ve described the usage scenarios around OAuth client credentials flow for Business Central and how to set it up. The next step is to test the APIs and with OAuth authentication to see if it works properly. In this post, I want to focus on the REST Client extension for VS Code and PowerShell. In the next posts, I will also cover Postman, Insomnia and C#.

Version 18.3 has been officially released and is available for the SaaS platform. So you can just go ahead and test it right away!

The official documentation can be found here, which includes similar information and also a code samples for the REST Client. I’ll just try to clarify some of the steps and give some explanation about the different parameters and why they must have specific values.

Before we dive into the code, it’s important to understand the key parameters and their values.

REST Client

The REST Client allows us to write the raw HTTP requests. The very, very basic HTTP requests to get the OAuth access token and to call the Business central APIs. And if you understand these, then you will be able to translate them to any other language or to make use of the available libraries (as you will see below with the PowerShell example).

All passwords and secrets used in the examples are fake, replace them with your own values!

Getting the access token

The access token can be retrieved with a POST request to https://login.microsoftonline.com/{{tenant}}/oauth2/v2.0/token. The body of the request must contain the parameter grant_type=client_credentials plus the parameters as described in the table above.

One comment about the Tenant ID: You may have seen OAuth examples where the word common was used instead of tenant ID. The word common indicates that you are using a multitenant application and the logged-in user will determine the Azure AD tenant. But with the client credentials flow, there is no logged-in user, the application logs in by itself. It is a non-interactive authentication flow, and the application logs in as a service principal. The service principal represents the application account in an Azure AD. So, instead of using the word common, you need to specify which Azure AD tenant the application wants to log in to.

@clientid = 3870c15c-5700-4704-8b1b-e020052cc860
@clientsecret = ~FJRgS5q0YsAEefkW-_pA4ENJ_vIh-5RV9
@scope = https://api.businesscentral.dynamics.com/.default
@tenant = kauffmann.nl


###########################################################################
#    ____      _                                   _        _              
#   / ___| ___| |_    __ _  ___ ___ ___  ___ ___  | |_ ___ | | _____ _ __  
#  | |  _ / _ \ __|  / _` |/ __/ __/ _ \/ __/ __| | __/ _ \| |/ / _ \ '_ \ 
#  | |_| |  __/ |_  | (_| | (_| (_|  __/\__ \__ \ | || (_) |   <  __/ | | |
#   \____|\___|\__|  \__,_|\___\___\___||___/___/  \__\___/|_|\_\___|_| |_|
#
###########################################################################
# @name tokenrequest
POST https://login.microsoftonline.com/{{tenant}}/oauth2/v2.0/token
Content-Type: application/x-www-form-urlencoded

grant_type=client_credentials
&client_id={{clientid}}
&client_secret={{clientsecret}}
&scope={{scope}}

###
@token = {{tokenrequest.response.body.access_token}}
###

The response should look like this:

HTTP/1.1 200 OK
Cache-Control: no-store, no-cache
Pragma: no-cache
Content-Type: application/json; charset=utf-8
Expires: -1
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Content-Type-Options: nosniff
P3P: CP="DSP CUR OTPi IND OTRi ONL FIN"
x-ms-request-id: 2bdd9490-1b1c-4f3f-ace4-ad36c4fd4c04
x-ms-ests-server: 2.1.11829.9 - NCUS ProdSlices
Date: Mon, 12 Jul 2021 20:53:12 GMT
Connection: close
Content-Length: 1235

{
  "token_type": "Bearer",
  "expires_in": 3598,
  "ext_expires_in": 3598,
  "access_token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6Im5PbzNaRHJPRFhFSz...."
}

Inspecting the token

The access token is a so-called Json Web Token. It consists of three parts, delimited by a dot. The first two parts are Base64 encoded and can be easily inspected with the website https://jwt.io. Just copy and paste the access token on this website to see the results.

Below is an example of a decoded access token. Note the roles property which has value API.ReadWrite.All. This indicates that the application has been granted this permission in the Azure AD.

{
  "aud": "https://api.businesscentral.dynamics.com",
  "iss": "https://sts.windows.net/5ffa1f42-fc0f-4f71-a789-2e225ce70d09/",
  "iat": 1626123761,
  "nbf": 1626123761,
  "exp": 1626127661,
  "aio": "E2ZgYGgO8g8RMDI59vJwtkH8cvetAA==",
  "appid": "3870c15c-5700-4704-8b1b-e020052cc860",
  "appidacr": "1",
  "idp": "https://sts.windows.net/5ffa1f43-fc0f-4f71-a789-2e215ce70d09/",
  "idtyp": "app",
  "oid": "92640508-f706-49db-bcc6-1cc58543ca48",
  "rh": "0.AQwAQx_6Xw_8cU-niS4hXOcNCVzBcDgAVwRHixvgIAUsyGAMAAA.",
  "roles": [
    "API.ReadWrite.All"
  ],
  "sub": "92640508-f706-49db-bcc6-1cc58543ca48",
  "tid": "5ffa1f42-fc0f-4f71-a789-2e225ce70d09",
  "uti": "V8dCG6Em5EWBDxyjel0-AA",
  "ver": "1.0"
}

Using the access token

With the resulting access token, we can call the Business Central API. The access token must be added to the Authorization header with the value Bearer <token>. Please note that in the code above the token was stored in a variable, using the feature of the REST client to give a request a name and then work with the response as a variable.

@baseurl = https://api.businesscentral.dynamics.com/v2.0/sandbox

#######################################################################
#    ____      _                                           _           
#   / ___| ___| |_    ___ ___  _ __ ___  _ __   __ _ _ __ (_) ___  ___ 
#  | |  _ / _ \ __|  / __/ _ \| '_ ` _ \| '_ \ / _` | '_ \| |/ _ \/ __|
#  | |_| |  __/ |_  | (_| (_) | | | | | | |_) | (_| | | | | |  __/\__ \
#   \____|\___|\__|  \___\___/|_| |_| |_| .__/ \__,_|_| |_|_|\___||___/
#                                       |_|                            
######################################################################
# @name companies
POST {{baseurl}}/api/v2.0/companies
Authorization: Bearer {{token}}

###
@companyid = {{companies.response.body.value[0].id}}


######################################################################
#    ____      _                    _                                
#   / ___| ___| |_    ___ _   _ ___| |_ ___  _ __ ___   ___ _ __ ___ 
#  | |  _ / _ \ __|  / __| | | / __| __/ _ \| '_ ` _ \ / _ \ '__/ __|
#  | |_| |  __/ |_  | (__| |_| \__ \ || (_) | | | | | |  __/ |  \__ \
#   \____|\___|\__|  \___|\__,_|___/\__\___/|_| |_| |_|\___|_|  |___/
#
######################################################################
GET {{baseurl}}/api/v2.0/companies({{companyid}})/customers
Authorization: Bearer {{token}}

If everything is correctly set up, then you will get the usual response.

Error messages

If the application does not have a service principal in the Azure AD (because it was not granted consent) then you will not receive an error message when you request the token. Instead, you will receive an access token with no permissions. Compare the access token below with the previous one, and note that it does not contain the roles property.

{
  "aud": "https://api.businesscentral.dynamics.com",
  "iss": "https://sts.windows.net/5ffa1f42-fc0f-4f71-a789-2e225ce70d09/",
  "iat": 1626123054,
  "nbf": 1626123054,
  "exp": 1626126954,
  "aio": "E2ZgYPDN3bhAgVnuuxGDlPE8HSNvAA==",
  "appid": "3870c15c-5700-4704-8b1b-e020052cc860",
  "appidacr": "1",
  "idp": "https://sts.windows.net/5ffa1f43-fc0f-4f71-a789-2e215ce70d09/",
  "idtyp": "app",
  "rh": "0.AQwAQx_6Xw_8cU-niS4hXOcNCVzBcDgAVwRHixvgIAUsyGAMAAA.",
  "tid": "5ffa1f42-fc0f-4f71-a789-2e225ce70d09",
  "uti": "hcjDPayjqUyOwYoldBjfAA",
  "ver": "1.0"
}

When you use this token to call the Business Central API, then you will get this error message. This means that the application account was not granted consent.

<error xmlns="http://docs.oasis-open.org/odata/ns/metadata">
  <code>Unauthorized</code>
  <message>The credentials provided are incorrect</message>
</error>

If the application has been granted consent, then it is still restricted by the assigned permissions in Business Central. In case the application makes a call that is not allowed, then the response will have status 400 Bad Request and the body contains the details. For example:

{
  "error": {
    "code": "Internal_ServerError",
    "message": "You do not have the following permissions on TableData Contact: Insert.\r\n\r\nTo view details about your permissions, see the Effective Permissions page. To report a problem, refer to the following server session ID: '1266'.  CorrelationId:  56a72e47-44db-4d8e-9608-826ca693bc0c."
  }
}

Automatically retrieve access token

Finally, I would like to mention a feature of the REST Client that allows to automatically retrieve the token, cache it and refresh it when it expires. This is done with a system variable $aadV2Token. It requires some configuration, but it’s extremely easy to set and forget. It works as follows:

The system variable should be referenced to as {{$aadV2Token appOnly}}. The option appOnly is needed for making use of the client credentials flow. The system variable requires some settings that must be provided as environment variables. The REST Client has a feature to set environment variables in the VS Code settings file (user or workspace level). You can group those variables under a name, together they are an environment. By creating multiple groups with similar variables in it, you can easily switch between those environments without manually changing the values or storing them in the source file.

Below is an example of a workspace settings file. Some of the aadV2 variables are defined in the $shared group. This makes the variables available to all other environments. The environment-specific variables are in their respective groups.

    "rest-client.environmentVariables": {
        "$shared": {
            "aadV2ClientId": "3870c15c-5700-4704-8b1b-e020052cc860",
            "aadV2ClientSecret": "YCA-K80B69XDY-4~e3M.zrn3P.BkUdj-4.",
            "aadV2AppUri": "https://api.businesscentral.dynamics.com/"
        },
        "cronus.company-demo": {
            "aadV2TenantId": "cronus.company",
            "baseurl": "https://api.businesscentral.dynamics.com/v2.0/demo"
        },
        "kauffmann.nl-sandbox": {
            "aadV2TenantId": "kauffmann.nl",
            "baseurl": "https://api.businesscentral.dynamics.com/v2.0/sandbox"
        }
    }

The names of these aadV2… variables must exactly be defined like the example. The aadV2AppUri should be the value https://api.businesscentral.dynamics.com/. The REST Client will automatically add .default after it. After configuring these settings, the code in the REST Client does not require a separate request to retrieve the access token. That is now automatically managed for us. Note that I’ve also added the variable baseurl to the environment settings.

To select the environment you can either open the command palette and choose Rest Client: Switch Environment or find the current environment in the right bottom corner and click on it.

#######################################################################
#    ____      _                                           _           
#   / ___| ___| |_    ___ ___  _ __ ___  _ __   __ _ _ __ (_) ___  ___ 
#  | |  _ / _ \ __|  / __/ _ \| '_ ` _ \| '_ \ / _` | '_ \| |/ _ \/ __|
#  | |_| |  __/ |_  | (_| (_) | | | | | | |_) | (_| | | | | |  __/\__ \
#   \____|\___|\__|  \___\___/|_| |_| |_| .__/ \__,_|_| |_|_|\___||___/
#                                       |_|                            
######################################################################
# @name companies
get {{baseurl}}/api/v2.0/companies
Authorization: Bearer {{$aadV2Token appOnly}}

###
@companyid = {{companies.response.body.value[0].id}}


######################################################################
#    ____      _                    _                                
#   / ___| ___| |_    ___ _   _ ___| |_ ___  _ __ ___   ___ _ __ ___ 
#  | |  _ / _ \ __|  / __| | | / __| __/ _ \| '_ ` _ \ / _ \ '__/ __|
#  | |_| |  __/ |_  | (__| |_| \__ \ || (_) | | | | | |  __/ |  \__ \
#   \____|\___|\__|  \___|\__,_|___/\__\___/|_| |_| |_|\___|_|  |___/
#
######################################################################
get {{baseurl}}/api/v2.0/companies({{companyid}})/customers
Authorization: Bearer {{$aadV2Token appOnly}}

Tip: if you start typing {{$aadV2Token}} and do not add appOnly, then you will get popups in VS Code to copy a code and login in a browser. Just click cancel and add the option appOnly to get rid of these popups.

PowerShell

The raw request from the REST Client could be easily translated into PowerShell as follows:

#########################
# PowerShell example
#########################

$clientid     = "3870c15c-5700-4704-8b1b-e020052cc860"
$clientsecret = "~FJRgS5q0YsAEefkW-_pA4ENJ_vIh-5RV9"
$scope        = "https://api.businesscentral.dynamics.com/.default"
$tenant       = "kauffmann.nl"
$environment  = "sandbox"
$baseurl      = "https://api.businesscentral.dynamics.com/v2.0/$environment"

# Get access token
$body = @{grant_type="client_credentials";scope=$scope;client_id=$ClientID;client_secret=$ClientSecret}
$oauth = Invoke-RestMethod -Method Post -Uri $("https://login.microsoftonline.com/$tenant/oauth2/v2.0/token") -Body $body

# Get companies
$companies = Invoke-RestMethod `
             -Method Get `
             -Uri $("$baseurl/api/v2.0/companies") `
             -Headers @{Authorization='Bearer ' + $oauth.access_token}

$companyid = $companies.value[0].id

# Get customers
$customers = Invoke-RestMethod `
             -Method Get `
             -Uri $("$baseurl/api/v2.0/companies($companyid)/customers") `
             -Headers @{Authorization='Bearer ' + $oauth.access_token}

The script above needs to handle the expiration of the token itself, which happens in 60 minutes. Most probably this results in getting a new token for every time the script runs. It’s recommended to install the MSAL.PS module. This PowerShell module has a function Get-MsalToken which handles the call to Azure to acquire the token and it uses a cache to securely safe and to reuse the token until it expires. To install the module run this command:

Install-Module -name MSAL.PS -Force -AcceptLicense

Now we can simplify the script by replacing the Invoke-RestMethod to get the access token with the command Get-MsalToken.

##############################
# PowerShell example with MSAL
##############################

$clientid     = "3870c15c-5700-4704-8b1b-e020052cc860"
$clientsecret = "~FJRgS5q0YsAEefkW-_pA4ENJ_vIh-5RV9"
$scope        = "https://api.businesscentral.dynamics.com/.default"
$tenant       = "kauffmann.nl"
$environment  = "sandbox"
$baseurl      = "https://api.businesscentral.dynamics.com/v2.0/$environment"

# Get access token
$token = Get-MsalToken `
         -ClientId $clientid `
         -TenantId $tenant `
         -Scopes $scope `
         -ClientSecret (ConvertTo-SecureString -String $clientsecret -AsPlainText -Force)

# Get companies
$companies = Invoke-RestMethod `
             -Method Get `
             -Uri $("$baseurl/api/v2.0/companies") `
             -Headers @{Authorization='Bearer ' + $token.AccessToken}

$companyid = $companies.value[0].id

# Get customers
$customers = Invoke-RestMethod `
             -Method Get `
             -Uri $("$baseurl/api/v2.0/companies($companyid)/customers") `
             -Headers @{Authorization='Bearer ' + $oauth.access_token}

If you are using PowerShell Core, then the script can be simplified further because of some new parameters for authentication. Instead of using the Headers parameter on Invoke-RestMethod, we can use the Authentication and Token parameter. Whereas the Token parameter is a secure string, created from the token that was retrieved with the Get-MsalToken command.

##############################
# PowerShell example with MSAL
##############################

$clientid     = "3870c15c-5700-4704-8b1b-e020052cc860"
$clientsecret = "~FJRgS5q0YsAEefkW-_pA4ENJ_vIh-5RV9"
$scope        = "https://api.businesscentral.dynamics.com/.default"
$tenant       = "kauffmann.nl"
$environment  = "sandbox"
$baseurl      = "https://api.businesscentral.dynamics.com/v2.0/$environment"

# Get access token
$token = Get-MsalToken `
         -ClientId $clientid `
         -TenantId $tenant `
         -Scopes $scope `
         -ClientSecret (ConvertTo-SecureString -String $clientsecret -AsPlainText -Force)

$secureToken = ConvertTo-SecureString -String $token.AccessToken -AsPlainText -Force

# Get companies
$companies = Invoke-RestMethod `
             -Method Get `
             -Uri $("$baseurl/api/v2.0/companies") `
             -Authentication OAuth `
             -Token $secureToken

$companyid = $companies.value[0].id

# Get customers
$customers = Invoke-RestMethod `
             -Method Get `
             -Uri $("$baseurl/api/v2.0/companies($companyid)/customers") `
             -Authentication OAuth `
             -Token $secureToken

The next blog post will discuss how to use Postman and Insomnia to get the OAuth token for Business Central. Stay tuned!


Service to service authentication in Business Central 18.3 – How to test (Postman & Insomnia)

$
0
0

This is the next blog post in a series about service to service authentication in Business Central. Please see the previous blog posts for more information:

In this post, I want to give some tips on how to test with Postman and Insomnia. Both are tools that are being used by Business Central developers, with most probably Postman the one that is mostly used.

Postman

Setting the access token

The access token must be entered on the Authorization tab of the request. First set Type to OAuth 2.0 and then then you can enter the token under Current Token in the Access Token field. The default setting is to add the token to the request headers with prefix Bearer. Make sure that this setting is not changed.

Retrieving the access token

How to get the token with Postman? A simple way would be to prepare a separate request to retrieve the access token and then copy the token from the response and paste it into the authorization header of the other requests. The response body must be of type x-www-form-urlencoded and contain the same parameters as discussed in the previous blog posts. Here is an example:

Note that the URL of the request uses a variable {{AADTenant}}. This variable is set in the environment variables. Of course, you can also use variables for the values in the request body. As explained in the previous blog posts, AADTenant represents the Azure AD tenant for which you want to access Business Central.

Managing access tokens

But Postman offers a different way of managing OAuth tokens. The OAuth parameters can be specified directly under the Authorization tab of a request. Below the Current Token (where you enter the access token to be used) you will find another tab named Configure New Token. Here you can enter the details to retrieve an access token. Make sure to set the Grant Type to Client Credentials. Optionally you can give the token a name. Below is an example of the Authorization tab with exactly the same parameters as used in the request above.

Postman is not going to automatically retrieve the access token for us. First we need to click on the button Get New Access Token. If the values are correct, then you should get a popup like this:

After this screen closes, a new popup screen is displayed that shows the retrieved access tokens. This screen contains a history of access tokens from which you can select one to use for the request. The example below shows three available access tokens. Expired tokens are displayed as strikethrough. They can be deleted individually or you can use the Delete option in this screen to delete all expired tokens at once. I would recommend that you do that regularly.

By clicking on Use Token you select that token to be used for the request. Back in the request under the Current Token you will also find a dropdown Available Tokens. This can be used to select another token or to open again the manage access tokens window where you can select another token or delete the expired tokens.

Configuring Authorization on a parent level

So far, I’ve only talked about setting the authorization per request. However, that’s not the ideal world. Instead, I would recommend configuring the Authorization on folder or collection level. All requests inside a folder or collection can inherit the authorization settings from their parent. Then you only have to select the current token once on collection or folder level and you can use it for all requests inside the folder or collection until the token has expired. I assume further screenshots are not required to show how to open the collection or folder settings. It’s pretty straightforward.

Insomnia

To be honest, this tool is completely new to me. I’ve got a question from somebody struggling with setting up the client credentials flow with Insomnia. So I decided to install this tool and give it a shot. And because I figured that more people might be using Insomnia, I wanted to include my findings here.

Using Environments

Insomnia also supports environments with variables to be used in the requests. They are basically JSON files and that works pretty easily. Here is an example of a very simple environment settings file.

In the screenshots below I’ve used those variables from the environment to illustrate it.

Setting Authorization

Setting the authorization type on a request is only slightly different from Postman. Just click on the Auth tab. Actually, you need to click twice, first to enable it and then a second time to get the dropdown menu to select the authorization type. Select OAuth 2.0 from the menu.

Select Client Credentials from the Grant Type options. Then fill in the Access Token URL, it should be the same as used in Postman. If you use an environment variable, then make sure to select the correct one. Just click on the variable and select the one you want to use from the popup. Also fill in the Client Id and Client Secret. Then expand the Advanced Options and enter the Scope. Just as a reminder, the scope value is https://api.businesscentral.dynamics.com/.default

I wonder why the scope is under the advanced options. I would expect it to be part of the standard options. It’s easily missed because it’s hidden by default. But for most OAuth flows it’s a mandatory setting. The other settings under the Advanced Options can be left as default.

Retrieving Access Token

One thing that Insomnia does better than Postman is that you don’t have to retrieve the token first. Just hit the Send button and the token will be retrieved automatically if it is not available. This is how the result looks like:

Configuring Authorization on a parent level

It seems that Insomnia does not support configuring the authorization on a parent level, like folders or environments. It’s a requested feature, being discussed here. In the discussion, somebody also mentioned a plug-in that he created for this purpose. See here for more information. I haven’t tried this myself, but if you are an Insomnia user you might want to give it a shot. Please let us know in the comments below if it worked.

Good luck with using Postman or Insomnia! The next post in this series will contain a C# example to use the Client Credentials Flow.

Service to service authentication in Business Central 18.3 – How to use in C#

$
0
0

We continue the series about Service to Service authentication, aka Client Credentials Flow, with some tips about getting and using the access token with C#.

Please see the previous posts in this series for more information about how to set up and test the Client Credentials Flow:

In this post, I want to look at C# and give some tips on retrieving an access token and using it to call a Business Central API. All example code below is just a .Net 5.0 Console application. It should be pretty easy to apply the code to an Azure Function or any other program you may have. The required values for Client Id, Client Secret, and AAD Tenant ID are defined as variables to keep the code as simple as possible, but you may consider putting them into a config file or maybe an Azure Key Vault.

Retrieve access token with raw request

Let’s first look at what I named the ‘raw request’. It’s basically a copy of the request from the Powershell example. The HttpClient is used to call to the Azure token endpoint. There is no third-party component being used, everything is standard .Net code.

using System;
using System.Threading.Tasks;
using System.Collections.Generic;
using System.Net.Http;
using System.Text.Json;

namespace OAuthClientCredentialsDemoRaw
{
    class Program
    {
        private const string ClientId = "3870c15c-5700-4704-8b1b-e020052cc860";
        private const string ClientSecret = "~FJRgS5q0YsAEefkW-_pA4ENJ_vIh-5RV9";
        private const string AadTenantId = "kauffmann.nl";
        private const string Authority = "https://login.microsoftonline.com/{AadTenantId}/oauth2/v2.0/token";

        static void Main(string[] args)
        {
            string accessToken = GetAccessToken(AadTenantId).Result;
        }

        static async Task<string> GetAccessToken(string aadTenantId)
        {
            string accessToken = string.Empty;

            using (HttpClient httpClient = new HttpClient())
            {
                Uri uri = new Uri(Authority.Replace("{AadTenantId}", aadTenantId));
                Dictionary<string, string> requestBody = new Dictionary<string, string>
                {
                    {"grant_type", "client_credentials" },
                    {"client_id" , ClientId },
                    {"client_secret", ClientSecret },
                    {"scope", @"https://api.businesscentral.dynamics.com/.default" }
                };

                FormUrlEncodedContent request = new FormUrlEncodedContent(requestBody);

                try
                {
                    HttpResponseMessage response = await httpClient.PostAsync(uri, request);
                    string content = await response.Content.ReadAsStringAsync();

                    if (response.IsSuccessStatusCode)
                    {
                        JsonDocument document = JsonDocument.Parse(content);
                        accessToken = document.RootElement.GetProperty("access_token").GetString();
                        
                        Console.ForegroundColor = ConsoleColor.Green;
                        Console.WriteLine("Token acquired");
                        Console.ResetColor();
                    }
                    else
                    {
                        Console.ForegroundColor = ConsoleColor.Red;
                        Console.WriteLine($"Failed to retrieve access token: {response.StatusCode} {response.ReasonPhrase}");
                        Console.WriteLine($"Content: {content}");
                        Console.ResetColor();
                    }
                }
                catch (HttpRequestException ex)
                {
                    Console.ForegroundColor = ConsoleColor.Red;
                    Console.WriteLine($"Error occurred while retrieving access token");
                    Console.WriteLine($"{ex.Message}");
                    Console.ResetColor();
                }

            }

            return accessToken;
        }
    }
}

Let’s now look at a different way to retrieve the access token. A much better way in my opinion.

Retrieve access token with MSAL

Similar to what I wrote in the PowerShell example, we can use Microsoft Authentication Library for .NET (MSAL.NET) to do the hard work. You can find the MSAL repository at GitHub: https://github.com/AzureAD/microsoft-authentication-library-for-dotnet. To add this to the code, we need to get the NuGet package. I assume that you know how to do this in Visual Studio or VS Code.

Here is a code example to retrieve the access token with the MSAL.NET library:

using System;
using System.Threading.Tasks;
using Microsoft.Identity.Client;

namespace OAuthClientCredentialsDemo
{
    class Program
    {
        private const string ClientId = "3870c15c-5700-4704-8b1b-e020052cc860";
        private const string ClientSecret = "~FJRgS5q0YsAEefkW-_pA4ENJ_vIh-5RV9";
        private const string AadTenantId = "kauffmann.nl";
        private const string Authority = "https://login.microsoftonline.com/{AadTenantId}/oauth2/v2.0/token";

        static void Main(string[] args)
        {
            AuthenticationResult authResult = GetAccessToken(AadTenantId).Result;
        }

        static async Task<AuthenticationResult> GetAccessToken(string aadTenantId)
        {
            Uri uri = new Uri(Authority.Replace("{AadTenantId}", aadTenantId));

            IConfidentialClientApplication app = ConfidentialClientApplicationBuilder.Create(ClientId)
                .WithClientSecret(ClientSecret)
                .WithAuthority(uri)
                .Build();

            string[] scopes = new string[] { @"https://api.businesscentral.dynamics.com/.default" };
            AuthenticationResult result = null;
            try
            {
                result = await app.AcquireTokenForClient(scopes).ExecuteAsync();
                
                Console.ForegroundColor = ConsoleColor.Green;
                Console.WriteLine("Token acquired");
                Console.ResetColor();
            }
            catch (MsalServiceException ex)
            {
                Console.ForegroundColor = ConsoleColor.Red;
                Console.WriteLine($"Error occurred while retrieving access token");
                Console.WriteLine($"{ex.ErrorCode} {ex.Message}");
                Console.ResetColor();
            }

            return result;
        }
    }
}

The ConfidentialClientApplication provides a method AcquireTokenForClient which uses the client credentials flow to retrieve the access token. This method is documented here: https://docs.microsoft.com/en-us/dotnet/api/microsoft.identity.client.confidentialclientapplication.acquiretokenforclient. No need to compose a request, make a call, and process the returned JSON. It’s all included!

Using the access token

Now that we have the access token it’s time to use it. The code below has been modified to call Business Central API twice while reusing the access token. Please note that some variables have been added as well to compose the Business Central URL.

The most important part is how the access token is added to the request. That is done as an Authorization header on line 66: client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(“Bearer”, AuthResult.AccessToken);

using System;
using System.Threading.Tasks;
using System.Net.Http;
using System.Net.Http.Headers;
using Microsoft.Identity.Client;

namespace OAuthClientCredentialsDemo
{
    class Program
    {
        private const string ClientId = "3870c15c-5700-4704-8b1b-e020052cc860";
        private const string ClientSecret = "~FJRgS5q0YsAEefkW-_pA4ENJ_vIh-5RV9";
        private const string AadTenantId = "kauffmann.nl";
        private const string Authority = "https://login.microsoftonline.com/{AadTenantId}/oauth2/v2.0/token";
        private const string BCEnvironmentName = "sandbox";
        private const string BCCompanyId = "64d41503-fcd7-eb11-bb70-000d3a299fca";
        private const string BCBaseUrl = "https://api.businesscentral.dynamics.com/v2.0/{BCEnvironmentName}/api/v2.0/companies({BCCompanyId})";
        private static AuthenticationResult AuthResult = null;

        static void Main(string[] args)
        {
            string customers = CallBusinessCentralAPI(BCEnvironmentName, BCCompanyId, "customers").Result;
            string items = CallBusinessCentralAPI(BCEnvironmentName, BCCompanyId, "items").Result;
        }

        static async Task<AuthenticationResult> GetAccessToken(string aadTenantId)
        {
            Uri uri = new Uri(Authority.Replace("{AadTenantId}", aadTenantId));

            IConfidentialClientApplication app = ConfidentialClientApplicationBuilder.Create(ClientId)
                .WithClientSecret(ClientSecret)
                .WithAuthority(uri)
                .Build();

            string[] scopes = new string[] { @"https://api.businesscentral.dynamics.com/.default" };
            AuthenticationResult result = null;
            try
            {
                result = await app.AcquireTokenForClient(scopes).ExecuteAsync();
                Console.ForegroundColor = ConsoleColor.Green;
                Console.WriteLine("Token acquired");
                Console.ResetColor();
            }
            catch (MsalServiceException ex)
            {
                Console.ForegroundColor = ConsoleColor.Red;
                Console.WriteLine($"Error occurred while retrieving access token");
                Console.WriteLine($"{ex.ErrorCode} {ex.Message}");
                Console.ResetColor();
            }

            return result;
        }

        static async Task<string> CallBusinessCentralAPI(string bcEnvironmentName, string bcCompanyId, string resource)
        {
            string result = string.Empty;

            if ((AuthResult == null) || (AuthResult.ExpiresOn < DateTime.Now))
            {
                AuthResult = await GetAccessToken(AadTenantId);
            }

            using (HttpClient client = new HttpClient())
            {
                client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", AuthResult.AccessToken);
                client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));

                Uri uri = new Uri(GetBCAPIUrl(bcEnvironmentName, bcCompanyId, resource));
                HttpResponseMessage response = await client.GetAsync(uri);

                if (response.IsSuccessStatusCode)
                {
                    result = await response.Content.ReadAsStringAsync();
                }
                else
                {
                    Console.ForegroundColor = ConsoleColor.Red;
                    Console.WriteLine($"Call to Business Central API failed: {response.StatusCode} {response.ReasonPhrase}");
                    string content = await response.Content.ReadAsStringAsync();
                    Console.WriteLine($"Content: {content}");
                    Console.ResetColor();
                }
            }

            return result;
        }

        private static string GetBCAPIUrl(string bcEnvironmentName, string bcCompanyId, string resource)
        {
            return BCBaseUrl.Replace("{BCEnvironmentName}", bcEnvironmentName).Replace("{BCCompanyId}", bcCompanyId) + "/" + resource;
        }
    }
}

Please keep in mind that the code examples are not optimized in many ways. It’s only meant to get you started with client credentials flow for Business Central in C#. The result from the API call to Business Central needs to be parsed as a JSON document or deserialized as an object in order to work with the data. That depends on your scenario of course. Let me know if you want to see some examples of that as well!

Service to service authentication in Business Central 18.3 – How to use in AL

$
0
0

Because I got the question if it was possible to use the client credentials flow in Business Central I decided to write a quick blog post about that as well. Codeunit OAuth2 provides a number of functions to acquire access tokens with different authorization flows, including the client credentials flow. The function AcquireTokenWithClientCredentials can be used for this purpose. Read more about the OAuth2 module here: https://github.com/microsoft/ALAppExtensions/tree/master/Modules/System/OAuth2.

Let’s just dive into the code. If you have read the PowerShell examples or C# example code, then the code below should be familiar.

codeunit 50100 BCConnector
{
    var
        ClientIdTxt: Label '3870c15c-5700-4704-8b1b-e020052cc860';
        ClientSecretTxt: Label '~FJRgS5q0YsAEefkW-_pA4ENJ_vIh-5RV9';
        AadTenantIdTxt: Label 'kauffmann.nl';
        AuthorityTxt: Label 'https://login.microsoftonline.com/{AadTenantId}/oauth2/v2.0/token';
        BCEnvironmentNameTxt: Label 'sandbox';
        BCCompanyIdTxt: Label '64d41503-fcd7-eb11-bb70-000d3a299fca';
        BCBaseUrlTxt: Label 'https://api.businesscentral.dynamics.com/v2.0/{BCEnvironmentName}/api/v2.0/companies({BCCompanyId})';
        AccessToken: Text;
        AccesTokenExpires: DateTime;

    trigger OnRun()
    var
        Customers: Text;
        Items: Text;
    begin
        Customers := CallBusinessCentralAPI(BCEnvironmentNameTxt, BCCompanyIdTxt, 'customers');
        Items := CallBusinessCentralAPI(BCEnvironmentNameTxt, BCCompanyIdTxt, 'items');
        Message(Customers);
        Message(Items);
    end;

    procedure CallBusinessCentralAPI(BCEnvironmentName: Text; BCCompanyId: Text; Resource: Text) Result: Text
    var
        Client: HttpClient;
        Response: HttpResponseMessage;
        Url: Text;
    begin
        if (AccessToken = '') or (AccesTokenExpires = 0DT) or (AccesTokenExpires > CurrentDateTime) then
            GetAccessToken(AadTenantIdTxt);

        Client.DefaultRequestHeaders.Add('Authorization', GetAuthenticationHeaderValue(AccessToken));
        Client.DefaultRequestHeaders.Add('Accept', 'application/json');

        Url := GetBCAPIUrl(BCEnvironmentName, BCCompanyId, Resource);
        if not Client.Get(Url, Response) then
            if Response.IsBlockedByEnvironment then
                Error('Request was blocked by environment')
            else
                Error('Request to Business Central failed\%', GetLastErrorText());

        if not Response.IsSuccessStatusCode then
            Error('Request to Business Central failed\%1 %2', Response.HttpStatusCode, Response.ReasonPhrase);

        Response.Content.ReadAs(Result);
    end;

    local procedure GetAccessToken(AadTenantId: Text)
    var
        OAuth2: Codeunit OAuth2;
        Scopes: List of [Text];
    begin
        Scopes.Add('https://api.businesscentral.dynamics.com/.default');
        if not OAuth2.AcquireTokenWithClientCredentials(ClientIdTxt, ClientSecretTxt, GetAuthorityUrl(AadTenantId), '', Scopes, AccessToken) then
            Error('Failed to retrieve access token\', GetLastErrorText());
        AccesTokenExpires := CurrentDateTime + (3599 * 1000);
    end;

    local procedure GetAuthenticationHeaderValue(AccessToken: Text) Value: Text;
    begin
        Value := StrSubstNo('Bearer %1', AccessToken);
    end;

    local procedure GetAuthorityUrl(AadTenantId: Text) Url: Text
    begin
        Url := AuthorityTxt;
        Url := Url.Replace('{AadTenantId}', AadTenantId);
    end;

    local procedure GetBCAPIUrl(BCEnvironmentName: Text; BCCOmpanyId: Text; Resource: Text) Url: Text;
    begin
        Url := BCBaseUrlTxt;
        Url := Url.Replace('{BCEnvironmentName}', BCEnvironmentName)
                  .Replace('{BCCompanyId}', BCCOmpanyId);
        Url := StrSubstNo('%1/%2', Url, Resource);
    end;
}

Remarks

I’ve tried to keep the code close to the C# example. There is definitely room to improve and JSON handling should be added here as well. Secrets should not be stored in code like this. For the Business Central SaaS environment, I would definitely go with Azure Key Vault storage.

The only thing I was really missing is handling the lifetime of the access token. The OAuth2 codeunit just returns an access token without any information about the expiration. In the code above I’ve added that myself by assuming the default lifetime of 60 minutes (access tokens are usually returned with expires in = 3599 seconds).

Another small thing I noticed is the RedirectURL parameter for the function AcquireTokenWithClientCredentials. That doesn’t make sense, there is no redirect URL used during the client credentials flow. So I passed in an empty string, and luckily that worked. The parameter could be completely removed in my opinion.

That’s it! With this blog post I finish the series about using the client credentials flow with Business Central APIs. But I’m not done with OAuth, not by far! I’d like to write something about setting up OAuth for on-prem installations as well. And I’m open for suggestions, just shoot me a message!

Configuring Business Central for Azure Active Directory authentication and OAuth (1)

$
0
0

It’s a recurring question I get quite frequently: how to set up Business Central on-premises for Azure Active Directory authentication? And related to that, how to call APIs on Business Central on-premises with OAuth authentication? I know there are quite some sources out there like docs and blog posts about this topic. But I figured it would make sense to write it up myself as well, at least for my own record.

All information below can also be found at Microsoft docs in an excellent step through explanation: Authenticating Business Central Users with Azure Active Directory.

But there might be some pieces that I recommend differently… 😊 And I will add some background information here and there.

Prerequisites

There are a few prerequisites that are not in scope for this blog post.

  • Business Central needs to be configured with a security certificate. This is a standard requirement for the credential-type setting that supports Azure AD authentication. The certificate can be a self-signed certificate for development environments, but for a production environment, it should be obtained from a trusted certificate authority. It’s also possible to get a free certificate from Let’s Encrypt. For more information about setting up security certificates with Business Central see this Microsoft docs article.
  • An active Azure subscription is required. If you don’t have one, than it’s time to sign up for a free Azure subscription here. Then create an Azure Active Directory tenant as described here. Azure Active Directory is a free service and it will stay free forever, as indicated here.
  • The Azure AD tenant must contain users that will be mapped to the users in Business Central. Read here how to manage users in Azure Active Directory. The main difference with Business Central database authentication is that the authentication for these users will be managed by Azure AD. Passwords and other security settings like multi-factor authentication are also managed inside Azure AD and not in Business Central.
  • Microsoft also recommends to limit the lifetime of an access token to 10 minutes. Ths procedure is explained here. But let me also explain the why. The default life of an access token is 1 hour, and an access token can’t be revoked. That means that if somebody would be able to steal an access token he would be able to use it up to an hour. Limiting the lifetime increases security. This doesn’t mean that a user has to login again every 10 minutes. It’s just a matter of how long an access token can be used to start a session. After that, when a session is started, access is maintained for as long as the session is active. Later on you will also see how to extend the time that a user session may be inactive before the connection is dropped.

Business Central on-premises environment

On-premises environments with Business Central are quite diverse. Over the years I’ve seen different setups and it’s quite hard to describe an environment that fits everybody’s needs. But there are some common characteristics that apply to the majority of environments.

  • Customer environments with multiple BC servers, like production and test.
  • Partner environments with multiple BC servers, for different purposes like development, demo and test.
  • In most cases those environments have the same user accounts.
  • The BC servers are often configured with Windows authentication.

In many cases, companies do already use Azure Active Directory. Some have configured synchronization or federation between on-premises Windows Active Directory and Azure Active Directory, allowing users to access cloud services like Office365 with their on-premises credentials. It makes total sense to configure the on-premises Business Central environments to work with Azure Active Directory rather than Windows authentication. It decouples the Business Central environment from the on-premises Windows Active Directory. While user accounts can still be shared across multiple Business Central environments. And for developers creating API integrations, it allows them to work with OAuth authentication against a local development environment.

In this article, I want to explain how to set up an environment with multiple environments, PROD and TEST. They will be accessible with the URL’s

  • https://prod.cronus.company/bc
  • https://test.cronus.company/bc

In case you are doing this for one environment, just skip the steps that mention the second URL.

Azure Active Directory

The first step is to register the Business Central environment as an application in Azure Active Directory. This is required to give the Business Central environment its own identity that accepts Azure AD logins.

You can choose to register each individual Business Central environment as an application in Azure Active Directory or to create one application registration that covers them all. The option to register one application for multiple Business Central environments is easier because it requires less configuration. It is even possible to register one application in Azure Active Directory that works for multiple Azure Tenants.

To make this easier to understand, it would help to consider the application registration in Azure Active Directory as an application rather than a Business Central service tier. All Business Central environments of one customer are (most probably) the same application, with the same set of users. It could be different versions of the application, but still the same application.

From the point of view of an ISV, it could be one application running in multiple environments for developers, for testing purposes, or for demos. In different versions, in different stages of development, but still the same application.

Actually, you could even decide to have just one Azure app registration covering all your Business Central environments. 😊

Why am I elaborating on this point? You will see when we get to the configuration settings. Let’s dive into it.

Open the Azure portal at https://portal.azure.com and open Azure Active Directory (direct link). Open App registrations in the left menu and then click on New registration.

The properties that need to be filled in are:

Click on Register to create the app registration.

The app registration has now one redirect URL for the PROD environment. We need to add another for the TEST environment.

Click on Authentication in the left menu. Then click on Add URI in the box with the title Web. Here we add the URL for the TEST environment. Don’t forget to click the Save button at the top.

This step must be repeated for every single Business Central environment that you want to use Azure Active Directory authentication. They all have their own unique redirect URL, and they must all be registered here.

Tip: If you have multiple Business Central server tiers running on a single server, then you don’t need to specify a redirect URL for each one of them. Just specify one redirect URI with only the hostname, e.g. https://bcserver.cronus.company. That should work for all Business Central server instances on that server. I couldn’t find a way to work with wildcards (which is considered unsafe anyway), so if you use subdomains as I do in this example, then you need to specify a redirect URL for each subdomain. You can specify up to 256 redirect URL’s.

The next step is to create an Application ID URI. Click on Expose an API in the left menu. Then at the top of the page, click on Set right after the text Application ID URI. A popup will be displayed with a suggested URI that has the format api://[guid], whereas the guid is the application id that is assigned to the app registration.

You can choose to use the suggested URI or create your own one. It must start with https, api, urn, or ms-appx and has no slash at the end. To use an https URI the domain that is used must be a verified domain. In this example, I choose to use https://businesscentral.cronus.company.

One may ask why I don’t use the full URL of the Business Central server instance, https://prod.cronus.company/bc. The reason is that I want to use this for multiple Business Central environments. So I’d rather go with a generic URI instead of one that is linked to a specific server. This is a URI, not a URL. A URI is an identifier of a resource and does not tell how to access it (that’s what a URL is for).

That’s it, no more settings are required in Azure. If you have been here before, you may have done extra steps like creating a scope or adding an app role in order to use OAuth authentication with APIs. As you will see in the next blog post, this is not required because there is an easier way.

Business Central server configuration

Let’s move over to Business Central and configure it for Azure Active Directory authentication. The first step is to set up at least one user for Azure AD authentication. Otherwise, you won’t be able to sign in after the switch to Azure AD.

A Business Central user account must be associated with an Azure AD account. This is done by specifying the user principal name in Azure AD (usually the email address) in the field Authentication Email on the User Card. This field is only editable in on-premises environments. Keep in mind, the field will not be editable in Docker sandbox containers because they behave like SaaS environments. For this reason, and because the other settings are also done with PowerShell, I prefer to set this field using a PowerShell command. See below for the full PowerShell script, including the command to configure the user.

Now we are ready to configure the Business Central server. The following settings must be done:

After configuring these settings, the Business Central server instance must be restarted.

The final step is to configure the web client to use AAD login. This can also be done with a simple PowerShell command. Of course, you can also change this setting manually in the navsettings.json file of the web client.

The full script to configure the server looks like this:

$ApplicationIDURI = "https://businesscentral.cronus.company"
$AADTENANTID = "97ff4abe-bcfe-4592-bb21-aa249ad4e83a"
$RedirectURL = "https://{HOSTNAME}/BC/SignIn"

#Configure user
Set-NAVServerUser -ServerInstance BC -User "ADMIN" -AuthenticationEmail "admin@cronus.company"

#Configure Business Central server instance
Set-NAVServerConfiguration -ServerInstance BC -KeyName ClientServicesCredentialType -KeyValue NavUserPassword
Set-NAVServerConfiguration -ServerInstance BC -KeyName WSFederationLoginEndpoint -KeyValue "https://login.microsoftonline.com/$AADTENANTID/wsfed?wa=wsignin1.0%26wtrealm=$ApplicationIDURI%26wreply=$RedirectUrl"
Set-NAVServerConfiguration -ServerInstance BC -KeyName ClientServicesFederationMetadataLocation -KeyValue "https://login.microsoftonline.com/$AADTENANTID/FederationMetadata/2007-06/FederationMetadata.xml"
Set-NAVServerConfiguration -ServerInstance BC -KeyName DisableTokenSigningCertificateValidation -KeyValue true
Set-NAVServerConfiguration -ServerInstance BC -KeyName ExtendedSecurityTokenLifetime -KeyValue "8"

Restart-NAVServerInstance -ServerInstance BC

#Configure Web Client for AAD login
Set-NAVWebServerInstanceConfiguration -WebServerInstance BC -KeyName ClientServicesCredentialType -KeyValue "AccessControlService"

This script can be executed for all Business Central server instances that must be configured for AAD login for the same Azure tenant. In my scenario, I run this for both PROD and TEST server instances.

Tip: for multi-tenant environments, the settings are slightly different. I’ve not included the differences here to make this blog post not overly difficult to follow. The Microsoft docs mentioned above will give you the required information for configuring multi-tenant environments.

The above script does not enable OAuth authentication for calling API web services. That’s for the next blog post. Stay tuned!

Configuring Business Central for Azure Active Directory authentication and OAuth (2)

$
0
0

After configuring Business Central on-premises for Azure Active Directory authentication, as explained in the previous blog post, it’s now time to configure it for OAuth authentication with APIs and web services.

Two options

There are two options that you can choose from. An easy option that requires only one setting and a more complex option that requires extra configuration in Azure. Let me first explain the more complex option and when you’ve seen that I’m sure you will love the easy option.

Option 1: Custom scope

As you may know, calling Business Central APIs on the SaaS platform with OAuth authentication needs a so-called scope. The scopes we can choose from are:

The scope consists of two parts: a resource followed by a permission or role. The resource is the full Application ID URI that is defined in the Azure app registration. In the previous blog post the example Application ID URI was defined as https://businesscentral.cronus.company. To get the full scopes for the Business Central on-premises environment, we just need to replace the standard resource with our custom resource. The scopes will then be:

  • https://businesscentral.cronus.company/user_impersonation
  • https://businesscentral.cronus.company/API.ReadWrite.All
  • https://businesscentral.cronus.company/Automation.ReadWrite.All

Note that I’ve skipped the Financials.ReadWrite.All permission. That’s only to be used with the Microsoft Graph endpoint and doesn’t apply to an on-premises environment.

These scopes need to be defined in Azure before we can use them. This is done in two places. Delegated scopes are defined under Expose an API and application scopes are defined under App roles.

Configure delegated permissions

Go to the app registration in Azure and click on Expose an API in the left menu. At the top of this page, you will see the defined Application ID URI. Click on Add a scope and define a new scope user_impersonation as follows:

Don’t forget to click on Add scope to finish this step. The screen now shows the added scope.

Now we need to test this permission by retrieving an OAuth access token and calling an API on our Business Central server instance. Let’s do this with Postman.

The first step is to create another app registration that represents our Postman installation. We do this in the same Azure Active Directory tenant as where the Business Central service tier has been created. Open Azure Active Directory, click on App registrations and then on New registration at the top. Give the app a name, e.g. Postman. Leave the other settings at the default values. Click on Register to create the app registration.

Make a copy of the Application (client) ID on the overview page. We need this value for Postman.

Click on Authentication in the left menu. Click on Add a platform and then on Mobile and desktop applications. Select the first default redirect URI: https://login.microsoftonline.com/common/oauth2/nativeclient. Finally, click on Configure. The Authentication page should show the selected redirect URI.

Make a copy of the redirect URI because we need this in Postman.

This is all we need to do for the app registration of Postman. You may think, what about setting the permission? Well, that’s not required because we will specify the required permission in Postman. That’s called dynamic permissions. And what about creating a secret? Again, not required because we are using a public client that can’t hold a secret. If you want to know more about this, then you may want to watch this video.

Test delegated permissions with Postman

Open Postman and create a new request. The URL of the request looks like: https://prod.cronus.company:7048/BC/api/v2.0. Open the Authorization tab and choose type OAuth 2.0. Then under Configure New Token specify these values:

Click on Get New Access Token and log in with your Azure AD account. Now you should get a window that asks for your permission to allow Postman to access Business Central on your behalf.

Click on Accept and then Postman will finish the flow by retrieving the access token. If that is successful, then you will get a window in Postman with the access token. Click on Use Token to select this token for the API request.

That’s it. Hit the Send button to call your Business Central environment with OAuth authentication!

As you probably noticed, we didn’t even touch the Business Central server instance. The settings we did in the previous blog post to enable Azure AD authentication were sufficient to enable the OAuth for delegated user permissions.

Service to service authentication

But we are not done yet. This was the delegated user permissions, officially called Authorization Code grant flow. We have something called service-to-service authentication, also known as Client Credentials flow. That is a permission that lets an external application use its own account rather than the logged-in user. Let’s configure that as well. Fasten your seatbelts, this requires a little more configuration!

Open the Azure app registration for the Business Central application. In the left menu click on App roles. Click on Create app role at the top. Define the app role as follows:

Click on Apply to create the app role. The App roles page will now look like this.

The next step is to add this permission to the Postman app registration. Application permissions can’t be added dynamically as we did with the delegated permissions earlier. Application permissions must be specified as static permission on the app registration.

Open the Postman app registration in Azure and click on API permissions in the left menu. This page should currently have one configured permission: User.Read. Click on Add a permission above the list of configured permissions. Then click on My APIs at the top. The list should contain the name of the app registration for the Business Central application. Select this entry.

Select Application permissions and then select API.ReadWrite.All.

Click on Add permissions to add the selected permission and close the dialog. The list of configured permissions should now include the API.ReadWrite.All permission for the Business Central application. Click on Grant admin consent to pre-consent the permissions. This saves us an extra step and configuration when the application is created in Business Central.

The final step is to create a secret. The Client Credentials flow always requires a secret, it makes no difference between public or confidential applications. Click on Certificates & secrets in the left menu, then click on New client secret. In the dialog you can optionally set a description and change the expiration date. Then click on Add to create the secret.

Click on the copy button right behind the secret value and paste the secret somewhere (e.g. Notepad) for later usage.

The next step is to create an application account in Business Central for the Postman application. Open Business Central and search for Azure Active Directory Applications. Click on New to add a new record. Fill in the client id of the Postman app registration. This is the same client id that was used earlier in Postman and can be found on the Overview page of the Azure app registration. Give it a description and then add permissions. For test purposes, I usually go with user group D365 FULL ACCESS.

It’s not required to click on Grant Consent because we already granted the permissions in the Azure app registration. That’s why it’s called pre-consent.

Test service-to-service authentication with Postman

Let’s test this with Postman! Open Postman again and go to OAuth settings of the same request as before. We are going to make a few changes here. The settings should now be filled in as follows.

That’s it! Try to get a new access token and use it to make a new request to Business Central. I’ll save you another screenshot because it would look exactly the same as earlier.

Note that we still didn’t configure anything in the Business Central server instance. There is one setting called AppIdUri. According to the documentation, this setting is used “to validate the security tokens that the server instance receives in SOAP and OData calls”. However, I was able to call APIs without configuring this setting. If I would have configured this setting, then the value would be the Application ID URI as specified in the Azure app registration for the Business Central application: https://businesscentral.cronus.company.

Phew… that was a lot of configuring… And what do we have at the end of this exercise? Our own custom scopes to call Business Central on-premises with OAuth authentication. Well, because the app registration can be used across multiple Business Central server instances, at least we don’t have to do this for every single Business Central server instance.

But still… we have a custom scope… wouldn’t it be easier if we could use the standard scope for Business Central SaaS? That would be a lot easier, right?

Let’s move to option 2, the easy option!

Option 2: Standard scope

Instead of creating custom scopes, it’s also possible to configure Business Central to accept scopes that start with https://api.businesscentral.dynamics.com. And if we do that, then we can skip the configuration of the custom scopes we did in step 1!

The configuration setting is called ValidAudiences. This is related to a property in the access token, called aud. This property, officially called “claim”, identifies the intended recipient of the token. The audience is your app’s Application ID, assigned to your app in the Azure portal. With the configuration setting ValidAudiences we tell our Business Central server instance to accept tokens with an alternative audience claim. The configuration setting can contain multiple values, separated by semicolons. To set this property, run this PowerShell command followed by a restart of the server instance:

Set-NAVServerConfiguration -ServerInstance BC -KeyName ValidAudiences -KeyValue "https://api.businesscentral.dynamics.com"
Restart-NAVServerInstance -ServerInstance BC

After this, you can call Business Central on-premises APIs with exactly the same scopes as used for the SaaS environment! How convenient is that! I guess you’d agree that it is much easier to work with one set of scopes for both SaaS environments and on-premises environments! And this also enables publishing apps from VS Code with Azure AD authentication. That would not be possible when using the custom scopes from the first option.

To test Postman with the user delegated flow (Authorization Code) follow the steps explained earlier, but change the scope to https://api.businesscentral.dynamics.com/user_impersonation.

To test Postman with service-to-service authentication (Client Credentials) you need to assign the standard Business Central permissions in the Azure app registration for Postman and then change the scope in Postman to https://api.businesscentral.dynamics.com/.default. See also the blogs series about service-to-service authentication for more information about configuring and testing service-to-service authentication.

There is one configuration setting that we didn’t touch yet, but it works automatically if you are using a single-tenant environment and exactly followed all steps in the previous blog post. This setting can be reviewed by calling the Get-NavTenant command:

The AadTenantId is filled with the Azure Active Directory ID. How did that happen, we didn’t run any command that configured this setting! This property is automatically set when you configure the WSFederationLoginEndpoint on the Business Central server instance. The URL starts with https://login.microsoftonline.com/[AAD Tenant Id]/wsfed and Business Central takes the Azure AD tenant id out of this URL and applies it to the Business Central tenant settings. That’s why I said in the previous blog post that this setting MUST be a guid and not the primary domain of the Azure Active Directory. If you would use the primary domain, then the Business Central tenant configuration is also set to the domain and then the service-to-service authentication will fail! I’ve just saved you a couple of hours and frustration… 😊

What if you have a multi-tenant environment? Then you need to provide the AadTenantId with the Mount-NAVTenant PowerShell command. See https://docs.microsoft.com/en-us/powershell/module/microsoft.dynamics.nav.management/mount-navtenant for more information. This can’t be done dynamically, already mounted tenants need to be dismounted first and then mounted again with the AadTenantId parameter.

A final word about calling custom APIs with service-to-service authentication on on-premises environments. This is possible starting with version 19.5. In older versions, you will get an error message that you can’t access the object using application permissions. A workaround is to set the application user account to External user and then it will also work in older versions. That’s a workaround, not an official solution!

Sending email via SMTP in Business Central (online and on-premise)

$
0
0

It’s not a secret that Microsoft is forcing OAuth authentication for several products, including Exchange Online and Business Central Online. Over the last couple of days, I got multiple requests from partners who are now struggling with configuring SMTP in Business Central for Exchange Online.

Let me start with reassuring you: Exchange Online still supports SMTP with basic authentication! It is not required to move to OAuth. It’s highly recommend though, but Business Central does currently not support all scenarios.

But wait… are you sure about that? Yes, I am! Here is a quote from the official documentation:

SMTP AUTH will still be available when Basic authentication is permanently disabled on October 1, 2022. The reason SMTP will still be available is that many multi-function devices such as printers and scanners can’t be updated to use modern authentication. However, we strongly encourage customers to move away from using Basic authentication with SMTP AUTH when possible. Other options for sending authenticated mail include using alternative protocols, such as the Microsoft Graph API.

The list of devices that can’t be updated to use modern authentication can also be expanded with a list of applications that don’t support modern authentication. Unfortunately, Business Central on-premise is on that list. Even when you are using the latest version. Even Business Central Online does not support all scenarios for SMTP with OAuth authentication.

I assume you know how to create email accounts in Business Central. If not, then please read this article first.

Business Central online

Let’s first see what we can do with Business Central Online. The SMTP Account card looks like:

Sending an email with these settings works fine. As long as you send emails in from a user session, or from a background session that was started under the same user account. That includes API sessions when using user delegation. The access token that is used to authenticate against Exchange Online is using delegated permissions. As a result, sending email from API sessions that are using client credentials flow (aka service-to-service authentication) do not support sending emails.

If you have an external application that calls Business Central APIs by using service-to-service authentication, then the API session will fail to send emails using the SMTP account.

The only solution for this is to fall back to basic authentication. See below how that works.

Business Central on-premise

Setting up SMTP with OAuth for on on-prem environment can be really cumbersome. You need to create an Azure app registration with specific permissions. After setting up Business Central the configuration is almost carved in stone. If something fails, e.g. the permissions were not correct, then there is no way to correct the permissions. Business Central uses refresh tokens to get a new access token and that simply doesn’t allow you to get a new access token with updated permissions. The only option would then be to recreate the Azure app registration and run page 6300 in the browser to enter the new app registration details.

So, what are the correct Azure app registration details for SMTP?

The official documentation can be found here: https://learn.microsoft.com/en-us/dynamics365/business-central/admin-how-setup-email#setting-up-email-for-business-central-on-premises. Because pictures are worth a thousand words, here is the complete process.

The first step is to open page 6300 “Azure AD App Setup Wizard” in Business Central. This page automatically opens when you set up SMTP for the first time. However, there is no action to open this page again if you need to enter new details. That’s a problem, for example when the secret expires you need to update it. And then you search left, right and center in Business Central, but you don’t find a way to update the secret. The only option is to run page 6300 directly by adding ?page=6300 to the URL.

The wizard page looks like this (after clicking on Next on the first page):

Make a copy of the Reply URL, you will need it in the next step. Open the Azure Portal and navigate to Azure Active Directory and then to App registrations. Here is a direct link: https://portal.azure.com/#view/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/~/RegisteredApps.

Click on New registration. Give your app a name, e.g. Business Central on-premise.

Under Supported account types select the second option: Accounts in any organizational directory (Any Azure AD directory – Multitenant). It is very important to select this option, otherwise it will not work!

Under Redirect URI (optional) select Web as platform and paste the copied Reply URL from the wizard page in Business Central.

Click on Register and in the next page copy the Application (client) ID. Paste this value in the Application ID field in Business Central.

Then click on Certificates & secrets in the left menu. Click on New client secret, select an expiration period and then click on Add.

Copy the value and paste it in the Key field in the wizard in Business Central.

Finally, you need to set the permissions. Click on API permissions in the left menu. The list will already contain one permission: User.Read for Microsoft Graph.

Click on Add a permission and then on Microsoft Graph and then on Delegated permissions. We need to add the following permissions:

  • offline_access
  • openid
  • SMTP.Send

The best way is to use the search box. Type in the permission name and then select the checkbox. Here is an example with the SMTP.Send permission:

Click on Add permissions to add the selected permissions to the list of configured permissions. Now we need to one other permission:

  • Office 365 Exchange Online / User.Read

This permission is a little bit hard to find. Click on Add a permission. Then go to APIs my organization uses and type in the search box Office 365 Exchange Online.

Select this API and then click on Delegated permissions. Search for the User.Read permission, select it and click on Add permissions. The list of permissions should now look like:

That’s it! The Azure app registration is ready. In the wizard in Business Central click on Next and then Finish.

The final step in Business Central is to test the SMTP mail account. This will trigger the process to request an OAuth access token. Because it is the first time, during this process you will be asked to give consent to the requested permissions. This will happen in a window that looks like:

Now we have Business Central on-premise configured in exactly the same way as Business Central online. You can send emails using SMTP from a user session or a background session that runs under the user account. But sending emails from an API session that is using service-to-service authentication is not possible. So that brings us to the final part of this blog post.

By the way, for security reasons I’d strongly recommend to block user access to page 6300 or 6301. If someone runs them in the web client then he could just get his hands on the secret. Below is a screenshot of page 6301:

Setting up SMTP with basic authentication

According to the documentation, basic authentication will still be available for SMTP. However, that does not mean it will work out of the box. There are two potential issues that we may need to solve.

Enabling SMTP basic authentication

The first issue is that Microsoft may have disabled basic authentication for SMTP for a Exchange online environment. This happens if they didn’t detect any usage of basic authentication. But we can turn it back on. This can be done for the whole organization, so all mail accounts can use it again. Or it can be switched on per individual mail account, which overrides the organization-wide setting. For more information, see this blog post under the FAQ section: https://techcommunity.microsoft.com/t5/exchange-team-blog/basic-authentication-and-exchange-online-september-2021-update/ba-p/2772210

To re-enable SMTP basic authentication follow the steps here: https://learn.microsoft.com/en-us/exchange/clients-and-mobile-in-exchange-online/authenticated-client-smtp-submission. It contains the steps for both enabling it for the whole organization or for an individual mailbox.

Working around multi-factor authentication

The second issue is called multi-factor authentication (MFA). If a user account has MFA enabled (which should be the case) then basic authentication is not going to work. The reason is that MFA requires user interaction, and basic authentication doesn’t support that.

The solution for this is to use a so-called app password. This is an option that allows creating a single password that can only be used for restricted scenarios. Read more about this feature and how to use it in this article: https://learn.microsoft.com/en-us/azure/active-directory/authentication/howto-mfa-app-passwords. It may need to be enabled, this article also explains how that works.

A step-by-step instruction to create an app password can be found here: https://support.microsoft.com/en-us/account-billing/create-app-passwords-from-the-security-info-preview-page-d8bc744a-ce3f-4d4d-89c9-eb38ab9d4137.

After creating the app password you can use it in Business Central. On the SMTP Email Account page, select Basic in the Authentication field and paste the app password in the password field.

With this configuration, it will also be possible to send emails from sessions that are running under a different user account, including API sessions that are using service-to-service authentication!

Hope this saves you a lot of time!

PS. I believe the support of OAuth for SMTP can be improved in several ways. Using basic authentication is not recommended, but we have no other option at this moment for API sessions using service-to-service authentication. And I believe that the code behind should be using Microsoft Graph instead of the old Exchange APIs. Also the way how OAuth is implemented in Business Central (not using v2.0!) should be updated.

Sales APIs returns error in a read-only request

$
0
0

This twitter message reminded me of an issue that can occur with the sales APIs in Business Central.

Apparently, the PowerBI connector is adding the header Data-Access-Intent: ReadOnly to requests when loading data from Business Central. While this is a good practice in general for GET requests, you may run into an error message with the sales APIs. This applies to salesQuotes, salesOrders and salesInvoices.

Below is an example from the salesOrders API:

If you remove the Data-Access-Intent header, then it works fine. However, after this, the Data-Access-Intent header can be added again and it still works! Below is a video that demonstrates this behavior. After reading without Data-Access-Intent: Readonly, it also works with this header enabled.

You can try this test yourself by following these steps:

  1. Call the salesOrders API without Data-Access-Intent: ReadOnly.
  2. Then turn on the header and make a second call. This will work
  3. Change the quantity on a sales order
  4. Call the API again (with the header turned on). It will fail.
  5. Go to step 1 and repeat…

So, what is going on? There is only one conclusion: sometimes the code behind the API wants to write to the database. But not always…

The reason is this little piece of code in the API page:

The function RedistributeInvoiceDiscounts:

The field “Recalculate Invoice Disc.” is automatically set to true after every change to a Sales Line that triggers an update of the amounts. It’s in the function UpdateAmounts, and also in the OnValidate trigger for Type and No.

So, after one request without ReadOnly, it works fine until a sales document is modified. Then a request with ReadWrite (the default) is required.

Let’s not dive into the codeunit “Sales – Calc Discount By Type”. It’s quite straightforward what happens there, it calculates discounts and modifies the header and/or the lines. No matter if the new amount equals the existing amount.

Why does the API try to recalculate the invoice discount amount? I can only make an educated guess here. Usually, the recalculation of the invoice discount amount is done when a document is released. When the document is still open, only the flag “Recalculate Invoice Disc.” is set on the lines. But if an open sales document is read through the API, it may contain incorrect invoice discount amounts. Most probably that’s why the API calls the function to recalculate it.

But wait… there is a line if HasWritePermission then before the call. Doesn’t that work in a ReadOnly request? Apparently not… The boolean HasWritePermission is set in the function CheckPermissions in the API:

The method WritePermission checks if a user has permission to write, including the current security filters. It does not take the current data access intent into account.

Report objects do have a method IsReadOnly to get the current report’s data access intent. The page object is currently missing this method. That’s something for Microsoft to fix for us. To help Microsoft prioritize this, I’ve created an idea: https://experience.dynamics.com/ideas/idea/?ideaid=c1342c97-994b-ed11-97b2-0003ff45cecf. Please vote!

The best option we currently have is to change the default AccessIntent of the Sales APIs to ReadWrite. This can be done on the page Database Access Intent List. The setting is not company specific, it applies to the whole environment (or database for on-prem).

Hope this helps and don’t forget to vote for the idea!


Prepare for the new invoice posting engine!

$
0
0

Introduction

In version 2021 wave 2 (v19) a new feature was introduced, called “Extend general ledger posting aggregations”.

The reason for this new feature is that the table "Invoice Post. Buffer" could not easily be extended. This table is used to buffer the entries that will be created during a posting procedure. Entries are combined (amounts are added up) based on their values in the primary key, which consists of 15 different fields. A number of localizations include modifications to this table, resulting in significant changes. This includes changes to the primary key because of certain requirements to further break down the posting entries that will be created during the posting process.

The table will be replaced by a new table "Invoice Posting Buffer" (note that this table has almost the same name!) that has a different table structure. The primary key is a Text[1000] field that by default is composed of the same fields as before. But the function BuildPrimaryKey now features an event OnAfterBuildPrimaryKey that allows extending the value in the primary key with extra values.

New Interface “Invoice Posting”

Together with the move to a new buffer table, a new interface Invoice Posting has been introduced. The codeunits "Sales-Post", "Purch.-Post" and "Serv-Documents Mgt." have been redesigned to make use of this new interface. The redesign of the code consists of moving pieces of code out of the mentioned codeunits to a new invoice posting codeunit that implements the new interface. The new posting codeunits are "Sales Post Invoice", "Purch. Post Invoice" and "Service Post Invoice".

The new invoice posting codeunits do not just copy over the code from the original posting codeunits. The code is partially redesigned as well to better align with current coding standards or to have a more logical flow. The original code in the posting codeunits has been marked as obsolete and will eventually be removed.

The interface "Invoice Posting" allows Microsoft to create localized versions of the invoice posting engine and it also allows partners to create their own invoice posting engine. It must implement the interface, but internally it may implement a different flow, add additional checks or rules, etc.

Switching to the new invoice posting engine is controlled by Feature Management. Originally, a year ago in v19, this was controlled as a setting in the setup pages of the sales, purchase and service modules. But in v21 this has been replaced by Feature Management. According to the description of the feature, the new posting routines will be enabled by default in v23 (2023 wave 2).

The posting routines contain a check to determine whether the old code (referred to as ‘legacy’) should be used or the new interface is enabled. The legacy code uses the existing table "Invoice Post. Buffer" while the implementations of the interface use the new table "Invoice Posting Buffer".

In this function we can also see that production environments can currently not use the new implementation.

WARNING 1 – No control over switching to the new invoice posting engine

Using a feature switch to enable the new invoice posting engine is quite risky in my opinion. When this feature becomes available for production environments, a customer could enable it before the code is ready. This could lead to unpredictable results and potentially incorrect postings.

The default implementation codeunit is one of the above-mentioned new invoice posting codeunits. These codeunits are configured in an enum for Invoice Posting. Here is enum "Sales Invoice Posting". The enums for Purchase and Service module are similar.

Inside the posting codeunits, code in function GetInvoicepostingSetup controls which implementation codeunit will be used:

With an event subscriber, it would be possible to overwrite the standard implementation codeunit with your own invoice posting codeunit. It is not necessary to use an enumextension for that, just a reference to a codeunit would also work. The enum seems to be a leftover from the original setting in the setup tables before it was moved to Feature Management. In the current implementation, I don’t see any added value for the usage of an enum.

But be careful! Because it is now an event instead of a setting, there can theoretically be multiple subscribers trying to pass their own invoice posting codeunit. That could again lead to unpredictable situations. It’s not very likely for this to happen, but by changing to an event instead of a setting this became an option. One scenario when this may not be so hypothetical is when Microsoft introduces localized versions of the invoice posting codeunits. When that is done with a localization app, then it will most probably be using the same event. Nobody can predict what will happen if an ISV solution is installed that wants to use its own invoice posting codeunit.

In my opinion, it would be better if the choice for Feature Management is reverted back to the original: a setting based on the selected value of an enum. You can argue that it does not belong to a setup page, but at least we will not run into the situation with multiple subscribers to such an important event. What’s more, why is this a feature that end-users should be able to switch on? Usually, an application feature adds something to the user experience. But this is a feature that will be totally hidden for the user, they will not see any difference during the posting or in the resulting entries. If there was a difference, then something is really wrong!

As I said, switching on the feature can lead to unpredictable results. Maybe it’s just a temporary feature, only for the period of 1 year. And when it will be switched on by default, the feature will be removed anyway together with the obsolete code. That will break all code currently using the obsolete object and events, which is exactly what we need because it forces us to move to the new invoice posting engine. But in case the obsolete code is here to stay a little longer, then there will be no break in the code and customers will run into trouble. If that is the plan, then I would recommend Microsoft to introduce a warning before enable this feature. Or maybe do a check on the obsolete event subscribers and verify if there are also event subscribers for the replacements.

Which leads me to events…

WARNING 2 – New home for events

The code that has been copied over to the new invoice posting codeunits also contains a lot of events. As long as the feature is not enabled, the existing events in the posting codeunits for Sales, Purchase and Service will be active. But when the switch has been made to the new invoice posting engine, then any code that depends on events in the posting routines will break.

The new invoice posting codeunits do not directly raise any event. Because an extension may implement an alternative posting invoice engine, overwriting the default posting invoice codeunit, any event subscriber would automatically break. The alternative codeunit could raise similar events, or maybe other events as well.

To mitigate this problem, new codeunits have been introduced as a central home for the events. The new posting invoice codeunits will raise the events from these central codeunits. There are three new codeunits: "Sales Post Invoice Events", "Purch. Post Invoice Events" and "Service Post Invoice Events".

The original events in the posting codeunits have been marked as obsolete with a comment ‘Moved to Sales Invoice Posting implementation’. Example:

The corresponding events can be found in the new codeunits for invoice posting events. The events are raised by calling a Run function. For example:

Results in event OnBeforePrepareLine from codeunit "Sales Post Invoice Events".

In fact, the comment ‘Moved to Sales Invoice Posting implementation’ is not entirely true. Because the code has been redesigned, including new function names, many events have also been renamed. For example, the original function FillInvoicePostingBuffer has been renamed to PrepareLine. Events from this function have also been renamed to reflect the new name of the function. Because of this, you can’t simply find new events by searching with the old name. There is even a renamed event that got the same name as a different event in the original codeunit! So be careful when you rewrite your code to make use of the new events!

Not only events have been renamed, but they can also contain different parameters. Obviously, the new buffer table is used a lot as a parameter. But sometimes other parameters are missing or new parameters have been added.

Prepare for the change

In order to prepare for the change, all code of the event subscribers for events that are subject to move to the new invoice posting engine needs to be rewritten to support the new events. But as long as the new invoice posting engine can’t be enabled in production environments, the old events must still be supported as well. We don’t know yet if there will be a period of time in which both can be used, or if it will be a big bang scenario in 2023 wave 2 (v23). But in sandbox environments, the new invoice posting engine can already be enabled. It’s highly recommended to do this and test it before the switch will be made. Mind you: on-prem environments are also recognized as production environments. Docker containers based on the sandbox image are fine.

The change does not only apply to events, any change to the table Invoice Post. Buffer must also be copied to the new table Invoice Posting Buffer. This includes any event subscriber! The events in the new table Invoice Posting Buffer look very similar, although they do have new names.

Overview of moved events

Here is an overview of the events from the "Sales-Post" codeunit and the corresponding events in the "Sales Post Invoice Events" codeunit. Including comments when the parameters are different (apart from the buffer table). The events in the Purchase and Service routines are similar.

Old eventNew eventComment
Procedure FillInvoicePostingBuffer
OnBeforeFillInvoicePostingBufferOnBeforePrepareLine
OnAfterInvoicePostingBufferAssignAmountsOnAfterInitTotalAmountsSplit into two new events
OnPrepareLineOnAfterAssignAmounts
OnFillInvoicePostingBufferOnAfterCalcInvoiceDiscountPostingOnPrepareLineOnAfterSetInvoiceDiscountPosting
OnBeforeCalcInvoiceDiscountPostingOnPrepareLineOnBeforeCalcInvoiceDiscountPosting
OnFillInvoicePostingBufferOnBeforeSetInvDiscAccountOnPrepareLineOnBeforeSetInvoiceDiscAccount
OnFillInvoicePostingBufferOnAfterSetInvDiscAccountOnPrepareLineOnAfterSetInvoiceDiscAccountParameter Sales Header removed
OnFillInvoicePostingBufferOnAfterCalcLineDiscountPostingOnPrepareLineOnAfterSetLineDiscountPosting
OnBeforeCalcLineDiscountPostingOnPrepareLineOnBeforeCalcLineDiscountPosting NOTE - there is an event in Sales Posting Invoice Events with the same name, but that is a different event!
OnFillInvoicePostingBufferOnBeforeSetLineDiscAccountOnPrepareLineOnBeforeSetLineDiscAccount
OnFillInvoicePostingBufferOnAfterSetLineDiscAccountOnPrepareLineOnAfterSetLineDiscAccount
OnFillInvoicePostingBufferOnBeforeDeferralsOnPrepareLineOnBeforeAdjustTotalAmounts
OnBeforeInvoicePostingBufferSetAmounts OnPrepareLineOnBeforeSetAmountsNew parameter: SalesLineACY: Record “Sales Line”;
New parameter: IsHandled: Boolean, if set to true then InvoicePostingBuffer.SetAmounts will be skipped.
OnAfterInvoicePostingBufferSetAmountsOnPrepareLineOnAfterSetAmounts
OnFillInvoicePostingBufferOnBeforeSetAccountOnPrepareLineOnBeforeSetAccount
OnAfterFillInvoicePostBufferOnPrepareLineOnAfterFillInvoicePostingBufferRemoved paremeter: CommitIsSuppressed
OnFillInvoicePostingBufferOnAfterUpdateInvoicePostBufferOnPrepareLineOnAfterUpdateInvoicePostingBuffer
OnBeforeFillDeferralPostingBufferOnPrepareLineOnBeforePrepareDeferralLineSalesLine not by var;
InvoicePostingBuffer not by var;
global TempInvoicePostBuffer missing;
CommitIsSuppressed renamed to SuppressCommit
OnAfterFillDeferralPostingBufferOnPrepareLineOnAfterPrepareDeferralLineSalesLine not by var;
InvoicePostingBuffer not by var;
global TempInvoicePostBuffer missing;
CommitIsSuppressed renamed to SuppressCommit
Procedure CalcInvoiceDiscountPosting
OnBeforeCalcInvoiceDiscountPostingProcedureOnBeforeCalcInvoiceDiscountPosting
OnAfterCalcInvoiceDiscountPostingNew event in Sales Post Invoice Events
Procedure CalcLineDiscountPosting
OnCalcLineDiscountPostingProcedureOnBeforeCalcLineDiscountPostingNOTE an event with the same name did exist in Sales-Post in procedure FillInvoicePostingBuffer, but this is a different event!
OnAfterCalcLineDiscountPostingNew event in Sales Post Invoice Events
Procedure GetSalesAccount
OnBeforeGetSalesAccountOnBeforeGetSalesAccount
OnAfterGetSalesAccountOnAfterGetSalesAccount
Procedure CreatePostedDeferralScheduleFromSalesDoc
OnAfterCreatePostedDeferralScheduleFromSalesDocOnAfterCreatePostedDeferralSchedule
Procedure PostInvoicePostBuffer
OnBeforePostInvoicePostBufferOnBeforePostLinesParameters removed: TotalSalesLine, TotalSalesLineLCY
OnPostInvoicePostBufferOnAfterPostSalesGLAccountsNO REPLACEMENTObsolete warning says it has been moved, but a new event is missing in the corresponding code in PostLines in codeunit Sales Post Invoice
OnPostInvoicePostBufferOnBeforeTempInvoicePostBufferDeleteAllOnPostLinesOnBeforeTempInvoicePostingBufferDeleteAllParameters GenJnlLineDocType, GenJnlLineDocNo, GenJnlLineExtDocNo, SrcCode moved to new parameter InvoicePostingParameters
Procedure PostInvoicePostBufferLine
OnPostInvoicePostBufferLineOnAfterCopyFromInvoicePostBufferOnPrepareGenJnlLineOnAfterCopyToGenJnlLineEvent has been moved to a few lines later which seems to be a better position. Possible alternatives: OnAfterCopyGenJnlLineFromSalesHeader in table “Gen. Journal Line”, OnAfterCopyToGenJnlLine in table “Invoice Posting Buffer”
OnBeforePostInvPostBufferOnPostLinesOnBeforeGenJnlLinePostRemoved var from paramter SalesHeader
CommitIsSuppressed renamed to SuppressCommit
OnAfterPostInvPostBufferOnPostLinesOnAfterGenJnlLinePostSalesHeader not by var
Parameters removed: SalesLine, GenJnlLineDocNo, GenJnlLineExtDocNo, GenJnlLineDocType
Parameter added: PreviewMode added
Parameter CommitIsSuppressed renamed SuppressCommit
Procedure InitNewLineFromInvoicePostBuffer
OnBeforeInitNewLineFromInvoicePostBufferOnBeforeInitGenJnlLine
Procedure RunGenJnlPostLine
OnBeforeRunGenJnlPostLineOnBeforeRunGenJnlPostLineParameter removed: SalesInvHeader
Parameter added: GenJnlPostLine: Codeunit “Gen. Jnl.-Post Line”
Procedure PostCustomerEntry
OnBeforeRunPostCustomerEntryOnBeforePostLedgerEntryParameters DocType, DocNo, ExtDocNo, SourceCode moved to new parameter InvoicePostingParameters
Parameter CommitIsSuppressed renamed to SuppressCommit
OnBeforePostCustomerEntryOnPostLedgerEntryOnBeforeGenJnlPostLineNew parameter: PreviewMode (Boolean)
Parameter CommitIsSuppressed renamed to SuppressCommit
OnAfterPostCustomerEntryOnPostLedgerEntryOnAfterGenJnlPostLineNew parameter: PreviewMode (Boolean)
Parameter CommitIsSuppressed renamed to SuppressCommit
Procedure PostBalancingEntry
OnPostBalancingEntryOnBeforeFindCustLedgEntryOnPostBalancingEntryOnBeforeFindCustLedgEntryParameters DocType, DocNo, ExtDocNo, SourceCode moved to new parameter InvoicePostingParameters
OnPostBalancingEntryOnAfterFindCustLedgEntryOnPostBalancingEntryOnAfterFindCustLedgEntry
OnPostBalancingEntryOnAfterInitNewLineNO REPLACEMENTObsolete warning says it has been moved, but a new event is missing in the corresponding code in PostLines in codeunit Sales Post Invoice
OnBeforePostBalancingEntryOnPostBalancingEntryOnBeforeGenJnlPostLineAdded var to parameter SalesHeader
New parameter: PreviewMode: Boolean
New parameter: GenJnlPostLine: Codeunit “Gen. Jnl.-Post Line”
Parameter CommitIsSuppressed renamed to SuppressCommit
OnAfterPostBalancingEntryOnPostBalancingEntryOnAfterGenJnlPostLineNew parameter: PreviewMode (Boolean)
Parameter CommitIsSuppressed renamed to SuppressCommit
Procedure SetAmountsForBalancingEntry
OnBeforeSetAmountsForBalancingEntryOnBeforeSetAmountsForBalancingEntry
Procedure SetApplyToDocNo
OnAfterSetApplyToDocNoOnAfterSetApplyToDocNo
Procedure CalcDeferralAmounts
OnBeforeTempDeferralLineInsertOnBeforeTempDeferralLineInsert
Other events
OnInsertInvoiceHeaderOnBeforeSetPaymentInstructionsNO REPLACEMENTEvent is not in use since v19

I hope this was useful. It was quite some work to compose the list of old and new events. In case I missed something, please let me know in the comments or contact me directly. Good luck with preparing for this breaking change!

Microsoft Dynamics 365 Business Central API v2.0 Reference

$
0
0

I’m delighted to introduce a new book I had the pleasure of co-authoring. A book that is aiming to be a full reference for all standard Business Central APIs. It contains a lot of information about APIs in general and in detail for every single API. And with details, I really mean details: not only examples of input and output but also a mapping of each field to their corresponding field on the pages in Business Central.

A big shout out and thank you to Jeremy Vyska who asked me to help write this book. Together with Philip von Bahr, he did an incredible job exploring and describing all APIs in so much detail. My contribution to the book is mainly the chapter about OAuth and background information on the Sales and Purchase document APIs.

All information about the book can be found here, including a link to buy the eBook: https://sparebrained.com/books/business-central-api-reference/

I really hope you enjoy this book!

How to get a reliable xRec

$
0
0

This tweet reminded me of a small but beautiful trick to get a reliable xRec in all the Modify trigger events.

Which resulted in this new thread:

The problem

When events were introduced in C/AL code, one of the most discussed was the OnAfterModify event. Back in 2016, Vjeko wrote an article about it: https://vjeko.com/2016/05/17/onafter-table-event-subscriber-patterns-and-antipatterns/.

There are actually two problems that play a role here. The first one is that the value of xRec can’t be trusted. The variable xRec is supposed to have the original value of the variable Rec when a record is modified. However, this is only the case when the user modified the record from the UI. When a record is modified by code with Rec.Modify(), then xRec has the current value of Rec instead of the original value.

So, when you have code like this (silly example, but it explains it pretty well), then it doesn’t work for modifies done by code:

    [EventSubscriber(ObjectType::Table, Database::Customer, OnAfterModifyEvent, '', false, false)]
    local procedure OnAfterModifyCustomer(var Rec: Record Customer; var xRec: Record Customer; RunTrigger: Boolean)
    begin
        if (Rec.Name <> xRec.Name) then
            Message('Customer name changed from %1 to %2', xRec.Name, Rec.Name);
    end;

The second problem is that the event OnAfterModifyEvent occurs after the database write. As a result, it doesn’t help to read the record again from the database, because you will then still get the updated values. I wrote an article about the order of events back in 2018: https://www.kauffmann.nl/2018/03/24/table-trigger-events-in-dynamics-365-business-central/.

The tableextension does have an OnModify trigger that may help in some situations, but not all. Consider this code:

tableextension 50100 CustomerExt extends Customer
{    
    trigger OnModify()
    var
        OldRec: Record Customer;
    begin
        SelectLatestVersion();
        OldRec.Get(Rec."No.");
        if (Rec.Name <> OldRec.Name) then
            Message('Customer name changed from %1 to %2 (from tableextension)', OldRec.Name, Rec.Name);
    end;
}

First of all, this event is only triggered when you do Modify(true). And my tests also show that it doesn’t work when the record is changed in the web client. In that case, OldRec contains the new value! I didn’t expect that, but even adding SelectLatestVersion didn’t help. I’d be glad if someone can confirm that this works on older versions, but maybe I’m totally wrong here.

Anyway, as you can see working with xRec is dangerous. So now the question is how to solve this?

The solution

I’ve seen different solutions, most of them were something like the code below. Storing the OldRec in a global variable in a single instance codeunit.

codeunit 50100 CustomerEvents
{
    SingleInstance = true;

    var
        OldRec: Record Customer;

    [EventSubscriber(ObjectType::Table, Database::Customer, OnBeforeModifyEvent, '', false, false)]
    local procedure OnBeforeModifyCustomer(var Rec: Record Customer; var xRec: Record Customer; RunTrigger: Boolean)
    begin
        SelectLatestVersion();
        OldRec.Get(Rec."No.")
    end;

    [EventSubscriber(ObjectType::Table, Database::Customer, OnAfterModifyEvent, '', false, false)]
    local procedure OnAfterModifyCustomer(var Rec: Record Customer; var xRec: Record Customer; RunTrigger: Boolean)
    begin
        if (Rec.Name <> OldRec.Name) then
            Message('Customer name changed from %1 to %2 (from event subscriber)', OldRec.Name, Rec.Name);
    end;
}

This is a working solution for this codeunit. But it is only working inside this codeunit. Other event subscribers or the triggers in the tableextension do not benefit from it.

So I would like to introduce another solution that works quite well. Credits for this solution goes to a student of one of my AL development classes who came up with it. He said: ‘Look at the xRec parameter in the OnBeforeModifyEvent. It is by var, does that mean you can change it?’. And that was spot on! The code below refreshes xRec from the database and that makes it useful for all triggers that come after it. Only other OnBeforeModifyEvent subscribers can’t rely on it because you don’t know in which order the same events are executed.

codeunit 50100 CustomerEvents
{
    [EventSubscriber(ObjectType::Table, Database::Customer, OnBeforeModifyEvent, '', false, false)]
    local procedure OnBeforeModifyCustomer(var Rec: Record Customer; var xRec: Record Customer; RunTrigger: Boolean)
    begin
        SelectLatestVersion();
        xRec.Get(xRec."No.");
    end;

    [EventSubscriber(ObjectType::Table, Database::Customer, OnAfterModifyEvent, '', false, false)]
    local procedure OnAfterModifyCustomer(var Rec: Record Customer; var xRec: Record Customer; RunTrigger: Boolean)
    begin
        if (Rec.Name <> xRec.Name) then
            Message('Customer name changed from %1 to %2 (from OnAfterModifyEvent subscriber)', xRec.Name, Rec.Name);
    end;
}

To be honest, I still consider this a dirty workaround for something that should be solved by the platform. In my opinion, xRec should be reliable in all situations.

And I also still hate the design choice that something that is called OnAfterModifyEvent occurs not immediately after the event, but after the database write. With the excuse that modify here means: ‘writing to the database’. We, as AL developers, think in terms of events, and then OnAfterModifyEvent should occur immediately after the event, and not at the end of a chain of other events. Anyway… we are used to the sometimes not so logic design in AL code. And maybe I’m the only one with this opinion, what do I know…

Viewing all 118 articles
Browse latest View live