Getting the head out of the Dynamics bubble, you may encounter that Dynamics NAV and Business Central do have a (less favorable) reputation: Database locks. Every customer can report at least one story about it. In an overall good system, this could be a deal breaker – that’s why Microsoft invested into this topic recently.
To be honest, my current customer is aware of this and from the beginning of our project we discussed this issue: How long does it take to import XXX records? How is the user experience? What can we do to minimize the problem?
As you may can imagine, it’s quite hard to come up with some numbers that won’t backfire in 6 months. I also tend to value the relationship to my customer more than to defend a product. That’s why I did talk straight from the beginning that we might face an issue, but we can minimize the problem by selecting hardware and a network architecture that is fit for purpose. Fortunately, my customer’s overall confidence about the product is bigger then the concerns about this potential problem – otherwise, I wouldn’t writing about now …
The good news…
… are that Microsoft actually did invest into this problem and released their optimization with the Spring ’19 release of Business Central. As you can read here the product team in Copenhagen tackled some problems. The release notes do sound like music to my ears:
- Analysis of long running SQL queries yielded key optimizations, and unexpected locking issues were discovered and removed.
- Locking issues in SIFT indexes on the Sales- and Purchase Line tables have been fixed.
- In the Edit Sales and Purchase Documents pages, recalculation of all lines for every field has been removed to speed up data entry.
As you can read, Microsoft did not release a universal solution to the problem. It’s because database locks are managed by the SQL Server. Therefor, to reduce database locks you must smarten up the way Business Central communicates with the SQL back-end. Which is not only true for the Sales Document & Purchase Document tables, but also for any kind of Extension. As you may remember: sh*t in, sh*t out …
So what are the results of this optimizations expressed in numbers? During my recent developments for my customer, I was able to generate some numbers. I have created a rather simple import: take an XML file and import the content into a selected table. I’m utilizing the Data Exchange Framework for this. The templates re configured to import into the Intermediate Data Import-table first and then I have a Codeunit which distributes the data into the selected Fields using RecordRefs and FieldRefs. If you are interested in the technical details, just drop me a message.
The good thing about this is, that I can import into different tables with the exact same code. Only variable that differs is the number of fields per record. Below table shows my findings of importing into the Sales Header & Sales Line tables:
|Batch||Records in Sales Header||Total Fields (21 Fields per SH record)||Records in Sales Line||Total Fields (10 Fields per SL record)||Total Fields per Batch||Total Import Time||Avg. duration per Field|
|#1||384||8064||1009||10090||18154||1 min 54 sec||6.28 ms|
|#2||378||7938||1011||10110||18048||1 min 53 sec||6.26 ms|
|#3||363||7623||1026||10260||17883||2 min 4 sec||6.93 ms|
|#4||387||8127||951||9510||17637||1 min 56 sec||6.58 ms|
Creating appr. 380 orders (with an avg. of 10 lines each) in appr. 2 minutes is quite a good result. Especially, when considering the fact that this test did run on my laptop with SQLExpress (see notes below).
And the conclusion is…
.. that (at least for Sales & Purchase documents) problematic database locks might be a story of the past.
I think with the right hardware in the background, Business Central can be scaled to a nice engine that is able to pass through vast amounts of data without interrupting other areas of the system. Of course, when comparing these numbers with dedicated systems for other applications like deep learning, they are not competitive. But, in the world of ERP they are.
By the way, as I wrote about getting the head out of the Dynamics bubble: I did hear recently that NetSuites needs appr. 3 seconds per field when importing data…
UPDATE: NetSutes needs 3 seconds per field when importing products. When importing Orders, it is necessary calculate with 5-10 seconds per order.
– All imports ran in a docker container locally on my laptop.
– I’m using a database with Cronus data.
– I did not use my laptop during the processing of imports. (When using my laptop the import times vary very strong due to limited resources.)
– I’m using a SQL Express server with a max of 1 GB RAM and 20% of the CPU
– The imports were executed through the Job Queue in a background process.
– For each field the OnValidate-Trigger was executed.