This post is my shortlist of the most common beginners' mistakes in AL that affect application performance. These are not necessarily the worst performance killers - selecting records from a ledger table with a filter on a field without an index could be worse that all my top 5 combined. This are rather a few things that I've run into most frequently in the past years.
1. Checking if a record is empty before calling FindSet on this record.
if SomeRecord.IsEmpty() then
exit;
SomeRecord.FindSet();
repeat
until SomeRecord.Next() = 0;
No need to overcomplicate things - IsEmpty does not add value in this code - unless the subsequent FindSet is called with the ForUpdate switch, or the ReadIsolation property on the SomeRecord variable is set to UpdLock or RepeatableRead. If the transaction isolation is higher than ReadCommitted, this additional check can prevent a long-living intent lock on the table. And only if your code is running in an on-premises environment. Otherwise - just drop this line, it doesn't help.
Better way to do the same
Just call FindSet straight away.
if SomeRecord.FindSet() then
repeat
until SomeRecord.Next() = 0;
Note
There are still some cases when IsEmpty before FindSet can be useful to fine-tune performance in specific scenarios in on-prem configurations. More on this in one of the coming posts.
2. Calling FindSet before DeleteAll
SomeRecord.SetFilter(SomeField, FilterValue);
if SomeRecord.FindSet() then
SomeRecord.DeleteAll();
When writing code in AL language, we can choose one of the two statements to delete database records. We can either delete one record at a time with Delete, or call DeleteAll to delete a whole batch with a single query. On the SQL side, there is practically no difference between them - both are translated into the same query:
DELETE FROM <TableName> WHERE <Conditions>
The subtle difference exists only in the condition part of the query - that clause which defines what exactly will be deleted. Delete always sets the selection conditions on the primary key, so that there is only one record satisfying the criteria. And these conditions come from the primary key fields of the current record.
DeleteAll, on the other hand, does not care about the current record (or if the recordset is received from the database or not). Record filters is all it takes into account when preparing the query.
Better way to do the same
Apply all required filters and delete.
SomeRecord.SetFilter(SomeField, FilterValue);
SomeRecord.DeleteAll();
Note 1
If the filters are applied on the primary key, DeleteAll will delete a single record, just like Delete does. And without reading the record from the table.
Note 2
Sometimes in BaseApp code, you can find IsEmpty being invoked before DeleteAll to check is there are any records to be deleted. Similar to the first example, it can be beneficial for some scenarios involving concurrent updates, but more on this topic in the following posts.
3. Attempting to insert a record to check if it already exists
InitializeRecord(SomeRecord);
if not SomeRecord.Insert() then
SomeRecord.Modify();
First of all, even if all inserts are successful and no duplicated primary keys are encountered, this code structure prevents buffered inserts, knocking out a good piece of BC server performance optimisation.
Secondly, if a duplication is found, this insert attempt raises a SQL error which is then sent back to Business Central where it can be intercepted and analysed. Also not the fastest and most elegant way of ensuring uniqueness of the key values.
The best way to avoid this structure is to make sure that the record you want to insert is unique - use number series or an incremental primary key. Sometimes the data is received from an external source and can contain records which already exist in BC, and these records must be updated. In this case IsEmpty is cleaner and more efficient method of probing the database.
Better way to do the same
Try to restructure your code and data to ensure uniqueness of the records upfront, or use IsEmpty.
InitializeRecord(SomeRecord);
SomeRecord.SetRecFilter();
if SomeRecord.IsEmpty() then
SomeRecord.Insert()
else
SomeRecord.Modify();
4. Calculating totals in a loop
SomeRecord.SetFilter(SomeField, FilterValue);
if SomeRecord.FindSet() then
repeat
TotalAmount += SomeRecord.DecimalField;
until SomeRecord.Next() = 0;
CalcSums function does the same much more efficiently. Use it!
CalcSums is more efficient because it shifts all calculation to the SQL server. Instead of selecting the whole recordset to iterate on all records, CalcSums, allows the database engine to do the calculation and return a single numeric value in the response.
Better way to do the same
Use CalcSums
SomeRecord.SetFilter(SomeField, FilterValue);
SomeRecord.CalcSums(DecimalField);
TotalAmount := SomeRecord.DecimalField;
5. Calculating FlowFields' values in each iteration of a loop
SomeRecord.SetFilter(SomeField, FilterValue);
if SomeRecord.FindSet() then
repeat
SomeRecord.CalcFields(DecimalField);
TotalAmount += SomeRecord.DecimalField;
until SomeRecord.Next() = 0;
Remember that each call of CalcFields sends a separate query to the SQL server. If your code iterates on a recordset, and some flowfields must be calculated on each iteration, there is a more efficient way of doing this. SetAutoCalcFields does not do any calculations itself, but adds these fields to the selection list of the data access provider. The following FindSet will include the calculation of the selected flowfields into the same SQL query that retrieves the table records. Reduced number of database queries has a positive effect on performance.
Better way to do the same
Call SetAutoCalcFields before the loop to select the fields that must be calculated.
SomeRecord.SetFilter(SomeField, FilterValue);
SomeRecord.SetAutoCalcFields(DecimalField);
if SomeRecord.FindSet() then
repeat
TotalAmount += SomeRecord.DecimalField;
until SomeRecord.Next() = 0;