Slow Performance

Questions and answers on designing your Servoy solutions, database modelling and other 'how do I do this' that don't fit in any of the other categories

Slow Performance

Postby ashutoslenka426 » Tue Jul 12, 2016 8:20 am

Hi All ,

I have some 1,14,000 records in the db . I have doing this operation in the headless client . I am looping through the foundset , finding some duplicate records on some search criteria . Moving some related records of the duplicate records to the archived tables . Then merging the related records to the Original (Master) record . This operation is running very slow . This is taking around 48 hours to complete the operation . I am fragmenting the code to notice what actually is making slow . Now I am in Record with pk - 23305 . This record took 101.628 seconds . The majority of the time is consumed by the functionalities for moving the records to the archived tables . This is 25 secs and 24 secs respectively . I am doing here databaseManager.copyMatchingFields() and databaseManager(recObj) operations . Why it is taking too much time ? . One thing i have noticed it is initially fast but latter gradually it became slow . What may be the cause ? . Please provide some suggestion on this .
AL
ashutoslenka426
 
Posts: 295
Joined: Thu Jan 26, 2012 3:38 pm

Re: Slow Performance

Postby paronne » Tue Jul 12, 2016 1:37 pm

Hi, if you need to loop over such big foundset you should not just iterate all the records in foundset but you should do it in bulks. When you iterate over too many records ( thousands ) the time to iterate over the records it increase during the iteration, that's why you have the feeling that at the beginning is fast and then gets slower.

Instead of loading all records in foundset load them in chunks of 1000. Your foundset must have a sortable index to such action because you should load the next 1000 records with index > last index read. In this way the iteration time remains linear.
The saveData is also called for each chunk instead of each record.

The pseudo-algorithm would be:
Code: Select all
//load 1000 records on each iteration:
var index;
foundset.loadRecords(‘select pk order by idx limit 1000’)
do {
var count = databaseManager.getFoundsetCount(foundset)
foundset.getRecord(count)
for(var i = 1; i < foundset.getSize(); i++) {
    var record= foundset.getRecord(i)
    if(i%200 ==0) databaseManager.saveData()
    index= record.idx
}
} while (foundset.loadRecords(‘select pk where idx > index order by idx limit 1000’).getSize())


Also if you create/duplicate records in another foundset, you should make sure to clean the foundset once in a while; since you don't need previously created records clear the foundset each 1000. records created.

Using this pattern you can improve the performance 5 to 10 time.
Still for bulk updates foundset iteration is not a suitable approach. To get good performance the best approach is to iterate over bulks of dataset (similar approach as before but using dataset instead of foundset) and perform insert/update using the plugins.rawSQL. Since the updates made with rawSQL are not broadcasted via the application server you should then force all foundset (affected by the update) to be refreshed.
paronne
 
Posts: 202
Joined: Fri Nov 02, 2012 3:21 pm

Re: Slow Performance

Postby ashutoslenka426 » Tue Jul 12, 2016 3:07 pm

Thanks for your reply.
AL
ashutoslenka426
 
Posts: 295
Joined: Thu Jan 26, 2012 3:38 pm


Return to Programming with Servoy

Who is online

Users browsing this forum: No registered users and 8 guests