Progressive time to save

I’m in the process of setting up a demo website for a new database solution which should allow users to mess with a set of working records. Once an hour they are bumped off, the working records stripped out and a fresh identical set is installed ready for service again.

In order to accomplish this an identical routine runs through 13 tables. The working records for that table are found and deleted. Then the masters are found. A loop loads each record in turn into a second form. The second form duplicates the record, resets the primary key, returning control to the loop in the first form. And so on to completion.

Most tables have only a handful of records, so the routine runs quickly. But one of them has exactly 400 records. Not only is it slower, it’s progressively slower each hour by several seconds. What starts out taking well under a minute builds gradually to several minutes to accomplish the same process.

I’ve set time traps at various points in the routine. Everything is lightning fast – until the save data call at the end of the loop. It’s at this one point the time to save takes longer and longer and longer.

Anyone have an idea what’s going on here?

This is Servoy 2.2.4 with Sybase ASA.

could you post the code that is going on in the table with the 400 records? the processing of 400 records can hardly ever take several minutes, whatever blobs or heavy calcs you have there. this should take seconds, not minutes.

Here’s some sample through times:

1st pass – 33 seconds
2nd pass – 40 seconds
3rd pass – 50 seconds
4th pass – 59 seconds
5th pass – 1 minute 9 seconds
6th pass – 1 miniute 18 seconds
7th pass – 1 minute 29 seconds

The time to complete is increasing by about 10 seconds with each pass. Not acceptible when this routine will run hourly for days at a time.

Note what happens when I process the other 12 tables collectively, averaging 15 to 40 records each. There’s a slow incremental processing increase. Why? I haven’t any idea.

1st pass – 13 seconds
2nd pass – 6 seconds
3rd pass – 7 seconds
4th pass – 7 seconds
5th pass – 7 seconds
6th pass – 8 seconds
7th pass – 8 seconds
8th pass – 8 seconds
9th pass – 9 seconds
10th pass – 10 seconds
11th pass – 11 seconds

Clearly there’s a residue from each hourly pass. What that could be and how to clear it I have no idea.

The code for processing each table is identical.

forms.catitemsUtility.resetDemos

controller.find();
sev_id = 2;
controller.search();
var vSize = foundset.getSize();
if ( vSize > 0 )
{
	controller.deleteAllRecords();
	controller.saveData();
	
	controller.find();
	sev_id = 1;
	controller.search();
	
	var vSize = foundset.getSize();

	if ( vSize > 0 )
	{
		for ( var i = 1 ; i <= foundset.getSize(); i++ ) // do a full loop of all records
		{
			var record = foundset.getRecord(i);			
			var vName = record.catkey;
			var vCatitemid = record.catitemid;
			forms.catitemUtility2.controller.loadRecords(record.catitemid$catitem_to_catitem);
			forms.catitemUtility2.makeDuplicate();
		}
		controller.saveData();
	}
}

===================================================

forms.catitemUtility2.makeDuplicate

var vSize = foundset.getSize();
var vName = catkey;
controller.duplicateRecord();
catitemid = demoid; // overwrite the primary key
sev_id = 2;

Hi Morley,

just guessing. Maybe your database tables have lots of empty space from previously deleted records?

in FileMaker we did a “save as compressed” now and then,
postgreSQL has its vacuum…

Not sure about Sybase…

swingman:
Hi Morley,

just guessing. Maybe your database tables have lots of empty space from previously deleted records?

in FileMaker we did a “save as compressed” now and then,
postgreSQL has its vacuum…

Not sure about Sybase…

In Sybase Central I use the “Unload a Database” wizard to “Unload and reload into a new database”. The resulting new database is much smaller (does not contain the “empty space”). Right-click the database from within Sybase Central to access the “Unload a Database” wizard.

Westy:
In Sybase Central I use the “Unload a Database” wizard to “Unload and reload into a new database”. The resulting new database is much smaller (does not contain the “empty space”). Right-click the database from within Sybase Central to access the “Unload a Database” wizard.

Westy, you may have identified the nub of the problem and a possible route to a solution.

This is a solution demo. Every hour I’m dumping the set of demo records and replacing them with fresh copies from masters, then re-opening the playground.

This hourly routine is run from a headless client via the batch processor. If automated unloading and reloading the database were possible, would this not also interfer with the Servoy repository and the operation of the Servoy Server?

I just don’t know. Just trying to think this through.

Christian speculated the database may need to be compressed in the manner of FMP. Is this possible/available?

Morley,

Any routine like this which really only involves the backend database I would always, always process via sql directly, either within Servoy with the RawSQL plugin or with a back end tool like Aqua Data Studio. In this case I would do it with Servoy because it sounds like you want to run it on the hour and that might be the best/easiest way for you to schedule it.

Looking at your code, there is one thing that sticks out to me if you do want to do this just using the usual methods without getting into the RawSQL and that is the ‘foundset.getRecord(i)’. I kind of forget but I think that is the one that actually invokes/brings the UI into play. Isn’t it foundset.getSelectedIndex(i) that just actually ‘finds’ the record without actually showing it? If that’s the case that would make a lot of sense that it would progressively slow down.

John