Inserting a new Record (bulk) inside transactions?

Hi all,

Is a raw sql insert possible during a transaction?

Is there a standard way to do this? I’ve not found a good reference, particularly with Servoy.

Updating many records incurs a large penalty due to the updates to the foundset during the updates. Bulk update using raw sql would work best, but there always seem to be gotchas and examples are singular in nature. The examples usually show the base case, not casting or other potential issues.

I’ve attempted an insertion and get the following error. Removed the UUID and

Raw SQL message org.postgresql.util.PSQLException:

Can’t infer the SQL type to use for an instance of com.servoy.j2db.util.UUID. Use setObject() with an explicit Types value to specify the type to use. Can’t infer the SQL type to use for an instance of com.servoy.j2db.util.UUID. Use setObject() with an explicit Types value to specify the type to use.
sql query args [FS-53199859-D71F-11E5-8AFE-A292F4137E41,FS-546B3823-D71F-11E5-8AFE-A292F4137E41,B813A8C6-03A4-4659-9A3D-AE3C7F031BBE,1564947969533,1564947969000]
INSERT INTO import_guids (assem_guid,part_guid,import_table_id,edit_date,modification_date) VALUES (?,?,?,?,?)

Removing the GUID reference results in a different error:

Raw SQL message org.postgresql.util.PSQLException: ERROR: column “edit_date” is of type timestamp without time zone but expression is of type double precision
Hint: You will need to rewrite or cast the expression.
Position: 91 ERROR: column “edit_date” is of type timestamp without time zone but expression is of type double precision
Hint: You will need to rewrite or cast the expression.
Position: 91
sql query args [FS-53199859-D71F-11E5-8AFE-A292F4137E41,FS-546B3823-D71F-11E5-8AFE-A292F4137E41,1564948086436,1564948086000]
INSERT INTO import_guids (assem_guid,part_guid,edit_date,modification_date) VALUES (?,?,?,?)

thanks,
–Joe.

Hi Joe,

joe26:
Is a raw sql insert possible during a transaction?

Yes, it can.
Pls make sure that the time between opening and closing of a transaction is as short as possible.
For example, using found sets:

  1. turn autosave off
  2. make all of the changes you want
  3. open transaction
  4. call databaseManager.saveData()
  5. close transaction
  6. turn autosave back on

As [2] will take up most of the time, you can get around that by turning off the autosave first and then save all of the changes at once using transaction and saveData > [3],[4],[5]
Of course, the above need more handling in case the saveData or transaction fails, you’ll need that anyway.

If you’re using rawSQL to update: this can be done, just make sure to call plugins.flushAllClientsCache(serverName, tableName) afterwards, for each table you touched.
Also be aware that, although the actual insert/update statement is ‘cheap’, this flushAllClientsCache is expensive. You can’t get around it, since otherwise the cached found sets of clients will miss out on the changes you did.
(rawSQL statements are executed sort of directly to the database, Servoy does not handle the consequences of these actions as it does when modifying foundsets)

On the other hand, for updating large number of records, you can have a look at the Servoy foundset updater (databaseManager.getFoundSetUpdater).
Although functionality is limited to a couple of use cases, it might apply to yours.

Concerning the errors:
I believe you do pass a UUID object rather than the string ( application.getUUID().toString() ), but while using rawSQL: why don’t you use the postgres UUID extension to generate the UUID?
To install:

CREATE EXTENSION IF NOT EXISTS "uuid-ossp";

To execute:

SELECT uuid_generate_v4();

For the date, it looks like your passing a number rather than a date.

Hope this helps

Marc,

Some excellent information I will use. I did not realize that a databaseManager.saveData() was required for a transaction, seems redundant.

I’m updating a table external to a data import. The UUID with the error comes from the new data import. Each record automatically gets its own UUID as a PK.
The import can take a long time while it is dependent upon the import file, which 2.4Mb of text translating into table entries.

Unless the actual PK that Servoy generates … omg. Yeah, that is likely what is happening since I’m NOT going through Servoy with the PK. Sounds reasonable actually.
So much that Servoy does, that one forgets to do it themselves outside the environment.

With the errors, I was unsure what date to pass the new record.

Onward, thanks!
–Joe.

joe26:
Some excellent information I will use. I did not realize that a databaseManager.saveData() was required for a transaction, seems redundant.

It isn’t required, this is just an example when using a Servoy foundset.
The reason to handle it this way is, when starting a transaction Servoy will claim one database connection for that particular client.
So if you leave a transaction open for 60 seconds, no one else can use that database connection during this period.
If you tell Servoy not to automatically save changes to records, then make changes, then start your transaction and call saveData(), finally close the transaction and turn autosave back on, this will result in having a short amount of time needed for the actual transaction.

joe26:
The import can take a long time while it is dependent upon the import file, which 2.4Mb of text translating into table entries.

I don’t know what method you use to read the file, but I can recommend using a java file stream, able to read the file line by line.
This is faster than loading the whole file in memory and doing the same thing.

Besides: if this will take a while, a transaction might not be the right thing to do at this particular moment.
Think about pre-processing the data into a temporary table. Whenever that was successful, insert the content into the final tables while using a transaction.
Result the transaction will only be needed for a very short time, since copying the pre-processed data is very fast using rawSQL.

joe26:
Unless the actual PK that Servoy generates … omg. Yeah, that is likely what is happening since I’m NOT going through Servoy with the PK.

If you’re using Servoy sequences to generate your pk’s, inserting data using rawSQL will not be possible, as you can not guarantee a unique PK.
As you are using PostgreSQL, it’s easy to move to PG sequences having the database generate your PK’s
It will also save you the effort of re-sequencing after setting up a new repository, or restoring a database.

Thanks, Marc.

If you’re using Servoy sequences to generate your pk’s, inserting data using rawSQL will not be possible, as you can not guarantee a unique PK.

Actually using GUIDs. But since using rawSQL, I’m not even creating a primary key which is a GUID.

I don’t know what method you use to read the file, but I can recommend using a java file stream, able to read the file line by line.

Is the java file stream already a Servoy plugin or is it on ServoyForge?

Much appreciated!
–Joe.

Hi Joe,

see file stream code below

function readFile(_oFile) {
    //
    // Use BufferedReader so we don't have to read the whole file into memory
    //
    var _oFR = new Packages.java.io.FileInputStream(_oFile),
        _oIR = new Packages.java.io.InputStreamReader(_oFR, "UTF8"),
        _oBR = new Packages.java.io.BufferedReader(_oIR),
        _sLine = "dummy",
        _nReadLine = 0;
    
    try {
        while (_sLine) {
            _sLine = _oBR.readLine();
            _nReadLine++;
    
            if (_sLine) {
    
                // Put your processing code here
            }
        }
    
        //
        //do NOT forget this close! to prevent memory leaks
        //
        _oBR.close();
    
    } catch (_oErr) {
        _oBR.close();
        application.output("ERROR: " + _oFile.getName() + " at row " + _nReadLine, LOGGINGLEVEL.ERROR);
        application.output("ERROR: " + _oErr, LOGGINGLEVEL.ERROR);
        return; // stop process
    }
    
    //
    // garbage collection
    //
    _oFR = null;
    _oIR = null;
    _oBR = null;
}

Awesome!

Thanks, Marc!