Large data import problem

I am trying to transfer over some data from a VERY large FM system into Servoy. The most obvious way of doing this is to use the Interactive SQL tool of Sybase Central. However, I have tried this on both my Mac and PC (Servoy Server running on my Mac, latest 2.1 release), and after importing 242424 records, none of the data is actually saved into the system. I am importing a TAB delimited file with the 1st line holding the field names, so I get an error at first about that line, but am given the choice of continuing, which I do, but when it finished, the data isn’t there. I saw on the forum a mention of having to then enter a Commit, which I tried, but still nothing.
I am desperate for a solution that will allow me to import a LOT of data very quickly. Interactive SQL seems to be the best tool, but not if the data isn’t saved. I have tried doing it in Servoy, but it is just too slow, and on my Mac (preferred platform), Java unexpectedly quits during the import :!:
If someone knows of a reliable tool that we let me transfer data (either directly from FileMaker, or via export files), for Mac or PC, PLEASE let me know.

I have used Sybase Central to load some pretty large FMPro files (60,000+ records), and have also encountered the problems you describe. I eventually got it to work, but it seems that Sybase is very intolerant of alpha characters in numeric fields and other things that went under the radar of FMPro. First of all, eliminate the field names record if you can. Second, I found it easier and faster to code a script in FMPro that would test the values in the various fields and clean them up if necessary. This saves your time by catching the data errors before you get to the 242,400th record (a lesson I had learned painfully sometime around the third or fourth try).
Third, I found ISQL very fast. It had transfered the 60,000+ records in just a few minutes on a Windows 2000 server, and did not have to use a COMMIT statement.
Hope this helps

Imported 800.000+ records with sybase central on a windows XP centrino laptop with no problems. Tried Tab delimited file and DBF file. The speed was aprox 100.000 records / minute (with no index defied) or 40.000 records / minute (with 4 indexes active) !!!

I think the problem may have been to do with date from FM importing into timestamp fields not being allowed (as well as possible funny data in fields). I have managed to solve the date problem by creating a date format in FM that is almost the same as a timestamp (yyyy-mm-dd 00:00:0) and exporting layout as formatted. Now, however, when I do the import, it throws up a PK not unique error for the ID field that is defined as the PK with Servoy Seq. auto-increment attached to it.
It would seem that Servoy is not incrementing the PK when an import is done using the back-end db tools (like Interactive SQL [iSQL]).
If I go back into Servoy and change the PK to be ‘db managed’ and then into Sybase Central and change the PK to ‘default value/system defined/autoincrement’ it STILL doesn’t increment on import (even after having restarted the dbsrv9 process).
PLEASE help as this is now causing me a real delay in getting this system going as I need to import millions of records into Servoy, and I can’t do that with Servoy itself.
How did your imports above work with date fields and also with the id fields incrementing?

let me first explain how the seqences work:
-servoy sequence, servoy has a table with last value and logic to increment and insert the value in new rows
-dbsequence, database has a so called sequencer and servoy request the next sequence from the db and insert the value in new rows
-dbidentiy, database manages the seq and database manages the insert of the value, servoy only retrieves the value after the row is created.

So if you are working with multiple db entry (servoy usage and low level import for example) you need a database seq or dbidentity (in dbseq case you have to retrieve the value from sequencer somehow during import)

maybe in your case it much easier to to generate an id in your export file (when using servoy sequences, do sync all sequences via adminpage afterwards!)

OK,
I’ve worked out, with the help of this posting, what the problem is.
FM doesn’t really care (unless you make it) what is entered into a field, so in the ‘number’ fields I was importing, the FM users had entered lots of rubbish, like letters, extra decimal points etc. and of course Sybase doesn’t like importing anything other than a number into a number field.
Basically I spent two days last weekend, very tediously, cleaning up 250,000 lines of export data using BBEdit (& GREP) and eventually managed to import the data. I also had to add an extra column with the PK data (just a Serial No. from FM). It’s a REAL shame that iSQL doesn’t allow a header line with field names and do some auto-matching (and then ignore that 1st line). It wasn’t too bad with my line items data, as there were only about 25 fields to import, but when I was doing the orders file, which had about 60-70 fields, each time there was an import error, I’d have to clean it up and then re-align fields which took about 10 mins. each time!
Anyway, thanks for all the help. I now need to do some cleaning up in my FM data before I do the real import of 1.25million line items, 300,000 orders and 160,000 customers!!!
Rafi.

Cleaning data that is in FMP can be a real challenge - because FMP doesn’t enforce data types and SQL does.

Here’s a couple of FMP calcs that I have found to be helpful:

TEXT FIELDS:

Case(isEmpty(fieldName), "NULL", Substitute(Left(fieldName, dataLengthInSQL), "'", "''"))

EXAMPLE:

Case(isEmpty(firstName), "NULL", Substitute(Left(firstName, 30), "'", "''"))

This will trim the string to the correct length and substitute any single quotes (“'”) with double single quotes (“‘’”).

You can use the above in a “Replace” script (working on a BACKUP OF COURSE), or you can create a single “cleaner” FMP file and import your “raw” data into it.

Hope this helps,

Bob Cusick