Importing large record sets, 2 Qs

Hi Guyz n Galz:

I’m looking at using Servoy to build a specialized Contacts solution for a PR workgroup that frequently downloads thousands of rows of contacts from a web service they subscribe to.

I understand that importing via the Servoy interface is not recommended beyond a couple thousand rows, and I understand why. The suggestion is to import on the back end.

But this solution is for a small organization with no resident SQL-geek, only an outsourced consultant that handles their Exchange, file & db servers. I’m hoping someone here can help me answer this since I’m in an evaluation-only phase and don’t have time to read through the manuals (apologies!)

Q1: What is the easiest approach one can provide (i.e., most friendly to the non-tecnichal enduser) for importing large record sets? The simplest I could think of would be to have them upload to a file server that the back-end db server can also access, have them enter the path/filename in a global, and then call a stored procedure.

Would this work?

I don’t consider it very smooth or friendly compared to importing directly via the GUI such as in FMP or Access … is there an easier way (again, for the user, I don’t care how complex or easy the development end is, I only have to do that once)?

Q2: There is a chance that the optimal architecture of the solution will call for combining columns from two different back-end server environments into some of the front-end forms (one thing Servoy really seems to make EASY! major reason I’m looking at it). This, of course, will complicate said import immensely, no? It would be so nice to just do it via the Servoy Client app and have Servoy send the new rows to the different back ends.

– Is it possible to add rows using a form that includes columns from multiple tables? (If this is possible, WOW.)

– Any suggestions on how to handle importing 10K-20K rows from a single source when the target columns are on different servers? Again, this needs to be as automated as possible on the front end.

TIA for any suggestions and info!

gaBbY

“Living is entirely too time-consuming. Today if you’re not confused you’re just not thinking clearly.” (Irene Peter)

Hello gaBbY,

I think 10K -20K rows is not too much. In that case importing can be done easily via the GUI.

I have an application running that imports between 2K - 30K rows a day via a GUI and does some converting too…

Importing from multiple sources and to multiple sources is possible if you know how to do it :) So I would say ‘GO AHEAD’ and use Servoy for the job.

How to do this all however is a little bit harder than just answering you can do it. It is not too difficult but there is more than one way that leads to Rome…

Agree with IT2BE that your import requirements should be fine with Servoy. For context I have imported just over 39,000 rows several times and takes less than 10mins using a medium spec laptop.

Great thing about Servoy is that you can build a simple test solution very quickly then experiment with your import requirements. Use the F1/Help system for quick reference

Regards

Graham Greensall
Worxinfo Ltd

Thank you gentlemen! Indeed, I will just test. (Or, at least I will test with a single back-end db). The data being imported will just be several col’s wide (small columns), so I would imagine things would be reasonably fast (especially since the application will be used only over the LAN).

Re reading the Help – which is really good!! – that is the reason I posted here. The help says to import using a back-end tool if one is importing more than 5,000 rows. Perhaps that’s just to play it safe since some people might have enough text in a field to fill a novel.

Servoy is certainly a very interesting tool – whether or not I use it (or client approves it) for this job, I think I will continue to spend some time with it. Look for my next “newbie” question coming to a forum near you soon!

Thanks again,

gaBbY