Downloading Web data

Hi,

I want to download some online data into Servoy.

For eg: we have a website where users enter comments which is stored in MySql database.

When I trigger a method from Servoy the above data has to be imported into Servoy (Sybase)

I think the following procedure will work but want to check whether I’m doing right

  1. Click a button in Servoy called “Download Web data”

  2. This will call a php script on the webserver

  3. This php script will find all the new data some how and export that data as a tab seperated and store the tab file in a web directory

  4. Servoy then continues to download the exported file to the local machine using FTP bean (I already have one)

  5. The file has to be read by Servoy and by converting into an array, after few checkings these data will be imported into Servoy

Is there any easy and intelligent way of doing this like exporting to xml format or something?

Please drop some ideas

Thanks

Can’t you hook Servoy straight into the MYSQL from your website? If so, you could just run a method to create new records in Sybase with the array data from MYSQL.

I know this already and implemented this in our another solution.

But we found several issues when the web server is remote.

Praticularly in my current case, we are distributing a product which is linked to websites. The product is a standalone version created by runtime builder.

Since this product handles lot of heavy media content and also mostly the user will be using offline, I cannot directly link the web database in Servoy.

Pls see this thread we discussed some time ago.
http://forum.servoy.com/viewtopic.php?t=3423&highlight=

So we do not prefer the above method. Thanks a lot for your suggestion..

Any other idea would be highly appreciated.

Thanks

You can simply use the http plugin “getPageData(url)”. This will the url of your PHP page - and the function will return the CODE of the page that comes back as a result of your php page.

You can format the page that comes back with your php page to contain the data you want to parse.

Done.

No need for FTP, etc. :D

Will this URL be indexed by search engines or not?

If indexed it could be available in the search results and I think we do not want anyone accidentally click the link and read the result data.

Thanks

Hi Hameed,

Whether or not it gets indexed is up to you! You can put a robots.txt file on your web server to tell the indexes what to index and what not to index.

(You can Google “robots.txt” for more information).

Hope this helps.

The Web Robots Pages