How to ask a batch processor for data?

Hi all,

I’m on Servoy 5.1.x and have a batch processor/headless client running. Each client refreshes a live sales data display every minute – this is a bit inefficient as the number of clients grow.

So I’d like the batch processor/headless client to gather the live sales data every minute and store in global variables or some other way.
Then the Servoy smart clients can ask for the sales data from the headless client. How do I do this? It looks to me like the headless client plugin creates a new headless client rather than connect to an exsisting one, or have I misunderstood?

Don’t know if it’s possible to share data via a global variable (don’t think so as the global is still private to the client).
But why not trying to use the batch processor to modify every minute the data read by the clients (the batch processor is doing all the complex queries once and the clients just make a single simple one). But it might not improve that much the performance if your batch processor queries are not very complex. It depends also how many clients you have.

I can create a table with just one record and have the batch processor update it. Then I can read that single record periodically from the clients. This may be the simplest way of doing this.

I have to read it using a scheduled script since I’m setting the StatusText in the application. It would be nice to do application.StatusText(‘%%sales_summary%%’); and have it update itself, but that does not work.

Using globals, I was thinking of using globals on the batch processor only. I would need to use something like the headless client plugin to get the values transferred to the actual clients.

Am I correct to assume that each smart client is doing the logic to compile the sales data every minute, and that’s where the inefficiency is?

Globals local to the scope of the client that initialized them and this is a good thing, you don’t want to deal with thread safety. You are correct that the headless client plugin can start up a new HC, but it can also connect to an existing one by specifying its UUID with .getClient().

However a solution that may be viable for you is have a batch processor run the logic once a min and put the sales data into a table that the clients have foundsets attached to, then when the data is updated via that batch processor (HC) they get it via a data broadcast and not querying the headless client.

swingman:
Using globals, I was thinking of using globals on the batch processor only. I would need to use something like the headless client plugin to get the values transferred to the actual clients.

Look at JSClient.getDataProviderValue()

Although what your describing is a polling based system, which if your worried about scale already is not the right path to go down.

ryanparrish:
Am I correct to assume that each smart client is doing the logic to compile the sales data every minute, and that’s where the inefficiency is?

Yes, I have 20 clients doing this.

ryanparrish:
However a solution that may be viable for you is have a batch processor run the logic once a min and put the sales data into a table that the clients have foundsets attached to, then when the data is updated via that batch processor (HC) they get it via a data broadcast and not querying the headless client.

This would work well, except that I have to display the data on a form instead of in the application.statusText().

This opens another possibility: When a sales transaction is created/updated/deleted I can use table events to set a flag that the sales summary is invalid and use the batch processor to check this flag more frequently but update the sales data only when needed.

Perhaps keep a foundset of the sales data table that I mentioned in a global that get initialized onSolutionOpen(), that way it will get the data broadcasts. Then when you call application.statusText() at your regular interval use that global as your source of the data and then the client it will be pulling locally cached data.

The event idea sounds good, just be mindful of how your table that has sales transactions is updated as you may run into an issue where the time it takes to process the data is longer than the update interval and things start to stack up.