Handling threads and remote stream

Home for older / inactive topics

Handling threads and remote stream

Postby ptalbot » Mon Jun 07, 2010 3:19 pm

About handling remote file stream, you (Johan) gave us a few interesting pointers.
Using a FileTransfer object, whose reference would be held on the server side in a ConcurrentHashMap and pushing bytes repeatedly by chunks (always using the reference key), will work of course.

Still I'm wondering what happens when x clients are requesting the same method at the same time...
If the method takes some time to process, wouldn't it be better to return immediately and have a thread per client (preferably in a pool) that would handle the long process? Because otherwise I guess that since file handling over the internet can potentially take some time to process, (for example if one client is pushing a big file), the others will have to wait or will get a timeout (one or the other? that's what I wanted to know when I asked if you were queuing the calls).

The thing is that I already wrote an implementation, where I was managing a Thread pool in the plugin Remote service on the server, but this worked because each thread was receiving its own RemoteInputStream and was able to return one result object. I'm handling threads on the server side because the plugin is designed to handle lots of documents sent by concurrent users. This worked well because the client and the thread were directly connected to each other (using the Remote Stream) once the process has started. Now that I know that RemoteInputStream is not an option since it will not use the tunnel, I'm looking for alternatives.

With your idea, the client has to repeatedly call the server to push more bytes, but I don't know how I can do that with threads...
Because the client will always need to call the Remote service (that you have exported) which is the unique entry point to the thread pool, but I don't want this entry point (the server plugin) to be the one doing the actual work. I would like the thread to be doing the process. But somehow I have trouble merging the two ideas together.

Any thoughts?
Patrick Talbot
Freelance - Open Source - Servoy Valued Professional
https://www.servoyforge.net
Velocity rules! If you don't use it, you don't know what you're missing!
User avatar
ptalbot
 
Posts: 1654
Joined: Wed Mar 11, 2009 5:13 am
Location: Montreal, QC

Re: Handling threads and remote stream

Postby jcompagner » Mon Jun 07, 2010 4:51 pm

The thread that does the job runs in the client not on the server...
(on the server you also have threads but those are handled and maintained by rmi)

this is the method in the client that scripting can call:

Code: Select all
js_writeFileToServer(final File fileToStream, final String filenameOnServer, Function callback)
{
     final FunctionDefinition fd = new FunctionDefinition(callback)
      application.getExecutor().execute(new Runnable()
     {
               public void run()
               {
                  IClientPluginAccess access = plugin.getClientPluginAccess();
                  IFileTransfer fileTransfer = (IFileTransfer)access.getServerService(IFileTransfer.SERVICE_NAME);

                  UUID uuid= fileTransfer.initWriteFileTransfer(filenameOnServer)
                  bytes[] bytes = new byte[8000];
                  // read in the first bytes
                  FileInputStream fileinputstream = new FileInputStream(fileToStream)
                  int read = fileinpustream.read(bytes)
                  while (read != -1)
                  {
                      fileTransfer.write(uuid, bytes,0,read)
                     read = fileinpustream.read(bytes)
                  }
               fileTransfer.close(uuid);
                 fd.execute(...)
             }
     }
}


so thats the client code
Then on the server you have

Code: Select all
class FileTransfer implements RemoteInterface
{
  public UUID initWriteFileTransfer(filenameOnServer)
  {
         FileTransferObject fileTransferObject = new FileTransferObject()
         UUID uuid = new UUID();
         concurrentHashmap.put(uuid,fileTransferObject );
         return uuid;
  }


  public void write(UUID uuid, byte[] bytes,int start,int length)
{
        FileTransferObject fileTransferObject = concurrentHashmap.remove(uuid);       
        if (fileTransferObject  == null) throw new RuntimeException("unknown uuid or concurrent access to the uuid");
        fileTransferObject.write(bytes,start,length);
       concurrentHashmap.put(uuid,fileTransferObject );
}

public void close(UUID uuid)
{
        FileTransferObject fileTransferObject = concurrentHashmap.remove(uuid);       
        if (fileTransferObject  == null) throw new RuntimeException("unknown uuid or concurrent access to the uuid");
        fileTransferObject.close();
}
}
Johan Compagner
Servoy
User avatar
jcompagner
 
Posts: 8828
Joined: Tue May 27, 2003 7:26 pm
Location: The Internet

Re: Handling threads and remote stream

Postby ptalbot » Mon Jun 07, 2010 5:39 pm

jcompagner wrote:(on the server you also have threads but those are handled and maintained by rmi)

I was not concerrned about the client. That part is already working in a thread (although I didn't use application.getExecutor() for that purpose but an ExecutorService created using Executors.newSingleThreadExecutor()- I will change to use yours, good idea!)

What I'm concerned is the server side which can potentially have to handle lots of concurrent requests to the write() method. Using a ConcurrentHashMap only takes care of concurrency (make sure that each client is writing to the correct file for example), but this doesn't imply that blocking calls to this method will not block other clients.

When you say that threads are handled and maintened by rmi, so then the JVM is handling this, is it?
I just googled that and found such parameters as -Dsun.rmi.transport.tcp.connectionPool=true - which I suppose you use - meaning that I don't need to handle this at the plugin level.

So I suppose this is a lot simpler than what I had in mind... Thanks for taking the time to clarify!
Patrick Talbot
Freelance - Open Source - Servoy Valued Professional
https://www.servoyforge.net
Velocity rules! If you don't use it, you don't know what you're missing!
User avatar
ptalbot
 
Posts: 1654
Joined: Wed Mar 11, 2009 5:13 am
Location: Montreal, QC

Re: Handling threads and remote stream

Postby jcompagner » Mon Jun 07, 2010 5:43 pm

connection pool is a little bit different (its about pooling socket connections)

But request to remote server rmi objects are coming in concurrently.
Thats how our own remote object works (like the one that processes all your sql calls from all the clients)

You only need to make sure that the remote object fields are threadsafe because the can be accessed by many clients at the same time.

I guess if you want to limit how many clients can do there jobs at once then you have to build in some logic somewhere so that you make sure that 1 client has to wait before another finishes.
Johan Compagner
Servoy
User avatar
jcompagner
 
Posts: 8828
Joined: Tue May 27, 2003 7:26 pm
Location: The Internet

Re: Handling threads and remote stream

Postby ptalbot » Mon Jun 07, 2010 5:56 pm

jcompagner wrote:I guess if you want to limit how many clients can do there jobs at once then you have to build in some logic somewhere so that you make sure that 1 client has to wait before another finishes.

That was not my intention.
Thanks again for your answers!
Patrick Talbot
Freelance - Open Source - Servoy Valued Professional
https://www.servoyforge.net
Velocity rules! If you don't use it, you don't know what you're missing!
User avatar
ptalbot
 
Posts: 1654
Joined: Wed Mar 11, 2009 5:13 am
Location: Montreal, QC


Return to Archive

Who is online

Users browsing this forum: No registered users and 4 guests