Concurrent http calls

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Concurrent http calls

parsa28
Hi folks,

I want to make a set of http calls which have large response times (not due to network roundtrip, but because of the process time on the other server). Using Http singleton I can achieve this in a serial manner. But how can I make my calls concurrent without going through the hassles of multi-threading?

Reply | Threaded
Open this post in threaded view
|

Re: Concurrent http calls

parsa28
After reading n8han's post on the "future" implementation of Dispatch, let me extend my question to this: I'm making these http calls on a per-request basis for a Lift-based web service. I'm totally alien to Actor model and not that good at multi-threading. But I definitely want to make the http calls as concurrent as possible without doing a big harm to the stability of the web service. What's the way to go?
Reply | Threaded
Open this post in threaded view
|

Re: Concurrent http calls

n8han
Administrator
Hi,

Do you have a number of Dispatch requests that can be fulfilled independently? If that is the case then you should see a substantial benefit from further multithreading in your response cycle (web requests being already handled in separate threads, typically).

There are two choices for multithreading Dispatch/HttpClient:

1. Use a separate client instance for each thread.
2. Use a thread safe instance with a connection manager.

The first option is the default (`new Http`) and basically foolproof. The advantage of the second is supposed to be its connection pool; if your server is handling a lot of requests it should be able to reuse connections more effeciently than reopening them. However, this method is necessarily more complex, and when I tried using it for an android application I lost a lot of stability and had to revert. (As far as I could tell, failed/bad connections were poising the pool, and being reused.)

The "futures" interface in Dispatch is two things: a thread-safe connection manager, and also an interface to java.util.connurrent's futures. But you can do all that without that interface's help; I'm actually thinking about pulling it in a future version. There's a lot of intricate/weird code that makes it work (including the mysterious HttpPackage type), and I'm just not sure it's worth it to be able to write

    http.future(handler)

instead of

    future { http(handler) }

when there is zero difference in performance or type safety between them.

Basically you have to decide two different things for your app: 1) multiple Http instances or a single thread safe one and 2) a strategy for concurrently acquiring the data you need to respond to a request. The interface supplied by dispatch futures is just one one of doing it.

Nathan

On 01/18/2011 08:29 AM, parsa28 [via Databinder] wrote:
After reading n8han's post on the "future" implementation of Dispatch, let me extend my question to this: I'm making these http calls on a per-request basis for a Lift-based web service. I'm totally alien to Actor model and not that good at multi-threading. But I definitely want to make the http calls as concurrent as possible without doing a big harm to the stability of the web service. What's the way to go?

If you reply to this email, your message will be added to the discussion below:
http://databinder.3617998.n2.nabble.com/Concurrent-http-calls-tp5934754p5935443.html
To start a new topic under Databinder, email [hidden email]
To unsubscribe from Databinder, click here.

Reply | Threaded
Open this post in threaded view
|

Re: Concurrent http calls

parsa28
Hi Nathan,

n8han wrote
Do you have a number of Dispatch requests that can be fulfilled independently? If that is the case then you should see a substantial benefit from further multithreading in your response cycle (web requests being already handled in separate threads, typically).
Yes and that's why I insist on doing this.

I think I'd better start with a shared fixedThreadPool and see where it goes.

Thanks for your help.