question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How does using async handlers reduce the total number of threads needed?

See original GitHub issue

For some background, I’m pretty new to concurrency in Java. I have experience with Node.js and Go, so I have some familiarity with concurrency and async work in those languages, but I’m noticing differences in Java where you have to do more work such as managing thread pools.

In the docs, the async handlers stood out to me (https://javalin.io/documentation#faq). It looks like a nice way to keep that sequential programming model. As long as I wrap my work in a Future and return it, I activate this “async” behavior. In the docs, a new concept was introduced to me - executors. I then read some Java documentation about executors to learn what this was all about.

So my understanding at this point is that with Java, you can’t just ask that the runtime process something in the background like in Node.js if I called setImmediate(<my_func>) which instructs the event loop to process it on the next cycle or in Go if I called go <my_func> which instructs the scheduler to spawn a new goroutine and run it there. Instead, I have to tell it where to run it in the background. That could be new Thread(...) if I were willing to manage threads myself, or an executor if I’d like to work at a higher layer of abstraction and take advantage of a pool of threads being managed for me, where threads are reused so that it’s more efficient.

But what I’m confused about at this point is why this improves things in the first place. The docs say that the thread pool created for Jetty defaults to 200 threads. So my understanding is that means I can process up to 200 requests at a time. If requests take a long time, or many come in at once, that would cause problems because these 200 threads might eventually be exhausted and subsequent requests would have to wait for a thread to be free. My understanding is that I have two ways to solve this in Javalin:

  1. By raising the number the number of threads that Jetty uses, for example from 200 to 400, doubling the number of requests I can handle at once, or doubling the length of time my requests could be allowed to take to complete without subsequent requests having to wait for a thread.
  2. By using async requests. I’d set up an executor to schedule the work on. The executor would have as many threads as are needed to process all of the requests, depending on how long each request takes and how many I expect to come in at once. So maybe I would set up a thread pool with 400 threads in this executor.

So if my requests take a while to process, such that in my example here I could get the work done with 400 threads, wouldn’t I need to set up an executor with its own thread pool of 400 threads? Wouldn’t I end up with an equal number of threads or a greater number of threads provisioned in total, between the Jetty thread pool and my executor’s thread pool? And wouldn’t this negate the improvement from using async requests?

After reading as much as I could about Javalin, it seems that my understanding is flawed, because the example at https://github.com/tipsy/javalin-async-example demonstrates that using the async requests approach halves the total amount of time the requests in that example take to process (from 15s to 7.5s). And the executor set up in the example in that repo (and in the example in the docs) is obtained by calling Executors.newSingleThreadScheduledExecutor, so it seems that the executor used has just one thread to do all the work, yet it’s faster.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:11 (11 by maintainers)

github_iconTop GitHub Comments

1reaction
rbygravecommented, Aug 8, 2021

high number of threads provisioned at all times by

Just to say the thread pool with grow and shrink. It is more a question of the max thread pool size. What do we want to provision the max thread pool size to be (and if we increase the max we similarly increase the max memory consumed by the app).

I’m conservative. To me today the cost effective / cheap approach is to increase the max thread pool if really necessary, use a bit more memory and in doing so keeping things simple knowing that when Loom arrives we get that memory back.

1reaction
mattwelkecommented, Aug 8, 2021

I’ll send a PR with a doc change if I feel like there’s something to add that will help people with my perspective, where we aren’t used to the concepts of thread pools and stuff like that. I need some time for my thoughts to digest though first. I was barely able to put my thoughts together enough to create this issue. I’m glad I seem to have gotten my point across though. xD

Thanks again

Read more comments on GitHub >

github_iconTop Results From Across the Web

How does using async handlers reduce the total number of ...
By raising the number the number of threads that Jetty uses, for example from 200 to 400, doubling the number of requests I...
Read more >
Asynchronous Handlers • NServiceBus • Particular Docs
Its primary purpose is to reduce the number of application threads and provide efficient management of threads.
Read more >
Async/Await - Best Practices in Asynchronous Programming
The first problem is task creation. Obviously, an async method can create a task, and that's the easiest option. If you need to...
Read more >
If async-await doesn't create any additional threads, then how ...
IO tasks are not CPU bound and thus do not require a thread. The main point of async is to not block threads...
Read more >
Asynchronous work with Java threads | Android Developers
... which packages are visible automatically · Declare package visibility needs ... Permissions used only in default handlers · Restrict interactions with ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found