Using ThreadPool.QueueUserWorkItem in ASP.NET in a high traffic scenario

Other answers here seem to be leaving out the most important point:

Unless you are trying to parallelize a CPU-intensive operation in order to get it done faster on a low-load site, there is no point in using a worker thread at all.

That goes for both free threads, created by new Thread(...), and worker threads in the ThreadPool that respond to QueueUserWorkItem requests.

Yes, it’s true, you can starve the ThreadPool in an ASP.NET process by queuing too many work items. It will prevent ASP.NET from processing further requests. The information in the article is accurate in that respect; the same thread pool used for QueueUserWorkItem is also used to serve requests.

But if you are actually queuing enough work items to cause this starvation, then you should be starving the thread pool! If you are running literally hundreds of CPU-intensive operations at the same time, what good would it do to have another worker thread to serve an ASP.NET request, when the machine is already overloaded? If you’re running into this situation, you need to redesign completely!

Most of the time I see or hear about multi-threaded code being inappropriately used in ASP.NET, it’s not for queuing CPU-intensive work. It’s for queuing I/O-bound work. And if you want to do I/O work, then you should be using an I/O thread (I/O Completion Port).

Specifically, you should be using the async callbacks supported by whatever library class you’re using. These methods are always very clearly labeled; they start with the words Begin and End. As in Stream.BeginRead, Socket.BeginConnect, WebRequest.BeginGetResponse, and so on.

These methods do use the ThreadPool, but they use IOCPs, which do not interfere with ASP.NET requests. They are a special kind of lightweight thread that can be “woken up” by an interrupt signal from the I/O system. And in an ASP.NET application, you normally have one I/O thread for each worker thread, so every single request can have one async operation queued up. That’s literally hundreds of async operations without any significant performance degradation (assuming the I/O subsystem can keep up). It’s way more than you’ll ever need.

Just keep in mind that async delegates do not work this way – they’ll end up using a worker thread, just like ThreadPool.QueueUserWorkItem. It’s only the built-in async methods of the .NET Framework library classes that are capable of doing this. You can do it yourself, but it’s complicated and a little bit dangerous and probably beyond the scope of this discussion.

The best answer to this question, in my opinion, is don’t use the ThreadPool or a background Thread instance in ASP.NET. It’s not at all like spinning up a thread in a Windows Forms application, where you do it to keep the UI responsive and don’t care about how efficient it is. In ASP.NET, your concern is throughput, and all that context switching on all those worker threads is absolutely going to kill your throughput whether you use the ThreadPool or not.

Please, if you find yourself writing threading code in ASP.NET – consider whether or not it could be rewritten to use pre-existing asynchronous methods, and if it can’t, then please consider whether or not you really, truly need the code to run in a background thread at all. In the majority of cases, you will probably be adding complexity for no net benefit.

Leave a Comment