java - Flume's HttpSource: is the Jetty server multithread? -


i've been looking bit flume's httpsource internals, trying yo figure out how jetty server used.

i've seen single element list of connectors used; connector listen incoming http connections on configured http host , port. context created root path, , httpservlet added context containing logic executed when connection received. finally, jetty server started.

connector[] connectors = new connector[1];  if (sslenabled) {     sslsocketconnector sslsocketconnector = new httpsourcesocketconnector(excludedprotocols);     ...     connectors[0] = sslsocketconnector; } else {     selectchannelconnector connector = new selectchannelconnector();     ...     connectors[0] = connector; }  connectors[0].sethost(host); connectors[0].setport(port); srv.setconnectors(connectors);  try {     org.mortbay.jetty.servlet.context root = new org.mortbay.jetty.servlet.context(srv, "/", org.mortbay.jetty.servlet.context.sessions);     root.addservlet(new servletholder(new flumehttpservlet()), "/");     httpserverconstraintutil.enforceconstraints(root);     srv.start();     ... 

my question is, seen above implementation: such jetty server create thread each incoming http connection? or unique httpservlet serve requests, 1 one, sequentially?

thanks helping!

first of note: org.mortbay.jetty means using very old version of jetty. jetty 5 or jetty 6. have been eol (end of life'd) way in 2010 (and earlier).

back in jetty 6 days, there threadpool used on-demand, , depending on connector type either result in thread per connection (known blocking connectors), or thread per nio selection (in case 1 connections have many threads on lifetime of connection, never more 1 active per connection).

starting jetty 8, , servlet async, threading model refactored favor async behavior of request processing more.

with jetty 9, blocking connectors dropped in favor of supporting async processing of request, inputstreams, , outputstreams.

the current model threadpool of threads used, on demand, when needed connection (this processing of request, or response, or reading request body content, or writing response body content, or active websocket streaming, etc...)

this model preferred spdy , http/2 based support, have multiple requests per physical connection. know in models quite possible have multiple active threads per physical connection, depending on behavior of servlets.

also, web application can choose spin more threads own processing, such via servlet async processing behaviors, or initiate outgoing requests other services, or process other tasks unrelated specific request / response context.


Comments

Popular posts from this blog

apache - PHP Soap issue while content length is larger -

asynchronous - Python asyncio task got bad yield -

javascript - Complete OpenIDConnect auth when requesting via Ajax -