amazon web services - Impose time limit to popen/fgets in PHP -


i want impose time limit process reading using fgets opened popen in php.

i have next code:

$handle = popen("tail -f -n 30 /tmp/pushlog.txt 2>&1", "r"); while(!feof($handle)) {     $buffer = fgets($handle);     echo "data: ".$buffer."\n";     @ob_flush();     flush(); } pclose($handle); 

i tried without success:

set_time_limit(60); ignore_user_abort(false); 

the process follow:

  1. the browser send request waiting answer in html5 server side events format.
  2. the request received aws load balancer , forwarded ec2 instances.
  3. the answer last 30 lines of file
  4. the browser receive in 30 messages , connection persisted.
  5. if tail command sends new line returned else fgets wait undefined time until new line returned tail command.
  6. aws load balancer after 60 seconds of network inactivity (no new lines in 60 seconds) closes connection browser. connection ec2 instance not closed.
  7. the browser detect connection closed , opens new connection, process go step 1.

as steps describe, connection between aws load balancer , ec2 instance never closed, after few hours/days there hundreds , hundreds of tail , httpd process running , server start not answering.

of course appear aws load balancer bug, don't want start process gain attention amazon , wait fix.

my temporary solution sudo kill tail kill process before server becomes unstable.

i think php doesn't stop script because php "blocked" waiting fgets finish.

i know time limit of aws load balancer editable, want keep in default value, higher limit not going fix problem.

i don't know if need change question how execute process in linux time limit / timeout?.

php 5.5.22 / apache 2.4 / linux kernel 3.14.35-28.38.amzn1.x86_64

tested php 5.5.20:

//change configuration. set_time_limit(0); ignore_user_abort(true);  //open pipe & set non-blocking mode. $descriptors  = array(0 => array('file', '/dev/null', 'r'),                       1 => array('pipe', 'w'),                       2 => array('file', '/dev/null', 'w')); $process      = proc_open('exec tail -f -n 30 /tmp/pushlog.txt 2>&1',                                 $descriptors, $pipes, null, null) or exit; $stream       = $pipes[1]; stream_set_blocking($stream, 0);  //call stream_select 10 second timeout. $read = array($stream); $write = null; $except = null; while (!feof($stream) && !connection_aborted()         && stream_select($read, $write, $except, 10)) {      //print out lines can.     while (($buffer = fgets($stream)) !== false) {         echo 'data: ' . $buffer . "\n";         @ob_flush();         flush();     }  }  //clean up. fclose($stream); $status = proc_get_status($process); if ($status !== false && $status['running'] === true)     proc_terminate($process); proc_close($process); 

Comments

Popular posts from this blog

apache - PHP Soap issue while content length is larger -

asynchronous - Python asyncio task got bad yield -

javascript - Complete OpenIDConnect auth when requesting via Ajax -