I have a very very urgent request.
I'm trying to receive data via a POST operation in one PHP function and store it anywhere I can such that the data can be inspected by hordes of curious visitors via another PHP function.
I have tried writing the data to a super-simple MySQL db, to a file, and to a cache. In every case, there seems to be resource contention to get at the data quickly. Basically, when one script is writing to the data file once per second, it appears to lock out ALL visitors.
More specifically, I have a loop that performs a POST operation to my gateway.php script:
public function send()
for($i=0; $i<10; $i++ ) {
$this->send_one();
// accelerometers will be sampling approx 250ms (ie: 4 times per second)
usleep(1000 * 1000);
echo $i . "\n";
}
public function send_one() {
$cu = new SMBLE_curl_utility();
$cu->no_delay = TRUE; // sets CURLOPT to prevent nagle's algorithm
$coords = array(); // some random data
for($j = 0; $j<self::NUMBER_OF_POINTS; $j++){
$coords[$j] = implode(",", $this->pick_random_coordinate());
}
$data = self::NUMBER_OF_POINTS . "," . implode(",", $coords); // just a string of data
// this just makes a simple POST operation to the url which receives the data
$cu->post("/gateway", array(
"d" => $data
));
}
When that loop runs, any scripts polling the data file are timing out.
Can anyone suggest what this might be? Is it that my curl POSTs are maintaining some consistent database/file/cache connection that is locking up the data file to prevent reads? Or does it have something to do with my server keeping the HTTP connection open for my send() function and this somehow causes a lock to be estagblished on the data?