Here's the problem:
I'm running a script that queries a series of MySQL databases to build a table based off of 10 fields, all roughly varchar(20)... The application searches a database of 40,000 users for individuals whose card_received field are set to 'n', and email to the appropriate parties... However, the runtimes that I'm receiving as the file size gets larger are becoming exponential, and ultimately lock up the server at a certain point.
The algorithm itself is simple...
while($row = mysql_fetch_assoc
{
print "<TR>";
foreach($fields as $key=>$value)
print "<td> " . $data[$value] . "</td>";
print "</tr></table>";
}
I'm using outputted buffering instead of concatenating the variable $email_body, and found it to cut down my runtime by roughly 23% for this function. I then place that $email_body variable in the following, to build the email with an attached excel file...
$boundary = '-----=' . md5( uniqid ( rand() ) );
$email_body .= "--" . $boundary . "\n";
$email_body .= "Content-Type: application/vnd.ms-excel; name=\"Order Details\"\n";
$email_body .= "Content-Transfer-Encoding: base64\n";
$email_body .= "Content-Disposition: attachment; filename=\"$theFile\"\n\n";
$content_encode = chunk_split(base64_encode($email_attach));
$email_body .= $content_encode . "\n";
$email_body .= "--" . $boundary . "\n";
$email_body .= "Content-Type: text/html; charset=iso-8895-15\n";
$email_body .= "Text of email here:\n\n";
$email_body .= "Blah blah blah";
$headers = "From: ****@****.com\nBcc: ****@****.com\n";
$headers .= "MIME-Version: 1.0\n";
$headers .= "Content-Type: multipart/mixed; boundary=\"$boundary\"";
if(mail($mailto, "Order Info", $email_body, $headers))
echo "<BR>Details of this order were emailed to $mailto";
The following is a table of the runtimes that I received. The first column is the number of users queried. The second column is the number of seconds to process the query and send the email... The third is the number of users that the server processed per second.
295 0.68303 431.8990381
345 1.45469 237.1639318
20 0.12157 164.5142716
79 0.21474 367.8867468
60 0.1603 374.2981909
161 0.43911 366.6507253
493 3.07463 160.3444967
964 6.52829 147.6650088
1141 6.0309 189.1923262
1872 40.54897 46.16640077
2665 109.07258 24.43327186
You can see that, once it processes more than around 1000 users, the run time becomes exponential, though there is NO algorithmic reason for this.
Are there size limitations of the mail() function that may or may not be affecting my performance? Does anyone have any idea how I can fine-tune the code to produce more linear results? Any help or advice would be GREATLY appreciated! Thanks.