I'm trying to speed up a script that we use to let bots crawl our site. This is the only page that the bots are allowed to access.
We have over 2 million records in our database and we want one column from the table linv_inventory shown in an HTML table paged in 10,000 row increments.
// counting the offset
// $rowsPerPage = 10000
$offset = ($pageNum - 1) * $rowsPerPage;
$query = "SELECT `inventory_part_number` FROM `linv_inventory` ORDER BY `inventory_part_number` LIMIT $offset, $rowsPerPage";
$result = mysql_query($query) or die('Error, query failed');
Which of these would be faster? The timing results that I have gotten have really
been inconclusive.
In the first, we query the first 10,000 records, extract them into an array, then iterate through the array in a foreach.
$element = array(true);
while($row = mysql_fetch_array($result))
$element[] = trim($row[0]);
echo "<table width=\"100%\" cellspacing=\"2\" cellpadding=\"3\">\n";
$i = 0;
foreach($element as $cell) {
echo '<td>' . $cell . '</td>';
if($i%5==0) {
echo " </tr>\n";
echo " <tr>\n";
}
$i++;
} // foreach($element as $cell)
echo " </tr>\n";
echo "</table>\n";
In the second, we just iterate over the result set using mysql_fetch_array().
while($row = mysql_fetch_array($result)) {
echo " <tr>\n";
echo '<td>' . $row['inventory_part_number'] . '</td>';$row = mysql_fetch_array($result);
echo '<td>' . $row['inventory_part_number'] . '</td>';$row = mysql_fetch_array($result);
echo '<td>' . $row['inventory_part_number'] . '</td>';$row = mysql_fetch_array($result);
echo '<td>' . $row['inventory_part_number'] . '</td>';$row = mysql_fetch_array($result);
echo '<td>' . $row['inventory_part_number'] . '</td>';$row = mysql_fetch_array($result);
echo '<td>' . $row['inventory_part_number'] . '</td>';
echo " </tr>\n";
} // while($row = mysql_fetch_array($result))
Are here any other means that I may use to make this faster as this script has been causing problems due to the number of bots that hit it simultaniously and brigning my database down to a crawl.