Hey guys,
Could one of you please cast your experienced eyes over this little issue for me please. I run an IP-2-City database on a few of my PHP-Nuke sites and I'm trying to develop a bit of code which will reliably import the huge csv file (updated monthly).
The .csv has around 1.5 million rows and for a few months, I've been splitting it down into 5 or 6 smaller files (>250,000 rows) because my script has 'time out' issues otherwise. I created the file from a couple of similar-intentioned scripts and some of my own work. Here is what I'm using....
<?php
include("mainfile.php");
if( !get_cfg_var('safe_mode') )
{
@set_time_limit( 0 );
@ignore_user_abort( 1 );
}
$db->sql_query( "DROP TABLE IF EXISTS _iplookup" );
$db->sql_query( "CREATE TABLE _iplookup (
ip_from int(10) unsigned NOT NULL default '0',
ip_to int(10) unsigned NOT NULL default '0',
c2code char(2) NOT NULL default '',
country varchar(64) NOT NULL default '',
region varchar(64) NOT NULL default '',
city varchar(64) NOT NULL default '',
KEY code (ip_from, ip_to, c2code) ) ");
$file = "city-1.csv";
$opened = fopen( $file, 'r' );
$counter = 0;
while( !feof( $opened ) ) {
$row = fgets( $opened, 4096 );
$data = explode("\"", $row );
$ip1 = $data[ 1 ];
$ip2 = $data[ 3 ];
$c2code = $data[ 5 ];
$c2code = strtolower( $c2code );
$country = $data[ 7 ];
$country = strtolower( $country );
$country = ucwords( $country );
$country = addslashes( $country );
$region = $data[ 9 ];
$region = strtolower( $region );
$region = ucwords( $region );
$region = addslashes( $region );
$city = $data[ 11 ];
$city = strtolower( $city );
$city = ucwords( $city );
$city = addslashes( $city );
$db->sql_query( "INSERT INTO _iplookup VALUES ( '$ip1', '$ip2', '$c2code', '$country', '$region', '$city' ) ");
$counter++;
}
fclose( $opened );
echo $counter." lines written from $file";
?>
The call to include mainfile.php is because its a nuke based install and the mainfile is only used to render the $db and query calls (I'll happily paste if you think it's important).
Occasionally, it will get to the end of .php file and then I'll try to run the next one (as i said, its 5 or 6 - the only differnce is that subsequent scripts don't delete the table if present, just append). One of the subsequent files will generally time out and so I have to start from scratch again.
I'm pretty sure it would work if I just keep the files nice and small but in terms of 'quick' monthly updating, I'd like to just ftp the entire csv (one piece) and let the script run for as long as it needs to (I do it at very off-peak times, for traffic purposes).
Any help will be greatly appreciated. Let me know if you require any more info.
Many thanks, peace + out,
NiHiLiSt 🆒