One thing that jumps out is the SELECT id FROM country_list WHERE code='$country' every iteration of the loop.
There are only about 280 or so countrys in the world (let's say 300 for arguments sake), this is not a huge amount of data. It would almost certainly be quicker (unless there's only a couple of elements in each of the files) to download the entire country list into an array and then access the id's through the array soemthing like the following
$file[0]="ftp://ftp.apnic.net/public/apnic/stats/apnic/delegated-apnic-latest";
$file[1]="ftp://ftp.apnic.net/public/stats/ripe-ncc/delegated-ripencc-latest";
$file[2]="ftp://ftp.apnic.net/public/stats/arin/delegated-arin-latest";
$file[3]="ftp://ftp.lacnic.net/pub/stats/lacnic/delegated-lacnic-latest";
//Get the country data
$query="SELECT id, code FROM country_list";
$res=mysql_query($query);
while($row=mysql_fetch_array($res,MYSQL_ASSOC)) {
$country_list[$row['code']]=$row['id'];
}
foreach($file as $key=>$val) {
if(!@$array=file($val)) {
die("<strong>$val</strong> is not available");
}
foreach($array as $fp=>$line) {
if(strpos($line, 'ipv4') && !strpos($line,'*')) {
$temp=explode('|',$line);
$b_ip=sprintf('%u',ip2long($temp[3]));
$e_ip=$b_ip+($temp[4]-1);
if(array_key_exists($temp[1],$country_list)) {
$country=$country_list[$temp[1]];
} else {
$country=0;
}
$query="INSERT INTO ip(ip_from, ip_to, code) VALUES ('$b_ip','$e_ip','$country')";
$result=mysql_query($query) or die(mysql_error());
//echo("$b_ip<br>");
}
}
Now, there's one more thing which may speed things up a little. DB queries are going to be the bottlekneck here (unless you've got a slow network connection but there's not a lot we can do about that) so optimization is going to come in the form of reducing the number of MySQL queries (as we already have done by removing the country query from the loop. We can also reduce the number of times the INSERT query runs, bu this will be a little bit more complex.
We can combine multiple inserts into one query with the following syntax.
INSERT INTO table
(col1, col2)
VALUES
('var11','var12'),
('var21','var22'),
('var31','var32);
The only problem is that MySQL chokes if you try to pass it a query that's too big and I'm not sure what this size is, I've hit it before but I can't for the life of me remember what it was. I'd suggest doing it every 50 or so iterations to be on the safe side (I am 100% certain this will go through without a problem). If you have loads of data which needs to go in you could run some tests but try to make it failsafe because in order to find out far you can go you will need to test it to breaking point.
Good Luck
Bubble