So basically, you're grabbing a line from a file, and making an HTTP request for each line? Wow, talk about resource heavy.
This might be a possible option, though I don't know if it would be a good solution or not, so I'm open to opinions:
* Splitting up the load - only do a certain number of subdomains per execution. Use multiple instances (though not too many!) simultaneously to work through the list. If you're on a unix server, you could do something like this:
$words = file('subdomains.txt');
$lines = count($words);
$limit = 15; // how many subdomains per instance
for($i = 0; $i <= $lines; $i+= $limit) {
exec("/path/to/php -d register_argc_argv=On -f myScript.php $i " . ($i+$limit) . ' &');
}
Then, in myScript.php, you would use the first parameter passed as the starting point in the files() array, and use the second parameter to determine how many subdomains the script should do before stopping.
EDIT: By the way, this solution assumes that you: a) Are on a Unix-based server, and b) Have [man]exec[/man]ute permissions