OK, looking at your post again, I think you're saying 100,000 records, not 100k length for each record.
If that's the case, then MySQL will be fine.
Anyway, for importing data, I usually write a bash or php shell script and cron it up to run and email me if it fails. It's pretty easy to write such a script. If there are a LOT of records, you don't want to read them all into memory then process them, because you can hit the upper memory limit and your cron won't work. If you will be processing one line at a time, then do something like:
<?php
$filename = "myfile.txt";
$fp = fopen($filename, "r");
while (!feof($fp)){
$line = fgets($fp,100000);
$parts = split("|",$line);
$query = // Build a query with the $parts you got from the previous line
$res = pg_query($query);
}
?>
[/code]
This way, you only read one line at a time, and since you keep overwriting the same variable, you don't have garbage collection issues all the time. It's actually a pretty fast way to import a file, and you can handle files hundreds of megs in size without running out of memory.
You'll need to add a bit to that to flesh it out, and do some error detection and what not, but you get the basic idea, right?