Ok, first of all, before looping the file, put in an array() every ID. Then you can just use in_array() instead of a SELECT per line. You can do it pretty fast with an "binary search", look at the comments on the php manual for in_array()
Then, if you don't find the ID, don't INSERT right away. Use the database multiple insert feature: INSERT INTO table VALUES (1, 'a'), (2, 'b'), (3, 'c'). Just separate the rows with parenthesis and the commas. Concat the records and insert at the end of the loop.
Well, depend on your server. Inserting 23K rows at once might be too slow. You can consider the multiple sql insert statement as "buffered", just do the query every 1K or 2K values.
The UPDATE is the hard part... Go with one per loop iteration where it is needed. If it's still slow, hold the values that should be updated, and after the file loop, use a DELETE and then an INSERT.
If your ID database is a PK or Unique key you can try to use multiple REPLACE INTO.
I have optimized a script from a co-worker using these techniques and was able to go from inserting 5K lines to over 200K... but YMMV.