An alternative would (I suspect) be to import the CSV directly into MySQL (I recall it does have a command to import CSV files .... ah, "LOAD DATA INFILE").
If you write a PHP script that constructs the appropriate import statement and sends it to the MySQL, then MySQL can load and import the file, returning to PHP whatever it returns when it's finished.
Then PHP can issue update commands to the database to set the "Category" fields of records that contain the appropriate keywords in their "Name" fields.
For added performance, the initial load and category update could be done in a temporary table (with the same structure as the real table): partly to take some of the load off the table's indexes (the temp table probably doesn't need any unless they'd help aid the "Name" column search), partly to avoid the updates wasting time on the thousands of records already present, and partly to avoid locking the table too much while performing the category updates. When the new records have been knocked into shape, they can then be copied across by an INSERT ... SELECT statement.
The main thing is that importing CSV in bulk is pretty much always much faster than a series of INSERT statements (as it happens, I think phpMyAdmin parses the CSV and issues multiple INSERTs as well). As it happens, it also sidesteps having PHP read the file and pass the contents to MySQL. The principal drawback is that MySQL needs read access to the file in question.
Actually, if you are using a temp table to load the CSV into, and depending on how complicated it would be to search the "Name" fields, you could have your script make a request to select all of the distinct names from the temp table ("Select distinct name from temptable"). The script goes through the list and pairs names with categories in whatever way it's supposed to, and then it can send commands to update the temp table ("update temptable set category='colour' where name='maroon'" or whatever).