Sure,
Here is an example of code although there really is nothing to fix... I am just trying to optimize it... I have to run this query (in this case the code is in perl):
$query = $db->prepare("SELECT * from urls WHERE (k1='$wid' OR k2='$wid' OR
k3='$wid' OR k4='$wid' OR k5='$wid' OR k6='$wid' OR k7='$wid' OR k8='$wid' OR
k9='$wid' OR k10='$wid' OR k11='$wid' OR k12='$wid' OR k13='$wid' OR
k14='$wid' OR k15='$wid' OR k16='$wid' OR k17='$wid' OR k18='$wid' OR
k19='$wid' OR k20='$wid' OR k21='$wid' OR k22='$wid' OR k23='$wid' OR
k24='$wid' OR k25='$wid' OR k26='$wid' OR k27='$wid' OR k28='$wid' OR
k29='$wid' OR k30='$wid')");
$query->execute;
Basically it is a search engine and I want to get all urls with any keyword id (k1..k30) that matches the given word id.
k1..k30 and $wid are integers which correspond to the primary key in a wordlist.
If I only had to do this once per search it would be fine. But in this case I am mantaining a cache of all urls for a given keyword id that I constantly have to update as I spider sites.
There are tradeoffs no matter how I assign my tables. This happens to be my bottleneck and I purposely designed it this way so the bottleneck would be in creating the cache of word rankings. I am wondering if there is any way I can optimize it as it gets too slow with 70K urls. I thought an INDEX would work but with 30 cols this is not straight forward. I have to update approx. 10K words a day and with 10 concurent processes (the max I am allowed) this takes roughly 4 hours. I would like to increase words proccessed and urls in the table dramatically. I know with less urls this goes much faster. My only recourse may be more computing power but for now I would like to try to optimize this if possible.
Thanks again for everyones helpful comments.
-Frank