Bear with me, as I'm still quite new at PHP.
I'm trying to figure out the best way to build a search function for a database that will eventually be large. (300,000 items total, each item has tables for different categories of information (for example: description, uses, location). These are connected by a matching id# and joined in various ways, depending on the query built.
The path of the search I'm building is to go from a straight-forward Javascript and html search page, to a paginated search result. Then the user can narrow down those query results based on a new search of their results.
My interest is to ensure a little bit of speed and security, of course. I've been learning about sessions and temporary tables. But, because of the database size, I'm wondering if they are the wrong way. Sessions have a time limit, and temporary tables delete once the connection is closed. Or maybe someone knows how to hold onto a temp table through more than one connect/disconnect?
I also considered building a permanent table that puts in the search variables, with a unique id and use this to manage the searches, pagination and all that? Then, at some point in the process, I can put code to delete the corresponding row (or even make it a saveable search). But I've been told that would only slow down the process and gobble up bandwidth.
I'm open to any suggestions. I just want to know the right road to take before I head there.
thanks!