I'm working on a time accounting system for a graphic design firm. One of the modules, called "Analyze", lets the user retrieve a bunch of time and expense entries based on parameters they select.
They can choose a client (or all clients), project (or all projects), a date range, etc.
The module defaults to "everything" (all clients, all projects, unspecified date range) and under these conditions is currently trying to retrieve about 13,000 records with the first query and 6,000 with the second. This generates a memory limit error on my client's host (Interland). I've bumped the limit up via ini_set(), but I wonder if there's a better way to go about it.
Under this default condition, the script runs two queries:
"SELECT project_time.time_id, project_time.project_id, project_time.date, project_time.employee_id, project_time.activity_id, project_time.hours, project_time.rate, project_time.notes, project_time.invoice_id FROM project_time ORDER BY date";
and
"SELECT project_expenses.expense_id, project_expenses.project_id, project_expenses.date, project_expenses.employee_id, project_expenses.item_id, project_expenses.quantity, project_expenses.price, project_expenses.notes, project_expenses.invoice_id FROM project_expenses ORDER BY date";
There's no "where" condition in this case, since no filters have been specified. I do specify the table in each field ("table.field") because when I AM specifying conditions, in some cases it involves a table join.
Now, it may seem unrealistic to retrieve this many results (and it does result in a long html page) but my system also formats the results into a tab-delimited text file (parked temporarily in a session variable) which can be downloaded by clicking a link and then opened in Excel - so though working with this much data in a browser doesn't make much sense, there is a way to post-process it more practically.
Is it expected that I'd run into this memory limit issue given the number of records being retrieved? Is the only practical way to get around it (and still get all the records) to bump up the memory? Should I somehow "segment" my query - do multiple queries with LIMIT and then assemble them (in which case, if I'm doing the assembly prior to delivering to the browser, am I likely to run into the same problem)?
Any guidance will be most appreciated!