Hold on, I've just re-thought that last part, if I always include the ID as part of my ORDER I can get around it:
"WHERE ID > " . $intLastID . " ORDER BY Price DESC, ID ASC LIMIT 10"
I really should read the stuff I link to properly. :o
Hold on, I've just re-thought that last part, if I always include the ID as part of my ORDER I can get around it:
"WHERE ID > " . $intLastID . " ORDER BY Price DESC, ID ASC LIMIT 10"
I really should read the stuff I link to properly. :o
Nope: because you can have IDs that are higher than the last ID but the price may still be lower (or higher), meaning you're now filtering out results, because they have an ID that's not sequentially accurate to the price. Example:
ID - Price
1 - 2.50
2 - 3.50
3 - 1.00
4 - 0.99
5 - 1.50
SELECT FROM <table> ORDER BY Price ASC LIMIT 2, gives IDs (4, 3), so page 2 would be SELECT FROM <table> WHERE ID > 3 ORDER BY Price ASC LIMIT 2 which gives IDs (4,5), you've now lost IDs (1,2)
Yes, you're right, it doesn't work.
Reading through that document I linked to earlier, it also doesn't work with direct page navigation either; this sort of thing:
< First Page < Previous Page 1 2 3 4 5 6 7 8 9 10 Next Page > Last Page >
To some extent, some of this might be academic when we're talking about tables with a few hundred, or even a few thousand records, rather than tens or hundreds of thousands.
Derokorian wrote:Nope: because you can have IDs that are higher than the last ID but the price may still be lower (or higher), meaning you're now filtering out results, because they have an ID that's not sequentially accurate to the price.
Indeed. One solution that comes to mind is to handle those that are of the same price in one query (in which case filtering by ID alone is fine), then the rest in another query (in which case filtering by price alone is fine), and UNION the two queries.
xerxes wrote:it also doesn't work with direct page navigation either
Yes, that's why the examples I gave in post #2 were previous/next links and scrolling for a mobile app (though in retrospect, you could do it for "load on demand" as you scroll down the page on a normal browser too).
good to see other guys getting into the topic..
but i must say i found my way to be so convenient .. and so far updated one of my project (an inventory system) as it has lots of complicated select queries, and they all seem to work well. EXCEPT ONLY for queries w/ group by clauses.. for that i did not work on solution yet but a simple call to the SQL_CALC_ROWS_FOUNd would do the trick for those rare occasions.
to repeat my function is simple..
1) Throw any SELECT statement at it and it will generate a COUNT statement from it (so long as it doesnt have the GROUP BY clause.. there are probably other clauses that may cause issues, ill find out when it happens lolz)
2) the function will split the statement at the first " FROM " keyword . no more regex, or fiddling w/ dynamic constructs etc..
//if rowlimit is set is greated than 0 do limitter.
if( $rowlimit > 0 ){
if( $t_page > 0 ){ //if curpage or targetpage is set then attempt to paginate
//make sure statement doesnt have Group By clause
if(!stristr($q, "GROUP BY")){// if doesn't have GROUP BY clause, do paging logic
//'/FROM.*(?=GROUP BY|ORDER BY|LIMIT|$)/sU' regex for later use if needed
$q_parts = explode( " FROM ", strtoupper($q) , 2); //limit to 2 only
$q_count = "SELECT count(*) AS row_count FROM " . $q_parts[1];
$col_r = query_select($q_count); // query select is one
$row_count = $col_r['row_count'];
//calculate pages
$page_count = ceil($row_count / $rowlimit); //eg. 100 / 10 = 10 pages
if($t_page <= $page_count){
$startrow = ($t_page * $rowlimit) - $rowlimit; //coz the ***** starts at zero
}else{
$startrow = $page_count;
}
}else{ //GROUP BY statement requires counting off a subquery.
etc etc etc...
note: $col_r = query_select($q_count); <-- this is just my function that makes quick single row select for me
3) after that the function will return 2 variables, page (current page its on) and pagecount (total pages)
4) the rest of the script simply generates next, prev, and the rest of the page 1 2 3 etc ..
that's it, it's so simple. the key really is just getting the COUNT thing to run. ofcourse i have obviously ignored the Group By stuff for now but one doesnt really have the choice when it comes to that.. CouNT will not work at all hehe.
the key is in all in this little snippet
$q_parts = explode( " FROM ", strtoupper($q) , 2); //limit to 2 only
$q_count = "SELECT count(*) AS row_count FROM " . $q_parts[1];
$col_r = query_select($q_count); // query select is one
hehehe.
PS
as i am for simplicity always, i would rather do the LIMIT START,ROWLIMIT thing than mess w/ primary ID's and bother w/ sorting.. and the fact that each query may have different primary ID names to target.. it's just too impractically messy..
Tea_J wrote:i have obviously ignored the Group By stuff for now but one doesnt really have the choice when it comes to that.. CouNT will not work at all hehe.
You could replace the grouped columns (or expressions) with selecting distinct columns/expressions - but that might not be simple.
it's just too impractically messy..
It might be difficult getting a full solution (assuming you don't find someone else who's already done it better than you could), but you wouldn't have to do it again.
Weedpacket;11027539 wrote:You could replace the grouped columns (or expressions) with selecting distinct columns/expressions - but that might not be simple.
yeh.. ill fight that logic another day.. lolz. for now i've set my routine to offer NEXT/PREV options only (one of Laserlight's suggestion) w/o pagination should it has GROUP BY clause lolz
It might be difficult getting a full solution (assuming you don't find someone else who's already done it better than you could), but you wouldn't have to do it again.
true, but at this point issue has been solved in much simplicity , so why bother w/ other ways to do it.. unless there's a real need w/ significance, then i'd explore that option.
Tea_J wrote:unless there's a real need w/ significance, then i'd explore that option.
Well, there was, but you decided it was too hard
it's just too impractically messy..
It might be difficult getting a full solution (assuming you don't find someone else who's already done it better than you could), but you wouldn't have to do it again.
To add to what Weedpacket said, you might spend several hours coming up with a really good solution for a particular issue, then you can perhaps turn it into a function, or possibly just a code snippet where it can be quickly modified by assigning values to a few key variables. If it's for a very common activity, like recordset paging, you might re-use that function or chunk of code many times for dozens different web sites.
When I think back to all the ASP sites I've built, apart for the occasional site for something a little out of the ordinary, probably only 10% of each one is truly original, the rest of the site is re-used code or essentially an adaption of a similar previous site with a different interface and a different page layout etc., so it's well worth investing the time to get something right if you might re-use often.
Weedpacket;11027545 wrote:Well, there was, but you decided it was too hard
lolz.. but .. but.....
To add to what Weedpacket said, you might spend several hours coming up with a really good solution for a particular issue, then you can perhaps turn it into a function, or possibly just a code snippet where it can be quickly modified by assigning values to a few key variables. If it's for a very common activity, like recordset paging, you might re-use that function or chunk of code many times for dozens different web sites.
When I think back to all the ASP sites I've built, apart for the occasional site for something a little out of the ordinary, probably only 10% of each one is truly original, the rest of the site is re-used code or essentially an adaption of a similar previous site with a different interface and a different page layout etc., so it's well worth investing the time to get something right if you might re-use often.
true, hence i started this topic.. since pagination engine i've been doing was manually coded per need.. this time i decided to code a universal paging solution (instead of using paging frameworks) .. i started this thread hoping someone would say "it's ok to fetch all rows and calculate paging from that as it doesnt put a dent in server performance unless you actually fetch each rows ( mysql_fetch_rows)" .. but as expected that is bad way, hence im sticking w/ the harder, but more efficient way of doing things = LIMIT..
PS
I wish the next MySQL update would have a function for FOUND ROWS w/o having to issue the CALC FOUND ROWS clause.. that would really simplify things so much more..
PS
I wish the next MySQL update would have a function for FOUND ROWS w/o having to issue the CALC FOUND ROWS clause.. that would really simplify things so much more..
Indeed, there are a number of differences between PHP, mysqli/PDO, MySQL and ASP,ADO,Microsoft SQL Server, with which I'm more familiar, that have me wondering whether I'm doing the things the right/best way.
For example in ASP/ADO/ MS SQL if you wanted to retrieve data to use in a drop-down, or you wanted a limited number of records for page, similar to what you'd get with LIMIT, you would use GETROWS(): http://www.adopenstatic.com/experiments/recordsetpaging.asp which allows you to fetch a whole result set into a 2 dimensional array. This was generally considered more efficient than the more common way of returning a recordset then looping through the results outputting the values at each step of the loop because there are a lot less calls to the DB and you close the connection earlier:
Looping through recordset:
Open connection to DB.
SELECT Statement
WHILE NOT End of recordset
Output row result
WENDClose connection
Using GetRows():
Open connection to DB.
SELECT statement
Fetch all the results into array
Close connection
Loop through array
I've tried to mimic this with PHP, PDO, MySQL using fetchall: http://php.net/manual/en/pdostatement.fetchall.php and that part is probably OK, because I'm only using it to fetch tens of records, not hundreds or thousands. Where it falls down is that using LIMIT with a start position incurs a cumulative performance hit, whereas ADO GETROWS() used in conjunction with Move() doesn't. I can find no equivalent to Move(), which is a shame, because if we could drop the start position from LIMIT and use it in conjunction with Move() we could forget all this nonsense about constructing WHERE clauses to determine where we left off and keep things like direct page navigation.
Unless of course anyone know different.