I have a question regarding the efficiency of sorting algorithms used with each of these methods for storing date information.
If you have a 14 character MySQL timestamp field (YYYYMMDDHHMMSS), an obvious benefit with sorting is that you can simply say: if one date > another date, then the 'greater' date is in the future with respect with the other, lesser, timestamp. When given a frequently occurring situation such as having a search form where the user can enter a date range to restrict returned records, it is easy to build a query that will select appropriate data based on the selected dates. I've also implemented this scheme in Oracle simply using a 14 character string field, and assembling the timestamp strings in the code.
The Oracle date type appears to store date information in a human-readable string format (making it easy to display, as there is no manipulation to be done), and Oracle can sort as well as MySQL with these date types using To_Char(Date_field,’YYYYMM…’). However, it would stand to reason that for any database to sort data stored like this, there would have to be some sort of nested string-matching along the way, which would seem woefully inefficient. (January would come after February in a straight sort, if the record is stored as ‘January 1, 2001’).
I'm wondering if anyone has more information on whether it is the Oracle 'monitor' that does the work to display these dates in a human readable format (but I doubt this, since they appear the same with third-party tools like TOAD), or is the Oracle date type sorting just slower? I'm dying to know how this works before I implement it in a solution that would have to scale to suit many thousands of records and users..