Hi
I run an open source CMS and need a little help/advise. The system collects statistics into a database table, after a while this table gets quite large and with busy sites can get stupidly big. To reduce server load I introduced caching; this meant the statistics were read and stored into text files, and then only built once a week.
Although this is good to reduce server load it doesn't reduce the table size, so I started to code more caching. This version collects all data in the SQL table and puts the current results into text files. This data can then be used with new data that is stored into the database. Once a month all data is collated and put into the archive text files. So..
Check archive files exist, and weren't created more than a month ago
If the files do not exist create them and store the data into files
If they do exist, but are over a month old, fetch contents; run SQL to get previous months data and collate the two into the archive files
If they do exist, and ARE less than a month old; read the data and then collate data from DB table. Cache into weekly stats file.
The table schema is:
id
browser
ip
received
refer
The code is:
I've checked it locally on 60k+ rows and I THINK it works, but I would like some additional thoughts. I can supply the stats SQL if anyone would like to test it for me.
Thanks
Ashley
PS. Live stats at 1024cms.com/index.php?p=stats