Hi there everyone,

I'm trying to get a script to run that checks the ages of files in a given directory and deletes those files found to be older than x days. Thanks to someone on the web, I got a great start with this script:

<?php
$expiretime=4320; //expire time in minutes

$tmpFolder="/home/content/73/7682073/html/accountbackups/";
$fileTypes="*.gz";

foreach (glob($tmpFolder . $fileTypes) as $Filename) {

// Read file creation time
$FileCreationTime = filectime($Filename);

// Calculate file age in seconds
$FileAge = time() - $FileCreationTime;

// Is the file older than the given time span?
if ($FileAge > ($expiretime * 60)){

// Now do something with the olders files...

//print "The file $Filename is older than $expire_time minutes\n";

//deleting files:
unlink($Filename);
}

}
?>

The files I'm checking and deleting are site backup files and can run as large as 10gb. I'm getting this error:

Warning: filectime() [function.filectime]: stat failed for /home/content/73/7682073/html/accountbackups/backup-4.9.2011_03-30-06_husaberg.tar.gz in /home/content/73/7682073/html/accountbackups/delete_backups.php on line 10

I found this comment on the php manual page for filemtime:

If PHP's integer type is only 32 bits on your system, filemtime() will fail on files over 2GB with the warning "stat failed". All stat()-related commands will exhibit the same behavior.

As a workaround, you can call the system's stat command to get the modification time of a file:

On FreeBSD:
$mtime = exec ('stat -f %m '. escapeshellarg ($path));

On Linux:
$mtime = exec ('stat -c %Y '. escapeshellarg ($path));

Thanks to "mpb dot mail at gmail dot com" for his/her similar comment on stat().

So I think that it's failing because of the size of the files. Unfortunately, the botched running of the script deleted all of my backups, so I don't currently have any files to test with, but wanted to get a head start on this if at all possible 🙂

Would this alteration give me what I need? I should note that I'm ok checking my modify time as opposed to creation time, since the backups aren't moedified on the server so the stamp should be identical.

<?php
$expiretime=4320; //expire time in minutes

$tmpFolder="/home/content/73/7682073/html/accountbackups/";
$fileTypes="*.gz";

foreach (glob($tmpFolder . $fileTypes) as $Filename) {

// Read file creation time
// $FileCreationTime = filectime($Filename);  //Old method of getting file creation date
$FileCreationTime = exec ('stat -c %Y '. escapeshellarg ($Filename));  //new method of getting creation date

// Calculate file age in seconds
$FileAge = time() - $FileCreationTime;

// Is the file older than the given time span?
if ($FileAge > ($expiretime * 60)){

// Now do something with the olders files...

//print "The file $Filename is older than $expire_time minutes\n";

//deleting files:
unlink($Filename);
}

}
?>

Thanks in advance for any light you can shed on my problem.

thanks,
json

    There is a value in the PHP.ini file which has the max execution time for a PHP script. You may want to check that... As a bit of advise for the future... You really shouldn't be using PHP for system operations like this. Yes it does support functions to do it but it was not built for it. You would be much better off writing something in python for linux or VB script for Windows. Just a tip...

      Thanks very much for your reply, however since this is on a shared host with no access to php.ini or server-side solutions, this is in fact a pretty good method of handling what I need to do so if anyone can address my problem, I would be eternally grateful!

      Thanks,
      Json

        Just remembered you can extend that execution time with this:

        ini_set('max_execution_time', 300);

          Any reason why you're doing this with PHP at all when a simple one-line shell command would suffice?

          EDIT: Hm, guess I need to start paying attention again to how old my open tabs are before posting. :o

          EDIT2: And besides, "I'm on a shared host" doesn't really cut it anymore; of the two shared hosts I've been with most recently, both offered the ability to schedule cron jobs to do this type of task automatically.

            Thanks to everyone for your input. After testing, it seems that I've figured out the hiccups. and have got it up and running.

            Thanks again,
            json

              Write a Reply...