Is this possible?

I want to perform a task with a php file, (updating a feed) which I want to do automatically once a day. I DON'T want to have to load the file in a browser by hand. The file itself could be anything (very small and fast) but it needs to be run every day.

The trouble is, I don't have access to my command line, or automated tasks via my webhost. So I can't run a cron job, right?

Two thoughts were running the file as an infinite loop using the sleep() function (unnecessary load on server / buggy / timeout problems / overall bad solution?) or getting a third party to load it for me. (Is this service available?)

Do any gurus out there know a way to cleverly sidestep this problem? I really don't want to have to pay more for access to my command line, (I run a very small photo site - markraymondmason.com), and I have a feeling I should be able to do it some other way.

Thanks...

    markraymondmaso wrote:

    Do any gurus out there know a way to cleverly sidestep this problem?

    Get a better host.

      No, seriously. How much are you paying per month? It doesn't take an expensive host to get features like conjobs... if you have a host that's not up to snuff, drop 'em!

        I'm paying $US 2.99 / mo. It's got everything else I need, plenty of storage, traffic, speed, domain names etc. I could upgrade within the company (1and1.com) to a package that gives me command line access for about $US 8 / mo. They seem pretty good, I'm just paying for a basic package.

        All I'm doing is adding a little RSS feed, though, and it seems crazy to pay $5 more when all my other needs are met.

        I was hoping someone would give me the name of a third party that would load an existing file every day on a schedule - there must be something like that around. There are sites that process form data for free, why not ones that run scripts? Heck, it doesn't evern have to be the same time every day, and it could even miss the odd day for all I care!

          What you could do is run your feed through PHP, rather than pointing directly to the .xml file.

          That way, every time the feed is called, PHP could check the file's modification time ([man]filemtime/man). If it hasn't been modified in a day (or whenever), do the necessary processing to update it, write the updated info to file, and then echo out the new contents to whomever requested the RSS feed at the time.

            Yeah, that's the stuff - good thinking.

            OK, so a feed can point to a php file that echos contents back instead of an .xml file - I didn't know that.

            Is the procedure the same, ie the feed address is just RSSfeed.php instead of RSSfeed.xml, with all the pertinent data just stored in a little text file?

              That sounds like the best solution, but in general if you want to run a server's php file on a schedule, just set up a cron job or scheduled task on your local computer to run wget. On Windows you can use this version.

                Never thought of that... sounds like that might be better.

                markraymondmaso - That's pretty much all you'd need, yeah, with perhaps the addition of a [man]header/man call to set the Content-Type to whatever XML's content type is.

                  I just tried a junior version of your idea, bradgrafelman, and it works splendidly. I think I'll just test for the time since last update, update the file if necessary, then just poop the user on through to the the xml to make the php as light-weight as possible.

                  If that doesn't work for some reason, I'll just echo the contents back like you said. Works like a charm.

                  Thanks, eh?

                    Write a Reply...