This is a waste of processing cycles. Instead of reading from the database, writing back to the database, and causing extra work for the server, why not simply calculate the number of horses based on the time that the record was created?
For example, if user B has 3 horses, don't write any cron job - don't do any automatic updating until the next time that the data is actually needed. If user B logs into his account 13 days later and you want to show him how many horses he has, simply read the data from the database (he had 3) and now, 13 days later, he has: 3 * (2 ^ 13).
While this might seem like a "who cares, it's only a hundredth of a second each day" kind of a problem, I assure you it's not. Imagine if instead of horses, we were talking about bacteria or something that need to be updated once every five seconds, and we had 200,000 records in the database. Instead of making the computer try to keep up with updating 200 thousand records every five seconds, why not simply wait until that record is needed and calculate that change when it's actually needed.
If we do the calculation every 5 seconds for 200K records, we're going to need to buy 3-4 more computers and build a beowulf cluster, we're going to need to Raid some of the fastest disk drives in the world, and the disk drives are going to wear out from all the abuse. If we do the calculation just when it's needed, we'll have more computing power than we need if we have ONE really old piece of hardware with ONE very slow hard drive.