If you have memory to burn, and your databases aren't too big,
you could stick everything into one big array and serialize that to save it to disk.
Otherwise you could go for simple CSV type files, where the data is seperated by comma's.
You could then read the file into memory line-by line (or into an array if memory allows it) and explode the lines into array's so you can get at the data.
The big problems are:
Locking. only one script can/should access the file at a time.
Writing. You can't just start writing in the middle of a file, so you must read the entire file into memory, change the data, and overwrite the existing file with the new data. So if you have a 30MB database, you are re-writing the full 30 MB for every change (except new records which can be appended)