hello
my name is gustavo silva, and i am a portuguese web designer/php programmer
i've been using php for a while now and i thought i was capable of doing sumething that was on my mind using php. when i started it seemed quite easy but then problems started to appear.
he're is the scope: i am doing a nokia logo generator, i want to generate all the possibilities that a 72*14 nokia operator logo pixel matrix can have.
hehe
i'm using mysql to store the results hehe
since the possibilities are enourmous, we're talking about (21009)-1 some tetratetratetra...andsoon..tetrabillion possibilities, i've decided to simplify that by spliting it up in parts.
(before i continue, i must say, i am not stephen hawkings or similar dude, i'm just triyng to find out if this is possible, based on my own simple logic and capabilities/abilities hehe)
i've split the matrix in 56 parts of 18 pixels each.
each part has it's own possibilities. sum=262.144 possibilities 18 pixels can assume.
after generating them all and storing them on a mysql table, i've proceeded to the algorithmic part, wich generated a random 56 number array between 000001 and 262144.
in another table i save the relational result
everytime the script generates another result it checks if there is any equal, if so, it generates another one, if not, it stores it and displays the result.
it's working fine, and it has already generated ~5.000.000 of them
i've done a side script so i could distribute tasks. a user that would connect to the site, would logon, and starteed generating 500 of them at the time though javascript (client-side) then would send the results to the server for processing. (working)
in a thousand years we will discover all the possibilities altough i am counting on hardware evolution an performance increase as well as database capabilities. (i am testing mysql here!)
my problems right now are:
will sql, ever, ever be able to have a LARGE table like this: 5.000.000.000.000.000.000 rows? i've read on mysql.com that sum users have billions of them, but how much?
am i able to have multiple tables linked as one?
another big problem, is storage, 5 million equals 2gb, i have 120gb, but i do not want to keep the server so goddamn low hehe
is there anyway to compress the data inserted but still be able to consult the table data?
how about displaying the results?
how would you code a "next/previous page" script that would display the results based on mysql limit. i've made one that still is slow because it makes an initial quaery that will seek the sum of the results in the first page, and then proceed based on limits... (i must check it... should be simple).
should i get the max of rows, store the result on a $var and then divide it by number of results per page? and the make the necessary querys?
please contacte me through: pseudo@zbronx.com or l-generate@zbronx.com
u can see a reduced version of the script working on a testing server on http://l-generate.zbronx.com (in portuguese only, generates only 1 result, no mass generation, still under testing at my place 🙂) or talk with me so i can give you the script details.
thanks in advance
long life and be well
gustavo