OK, now that we've settled that we CAN have enough rows in most databases. The real issue is designing this thing so it doesn't crawl along at a snails pace. While you should throw the biggest hardware you can afford to at it eventually, start out on a smaller box and try to make this app efficient enough to not need a big n way server to work well.
Data Modelling:
The test, their questions, and their real and fake answers, can be stored almost anyway you like. You could just pack the data from an array into a single field in the dbms. Use explode/implode or something like that. If you do it with joins, make sure to use primary keys and indexes to keep performance up.
I.e. :
Note, SERIAL is postgresql's way of making an autoincrementing primary key...
'create table tests (id SERIAL, title text, question_id int);'
Indexes cost a lot to maintain, but they save a lot on reads. Assuming the tests and questions won't be changing a lot, indexes are probably a good idea.
'create index tests_ndx_b on tests (id, question_id);'
will create a btree index in postgresql. Look up the index methods for your database and try the ones that look interesting.
'create table questions (id SERIAL, question text, answers int);'
'create index questions_ndx_b on questions (id, answers);'
Then,
'create table answers (id serial, answer text, correct bool);'
Now, make a table called scores and students to hold all their stuff in, make some fake data, (online articles chop up nicely) and stick it in the database and write some queries from php.
DO NOT create indexes on the scoring table since it will be write mostly.
You could create and destroy indexes on the scores and students et. al. for data pumping / report generation during off hours though.