Since the codes are unique the id column isn't needed, and the code column itself can be the primary key, and then the clientCodes table would use the code as the foreign key.
But then the codes table only has one column, and all the information there is already present in the clientCodes table. So the entire codes table is redundant.
All that's needed is the clientCodes table: when a client reserves a block of codes, generate them then and add records pairing them with the client into the clientCodes table.
If you want to save time allocating codes by not having to check for each one whether it has already been used or not before assigning it (something that would get slower and slower as the number of already-allocated codes increases), then you could have a second codes table. The structure is the same as what you have given, but its function is different. The codes are listed randomly, and indexed by an autoincrement primary key. When a block of codes is to be allocated,
SELECT code FROM codes ORDER BY id LIMIT 15000
(or however many); for InnoDB tables, records are physically ordered by primary key, so this query will select a physically contiguous block of records without having to perform an explicit ordering step first.
Once the codes have been allocated (entered into the clientCodes table), delete them from the codes table: then they won't be accidentally allocated to another client. This also makes it easy to see if you're running short of codes.
Generating the codes. 7529536 codes is not a lot to store, but it takes a bit too much memory to generate all the codes and shuffle them first before inserting, so something like this might work:
CREATE TABLE codes (
id int(11) unsigned not null auto_increment,
`code` char(6) not null,
PRIMARY KEY(id));
CREATE TEMPORARY TABLE letters (c char(1));
insert into letters values ('0'),('1'),('2'),('3'),('4'),('5'),('6'),('7'),('8'),('9'),('a'),('b'),('c'),('d');
insert into codes(`code`) select concat(t1.c, t2.c, t3.c, t4.c, t5.c, t6.c) from letters t1, letters t2, letters t3, letters t4, letters t5, letters t6 order by rand();
The final codes table would take up on the order of a quarter gig of disk space, though there are compression options that can reduce that.
An entirely different method would be to use an incremental method that uses a step size other than 1. Then only the value that has currently been reached (the last one assigned) need be stored: the next can be calculated from that. If the increment used were 4653511, for example, the first several numbers (wrapping around % 7529536 as needed) would be
1 => 000001
4653512 => 891c5a
1777487 => 343ab5
6430998 => bd5930
3554973 => 687789
678948 => 139604
5332459 => 9cb45d
...Assuming you start from 1 - you could start anywhere in the sequence. It's important to use an increment that is relatively prime to 7529536, otherwise chunks of the possible range will be missed. You also want to use a fairly large increment so that wrapping happens often. 4653511 happens to be the nearest prime to 7529536/φ, where φ is the golden ratio (so maximising the size of gaps between successive terms). (It didn't need to actually be prime, only relatively prime to 7529536; 4653509 is closer, but has a factor of 7 in common with 7529536, so using it would only hit 1/7th of the possible codes. Using a prime saved having to make this check.)
Note that this is certainly not random, and brief inspection would be enough to see what is going on.
Edit: I said it would take "a bit too much memory to generate all the codes and shuffle them first before inserting"; so I did a test:
<?php
$start = microtime(true);
$table = array_map(function($c)
{
return str_pad($c, 6, '0', STR_PAD_LEFT);
}, array_map(function($f)
{
return base_convert($f, 10, 14);
}, range(0,7529535)));
shuffle($table);
$end = microtime(true);
echo $end - $start, " seconds\n";
echo memory_get_peak_usage(true);
The result of the above was five minutes and a bit over a gig of memory. Note that the time involved didn't include any database operations.
Edit: Oh, I remember this, ten years ago now. Thought I'd run that last test again: 14 seconds and half a GB.