That would entirely depend on how your script operates. How does it know there are other pages it must gather data from? If there is some function which can find links, then you could put your entire data-gathering algorithm into a function and recursively call it. Example flow (using a mix of pseudo-code and actual code):
function gatherData($url) {
$page = file_get_contents($url);
$data = find_data($page);
insert_data_into_db($data);
if( ($nextURL = find_next_page($page)) ) {
gatherData($nextURL);
}
}
If instead you have some way of extracting the locations of all 26 pages, then building an array of URL's and using a simple [man]foreach/man loop would be easier (although possibly less efficient).