I keep receiving this error.

Could not run the query: Duplicate entry '255' for key 1

I am trying to insert records into my database but it seems like my code is trying to replace the data that is already there instead of keep adding more data?

//Use explode function to put tabbed array into the variable $string.
//explode can only use strings, the foreach statement will process
//each element in the array.
foreach($listingres as $record) // $listingres is the array you got from file() 
{ $record=trim($record); // Otherwise the newline character will still be on the end of the line 
  $fields = explode("\t", $record); // Do stuff with the $fields array. 

  //  Insert all of the 125 different variables into database
  $query = "INSERT INTO residential (mls, class, type, area, asking_price,
    city, state, zip, status, sale_rent, bedrooms, fullbath, halfbath, garage, 
    gar_type, yr_built, sqft, counties, number_acres, agent_id, agent_name, 
    agent_phone, listingoffice1_id, listingoffice1_name, listingoffice1_phone, 
    listingagent2_id, listingagent2_name, listingagent2_phone, listingoffice2_id,
    listingoffice2_name, listingoffice2_phone, comp_sa, comp_ba, year_built, 
    sqft_total, sqft_howmeas, lot_size, township, listing_date, expiration_date,
    legal_1, legal_2, parcel, bath_upper, bath_main, bath_lower, rooms_lrsize,
    rooms_lrlevel, rooms_drsize, rooms_drlevel, rooms_frsize, rooms_frlevel,
    rooms_kitsize, rooms_kitlevel, beds_1size, beds_1level, beds_2size,
    beds_2level, beds_3size, beds_3level, beds_4size, beds_4level, rooms_utilsize,
    rooms_utillvl, rooms_description, rooms_othsize, rooms_othlvl,
    other_desc, other_size, other_level, other_desc2, other_size2, other_level2,
    terms, fh_flood, fh_heatbud, zoning, lock_box, subdivision, directions_1,
    directions_2, original_price, assumable, accelerate, qualify, equity_amount,
    assumption_payment_amount,hfyr_tax)
    VALUES ($fields[0],'$fields[1]','$fields[2]','$fields[3]',$fields[4],'$fields[9]',
    '$fields[10]',$fields[11],'$fields[12]',
    '$fields[13]','$fields[14]','$fields[15]','$fields[16]','$fields[17]','$fields[18]',
    '$fields[19]','$fields[20]','$fields[21]','$fields[22]','$fields[23]','$fields[24]',
    '$fields[25]',
    '$fields[26]','$fields[27]','$fields[28]','$fields[29]','$fields[30]','$fields[31]',
    '$fields[32]','$fields[33]','$fields[34]','$fields[35]','$fields[36]',$fields[37],
    '$fields[38]','$fields[39]','$fields[40]','$fields[41]','$fields[42]','$fields[43]',
    '$fields[44]','$fields[45]','$fields[46]','$fields[47]','$fields[48]','$fields[49]',
    '$fields[50]','$fields[51]','$fields[52]','$fields[53]','$fields[54]','$fields[55]',
    '$fields[56]','$fields[57]','$fields[58]','$fields[59]','$fields[60]','$fields[61]',
    '$fields[62]','$fields[63]','$fields[64]','$fields[65]','$fields[66]','$fields[67]',
    '$fields[68]','$fields[69]','$fields[70]','$fields[71]','$fields[72]','$fields[73]',
    '$fields[74]','$fields[75]','$fields[76]','$fields[77]','$fields[78]','$fields[79]',
    '$fields[80]','$fields[81]','$fields[82]','$fields[83]','$fields[84]','$fields[85]',
    '$fields[86]','$fields[87]','$fields[88]','$fields[89]','$fields[90]','$fields[91]')";

  print $query . "<br>";  //Print out the values of the query


  $result = mysql_query($query) or die('Could not run the query:  '
  .mysql_error()); //Run the query

    This probably means that you have the 'mls' field set to a unique, which means that two records cannot have the same value. So I am guessing that there is another record with the value '255' already there. Remove the unique index from the DB and you should be ok.

      (RL)Ian wrote:

      Remove the unique index from the DB and you should be ok.

      That could be a Bad Thing(tm).

      Most db tables have primary keys for a reason. Better to figure out what steps in the data gathering process are likely to cause problems and take steps to "head 'em off at the pass"....

        dalecosp wrote:

        Better to figure out what steps in the data gathering process are likely to cause problems and take steps to "head 'em off at the pass"....

        Hmmm..that gets me thinking. I do not see any problems in my data gathering, I am placing my formatted data directly into the db. i think i am having problems with the insert statement itself. I just cannot understand how to force my insert statement to continue to add data, and not overwrite the existing data? Any ideas?

          its not trying to overide existing data. did you read (RL)Ian's reply?

            dalecosp wrote:

            That could be a Bad Thing(tm).

            Most db tables have primary keys for a reason. Better to figure out what steps in the data gathering process are likely to cause problems and take steps to "head 'em off at the pass"....

            Primary keys and unique keys are not necessarily the same thing.

              I can't be doing reading that code, but the first idea that came to me even before opening this thread is that the primary key is an auto increment tinyint - it's got up to the 255 mark, auto incremement tries to push it further, auto truncates the value down to 255 in its "I try to understand, honestly I do" way and voila - kick in the teeth.

              Could be wrong, clearly.

                Alrighty, I found my problem after doing many more searches and then reading Drakla's reply. My 'mls' field was set to tinyint. That was very much a problem. After putting that field to the int type I was good to go. I am still getting the error

                "Could not run the query: Duplicate entry '95759' for key 1"

                But this is because when I reload my webpage and the code starts all over, it is trying to write a value to my db that already exists. And now I know that an insert statement will not overwrite exising matching data. So this one is "RESOLVED". Thanks guys. I will do some searching. I am sure there is a way to overwrite data, these particular records will be updated weekly, I think it would be so much easier to just to replace the entire record on a weekly basis.

                  Write a Reply...