Yes, after 30 years RDBMS technology has got pretty smart. DBs self-optimise and integrity-check. It's great.
But all that that means is that you are freed from worrying about the vehicle to concentrate on the contents. You see, ultimately the quality and value of a database is down to the quality of it's contents.
Just because the table structure, indexes and relational integrity are fine does not mean that the actual data in the records is not crap. Users will do everything possible thing to stuff your data, sometimes you will not be able to believe that someone could do that by mistake - it must have been maliciouse. But no, it was just a user being a luser.
Also the laws of physics do apply and chaos reigns. Bits do get flipped and index keys become corrupted, data gets lost. You may also have subtle flaws in your design that are not immediately apparent, but that will eventually mean that you cannot get out all the data you put in.
With a small db these tings may not matter, but with millions of records you really must implement processes that test and detect such problems before they get out of hand.