4=4bytes, 10=10 bytes.
If you can find any application that requires a 1 byte integer (280, that's a range from zero to 1208925819614629174706176) then you're a better man than me 🙂
the most you'll ever need in any normal app is int4, sometimes int8 if you have millions of rows and thousands of mutations on sequence-columns per day.
So int(10) would be a wast of space if you never use more than 4 bytes.
Wether it's a problem is a different matter because int(10) uses 6 bytes more than int(4), so you 'waste' a huge 6 bytes per row. If your rows are 300 bytes, then 6 bytes is not that big a deal.