Honestly it's because it looks terrible to have notices, warnings etc popping up on websites from a cosmetic point of view; coupled with the fact that file_get_contents was what was being pushed in all previous posts, it's the simplest remittance of the issue.
That should be handled by logging instead of displaying errors in production.
Also turning down the reporting of errors (the only other way to fix using file_get_contents, does not affect the toll each error presents, which is what causes the slowdown regardless of suppression, it's not the suppression that causes the issue, just the error.
Turning down the reporting of errors is not the only other way to "fix" using file_get_contents, and it is not even the correct approach: rather, one should check for the possible errors in the first place, e.g., by calling file_exists prior to calling file_get_contents.
If it makes you happier they can manually ob_start before the call, then ob_flush to get rid of the message, which also allows them to log, or to override error handler... I think that these are overkill, or they could adjust their error_reporting, which is basically plastering over the cracks of unstable code (as is using @ for suppression
That is not overkill; that is being silly. file_get_contents is not "unstable code".
it at least signifies knowledge that file_get_contents is capable of failing to get files, so you can focus on remitting no content)
The fact that file_get_contents is capable of failing to get files should be handled by checking the return value for equivalence with false prior to using the contents retrieved.
Oh, but I see that the file name is actually some URL, in which case file_exists would not be applicable. Nonetheless, the "cosmetic point of view" would not have mattered: rather, one would have seen the errors in the log in the event that the URL was somehow incorrect or inaccessible: the function call would still have been essentially fine if its return value had been checked.