There are two main cases that spring to mind.
First is that different operating systems have different conventions about how to indicate the end of a line in a text file. Unix uses \n, Macintoshes use \r, and Microsoft DOS/Windows use \r\n (or is that \n\r?)
Programs written for these different systems use these different conventions when reading/writing text files. But woe betide if they use the same convention for binary files! If a Windows program were to save a binary file using Microsoft conventions, it's liable to replace all the occurrences of \n with \r\n - garbling the file pretty much irreparably.
The other situation is known as "endianness". When you store a value that fits into, say, four bytes (eg. a 32-bit integer), and it comes time to transfer that value one byte at a time, what order do you send the bytes in? Do you read them off from left to right or right to left? Wars have been fought over this issue (the terms "big-endian" and "little-endian" themselves come from <i>Gulliver's Travels</i>).
Some conventions or provisions have to be established whenever a stream of bytes is to be passed from one computer to another in case they were built by manufacturers with different ideas about endianness - otherwise what gets sent as "UNIX" might arrive as "NUXI", say.