Are there Performance issues with so many files?
There certainly might be. The most meaningful approach is to
try is several ways and measure the results, but 30,000 is not
a small number...
Should I break them in separate Subdirectories...
That is a viable approach.
If So... Where do the Bottlenecks occur in terms of file access.
AFAIK it's just the time to search the directories. Even a simple
method like duplicating the first 2 chars of the filename (e.g.
sa/sample.gif) can give at least an order-of-magnitude decrease
in file-opening time. But if your file names are too regular, you
would want some other method (e.g. the last two chars before
the dot)