Hi. I remember many projects where we had to store lots of user files. Sometimes they were so much, that our team was forced to find some solutions, because when too many files are stored in single folder OS begins to process this folder too slowly. Usually solution was - putting each file into the folder named by the first letter of this files name. I find that quite simple and effective.
Recently our team made something new in project TBA. There it works in such way: all files of all units are stored under a single directory - user_files/filedump. There are appr. 400 folders at each level and customizable number of levels (right now it's 3). Each time file gets stored or recalled (etc. shown on front-end), destination path gets calculated by finding md5 hash of string "<path>/<file>". Path is what we set for "upload_dir" in configs, so actually it is virtual. It is only needed to have different results for files with same names, but in different units. Files are only stored on the last level. So if number of levels is 3, it would be something like 234/123/25/test.jpg I think that this system if far from being ideal - it has many serious disadvantages, i have mentioned it only as an example. But i really think that In-Portal needs to get some automated system, which would help to deal with large number of files without of any additional custom code.
I invite you to discuss on this.