I just wanted to ask if it is safe to design a file table that in future will hold about 5-15 million of 0.5-10mb max files?
Will NTFS handle it?
I had a problem once on old Windows Server 2008 R2 that when I had a folder with more than 2.5 million files, then creating a new file inside that folder took about 30 seconds.... getting file list took about 5 minutes. Is that a NTFS problem?
Can it be a problem for this? Or file stream/file tables will create subfolders itself to handle so many files etc?
Or disabling 8.3 naming convention is enough and it will work fine then?
Thanks and regards
Yes. Just do not open file explorer. THAT - the program, not the operating system - can not handle that as well. Command line or server that do not try to load all files into a list work well.