Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Question
Tuesday, September 11, 2012 3:17 PM
I have a problem here whereby there's more than a million files in a single folder. Any action that require a listing of the directory will practically hangs on attempt; e.g. opening that particular folder in windows explorer.
I'm thinking if there is any way to split the files into folders, each containing 10,000 files?
If you're wondering how this scenario happens. The front end clients do ftp PUTs into that folder 24/7.
All replies (8)
Tuesday, September 11, 2012 7:01 PM | 1 vote
Hello,
I worked in the past in a ISP that had similar problems. The solution was to code a script to get the mass of data and create folders from 0-9 and a-z and put the files inside the folders according to 1st letter. Then for big folders we did it for 2nd letter, inside the folder A for example created other folder structure from 0-9 and a-z and organized file according to the name.
Other option is to count 10.000 files and put them into a new folder numbered 001 002, etc or other ordering method... by counting I mean, script something that runs daily/weekly and do this job for you... let me know if you need some examples in PowerShell (which Windows Version are we talking about here?) here some facts about limits on Windows: http://ask-leo.com/is_there_a_limit_to_what_a_single_folder_or_directory_can_hold.html even if the folder can hold more files the OS / CPU / memory resources to list it can be too much.
Thank you,
F. Schubert
System Administrator
MCP | Microsoft Certified Professional
MCTS 70-640 | Microsoft Certified Technology Specialist: Windows Server 2008 Active Directory, Configuration
MCTS 70-642 | Microsoft Certified Technology Specialist: Windows Server 2008 Network Infrastructure, Configuration
Tuesday, September 11, 2012 8:48 PM | 1 vote
I have a problem here whereby there's more than a million files in a single folder. Any action that require a listing of the directory will practically hangs on attempt; e.g. opening that particular folder in windows explorer.
I'm thinking if there is any way to split the files into folders, each containing 10,000 files?
If you're wondering how this scenario happens. The front end clients do ftp PUTs into that folder 24/7.
Hi,
With the help of scripting we can accomplish it and split huge folder to multiple folders and place the files equally or based on the count.I suggest you check below thread once: http://social.technet.microsoft.com/Forums/en-AU/winserverpowershell/thread/4f546d84-65dc-4640-86a2-62f841d6447f
Still you need a help in scripting seek help PowerShell forum:http://social.technet.microsoft.com/Forums/en-AU/winserverpowershell/threads
Regards, Ravikumar P
Wednesday, September 12, 2012 8:30 AM
Hi,
thanks for replying. i had the same idea.
let's just say i'm going to do this the conventional way from command prompt; copy every file that starts with alphabet a into a_folder which resides in another volume/disk.
C:\ move a* D:\a_folder
im curious how is this action processed?
Does it filter and move?
- list all files in folder
- filter files that starts with a
- move them
or does it move file immediately when found?
search for first file that starts with a
move that file
search for the next file that starts with a
move that file
repeat
Anyway, the systems using Windows 2008 R2
Wednesday, September 12, 2012 8:41 AM
Hi,
is it possible to list just the first file in a folder? regardless of filename or extension through powershell?
This way, i can put some logic into organizing the chunk of files.
Thursday, September 13, 2012 7:37 AM | 1 vote
Hi,
Files in a folder are also listed by specific attribute, such as Data Modified, File Name etc. So it should be supported to list "first file" as actually it is the last modifed file or created file etc.
Meanwhile for any question about the script, you could post a threas to Script Guys forum for further information:
http://social.technet.microsoft.com/Forums/en/ITCG/threads
TechNet Subscriber Support in forum |If you have any feedback on our support, please contact [email protected].
Thursday, September 13, 2012 9:17 AM
Hi,
I'm not sure how the file system places each files, but i've got a gut feeling that the first file in a directory may not necessarily be organized by last modified or creation. My deduction goes like this: when you enter dir without arguments in command prompt, it lists files in the directory in alphabetical order. Im guessing it will read through all the files first, then sort them.
That's the problem here, i do not want to read through millions of files.
I think i found the solution here. Should be able to work something out from it.
http://msdn.microsoft.com/en-us/library/windows/desktop/aa365200%28v=vs.85%29.aspx
This way i can process one file at a time without first looking through the whole directory to see what's inside.
Thursday, September 13, 2012 9:23 AM
Hi everyone,
I found this on the net, and you might want to test this out. You would need to safe the file as filter.bat and run the command by filter.bat "c:\source" "d:\target". What it will do it will list all the files and copy only 10 files, you can increase the size if you want to.
@echo off
mkdir %2
dir %1 /b /s /A:-D >tempfilelist.txt
setlocal enabledelayedexpansion
set counter=0
for /f "tokens=1 delims=¬" %%b in (./tempfilelist.txt) do (
IF !counter! LSS 10 call :docopy "%%b" %2
set /a counter+=1
)
endlocal
)
del /q tempfilelist.txt
GOTO:EOF
:docopy
copy %1 %2
GOTO:EOF
CK
Thursday, September 13, 2012 9:52 AM
Hi,
I dont want to list all files preceding to any action like copying/moving. I wan to avoid listing all files. To list all files mean i have to read through the directory and list the millions of files.
I think i will work my work from the windows api findfirstfile and filenextfile. http://msdn.microsoft.com/en-us/library/windows/desktop/aa365200%28v=vs.85%29.aspx
And thanks alot, i will still look through the script when i have time.