Recently Browsing 0 members
No registered users viewing this page.
I have a script whose sole purpose is to walk through about 15,000 - 20,000 files in a single directory structure, making modifications to 5 or 6 lines within each file. Simple. I have a small splash screen that is simply counting how many files have been completed out of the overall total. The script is functioning fine and the splash screams through the initial files, but through time begins to slow to a crawl (It becomes noticeable after about 1,000 to 1,500 files).
I have used a combination of methods to write to the file, but none of which appear to make much of any difference:
Local $FileOpen = FileOpen($FileInList,1) FileWriteLine($FileOpen,@CRLF) FileClose($FileOpen) OR
FileWriteLine($FileHandle, $NewLineString & @CRLF) OR
Local $FileHandle = FileOpen($FileInList,2) _FileWriteFromArray($FileHandle,$FileContentArray,1) FileClose($FileHandle) I have checked and the CPU and memory are fine. My current work-around is to create 5-10 separate directory structures from the original one that contains 20,000 files, run the script and then combine them again in the end. It works, but isn't very portable. Any ideas?