Jump to content

Recommended Posts

Posted

Below is a segment of code that I am trying to use to perform a set of actions on a large number of files. Some of the commands are from an include file I have made but should be self explanatory.

Basically, the segment makes a directory list of all files names in a specific location which is piped to files.txt. The script then reads in each line of files.txt into an array called $the_pdf_files[100000].

With around 7,000 files, it takes about 5-10 minutes to read into the array. If possible, I would like it to take about 5-10 minutes to read in 50,000 or so file names.

Any suggestions or pointers in the right direction?

;First do a dir /b >filelist.txt

run_command("dir " & $pdf_location & "*.pdf /b >" & $pdf_location & "files.txt")

msgbox(1, "Click ok", "Click ok when the command is finished.")

$the_file = FileOpen($pdf_location & "files.txt",0)

if($the_file = -1) Then

exit_program("Could not open files.txt, which contains the list of .pdf files on d:\, generated by this script.")

EndIf

dim $count = 0

Do

$the_pdf_files[$count] = filereadline($the_file, $count + 1)

$count = $count + 1

until @error

$count = $count - 2

;meaning read lines until eof

fileclose($the_file)

filedelete($pdf_location & "files.txt")

Posted

Have you looked at the User Defined Function _FileReadToArray in the help file?

<{POST_SNAPBACK}>

I just tried it out and it works great (took about 2 seconds to read the file).

Thanks for pointing me in the right direction, I am still learning about the basics of Autoit while still trying to write useful scripts.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...