Jump to content

Working with large files


Recommended Posts

Hi all. I was wondering what possible ways there are to work with large files in AutoIt without bogging down the memory on a PC.

For example, how does regedit look at the huge amount of data in the registry without bogging down. If I export the registry to a file, the file is huge. I am guessing that the reason that the export is so huge is that the .dat registry is compressed and uses codes to indicate locations rather than the full text of the export. It also may have a table (like a directory) that it loads and looks up the table while only displaying that data that can be shown on screen.

However, what about regular text/data/exe/etc... files?

Can a large file be opened by small sections at a time in a way that you can scroll back and forth through its contents by constantly accessing just the portion of the file you are viewing?

Thanks

Be open minded but not gullible.A hammer sees everything as a nail ... so don't be A tool ... be many tools.

Link to comment
Share on other sites

Hi all. I was wondering what possible ways there are to work with large files in AutoIt without bogging down the memory on a PC.

For example, how does regedit look at the huge amount of data in the registry without bogging down. If I export the registry to a file, the file is huge. I am guessing that the reason that the export is so huge is that the .dat registry is compressed and uses codes to indicate locations rather than the full text of the export. It also may have a table (like a directory) that it loads and looks up the table while only displaying that data that can be shown on screen.

However, what about regular text/data/exe/etc... files?

Can a large file be opened by small sections at a time in a way that you can scroll back and forth through its contents by constantly accessing just the portion of the file you are viewing?

Thanks

Just posting to put this on top again to see if I could get a reply.

Be open minded but not gullible.A hammer sees everything as a nail ... so don't be A tool ... be many tools.

Link to comment
Share on other sites

You can always just read so many characters into a text file, and the next pass read the next chunk. You can either do that or create some sort of indexing system for the large file like the Registry does do.

Edit: I dont know that reading just part of the file will actually allow you to save memory, but it should help with not having to use all that memory while you are in process of working on the string you returned.

IMO,

JS

Edited by JSThePatriot

AutoIt Links

File-String Hash Plugin Updated! 04-02-2008 Plugins have been discontinued. I just found out.

ComputerGetInfo UDF's Updated! 11-23-2006

External Links

Vortex Revolutions Engineer / Inventor (Web, Desktop, and Mobile Applications, Hardware Gizmos, Consulting, and more)

Link to comment
Share on other sites

You can always just read so many characters into a text file, and the next pass read the next chunk. You can either do that or create some sort of indexing system for the large file like the Registry does do.

Edit: I dont know that reading just part of the file will actually allow you to save memory, but it should help with not having to use all that memory while you are in process of working on the string you returned.

IMO,

JS

Thanks JS.

You can always just read so many characters into a text file, and the next pass read the next chunk.

I haven't messed with the file open utilities much in AutoIt yet.

Are you saying that I could read parts of a file into a text file and view them that way?

I guess I'll just have to try some things out in AutoIT and ask more questions later.

Be open minded but not gullible.A hammer sees everything as a nail ... so don't be A tool ... be many tools.

Link to comment
Share on other sites

If you use FileOpen to get the handle of a text file and use FileReadLine to read each line of text into a variable, then you are saving memory as the variable in memory is only storing 1 line at a time while you loop through each line.

Another method is to read the whole file into memory with FileRead, then use StringSplit to add each line into an array. At that point, your memory used would be twice the size of the file read. Then clear the variable used in the intial FileRead to release back some memory and process the array as needed. This method is perhaps about twice as fast in reading the file but has the penalty of high memory usage.

If you do not care for speed and want to save memory, then FileReadLine may do, otherwise the read into an array is efficent with speed of reading. Depends on the script at hand for choice of what method suits best.

Edit:

Have a look at _FileReadToArray Standard UDF for ease.

Edited by MHz
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...