Sign in to follow this  
Followers 0
TomTom

Find, Sort, Filter, Mask big ACII Files

7 posts in this topic

Hi,

I have to sort , filter and display the results in listview box.

The problem I will face, I will have a lot off (more then 2500 )ASCII Text file with can be have a size more the 30 MByte each.

Do anybody have done some job with AutoIt ?

If yes, please can you help me ?

Thanks

Tom

Share this post


Link to post
Share on other sites



Hi,

never done that job with such an amount of huge files, but it should work. Maybe it could last a long long time. :D

Give it a try.

So long,

Mega


Scripts & functions Organize Includes Let Scite organize the include files

Yahtzee The game "Yahtzee" (Kniffel, DiceLion)

LoginWrapper Secure scripts by adding a query (authentication)

_RunOnlyOnThis UDF Make sure that a script can only be executed on ... (Windows / HD / ...)

Internet-Café Server/Client Application Open CD, Start Browser, Lock remote client, etc.

MultipleFuncsWithOneHotkey Start different funcs by hitting one hotkey different times

Share this post


Link to post
Share on other sites

I did a gui which entered data into a toolbar. I had to extract coordinates of a ton of cities at FallingRain.com from HTML into data files. I ended up compiling hash tables in separate files with which to populate drop down boxes, with thousands of cities in each list. It was quick. There was no lag between the event that determined the list to populate with and the completion of the population event.

Share this post


Link to post
Share on other sites

Hi,

In the function in "Autoit3Ex.au3" [see my sig] , called"Func DirToList($szPath)", I made a dos file which could be huge; the rest of the func retrieved parts of that for display; filtering could be added; limit the display to the number you want, and scroll;

Not as pretty as @ptrex, but simple; uses "DOS findstr" so its as fast as anything else, I think;

(Sorting fast could be added if required using "_GUIlistView.au3 from my sig too")

best, Randall

Share this post


Link to post
Share on other sites

Yes, Loading only part of the data you are working with at any one time is crucial. I do alot of raw data processing. I've made the mistake many times of loading 100MB files in one shot, and the thing freezes up. I've frequently resorted to opening a file, then reading and evaluating each line one at a time, using the same variable for the current line loaded.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0