Sign in to follow this  
Followers 0
DarkNecromancer

AutoIt Storage Limitations

3 posts in this topic

I am currently creating a web domain crawler and it will seen keep statistical data on the pages it crawls in the domains. All that aside, I've almost got my code ready to go live, however I started thinking about the amount of data I'm going to be saving. At the moment I'm doing file read/write line and it works fine for the small amount of data I'm generating from the limited amount of time a single client has run, but if i get 30+ clients running I can see the data requirements going up extremely fast. I was looking at the SQLite include but I don't know the performance stats of it. Does anyone have any experience of using it to manage Gigs of data?? Does anyone have any ideas about storing this much data. At the moment the process keeps track of all unique domains, the pages it comes across and would need to keep some information about the pages themselves ( like the number of s's on them :whistle: ). I just want to avoid dumping a couple of days designing a back end for my code only to find out that I'm running into limitations to the data capacity. I'm assuming that SQL would be my best choice given that it appears to use an external code base (dll), but I've never used it before. Meh, let me know what you guys think

Darknecromancer

Share this post


Link to post
Share on other sites



The only limitation i know of is that INIRead/Write has a limit of 32kb so as long as you don't use an INI, you should be fine for large amounts of data.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0