Quinch Posted October 30, 2008 Posted October 30, 2008 Just thought to check - how are arrays stored, in terms of memory usage? Will declaring a large array slow down the execution compared to declaring a small one, assuming they hold an equal number of values? I'm knocking together a script that does most of its work through several arrays - as a value is entered, the arrays are widened to accommodate the new information. However, would it slow the script down much if I simply defined a very large array from the start and said nerts to ReDim?
dbzfanatic Posted October 30, 2008 Posted October 30, 2008 As long as you have the default value of the indexes as 0 I don't think it would have that much of an impact on your speed, I could be wrong but that's what common sense says to me. Go to my website. | My Zazzle Page (custom products)Al Bhed Translator | Direct linkScreenRec ProSimple Text Editor (STE) [TUTORIAL]Task Scheduler UDF <--- First ever UDF!_ControlPaste() UDF[quote name='renanzin' post='584064' date='Sep 26 2008, 07:00 AM']whats help ?[/quote]
picaxe Posted October 30, 2008 Posted October 30, 2008 Just thought to check - how are arrays stored, in terms of memory usage? Will declaring a large array slow down the execution compared to declaring a small one, assuming they hold an equal number of values? I'm knocking together a script that does most of its work through several arrays - as a value is entered, the arrays are widened to accommodate the new information. However, would it slow the script down much if I simply defined a very large array from the start and said nerts to ReDim?I've seen improvements of 8x by declaring a large array and not ReDim-ing within a loop.
Quinch Posted October 30, 2008 Author Posted October 30, 2008 Alright, so there's no particular downside to declaring say, a [10000][3][10000][1000] array right off the bat and populating it as needed? Also, does the 2^24 limit refer to the total size of the array, or just the amount of values that can be inserted into it?
picaxe Posted October 30, 2008 Posted October 30, 2008 I believe the sum of elements in all dimensions must not exceed 2^24.
PsaltyDS Posted October 30, 2008 Posted October 30, 2008 The speed of access the array is no different between a large or small array. The extra steps to ReDim the array cause the slow down. This can be minimized by ReDim'ing in increments as required, like adding 100 to the first index each time. An experiment to show what the penalty for declaring unnecessarily large arrays is -- 1. Open Task Manager, on the Processes Tab. 2. Run this script from SciTE: For $n = 1 To 10 Dim $avArray[1000][16][1000] Sleep(3000) Dim $avArray[10][16][1000] Sleep(3000) Next 3. Watch the AutoIt3.exe process, on the Memory column as the declared array changes size every three seconds. Valuater's AutoIt 1-2-3, Class... Is now in Session!For those who want somebody to write the script for them: RentACoder"Any technology distinguishable from magic is insufficiently advanced." -- Geek's corollary to Clarke's law
Quinch Posted October 30, 2008 Author Posted October 30, 2008 So inna nutshell, it's going to use a heckload of memory while it executes, but on the whole, there won't be any difference in execution speed? Cool, I can live with that. Thanks.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now