As did I, but it wasn't as fast as I'd hoped, and I think I know why. Somewhere on the forum is a post from the Dev who wrote the initial Array code, including ReDim, and he says that ReDim makes things go real slow because of the way it has to work (internally copying all the data from the old array to a new array of the correct size, etc...). His suggestion was to only ReDim every so often, and in large chunks. I couldn't figure out how to do that in my functions, so I didn't worry so much about it. Since the ReDim is the bottleneck (as you two found), it just goes to show the problem with trying to use dynamically increasing arrays like this. I had another solution, but it didn't work on the project I was working on. Here that one is, for you to play with (based on the Wikipedia article for hash tables).
You will see that i had eliminated redim in "dictionary2.au3", but it was still slow; I think it might be because you were using an array to hold an array too, and repeatedly refinig that; I don't think autoIt likes that for speed; see warning in Helpfile)