Jump to content
Sign in to follow this  

Large file to Array - Error Allocating Memory

Recommended Posts


I have a large CSV file that was 350Mb at the start of the year and is forever growing due to other processes. It now stands at 550Mb. I am required to split this into smaller files which are to be used for a separate process. I have written a script that takes this file as an input and splits it into smaller files depending on the size the user has input, say 100Mb.

This works fine for 100Mb however if the user enters anything above 250Mb I am receiving an "Error Allocating Memory" message. I do not want to limit their size and am sure I am doing something wrong as I'm only just getting into AutoIt. Can anyone help me get around this error? The error occurs when I "Read chunk into array" for chunks greater than 250Mb.

The original file (546Mb) has 4.4mill lines each with ~150 characters.

The cut down version of my app without gui and other tidbits is here (Change $Splitsize to whatever):

#include ".\_Include\FileConstants.au3"
#include ".\_Include\Array.au3"
$Title = "File Splitter"
Func SplitMe()
            local $theFile, $MainFileSize, $Array      
            $SplitSize = 100 ; Mb
            $theFile = "C:\LargeFile.csv"
            $MainFileSize = FileGetSize($theFile)
            $NewFileSize = 1000000 * $SplitSize ; New file Size Limit
            $hFile = FileOpen($theFile, 0)
            IF $MainFileSize > $NewFileSize Then
                        $a = 0 ; New File Number
                        While 1
                                    $a = $a + 1
                                    $Array = StringSplit(StringStripCR(FileRead($hFile, $NewFileSize)), @LF) ; Read chunk into array
                                    $cut = StringLen($Array[$Array[0]]) ; Remove the last (incomplete) line
                                    _ArrayDelete($Array, UBOUND($Array)) ; ^
                                    $Array[0] = $Array[0] - 1 ; Decrease record count by 1 ^^
                                    ; Remove file if it already exists
                                    $newFile = StringLeft($theFile, StringInStr($theFile, ".", 0, -1) - 1) & "_" & $a & StringRight($theFile, StringLen($theFile) - (StringInStr($theFile, ".", 0, -1) - 1))
                                    If FileExists($newFile) Then
                                    CreateFile($newFile, $Array, $a)
                                    FileSetPos($hFile, ($cut * -1), $FILE_CURRENT ) ; Go back to the start of the incomplete row
                                    $Array = 0 ; Empty Array
                                    If @Error = -1 or ($NewFileSize * $a) > $MainFileSize THEN ExitLoop ;
                        msgbox(262144, $title, "The file is already smaller than the requested split size")
Func CreateFile($newFile, $Array, $i)
            $lines = $Array[0] ; total records
            $f = FileOpen($newFile, 2) ; Open File
            $j = 1 ; current line
                        FileWrite($f, @CRLF & $Array[$j]) ; Write line to new file
                        ; Progress Bar - GUICtrlSetData($Pgrs, ((($i-1)/$NoOfFiles) + ( ($j/$lines) * (1/$NoOffiles) )) * 100)
                        $j = $j + 1
            Until $j > $lines
            FileClose($f) ; Close File

Share this post

Link to post
Share on other sites


Try reading in smaller chunks to your split file..

Reading 100MB of data in one go into the memory space of your app (autoit) sounds like you'll run into probs.

Set yourself an internal chunck size, for example: 2048 bytes.

then loop through till you hit your split size.

This way your memory use won't go through the roof.


Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Create New...