Jump to content
Sign in to follow this  
acada

Memory leak with FileReadLine

Recommended Posts

acada

Hi there I have problem with repeat file write/read. I need to generate numbers from 1 to 10000 and use them somewhere else in script. I need to do not use them more then once. So I created a file, where will be the number written after it is used and function, which check this flie and returns if the number waws used before. The problem is that this script is causing memory leaks. In few seconds the RAM usage is huge... Can anybody helps tell me whats wrong? BTW the problem occurs only when started with empty container.txt file. If the file is full of numbers (I mean full from 1 to 10000 and sorted ascending), there is no memory leak.

$j = 1
while $j = 1
    $rnd = Random(1,10000,1)
    If check($rnd) = 0 Then
        $file_1 = FileOpen(@WorkingDir&"\container.txt",1)
        FileWriteLine($file_1,$rnd)
        FileClose($file_1)
    EndIf
WEnd

Func check ($number)
    $check = 0
    $file_2 = FileOpen(@WorkingDir&"\container.txt",0)
    While $j = 1
        $line = FileReadLine($file_2)
        If @error = -1 Then ExitLoop
        If $number = $line Then $check = 1
    WEnd
    FileClose($file_2)
    Return $check
EndFunc

Share this post


Link to post
Share on other sites
PsaltyDS

You can do that more efficiently:

$sNumbers = @CRLF
$iTime = TimerInit()
For $n = 1 To 10000
    $rnd = Random(1, 10000, 1)
    If Not StringInStr($sNumbers, @CRLF & String($rnd) & @CRLF) Then $sNumbers &= $rnd & @CRLF
Next
$sNumbers = "10000 numbers generated in " & Round(TimerDiff($iTime) / 1000, 3) & "sec:" & $sNumbers
$file_1 = FileOpen(@WorkingDir & "\container.txt", 2) ; 2 = overwrite
FileWrite($file_1, $sNumbers)
FileClose($file_1)

ShellExecute("notepad.exe", @WorkingDir & "\container.txt", @WorkingDir, "Open")

This only open/write/close's the file once. I get about 17sec total, 50% on the CPU, and the Commit Charge goes up about 8MB while it runs then goes back to baseline.

:)

Edited by PsaltyDS

Valuater's AutoIt 1-2-3, Class... Is now in Session!For those who want somebody to write the script for them: RentACoder"Any technology distinguishable from magic is insufficiently advanced." -- Geek's corollary to Clarke's law

Share this post


Link to post
Share on other sites
acada

You can do that more efficiently:

$sNumbers = @CRLF
$iTime = TimerInit()
For $n = 1 To 10000
    $rnd = Random(1, 10000, 1)
    If Not StringInStr($sNumbers, @CRLF & String($rnd) & @CRLF) Then $sNumbers &= $rnd & @CRLF
Next
$sNumbers = "10000 numbers generated in " & Round(TimerDiff($iTime) / 1000, 3) & "sec:" & $sNumbers
$file_1 = FileOpen(@WorkingDir & "\container.txt", 2) ; 2 = overwrite
FileWrite($file_1, $sNumbers)
FileClose($file_1)

ShellExecute("notepad.exe", @WorkingDir & "\container.txt", @WorkingDir, "Open")

This only open/write/close's the file once. I get about 17sec total, 50% on the CPU, and the Commit Charge goes up about 8MB while it runs then goes back to baseline.

:)

Thank you for kicking me in... I have used your idea and rewrote it little bit:

$j = 1
while $j = 1
    $rnd = Random(1,10000,1)
    If check($rnd) = 0 Then
        $file_1 = FileOpen(@WorkingDir&"\container.txt",1)
        FileWriteLine($file_1,$rnd)
        FileClose($file_1)
    EndIf
WEnd

Func check ($number)
    $check = 0
    $file_2 = FileOpen(@WorkingDir&"\container.txt",0)
    $compare = FileRead($file_2)
    If $number = StringInStr(String($compare),String($number)) Then $check = 1
    FileClose($file_2)
    Return $check
EndFunc

You know, I needed the continuous first loop. And you are right. Ther is no more leaking...

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.