AmbiguousJoe Posted August 22, 2012 Share Posted August 22, 2012 I created the following function to back-up rows from a 2D-array as my script runs through the data. I write to an Excel file at the end, so this provided a way for me to terminate the script early, without losing all my progress. It's works fine until my script starts writing to the file multiple times per second. My text files become corrupted. What should be a 125-500kb file with tab-delimited data, turns into an 80-123mb file with several thousand lines of nothing but blank characters/spaces. Func _BackupArray($nRow) ;Receives the row number that will be backed up. ConsoleWrite("Function: BackupArray"&@CRLF) Local $nCol = 0 ;column counter Local $sLineOutput = $aExcelList[$nRow][0] ;Collects the first array cell into a string While($nCol <= $nTotalExcelCols-1) ;cycles through columns to collect Array data. $sLineOutput = $sLineOutput&@TAB&$aExcelList[$nRow][$nCol] ;Adds each array's cell to the line, TAB delimited. $nCol = $nCol+1 ;Increases the column counter WEnd _FileWriteToLine($sBackupFile, $nRow+1, $sLineOutput,1) ;Sends the string to the backup file EndFunc ;==>BackupArray I suspect that the the problem is that _FileWriteToLine is a slow or inefficient way to accomplish a backup, but perhaps someone more knowledgable could help me out. When there are several seconds between calling this function, I don't have any problems; it's only when the function starts getting called several times per second. Link to comment Share on other sites More sharing options...
hannes08 Posted August 22, 2012 Share Posted August 22, 2012 (edited) First try: Use a filehandle instead of a filename for _FileWriteToLine(). From your snippet I can't see whether you're using a filename or a file handle. In at least 99% of the time using a filehandle is faster. Edited August 22, 2012 by hannes08 Regards,Hannes[spoiler]If you can't convince them, confuse them![/spoiler] Link to comment Share on other sites More sharing options...
water Posted August 22, 2012 Share Posted August 22, 2012 Looks like he is using a filename cause of the way the variable $sBackupFile is named. I wouldn't use _FileWriteToLine to write to a specific line number. Use FileOpen to open the file and retrieve a handle and then use FileWriteLine to write the line to the end of the file. To close the file use FileClose. My UDFs and Tutorials: Spoiler UDFs:Active Directory (NEW 2022-02-19 - Version 1.6.1.0) - Download - General Help & Support - Example Scripts - WikiExcelChart (2017-07-21 - Version 0.4.0.1) - Download - General Help & Support - Example ScriptsOutlookEX (2021-11-16 - Version 1.7.0.0) - Download - General Help & Support - Example Scripts - WikiOutlookEX_GUI (2021-04-13 - Version 1.4.0.0) - DownloadOutlook Tools (2019-07-22 - Version 0.6.0.0) - Download - General Help & Support - WikiPowerPoint (2021-08-31 - Version 1.5.0.0) - Download - General Help & Support - Example Scripts - WikiTask Scheduler (NEW 2022-07-28 - Version 1.6.0.1) - Download - General Help & Support - Wiki Standard UDFs:Excel - Example Scripts - WikiWord - Wiki Tutorials:ADO - WikiWebDriver - Wiki Link to comment Share on other sites More sharing options...
czardas Posted August 22, 2012 Share Posted August 22, 2012 The first thing that strikes me is that it is not necessary to write to the file every time. I would be inclined to write multiline to TSV after however many entries have been added. If you wanted to, you could save it as XLS later. That would be my prefered approach. operator64 ArrayWorkshop Link to comment Share on other sites More sharing options...
AmbiguousJoe Posted August 22, 2012 Author Share Posted August 22, 2012 hannes08:FileWriteToLine doesn't use file handles, but I assume you are suggesting the same method as water.water:You're correct: the $sBackupFile variable is a filename selected in an earlier part of the script.I write to specific lines so that I don't have to delete and re-create backup files every time the script runs. A single file is used and updated, which matches the format of the excel file (so I can just copy and paste it in). That being said, that's not a good reason to use a less efficient function under these circumstances.czardas:I write to file every single time because some rows can take several minutes to complete (based on the actions performed on that row). Saving after every 10 entries or so might solve the problem I first described, but it could mean that I lose around 45 minutes of progress if I stopped the script during a set of slower actions.I replaced FileWriteToLine() with$FileHandle = FileOpen($sBackupFile, 1) FileWriteLine($FileHandle, $sLineOutput) FileClose($FileHandle)I let the script run through about 10,000 rows and it didn't have any problems writing to the file. Thanks for your help guys. Link to comment Share on other sites More sharing options...
MilesAhead Posted August 23, 2012 Share Posted August 23, 2012 Was 500 kb file only a test case or a typical data file size? If it's that small I would tend to accumulate the data in memory and write the file in one go in binary mode,, renaming the old file as backup. You could use OnAutoItExitRegisterif necessary. My Freeware Page Link to comment Share on other sites More sharing options...
hannes08 Posted August 23, 2012 Share Posted August 23, 2012 I have to admit that you're right with _FileWriteToLine() is not able to handle a filehandle. For gaining speed you could do as MilesAhead suggested, or use _FileReadToArray() in combination with _FileWriteFromarray(). Or use a SQLite Database. Regards,Hannes[spoiler]If you can't convince them, confuse them![/spoiler] Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now