Sign in to follow this  
Followers 0
simy8891

Compare a number of csv files and their values

12 posts in this topic

Hi guys,

I'm getting some issues with timing: it's taking ages :(

What I have:

  • 46 .csv files circa (number may vary)
  • Each of those has 6100 rows

What I have to do:

  • Compare a certain column (number) and if it's higher than the current one, update it

 

I'm obviously doing this with 2 For cycles, one into each other but it's really slow. It takes about 65/70 seconds every 1000 records. Is there something else I can try to speed up the process?

This is the For part I have:

For $i=1 To $ListOfCSVs[0]

        Local $tempArray= 0
        _FileReadToArray($tmpFiles&'\'&$ListOfCSVs[$i],$tempArray)

        ;MsgBox(0,$i,$tmpFiles&'\'&$ListOfCSVs[$i],$tempArray)

        $d=0

        For $d=1 To $CompleteArray[0][0]
            ;SplashTextOn($i, "ID:"&$d, -1, -1, -1, -1, 4, "", 24)
            $searchSAM=_ArraySearch($tempArray, $CompleteArray[$d][0],2,0,0,1)
            If  $searchSAM > 0 Then

                ;If found $searchSAM holds the position of the item just found
                $tmpItem=StringSplit($tempArray[$searchSAM],",")

                ;We're interested on the last row of $tmpItem >> $tmpItem[$tmpItem[0]]
                If $tmpItem[$tmpItem[0]] <> '' Then 

                    If $tmpItem[$tmpItem[0]] > $CompleteArray[$d][2] Then 

                        $CompleteArray[$d][2] = $tmpItem[$tmpItem[0]]

                    EndIf
                EndIf
            EndIf
            ;Splashoff()
        Next


    Next

I used the SplashTextOn to help me checking the speed of the cycle..

Cheers

Share this post


Link to post
Share on other sites



I'd do that by loading everything into an SQLite DB and work from there but there are plenty of details which need specification to provide a useful guidance at this stage.


This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Share this post


Link to post
Share on other sites

You could use Excel to do all the processing. Shouldn't be too hard with the rewritten Excel UDF that comes with the latest AutoIt beta version.


My UDFs and Tutorials:

Spoiler

UDFs:
Active Directory (NEW 2017-02-03 - Version 1.4.7.0) - Download - General Help & Support - Example Scripts - Wiki
OutlookEX (NEW 2017-02-03 - Version 1.2.4.0) - Download - General Help & Support - Example Scripts - Wiki
ExcelChart (2015-04-01 - Version 0.4.0.0) - Download - General Help & Support - Example Scripts
Excel - Example Scripts - Wiki
Word - Wiki
Tutorials:
ADO - Wiki

 

Share this post


Link to post
Share on other sites

Thanks guys. I'll try with Excel but I can't use SQL.

I'll also try to store everything in an array, it's gonna be huge, but I hope that it's gonna be faster since it'll run through each name once.

Share this post


Link to post
Share on other sites

#5 ·  Posted (edited)

Hi simy88912

not sure to well understood the files content, anyway maybe this could be faster (?) not tested, just an idea

in short, instead of scanning all lines of tempfiles one by one, just do a StringInstr in the whole file

p.s.

is CompleteArray bigger or smaller than TempFiles?

For $i = 1 To $ListOfCSVs[0]

    Local $tempArray = 0
    ; _FileReadToArray($tmpFiles&'\'&$ListOfCSVs[$i],$tempArray) ; instead of read to array
    Local $hTempFile = FileOpen($tmpFiles & '\' & $ListOfCSVs[$i])
    Local $sTempFile = FileRead($hFileOpen) ;   read the whole file in one shot to use with StringInStr later

    ;MsgBox(0,$i,$tmpFiles&'\'&$ListOfCSVs[$i],$tempArray)

    $d = 0

    For $d = 1 To $CompleteArray[0][0]
        ;SplashTextOn($i, "ID:"&$d, -1, -1, -1, -1, 4, "", 24)
        ; if is found is because $CompleteArray[$d][0] is equal to a line in the TempFile. is it right?
        ; $searchSAM=_ArraySearch($tempArray, $CompleteArray[$d][0],2,0,0,1)
        ; If  $searchSAM > 0 Then

        ; what we want to know is only if that record (string) is or is not within the temp file.
        If StringInStr($sTempFile, $CompleteArray[$d][0]) Then ; this should be faster..... (?)

            ;If found $searchSAM holds the position of the item just found
            ; $tmpItem=StringSplit($tempArray[$searchSAM],",")

            ; if i've well understand first field of CompleteArray could contain the same value
            ; of a record in the TempFile so is ok to use that record instead of the one in TempFile

            $tmpItem = StringSplit($CompleteArray[$d][0], ",")

            ;We're interested on the last row of $tmpItem >> $tmpItem[$tmpItem[0]]
            If $tmpItem[$tmpItem[0]] <> '' Then

                If $tmpItem[$tmpItem[0]] > $CompleteArray[$d][2] Then

                    $CompleteArray[$d][2] = $tmpItem[$tmpItem[0]]

                EndIf
            EndIf
        EndIf
        ;Splashoff()
    Next
Next
Edited by Chimp

small minds discuss people average minds discuss events great minds discuss ideas.... and use AutoIt....

Share this post


Link to post
Share on other sites

#6 ·  Posted (edited)

Hi Chimp,

thanks for the suggestion. In theory, CompleteArray and TempArray are the same size. There might be a mismatch of 1 or 2 lines so I thought I'd use directly _ArraySearch() since in 99.9% of the cases that string will be found. Sorry for not specifying this, obviously if you don't know what data's in there it is a bit difficult, however I believe your suggestion, applied to the specific case, will make it slower since it'll run a double check basically.

One thing though:

Most of the times $tmpItem[$tmpItem[0]] will be empty, is there a way to clean up the results in advance and have a smaller csv file? Count that for sure each .csv will have 60% items less with this check! But I'm stuck in the logical thinking of it :P

UPDATE:

Well, I'm feeling pretty stupid.. but the command generating these .csv files actually includes the option to omit empty rows for that kind of object. I tested a single .csv export and it went down from 6100 rows to 1200 circa. I hope it'll speed up the process.

Thanks

Edited by simy8891

Share this post


Link to post
Share on other sites

Hi Chimp,

....., applied to the specific case, will make it slower since it'll run a double check basically......

 

where is the double check?


small minds discuss people average minds discuss events great minds discuss ideas.... and use AutoIt....

Share this post


Link to post
Share on other sites

Hey, I've updated my previous post. Anyway regarding this case, there will be a double check here:

If StringInStr($sTempFile, $CompleteArray[$d][0]) Then

Since I know that in 99.9% of the case the string already exists in the target array, it's kind of useless. Again, for the case!

Also, since I did shrink the csv files, now I'm running across the temporary array instead of the giant one, so I'm only checking items present in the csv files. In 7 minutes I'm on the half.. Last night it took 4 hours instead :P

Share this post


Link to post
Share on other sites

Thanks guys. I'll try with Excel but I can't use SQL.

I'll also try to store everything in an array, it's gonna be huge, but I hope that it's gonna be faster since it'll run through each name once.

 

Definitely use memory SQLite database. It will take only few seconds!

The only extra stuff you need is place SQLite3.dll file to script directory, no installation of any other software.

http://www.autoitscript.com/autoit3/files/beta/autoit/archive/sqlite/

Look here how simple and fast it is:

Share this post


Link to post
Share on other sites

Zedna raises good points. Yet another simpler SQLite-based way: use sqlite3.exe command-line tool (freeware) with .import command to load a database with all your input data, then use SQLite Expert personal edition (also freeware) to query/massage it the way you want. Then you have zero line of code to develop, yet obtain the wanted result easily.


This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Share this post


Link to post
Share on other sites

excuse me simy8891,

when the statement $searchSAM=_ArraySearch($tempArray, $CompleteArray[$d][0],2,0,0,1) is executed (see line 10 below)

and an element is found, is because $tempArray[$searchSAM] and $CompleteArray[$d][0] are equals (I mean that the found element in $tempArray is equal to the one searched $CompleteArray[$d][0])

is it right?

if so then, using this statement $tmpItem = StringSplit($CompleteArray[$d][0], ",") (see line 13 below)

or this $tmpItem=StringSplit($tempArray[$searchSAM],",") should be equivalent.

am I wrong?

$searchSAM=_ArraySearch($tempArray, $CompleteArray[$d][0],2,0,0,1) ;    <-- this is found only if $CompleteArray[$d][0] is inside $tempArray
            If  $searchSAM > 0 Then

                ;If found $searchSAM holds the position of the item just found
                $tmpItem=StringSplit($tempArray[$searchSAM],",") ;                  <-- so here you can use $tempArray[$searchSAM]
                ;                                                                       or $CompleteArray[$d][0] becaude both are the same
                ;                                                                       is it right ???

small minds discuss people average minds discuss events great minds discuss ideas.... and use AutoIt....

Share this post


Link to post
Share on other sites

I'm sorry Chimp, I just got what you meant.

I can try it on the new script and see if there's any improvement. I should try on the old one though, at least on a 4 hours run we'll be able to see an improvement, if there's any. With 15 minutes I think the difference's gonna be tiny.

Anyway, I finally got what I wanted from the whole script! Thanks guys :)

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Similar Content

    • 31290
      By 31290
      Hi guys, 
      I'd like to write a piece of tool that would allow me to update a certain field in our Active Directory from a comma separated csv file composed like this:

      This file, automatically generated, can hold more than 10k lines.
      Thus, I need column A to be in one variable, column B in a second one and column C in a third one.
      I'm really missing this part as updating the AD is fairly easy once the 3 variable are populated. 
      I see things like this:
      Here's my attempts at the moment:
      #include <File.au3> #include <Array.au3> Global $csv_file = @DesktopDir & "\Book1.csv" Global $aRecords If Not _FileReadToArray($csv_file,$aRecords) Then MsgBox(4096,"Error", " Error reading log to Array error:" & @error) Exit EndIf For $x = 1 to $aRecords[0] Msgbox(0,'Record:' & $x, $aRecords[$x]) ; Shows the line that was read from file $csv_line_values = StringSplit($aRecords[$x], ",",1) ; Splits the line into 2 or more variables and puts them in an array ; _ArrayDisplay($csv_line_values) ; Shows what's in the array you just created. ; $csv_line_values[0] holds the number of elements in array ; $csv_line_values[1] holds the value ; $csv_line_values[2] holds the value ; etc Msgbox(0, 0, $csv_line_values[1]) Next Any help on this please? 
      Thanks in advance
      -31290-
    • Jewtus
      By Jewtus
      I've had issues in the past dealing with excel so I decided to cut out the middle man and build a script that would take any file that opens in excel (csv, xml, xls, etc) and convert it into an array so I can handle the raw data in a cleaner way. I used czardas' CSV parser to do this and added a simple save in excel to save it as a csv to parse.
       
      Func _CreateCSV($fnImportFile) $oExcel = ObjCreate("Excel.Application") $oExcel.Visible=False $oBook= $oExcel.Workbooks.Open($fnImportFile) $oSheet=$oBook.ActiveSheet $fnMaster=@TempDir&"\"&Chr(Round(Random(0,24),0)+64)&Random(0,24)&Chr(Round(Random(0,24),0)+64)&Random(0,24)&Chr(Round(Random(0,24),0)+64)&".csv" ConsoleWrite($fnMaster&@CRLF) $oSheet.SaveAs($fnMaster, 6) $oBook.Close(False) $oExcel.Quit $aReturnArray=_CSVSplit(FileRead($fnMaster)) FileDelete($fnMaster) If not @error Then Return $aReturnArray Else Return -1 EndIf EndFunc ; #FUNCTION# ==================================================================================================================== ; Name...........: _CSVSplit ; Description ...: Converts a string in CSV format to a two dimensional array (see comments) ; Syntax.........: CSVSplit ( $aArray [, $sDelim ] ) ; Parameters ....: $aArray - The array to convert ; $sDelim - Optional - Delimiter set to comma by default (see 2nd comment) ; Return values .: Success - Returns a two dimensional array or a one dimensional array (see 1st comment) ; Failure - Sets @error to: ; |@error = 1 - First parameter is not a valid string ; |@error = 2 - Second parameter is not a valid string ; |@error = 3 - Could not find suitable delimiter replacements ; Author ........: czardas ; Comments ......; Returns a one dimensional array if the input string does not contain the delimiter string ; ; Some CSV formats use semicolon as a delimiter instead of a comma ; ; Set the second parameter to @TAB To convert to TSV ; =============================================================================================================================== Func _CSVSplit($string, $sDelim = ",") ; Parses csv string input and returns a one or two dimensional array If Not IsString($string) Or $string = "" Then Return SetError(1, 0, 0) ; Invalid string If Not IsString($sDelim) Or $sDelim = "" Then Return SetError(2, 0, 0) ; Invalid string $string = StringRegExpReplace($string, "[\r\n]+\z", "") ; [Line Added] Remove training breaks Local $iOverride = 63743, $asDelim[3] ; $asDelim => replacements for comma, new line and double quote For $i = 0 To 2 $asDelim[$i] = __GetSubstitute($string, $iOverride) ; Choose a suitable substitution character If @error Then Return SetError(3, 0, 0) ; String contains too many unsuitable characters Next $iOverride = 0 Local $aArray = StringRegExp($string, '\A[^"]+|("+[^"]+)|"+\z', 3) ; Split string using double quotes delim - largest match $string = "" Local $iBound = UBound($aArray) For $i = 0 To $iBound -1 $iOverride += StringInStr($aArray[$i], '"', 0, -1) ; Increment by the number of adjacent double quotes per element If Mod ($iOverride +2, 2) = 0 Then ; Acts as an on/off switch $aArray[$i] = StringReplace($aArray[$i], $sDelim, $asDelim[0]) ; Replace comma delimeters $aArray[$i] = StringRegExpReplace($aArray[$i], "(\r\n)|[\r\n]", $asDelim[1]) ; Replace new line delimeters EndIf $aArray[$i] = StringReplace($aArray[$i], '""', $asDelim[2]) ; Replace double quote pairs $aArray[$i] = StringReplace($aArray[$i], '"', '') ; Delete enclosing double quotes - not paired $aArray[$i] = StringReplace($aArray[$i], $asDelim[2], '"') ; Reintroduce double quote pairs as single characters $string &= $aArray[$i] ; Rebuild the string, which includes two different delimiters Next $iOverride = 0 $aArray = StringSplit($string, $asDelim[1], 2) ; Split to get rows $iBound = UBound($aArray) Local $aCSV[$iBound][2], $aTemp For $i = 0 To $iBound -1 $aTemp = StringSplit($aArray[$i], $asDelim[0]) ; Split to get row items If Not @error Then If $aTemp[0] > $iOverride Then $iOverride = $aTemp[0] ReDim $aCSV[$iBound][$iOverride] ; Add columns to accomodate more items EndIf EndIf For $j = 1 To $aTemp[0] If StringLen($aTemp[$j]) Then If Not StringRegExp($aTemp[$j], '[^"]') Then ; Field only contains double quotes $aTemp[$j] = StringTrimLeft($aTemp[$j], 1) ; Delete enclosing double quote single char EndIf $aCSV[$i][$j -1] = $aTemp[$j] ; Populate each row EndIf Next Next If $iOverride > 1 Then Return $aCSV ; Multiple Columns Else For $i = 0 To $iBound -1 If StringLen($aArray[$i]) And (Not StringRegExp($aArray[$i], '[^"]')) Then ; Only contains double quotes $aArray[$i] = StringTrimLeft($aArray[$i], 1) ; Delete enclosing double quote single char EndIf Next Return $aArray ; Single column EndIf EndFunc ;==> _CSVSplit ; #INTERNAL_USE_ONLY# =========================================================================================================== ; Name...........: __GetSubstitute ; Description ...: Searches for a character to be used for substitution, ie one not contained within the input string ; Syntax.........: __GetSubstitute($string, ByRef $iCountdown) ; Parameters ....: $string - The string of characters to avoid ; $iCountdown - The first code point to begin checking ; Return values .: Success - Returns a suitable substitution character not found within the first parameter ; Failure - Sets @error to 1 => No substitution character available ; Author ........: czardas ; Comments ......; This function is connected to the function _CSVSplit and was not intended for general use ; $iCountdown is returned ByRef to avoid selecting the same character on subsequent calls to this function ; Initially $iCountown should be passed with a value = 63743 ; =============================================================================================================================== Func __GetSubstitute($string, ByRef $iCountdown) If $iCountdown < 57344 Then Return SetError(1, 0, "") ; Out of options Local $sTestChar For $i = $iCountdown To 57344 Step -1 $sTestChar = ChrW($i) $iCountdown -= 1 If Not StringInStr($string, $sTestChar) Then Return $sTestChar EndIf Next Return SetError(1, 0, "") ; Out of options EndFunc ;==> __GetSubstitute  
       
      Edit:
      The code above is pretty junk (my part at least) so I wanted to improve it... This will return an array of arrays based on the excel file
      #include <Array.au3> Func _GetExcelArrays($fnImportFile) $oExcel = ObjCreate("Excel.Application") $oExcel.Visible=False $oBook= $oExcel.Workbooks.Open($fnImportFile) $sheetCount=$oBook.Worksheets.Count Local $aReturnArray[$sheetCount] For $x=1 to $sheetCount $oSheet=$oBook.Worksheets($x) $oSheet.Activate $fnMaster=@TempDir&"\"&Chr(Round(Random(0,24),0)+64)&Random(0,24)&Chr(Round(Random(0,24),0)+64)&Random(0,24)&Chr(Round(Random(0,24),0)+64)&".csv" $oSheet.SaveAs($fnMaster, 6) $aReturnArray[$x-1]=_CSVSplit(FileRead($fnMaster)) FileDelete($fnMaster) Next $oBook.Close(False) $oExcel.Quit Return $aReturnArray EndFunc  
    • ur
      By ur
      Generally we will use tools like Winmerge or Beyond Compare for this purpose.
      Are there any UDF or libraries available in AutoIT to compare any two files or folder contents.
    • FMS
      By FMS
      Hello,

      I'm having trouble whit a scipt what I'm building where this is a snippit from, and hope somebody can help me whit.
      The problem lies in when i push the "add" button i want to check if the "user" already exists.
      But the search code i build founds 2 hits when i know there is only 1 hit.
      does somebody knows what I'm doing wrong?
      thanks in advance.

      Ps. if somebody knows " if i found the right user how can i rewrite the password for him/her? " an answer to that will be most appriciated
      #include <ButtonConstants.au3> #include <GUIConstantsEx.au3> #include <StaticConstants.au3> #include <WindowsConstants.au3> #include <AutoItConstants.au3> #include <Array.au3> #include <FileConstants.au3> #include <MsgBoxConstants.au3> #include <WinAPIFiles.au3> #include <File.au3> #Include <Timers.au3> #include <Crypt.au3> #include <GuiEdit.au3> $Form1 = GUICreate("Pie", 170, 235, 190, 200) local $usernames[0] local $passwords[0] global $selectuser = GUICtrlCreateCombo("", 21, 95, 130, 20) $addusername = GUICtrlCreateButton("Add", 21, 170, 130, 30) $usernamefield = GUICtrlCreateInput("", 21, 120, 130, 20) $passwordfield = GUICtrlCreateInput("", 21, 145, 130, 20, 0x0020) func encrypt() Local $hFileOpen = FileOpen("data.srmt", $FO_APPEND) if guictrlread($usernamefield) <> "" then ;~ local $encryptedusername = string(guictrlread($usernamefield)) ;~ local $encryptedpassword = string(guictrlread($passwordfield)) local $encryptedusername = string(_Crypt_EncryptData(guictrlread($usernamefield), "P14h735536jk3fvvbg", $CALG_AES_128)) local $encryptedpassword = string(_Crypt_EncryptData(guictrlread($passwordfield ), "P14h735536jk3fvvbg", $CALG_AES_128)) Local $iCheck_hit = False ConsoleWrite (@CRLF & "##############################################"& @CRLF) if ubound($usernames)> 0 then for $i = 0 to ubound($usernames)-1 If guictrlread($usernamefield) = $usernames[$i] Then $iCheck_hit = True ConsoleWrite ( "Found! "& @CRLF & $usernames[$i] & " = " & guictrlread($usernamefield) & @CRLF & "i = " & $i & @CRLF & @CRLF) ;~ Else ;~ ConsoleWrite ( "No! "& @CRLF & $usernames[$i] & " = " & guictrlread($usernamefield) & @CRLF & "i = " & $i & @CRLF & @CRLF) EndIf next EndIf ConsoleWrite ( "##############################################"& @CRLF) If $iCheck_hit = False Then FileWrite($hFileOpen, $encryptedusername & @CRLF) FileWrite($hFileOpen, $encryptedpassword & @CRLF) EndIf EndIf FileClose($hFileOpen) EndFunc func decrypt() Local $hFileOpen = FileOpen("data.srmt", $FO_READ) ;~ consolewrite(_FileCountLines("data.srmt") & " lines in the file" & @CRLF) for $i = 1 to _FileCountLines("data.srmt")/2 local $encryptedusername = FileReadLine($hFileOpen) local $encryptedpassword = FileReadLine($hFileOpen) ;~ consolewrite($encryptedusername & @CRLF) ;~ consolewrite($encryptedpassword & @CRLF) ;~ _ArrayAdd($usernames, FileReadLine($hFileOpen)) ;~ _ArrayAdd($passwords, FileReadLine($hFileOpen)) _ArrayAdd($usernames, binarytostring(_Crypt_DecryptData(binary($encryptedusername), "P14h735536jk3fvvbg", $CALG_AES_128))) _ArrayAdd($passwords, binarytostring(_Crypt_DecryptData(binary($encryptedpassword), "P14h735536jk3fvvbg", $CALG_AES_128))) Next FileClose($hFileOpen) ;~ consolewrite(ubound($usernames) & " entries in array") ;~ _arraydisplay($usernames) if ubound($userNames)> 0 then for $i = 0 to ubound($usernames)-1 GUICtrlSetData($selectuser, $usernames[$i]) next EndIf endfunc func updatecomboandinfo() if not GUICtrlRead($usernamefield) = "" Then _ArrayAdd($usernames, GUICtrlRead($usernamefield)) _ArrayAdd($passwords, GUICtrlRead($passwordfield)) GUICtrlSetData($selectuser, $usernames[UBound($usernames)-1]) EndIf EndFunc decrypt() GUISetState(@SW_SHOW) While True $nMsg = GUIGetMsg() Switch $nMsg Case $GUI_EVENT_CLOSE Exit Case $addusername updatecomboandinfo() encrypt() GUICtrlSetData($usernamefield, "") GUICtrlSetData($passwordfield, "") Case $selectuser if ubound($usernames)> 0 then for $i = 0 to ubound($usernames)-1 ;~ msg( "error",$array_decrypted_usernames[$i]) If $usernames[$i] = GUICtrlRead($selectuser) Then GUICtrlSetData($usernamefield, $usernames[$i]) GUICtrlSetData($passwordfield, $passwords[$i]) consolewrite("YES " & $i & " = " & $usernames[$i] & " | ctr = " & GUICtrlRead($selectuser) & @CRLF) Else ;~ msg( "error",$array_decrypted_usernames[$i]) consolewrite("NO " & $i & " = " & $usernames[$i] & " | ctr = " & GUICtrlRead($selectuser) & @CRLF) EndIf next EndIf EndSwitch WEnd  
    • Skysnake
      By Skysnake
      Hi there
      I have searched and found many posts on READING CSV.  What I need to do is WRITE CSV.
      Typically I have a Listview, I use @guinness's excellent  _GUICtrlListView_CreateArray to read that LV to Array. Currently I use _FileWriteFromArray to write the output.  It is fast and it is easy.
      The benefits of using these Array functions are their inherit flexibility.  The code is portable.  Plug it into LV report and you can write to file instantly.  No formatting issues.
      To be sure, my problem is not with the Listview or Array.  My problem is with the file write to CSV: it should be fast, accurate and efficient.
      Part of the problem is Microsoft Excel's insistence that my CSVs are not what they appear to be...
       
      I am inspired by the simplicity of SQLITE3.EXE's dot command options:
      .headers on .mode csv .once output.csv  
      Problem is that I want to export the text packed in characters, such as ' " ' with fields delimited by another character, such as ' , '.
      Current method using _FileWriteFromArray outputs data like this
      12-04-2016| 0.00| 131.00|131|Description 12-04-2016| 0.00| 132.00|132|Description 132 rece 12-04-2016| 998.00| 0.00|998|Description Receipt 12-04-2016| 0.00| 900.00|900|Description What I want to achieve is this:
      "12-04-2016"," 0.00"," 131.00","131","Description" "12-04-2016"," 0.00"," 132.00","132","Description 132 rece" "12-04-2016"," 998.00"," 0.00","998","Description Receipt" "12-04-2016"," 0.00"," 900.00","900","Description" I have done this. The long way. I have written the output one $aResult[$i][1] item at a time, spaced with the $text & $field markers.  I have placed this in an array with additional columns just to fit in the formatting. I have also played with the idea of inserting the array into a sqlite db, then run sqlite3's dot commands on that.
      Is there a different method?  Is there an efficient method to specify FileWriteFromArraytoCSV?  Does anyone have ideas?  I am sure I will know how to code this, I just need advice on the most efficient way of going about it. If I missed any threads, please enlighten  me.
      Thanks for reading