Jump to content

Queries on .txt file OR import .txt file in SQLite


Recommended Posts

Good evening everyone :)
I'm working on this little project for a week, and, what I'm trying to do could be useful for many users as well...
I'm trying to do a "Report Generator", which reads the data that have to report from a text file (.txt) formatted with this pattern;
Data1;Data2;Data3;Data4;Data5;;

YES, there are 2 semi-colon at the end of the line.

In detail, Data1 is a date/time stamp with this format: YYYY/MM/DD HH:MM:SS ;

When the script starts, the user is prompted to choose 2 dates which I'll call as:

  • Report_Date_Start;
  • Report_Date_Start.

So, the report, should cover all dates between Report_Date_Start AND Report_Date_End.

And, already at this point, I don't know how to do the query... How can I say to the script:
SELECT * FROM (.txt) WHERE Data1 BETWEEN Report_Date_Start AND Report_Date_End; ?

I thought that I could do a _DateDiff, but if the difference between the two dates is months and not days, how can I do the trick?
Should I make a Switch...Case with the _DateDiff() and see then calculate all the dates between Report_Date_Start AND Report_Date_End... But then, how can I compare the dates in the file with all the dates between Report_Date_Start AND Report_Date_End? I'm going crazy, I know...

I've already made a "Export Tool", which exports the content of the .txt file in a .db, managed with SQLite... I mean, there I could easily do a query like I did above the thread, but, this "export", for 1080 rows, takes 28 seconds to be done. And, 1080 rows are daily rows that are added every day in the .txt file, so, in a week, the file could be easily 7000+ rows, which means that the "export" would take 3 minutes to be done... And we can go over and over...
I'll post just for be "complete" what I've done about the export, so, maybe, someone could say how to improve it in terms of efficency...
 

Local $aContenutoFileAuditReport = ""
_FileReadToArray($sFileAudit_Report, $aContenutoFileAuditReport)
If(IsArray($aContenutoFileAuditReport) And Not @error) Then
    Local $aContenutoFileAuditReport_Splitted = ""
    Local $sQuery = ""
    Local $hInizioConteggio = TimerInit()
    For $i = 1 To UBound($aContenutoFileAuditReport) - 1
        $aContenutoFileAuditReport_Splitted = StringSplit($aContenutoFileAuditReport[$i], ";")
        $sQuery = "INSERT INTO FileDB_Report(DATESTAMP, TIMESTAMP, USER_ID, OBJECT_ID, DESCRIPTION, COMMENT) " & _
                  "VALUES(" & _
                  _SQLite_FastEscape($aContenutoFileAuditReport_Splitted[0]) & "," & _
                  _SQLite_FastEscape($aContenutoFileAuditReport_Splitted[1]) & "," & _
                  _SQLite_FastEscape($aContenutoFileAuditReport_Splitted[3]) & "," & _
                  _SQLite_FastEscape($aContenutoFileAuditReport_Splitted[4]) & "," & _
                  _SQLite_FastEscape($aContenutoFileAuditReport_Splitted[5]) & "," & _
                  _SQLite_FastEscape($aContenutoFileAuditReport_Splitted[6]) & ");"
                 If(_SQLite_Exec($hFileDB_Report, $sQuery) <> $SQLITE_OK) Then
                    ConsoleWrite("Errore durante l'esecuzione della query #" & $i & @CRLF)
                 Else
                    ConsoleWrite("Query eseguita correttamente #" & $i & @CRLF)
                 EndIf
            Next
            ConsoleWrite("Esportazione completata in: " & Round(TimerDiff($hInizioConteggio)/1000, 0) & " secondi")
        Else
            MsgBox($MB_ICONERROR, "Errore!", "Errore durante la lettura del file nell'array." & @CRLF & "Errore: " & @error)
        EndIf

I know that I can't do queries from a .txt file...
[19:18] 
I've been writing this post from 18:40 maybe...
By the way, if @jchd or someone else could tell me if I can import a formatted .txt file in SQLite and then, do queries on the DB, I'd be very happy for that...
About the report in PDF, I'm talking with @taietel in order to know how to create a PDF.
I hope someone will help me :)
Sorry for the "long" list of questions...
Thank you for everything you've done for me :)
I have to say that this is the community of programming language that I've loved most! :D
By the way, I'll be back tomorrow in the morning ( ~ 9:15 a.m. Italian time ), so, excuse me if I can't answer before that time.
Hope you guys have a wonderful day/night.


Thanks again :)

Francesco

Click here to see my signature:

Spoiler

ALWAYS GOOD TO READ:

 

Link to post
Share on other sites

28s for 1080 rows seems a lot. Well it depends on what data2 to data5 look like, but anyway that seems a lot.

The SQLite command-line interface (sqlite3.exe aka the CLI) allows direct import of a CSV formatted file into a DB. Run sqlite3.exe -help to see options. You can invoke it thru _SQLite_SQLiteExe() funcion call.

All you'll have to do is to sanitize your input. To keep safe from possible single quotes in the data fields, you can read the whole file at once, StringReplace single quotes by double single quotes, StringReplace double quotes by double double quotes, StringRegexReplace the result to insert double quotes around string fields and supply that to sqlite3.exe as a csv file.

Else wrap your bulk inserts inside a transaction, you know: "begin;" ... "end;". That will surely cut your 28s significantly down.

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to post
Share on other sites

Good morning @jchd:)
I'll try what you suggested to me :)
By the way, how can I set the sqlite3.dll in my system?
I did a post where I cannot register properly the sqlite3.dll, and I had to download, register and use in the script the sqlite3_x64.dll.
How can I set this "environment" properly? Thank you :) 

Click here to see my signature:

Spoiler

ALWAYS GOOD TO READ:

 

Link to post
Share on other sites

You don't have to register the dll for using it. Just drop it in the directory of the executable, or in @systemdir. No environment needed either.

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By t0nZ
      Code to read a Spiceworks Database and export text data files in INI format.
      Focused on exporting data about PC inventory and useful to migrate to another inventory system.
      Tested with a SQLite DB from Spiceworks latest (and LAST on premise) version - (7.5.00107.30 ott 2019)
      It creates one  text file for every single machine.
       
      ;Spiceworks Db Exporter ; ;- NSC - t0nZ 2021 ;code to read a Spiceworks Database and export text data files in INI format. ;focused on exporting data about PC inventory. ;useful to migrate to another inventory system. ;tested with a SQLite DB from Spiceworks latest (and LAST on premise) version - (7.5.00107.30 ott 2019) ;It creates one text files for every single machine. ; --> please adapt paths to your environment. #include <SQLite.au3> #include <String.au3> #include <File.au3> dataINIfromDBspiceworks() #Region spiceworks immport Func dataINIfromDBspiceworks() ConsoleWrite("start import from db Spiceworks" & @CRLF) Local $dbspicepath = "C:\scambio" Local $dbspice = "spiceworks_prod.db" ; =====================>>>>> START SQL DLL _SQLite_Startup() ;======================<<<<<<<<<<<<<<<<<<< _SQLite_Open($dbspicepath & '\' & $dbspice) ;- Local $aCPdata Local $hQuery Local $recordcount = 0 Local $salvarec = 0 ;------------------------------------------------ Local $spiceQuery = "SELECT serial_number," & _ "server_name," & _ "manufacturer," & _ "(model || '-' || raw_model)," & _ "processor_type," & _ "raw_processor_type," & _ "processor_architecture," & _ "raw_processor_type," & _ "number_of_processors," & _ "'speed'," & _ "CAST (memory AS FLOAT) / 1073741824," & _ "current_user," & _ "domain," & _ "network_adapters.dns_domain," & _ "'logon'," & _ "operating_system," & _ "os_architecture," & _ "version," & _ "'lang'," & _ "('C:;Fixed;' || (CAST (disks.size AS FLOAT) / 1048576) || ';' || (CAST (disks.free_space AS FLOAT) / 1048576) )," & _ "devices.ip_address," & _ "'(0.0.0.0)'," & _ "'(0.0.0.0)'," & _ "'(0.0.0.0)'," & _ "'(0.0.0.0)'," & _ "network_adapters.gateway," & _ "(network_adapters.description ||' - '|| network_adapters.name)," & _ "devices.mac_address," & _ "devices.updated_on," & _ "user_tag " & _ "FROM devices " & _ "inner JOIN " & _ "network_adapters ON network_adapters.computer_id = devices.id " & _ "inner JOIN " & _ "disks ON disks.computer_id = devices.id " & _ "WHERE disks.name = 'C:' " & _ "ORDER BY devices.updated_on DESC;" _SQLite_Query(-1, $spiceQuery, $hQuery) While _SQLite_FetchData($hQuery, $aCPdata) = $SQLITE_OK writeINI($aCPdata) $recordcount += 1 If $recordcount = $salvarec + 10 Then $salvarec = $recordcount ConsoleWrite($recordcount & " processed records " & @CRLF) EndIf WEnd ;________________________________________________ _SQLite_Close() _SQLite_Shutdown() EndFunc ;==>dataINIfromDBspiceworks Func writeINI($aCPdata) Local $folderdataINI = "c:\scambio\ini" If Not FileExists($folderdataINI) Then DirCreate($folderdataINI) $aCPdata[28] = StringRegExpReplace($aCPdata[28], "[\D]", "") ; this "2021-04-17 02:34:16" to that "20210417023416 Local $cpini = $folderdataINI & "\" & "cp_" & $aCPdata[28] & "_" & $aCPdata[0] & "_.ini" If FileExists($folderdataINI & "\" & "cp_" & $aCPdata[28] & "_" & $aCPdata[0] & "_.ini") Then FileDelete($folderdataINI & "\" & "cp_" & $aCPdata[28] & "_" & $aCPdata[0] & "_.ini") ; deleted previous files ! ConsoleWrite("deleted: " & "cp_" & $aCPdata[28] & "_" & $aCPdata[0] & "_.ini" & @CRLF) EndIf ;Section Unique :serial, computer name Local $aSectionUnique[2][2] = [["serial", $aCPdata[0]], ["computername", $aCPdata[1]]] IniWriteSection($cpini, "UNIQUE", $aSectionUnique, 0) ;Section Machine :manufacturer,model,cpuname,cpuid,cpuarc,cpuvendor,cpucores,cpuspeed,RAM,disk1capacity,disk1used Local $aSectionMachine[9][2] = [["manufacturer", $aCPdata[2]], ["model", $aCPdata[3]], ["cpuname", $aCPdata[4]], ["cpuid", $aCPdata[5]], ["cpuarc", $aCPdata[6]], ["cpuvendor", $aCPdata[7]], ["cpucores", $aCPdata[8]], ["cpuspeed", $aCPdata[9]], ["ram", $aCPdata[10]]] IniWriteSection($cpini, "MACHINE", $aSectionMachine, 0) ;Section User : username,domain,DNSdomain,logonServer Local $aSectionUser[4][2] = [["username", $aCPdata[11]], ["domain", $aCPdata[12]], ["DNSdomain", $aCPdata[13]], ["logonServer", $aCPdata[14]]] IniWriteSection($cpini, "USER", $aSectionUser, 0) ;Section OS : OSver, OSarch, OSbuild, OSlang Local $aSectionOS[4][2] = [["OSver", $aCPdata[15]], ["OSarch", $aCPdata[16]], ["OSbuild", $aCPdata[17]], ["OSlang", $aCPdata[18]]] IniWriteSection($cpini, "OS", $aSectionOS, 0) ;Section Disks: username,domain,DNSdomain,logonServer Local $aSectionDrives[1][2] = [["drives", $aCPdata[19]]] IniWriteSection($cpini, "DRIVES", $aSectionDrives, 0) ;Section network:localIP1,localIP2,localIP3,localIP4,publicIP,GW,adapter,mac Local $aSectionNetwork[8][2] = [["localIP1", $aCPdata[20]], ["localIP2", $aCPdata[21]], ["localIP3", $aCPdata[22]], ["localIP4", $aCPdata[23]], ["publicIP", $aCPdata[24]], ["GW", $aCPdata[25]], ["adapter", $aCPdata[26]], ["mac", $aCPdata[27]]] IniWriteSection($cpini, "NETWORK", $aSectionNetwork, 0) ;Section PLUS :date, groupid Local $aSectionPlus[2][2] = [["date", $aCPdata[28]], ["groupid", $aCPdata[29]]] IniWriteSection($cpini, "PLUS", $aSectionPlus, 0) EndFunc ;==>writeINI #EndRegion spiceworks import  
      Spiceworks on premise, now pretty abadonware, has a non crypted SQLite DB usually located in:
      c:\Program Files (x86)\Spiceworks\db\spiceworks_prod.db The query was the difficult part (at least for me), and I export in INI format because it was part of my effort to migrate from Spiceworks to a custom made system (Computer Plucker see this post) where I already parse .INI files in a custom MySQL and/or SQLite DB.
       
    • By lIlIIlIllIIIIlI
      $input = $CmdLine[1] $bytes = 1000000 ; 1000000 for 1 MB, 1000 for 1 KB $size = FileGetSize($input) $file = fileopen($input, 16) $max = ceiling($size / $bytes) for $i = 1 to $max $data = fileread($file, $bytes) $output = $input & '_' & $i & 'of' & $max filewrite($output, $data) next ^file split
      file path for a 20MB "video.mp4" is input, it will be output as "video.mp4_1of20", "video.mp4_2of20", etc. change $bytes to affect the split files size, this example made them 1MB (10^6 bytes)
       
      $input = $CmdLine[1] $name = stringtrimright($input, 1 + stringlen($input) - stringinstr($input, '_', 0, -1)) $split = stringsplit($input, 'of', 3) $max = $split[ubound($split) - 1] for $i = 1 to $max $in = $name & '_' & $i & 'of' & $max $file = fileopen($in, 16) $data = fileread($file) fileclose($file) filedelete($in) filewrite($name, $data) next ^ file join
      file path for any of the "video.mp4_Xof20" segments is input. the "video.mp4_Xof20" segments are read and written to "video.mp4" and then deleted, leaving the newly joined "video.mp4" in the end
       
      i made this today because i had a big file that i couldn't really open. i split it into 1000 chunks to make processing and stuff easier. i've also used it to split files to just under 25MB segments so i can attach and mail big files over gmail which is silly but it worked
    • By TheSaint
      I like and have been using TeraCopy, a third party program, for many years. Mostly it is a great program, but it does have some issues. On Windows XP for instance, Thumbs.db files could often hold up a copy or move process until the user manually responded to the error prompt. Some other issues I discuss below.
      PLEASE NOTE - I am not related to or affiliated in any way with the 3rd party TeraCopy program developers.
      ALSO NOTE - I myself have only tested TeraCopy Cure at this point, on Windows 7 (32 bit), and only with the free version of TeraCopy 2.27.
      This program, TeraCopy Cure, is related to another one of my TeraCopy assistant programs, TeraCopy Timer, but aims at being simpler and quicker to use ... if lacking its more advanced features.
      TeraCopy Cure is a frontend for TeraCopy and sets out to make up for its flaws and limitations.
      One of those flaws is queuing order, and the limitation relates to a same destination issue.
      You would think that queue order would be the same as add order, but that is not the case, and if you are doing a mix of COPY and MOVE then the COPY process could easily fail. It could fail with some source items, even if the  COPY process has already started before the same source MOVE process begins ... especially if the destination folder is on the source drive ... only the currently copying file is locked to that process.
      If you drag and drop another source for a same destination as an existing or impending COPY or MOVE process, then usually TeraCopy nicely adds it for you to that existing job. However, you might wish to avoid that, or it might occur during the testing phase of that active process, and then not be properly processed etc. But you are not given a choice and it just gets added.
       
           
      HOW TO USE
      See the right-click menu of the 'Batch List' (lowest field) and the right-click menu of the Tree field, for some useful options.
      (1) If desired, enable 'Auto Start'. NOTE - Even if enabled, this can be bypassed.
      (2) Set the destination folder, either by browsing on the tree or by dragging a folder to the Destination input field or label. Right-clicking on a folder in Explorer will also work, if enabled. The destination folder path will also show in the Tree if that right-click option is enabled.
      (3) Then drag & drop source file or folder onto one of three five areas - Folder (Drive) Tree field, or COPY or MOVE buttons ... this now also includes the source input and label.
      NOTE - If the Folder (Drive) Tree field is used, then you will additionally need to click either the COPY or MOVE button, to have that job added to the Batch List ... but this avoids 'Auto Start' if it is enabled. Drag & dropping to the buttons instead, saves on clicking, but starts the first job etc immediately if 'Auto Start' is enabled.
      (4) If needed and ready, click the START button to run the first job and those that follow.
      More information is included in program and in the NOTES etc sections below.
         
      WARNING - Depending on the amount of content on your destination drive, and folder level depth, and the speed of your PC, display of the full path in the Folder (Drive) Tree can take a while to show ... if you have that (right-click) option enabled ... it isn't by default. I found this feature quite tricky to get right, and I'm still not 100% sure it is now full-proof.
      TeraCopy Cure v1.5.zip  (source is included)
      TeraCopy Cure v1.6.zip  (source is included)
      NOTES
      UPDATES INFORMATION
      OLDER DOWNLOADS
       
    • By benners
      I am trying to normalise a database to remove duplicate info. I am using SQLite Expert to design the database and test sql queries.
      The database is going to store information about setup installers, such as paths, installer specific info, users to install for, type and category of installer blah, blah.
      I have attached the database thus far and the tables function are as follows:-
      category - stores text describing the general usage the installer comes under, such as Browser, Compression etc. installer - this is the main table that has relationships with the other tables and stores info about the installer file, install order etc. installer_user - a link table. Stores the user or computer to install the program for or on. package - stores the type of installer, NSIS, Inno  Nullsoft etc. platform - the OS architecture the installer file is compiled for. postinstall - a list of activities to perform when the main install has finished. postinstall_user - a link table. Stores the users\computers that are allowed to run the post install actions user - a list of computers or usernames. I might separate into two tables, undecided yet. Now there wil be one program that deals with the installation side and another that acts as a front end for editing the database suchs as adding new files, removing old files etc.
      The idea with the editing side is to be able to delete an installer from the installer table say with the id of 1 and all other pertinent information in the other tables will also be deleted. The same goes for deleting a user. All the fields relating to that user will be removed.
      I have managed to get that part working for the most part. If I delete either a user or installer, the related info in the installer_user and postinstall tables are removed but since I added the postinstall_user table to link usernames to the postinstall action, this is where I get the foreign key error. If someone can explain why, I am sure it is an obvious reason for someone who knows what they are doing 😄
      Cheers
      Installer - Copy.db
    • By Zedna
      #include <WinAPI.au3> $text = FileReadLastChars("C:\Program Files\AutoIt3\Include\Array.au3", 1024) MsgBox(0, 'FileReadLastChars', $text) Func FileReadLastChars($sFile, $nChars)     Local $nBytes     $tBuffer = DLLStructCreate("char["&$nChars&"]")     $hFile = _WinAPI_CreateFile($sFile, 2, 2) ; open for read     _WinAPI_SetFilePointer($hFile, -1 * $nChars, 2) ; from end     _WinAPI_ReadFile($hFile, DLLStructGetPtr($tBuffer), $nChars, $nBytes)     _WinAPI_CloseHandle($hFile)     Return DLLStructGetData($tBuffer, 1) EndFunc ; included as standard UDF since AutoIt 3.2.13.6 version Func _WinAPI_SetFilePointer($hFile, $iPos, $iMethod = 0)     $aResult = DllCall( "kernel32.dll", "long", "SetFilePointer", "hwnd", $hFile, "long", $iPos, "long_ptr", 0, "long", $iMethod)     If @error Then Return SetError(1, 0, -1)     If $aResult[0] = -1 Then Return SetError(2, 0, -1) ; $INVALID_SET_FILE_POINTER = -1     Return $aResult[0] EndFunc ;==>_WinAPI_SetFilePointer Here is my topic about _WinAPI_SetFilePointer()
       
      EDIT: simpler version compatible with latest AutoIt
      $text = FileReadLastChars("C:\Program Files\AutoIt3\Include\Array.au3", 1024) MsgBox(0, 'FileReadLastChars', $text) Func FileReadLastChars($sFile , $nChars) $hFile = FileOpen($sFile, 0) ; open for read FileSetPos($hFile, -1 * $nChars, 2) ; from end $sRet = FileRead($hFile) FileClose($hFile) Return $sRet EndFunc  
×
×
  • Create New...