Jump to content

How do I read file contents and upload as BLOB into MySQL Server?


Go to solution Solved by poila,

Recommended Posts

Hi to all,

I am writing a feature to an existing internal AutoIt program, based on a client's request.

The requested feature would be to grab the file via a button, which reads the file's path.

Below the file browsing button would be a button to upload the file into a MySQL server.

Initially, my method would be to read the contents of the file provided in the filepath, then convert the contents of the file into binary data, and then store it into a column of a table as a BLOB.

I am not sure whether if this is the best approach to storing text/Word/PDF files into the MySQL Server in this way, if the program used is an AutoIt program.

So far, I am only able to generate Excel files, but only because there was already an existing BLOB column in an existing table.

Sample code:

$sTemplateName = "SearchTemplate"
$sFilePath = @ScriptDir & "\temp\" & $sTemplateName & ".xls"
$iFileExists = FileExists($sFilePath)

; If template file not found, create it.
If (Not $iFileExists) Then
generateExcelFile($sTemplateName, $sFilePath)
EndIf

$iFileExists = FileExists($sFilePath)

If $iFileExists = 1 Then
$oExcel = _ExcelBookOpen($sFilePath, 0)
_ExcelWriteCell($oExcel, "Search conducted on " & _NowCalc() & " by " & $username & "", 2, 1)

$counter = 0
While Not $recordSet.EOF
_ExcelWriteCell($oExcel, 1 + $counter, 6 + $counter, 1)
_ExcelWriteCell($oExcel, getName("User", $recordSet.Fields("UserID").value), 6 + $counter, 2)
_ExcelWriteCell($oExcel, $recordSet.Fields("InternalName").value, 6 + $counter, 3)
_ExcelWriteCell($oExcel, getName("Product", $recordSet.Fields("ProductID").value), 6 + $counter, 4)
_ExcelWriteCell($oExcel, $recordSet.Fields("Serial").value, 6 + $counter, 5)
_ExcelWriteCell($oExcel, getName("Location", $recordSet.Fields("LocationID").value), 6 + $counter, 6)
_ExcelWriteCell($oExcel, $recordSet.Fields("Remarks").value, 6 + $counter, 7)
_ExcelWriteCell($oExcel, getFriendlyDate($recordSet.Fields("LastModified").value, 1), 6 + $counter, 8)
_ExcelWriteCell($oExcel, getFriendlyDate($recordSet.Fields("CreationDate").value, 1), 6 + $counter, 9)

$recordSet.MoveNext
$counter = $counter + 1
WEnd
_ExcelBookSaveAs($oExcel, $exportDir & "\" & $username & "_Search_Results_" & @YEAR & @MON & @MDAY & "_" & @HOUR & @MIN & @SEC, "xls")
_ExcelBookClose($oExcel, 1, 0)
EndIf
Edited by poila
Link to post
Share on other sites
  • Solution

Apparently, there was no one replying to this thread, but somehow after digging *a lot* deeper, I found some gems:

Read and write binary file: '?do=embed' frameborder='0' data-embedContent>>

Getting file extension: '?do=embed' frameborder='0' data-embedContent>>

Opening a file (automatically) with its default corresponding application: '?do=embed' frameborder='0' data-embedContent>>

Adapting some user's existing code to suit my customized needs: '?do=embed' frameborder='0' data-embedContent>>

Final sample:

#cs
   Function name:   uploadFileToServer
   Description:     Inserts binary file into BLOB column of file-storing table
   Parameter(s):    $sFilePath - String pinpointing to location of file
                    $iUserID - Integer containing the current user's ID
                    $iFileID - Integer containing the FileID referenced from FileRecord table
   Return Value(s): On success, returns 1. On failure, returns 0.
#ce
Func uploadFileToServer($sFilePath, $iUserID, $iFileID)
   ; Get the file extension
   Local $sTempFileExt = getFileExtension($sFilePath)

   ; Open the file in Binary Mode
   Local $hTempFile = FileOpen($sFilePath, 16)
   Local $iFileSize = FileGetSize($sFilePath)
   Local $sBinaryChars = FileRead($hTempFile,$iFileSize)
   FileClose($hTempFile)

   ; Create SQL statement for insertion of file
   Local $sSQLstatement_insertFile = "INSERT INTO filestorage (FileID,UserID,BinaryData,FileExtension) VALUES (" & $iFileID & "," & $iUserID & "," & $sBinaryChars & ",'" & $sTempFileExt & "')"
   Local $setFileBinaryData = _Query($global_SQLInstance, $sSQLstatement_insertFile)

   If $setFileBinaryData <> 0 Then
      Return 1
   Else
      Return 0
   EndIf
EndFunc   ;==>uploadFileToServer

Pre-requisites:

- You must have the MySQL.au3 file from user cdkid (this can be found in the AutoIt Tutorials Wiki)

- getFileExtension() is a user-defined function, not native to AutoIt. Feel free to write your own UDF to get a file extension

- The table must have a LONGBLOB column

EDIT: This does not check the actual file size *before* the upload process.

If you wish to contribute on limiting user's file size before calling my written function, please feel free to reply. Thanks.

Edited by poila
Link to post
Share on other sites
  • 2 years later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By AspirinJunkie
      I've noticed that for the task of "reading a file row by row into an array" FileReadToArray is consistently slower than a FileRead + StringSplit.
      The following script:
      Results in the following output for me:
      And the output with switched call sequence:
      This surprises me, since I first assume that the native function offers more optimization possibilities than having to create an array in AutoIt via an intermediate string.
      So I thought that maybe FileReadToArray is more economical with memory. But again FileReadToArray seems to consume more memory than the StringSplit variant. For this I got the PeakWorkingSetSize after the call of the respective function.
      After FileReadToArray the Size is always higher than with StringSplit. Since this is not the case if one changes the order of the two functions (see above), one can assume that FileReadToArray actually has this difference. The difference is mostly the double file size. Which would make sense if the file is completely saved internally as a UTF-16 string.
      Can anyone explain to me why FileReadToArray performs so unexpectedly poor - or am I missing something?
    • By argumentum
      $sQueryUpdateTime = "select intUpdateTime from tblStudies " . $where . " ORDER BY intUpdateTime DESC limit 1"; $rs = mysqli_query($conn, $sQueryUpdateTime); $row = mysqli_fetch_assoc($rs); the above used to take 300+ ms. to query. Then I set it as index and takes 30 ms. Cool.
      $sQuery = "select * from tblStudies " . $where . " ORDER BY StudyDate DESC limit $offset,$rows"; // takes 30 ms. on the indexed int $sQuery = "select * from tblStudies " . $where . " ORDER BY StudyDate DESC , PatientName ASC limit $offset,$rows"; // takes 300 ms. due to "PatientName" been a text field, even as I did index it So my observation is that "PatientName" takes a long time to sort, even tho "$rows = 20". Sorting text in 20 rows should be fast.
      ..tho, I find that any 2nd argument in the ORDER BY is just slow.
      Is there a way to query this in a way to have a faster result back ?
      Thanks
      PS: added ADD INDEX `StudyDate_2` (`StudyDate`, `PatientBirthDate`) USING BTREE;  and searched by those two with not much speed change ( StudyDate and PatientBirthDate are integer ).
    • By argumentum
      I was thinking but I don't have the experience, so you may have the experience.
      I was thinking to chop a DB with 100.000 ( 20 columns ) in 10.000 DB chunks ( 10 DBs ) and query concurrently all of them to speed up a search, then add the results. Is that a sound idea ? Or will I run in trouble down the road. Should a DB better be keep in one piece ?
      The DB is now in MySQL. I wanna do all this chopping and use SQLite. 
      Thanks
      [solved]
    • By Clark
      Hello all
      A programme I wrote for the organisation for which I work has been in use for a number of years with no issue.  One of the things that it does is allow for attachments to be selected (Word docs, Excel sheets, etc) which it then writes away as a BLOB in MySQL.
      This was all working fine until recently, when suddenly, on the odd occasion, Word would advise that the file was corrupt on retrieval of the BLOB from MySQL.
      Upon further examination it appears that the issue is not with the retrieval of the data from MySQL, but reading in the data from the disk by AutoIT.   It seems that on occasions that AutoIT will only read half the file.  
      I have no idea why this is, as the whole function is very simple.   I have included a sanitised vesion of it below in the hope that someone can tell me where to look next, as this one has me beat (again).  😋
      I have put some comments in to show what I have done to try and solve the problem, but am getting no closer to the truth.  @error is the strangest one - it should return -1 when it reaches end of file but is returning zero. 
      Func _AttachAttachment() local $file,$chars,$fsize If $RFC_Retrieved = False Then msgbox(0,"Error","Must have a RFC active first") Else $file=FileOpenDialog("Attach to RFC",".","All (*.*)",1) ; Check if file opened for reading OK If $file = -1 Then MsgBox(0, "Error", "Unable to open file.") Exit EndIf ; FileClose($file) These two lines are my attempt to open the file in binary mode. Made no difference ; FileOpen($file,16) ; Read in the File $fsize = FileGetSize($file) ConsoleWrite("File size is " & $fsize & @CRLF) ; the size comes back as what it say in Windows properties ; $chars = FileRead($file,$fsize) ; I was using this, then tried the line following this. It didn't help $chars = FileRead($file) ; This is showing a figure $chars/2 - 1 I don't understand this ConsoleWrite("Number of chars read is " & @extended & " error code is " & @error & @crlf) ; @error is coming back with 0. I don't understand this as I thought it would be -1 FileClose($file) EndIf EndIf EndFunc Thanks in advance for looking at this
      Clark
       
    • By Case85
      Hi for all!
       
      After a long time I wasted to find the best way to connect to any external MySQL server, I'm ready to hire "C" or "C++" developer to use MySQL connector dll file for create a stable way to connect to a MySQL server.
      I already tried to use the UDF "miniSQL - A simple Standalone PDO-syntax-like MySQL-library with AutoitObject" which is working perfect (this way what I looking for), but unfortunately after 1-2 hours usage failed and exit the script.
      I plan to use the MySQL connector heavily and continuously on long time.
      The developed connector must be have a small memory footprint.
      My budget is moderated, I need this connector quick as possible, please contact with me in PM for more details.
      Thank you for all.
×
×
  • Create New...