Jump to content

Send $array() trough SQL Query


marko001
 Share

Recommended Posts

Hi all,

I have an array and I need to send all values trough an SQL update.

I now use the following:

Func _upload($index)
    If Not _EzMySql_Startup() Then
        _status(1, "Unable to Connect")
    EndIf
    If Not _EzMySql_Open("myserver", "username", "pw", "db", "port") Then
        _status(1, "Unable to Connect")
    Else
        For $i = 0 To UBound($csv_list) - 1
            if $csv_list[$i][5] > 0 Then
                $query = "INSERT INTO `My_Table` (`id`,`name`, `data`,`object`, `ind`,`open`,`total`,`h1`,`h2`) VALUES ('', '" & $index & "', '" & $csv_list[$i][0] & "', '" & $csv_list[$i][1] & "', '" & $csv_list[$i][2] & "', '" & $csv_list[$i][3] & "', '" & $csv_list[$i][4] & "', '" & $csv_list[$i][5] & "', '" & $csv_list[$i][6] & "')"
             $res = _EzMySql_Exec($query)
                If @error Then
                    MsgBox(0, "1", _EzMySql_ErrMsg())
                Else
                EndIf
            EndIf
        Next
        _status(2,"Updated")
        _EzMySql_Close()
        _EzMySql_ShutDown()
    EndIf
EndFunc   ;==>_upload

This take time since I have to cycle all the lines of the array (it can be 1-50,000 lines)

Is there a way to send the whole array trough the query?

Thanks,

Marco

Link to comment
Share on other sites

Hmm,

I see that to use LOAD DATA INFILE I need to place the .txt on the server, where the DB is.

On server files, I must have the FILE privilege.

But I believie I can't ftp there...If I FTP I go to my own space and not where the DB is. Is there a way to use a local file (without the for...next cycle) or the remote file (maybe placed in a ./folder in the root like www.mydomain.com/files/filetoimport.txt)

I tried

$query = "LOAD DATA LOCAL INFILE '" & $filetotable & "' INTO TABLE `export` FIELDS TERMINATED BY '|'"

It returns me the following error: The used command is not allowed with this MySQL version

Looking on my db: Version MySQL: 5.0.91-log

But http://dev.mysql.com/doc/refman/5.0/en/load-data.html gives same commands so it should work...

Re-Edit: may I got why: But I don't have access to the remote db, obviously... back to first solution??

Are you getting the above error message? Are you attempting to import data from a file into MySQL using LOAD DATA LOCAL INFILE?

The reason this message crops up is due to a security setting in the MySQL configuration file (my.cnf). The following tweak to the file will fix the problem, however, MAKE SURE that none of your code is vulnerable to SQL injection attacks before you do this, as making this change could potentially allow someone to access local files on your server if you have any vulnerabilities in your queries. If you are using PHP, a good start would be making sure that all user entered data (GET, POST etc.) is run through the mysql_real_escape_string() function prior to being sent to MySQL. You have been warned!

This method assumes you have root access to the server. If not, you’ll need to speak nicely to your server admin Posted Image

Find your my.cnf file. The exact location will depend on your OS/distribution. On Linux boxes it might be in /etc or /etc/mysql or something similar.

You will probably have a line in your my.cnf that says: set-variable=local-infile=0

Change it to: set-variable=local-infile=1

Restart MySQL.

You should now be up and running. It goes without saying that I am not responsible for anything untoward that happens on your server as a result of following these instructions.

Edited by marko001
Link to comment
Share on other sites

Geez, your version is 6 years old! You'd have to read a lot of change logs to see if anything has been changed regarding the LOAD DATA statement since then. Whether the "INSERT DELAYED" option would yeild a performance increase I can't say. I see there is also an option that allows for at least a small batch of records to be added at once:

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

The max number of characters or rows that you can append that way I didn't see mentioned.

It could be an improvement over single INSERTs, but seems far from an optimum solution.

I might try upgrading to the latest version (if you have that option), and see if LOAD DATA behaves any differently.

I work with a database other than MySQL, so I'm not a lot of help.

Maybe a MySQL guru will pipe in...

Edited by Spiff59
Link to comment
Share on other sites

Geez, your version is 6 years old! You'd have to read a lot of change logs to see if anything has been changed regarding the LOAD DATA statement since then. Whether the "INSERT DELAYED" option would yeild a performance increase I I

It's not my fault but server side fault ^^ ...

But as far as I know, version 5.0 support this command aswell, i believe the problem ij just set-variable=local-infile=1

Link to comment
Share on other sites

The hosting service doesn't allow me to use the "LOAD DATA LOCAL INFILE" command so I can't send the whole file, I asked to customer service and they denied the request of changing it.

So still a for cycle for every line and a single insert query? or there is a fastest way?

M.

Link to comment
Share on other sites

With you not in control of the server, it sounds like your options are limited.

Have you checked whether the other method works, the one where multiple groups of data can be loaded into the VALUE phrase?

It may be your only choice, but if you can INSERT two, or four, or ten rows at a time, that may be better than nothing.

You could try this modified version of _upload, varying the $batch_size parameter, to see how many additional rows (if any) you can send in a single INSERT...

#include <Array.au3> ; temp
Global $csv_list[20][7]
; create temp data
For $x = 0 to 19
$csv_list[$x][0] = "DATA" & $x
$csv_list[$x][1] = "OBJ" & $x
$csv_list[$x][2] = "IND" & $x
$csv_list[$x][3] = "OPEN" & $x
$csv_list[$x][4] = "TOT" & $x
$csv_list[$x][5] = $x + 1
$csv_list[$x][6] = 0
Next
_ArrayDisplay($csv_list) ; temp
_upload("TEST", 2) ; number of rows per INSERT statement
Exit
;===================================================================================================================================
Func _upload($index, $batch_size)
Local $x, $csv_size = UBound($csv_list) - 1
Local $query_part1 = "INSERT INTO My_Table (`id`,`name`, `data`,`object`, `ind`,`open`,`total`,`h1`,`h2`) VALUES ", $query_part2
;   If Not _EzMySql_Startup() Then
;       _status(1, "Unable to Connect")
;   EndIf
;   If Not _EzMySql_Open("myserver", "username", "pw", "db", "port") Then
;       _status(1, "Unable to Connect")
;   Else
        For $i = 0 To $csv_size Step $batch_size
   $query_part2 = ""
   For $j = 0 To $batch_size - 1
    $x = $i + $j
    If $x > $csv_size Then ExitLoop ; avoid array error when $csv_size not evenly divisible by $batch_size
    If $csv_list[$x][5] > 0 Then
     $query_part2 &= "('','" & $index
     For $k = 0 To 6
      $query_part2 &= "','" & $csv_list[$x][$k]
     Next
     $query_part2 &= "'),"
    EndIf
   Next
   $query_part2 = StringTrimRight($query_part2, 1) & ";"
   MsgBox(1,"", $query_part1 & $query_part2) ; temp
;   $res = _EzMySql_Exec($query_part1 & $query_part2)
;   If @error Then
;   MsgBox(0, "1", _EzMySql_ErrMsg())
;   EndIf
  Next
;       _status(2,"Updated")
;       _EzMySql_Close()
;       _EzMySql_ShutDown()
;   EndIf
EndFunc   ;==>_upload
Edited by Spiff59
Link to comment
Share on other sites

This is for 976 records:

Time with 5 rows x INSERT: 125307.893706976
Time with 10 rows x INSERT: 63281.3557364744
Time with 20 rows x INSERT: 32592.3026092089
Time with 50 rows x INSERT: 14136.3903484292
Time with 100 rows x INSERT: 8479.93148180226
Time with 200 rows x INSERT: 5462.08146480102
Time with 1000 rows x INSERT: 3766.54284632805

1,000 rows and 3.7 secs is the min with 1000 rows for 976 records.

So I think I can modulate $batch_size based on ubound($csv_list) up to 1,000

Good work mate, I think this can be used as a template to mass_insert when LOAD DATA INFILE is not available from hoster.

M.

Edited by marko001
Link to comment
Share on other sites

To optimize cycle:

if $csv_size >= 1000 Then  ; n° of records
            $batch_size = 1000
        Else
            $batch_size = $csv_size
        EndIf

before the for...next

(it can also be done with more than 1,000; i noticed the ratio is no more so efficient with higher values)

Edited by marko001
Link to comment
Share on other sites

Wow, I'd of expected you to hit some string limit long before being able to load that many rows into one INSERT statement!

(they did all get added?)

Am glad to have been of some assistance :graduated:

Edit: I found this out there...

"The server's default max_allowed_packet value is 1MB. You can increase this if the server needs to handle larger queries "

So it looks like you have plenty of room to play.

PS - From your test resukts... it must have been miserably slow at 1-row-per-call!

Edited by Spiff59
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...