Jump to content

Recommended Posts

Posted

Hello,

Can you suggest me the fastest way of exporting large csv files to MySQL using these UDFs created by cdkid? I have thought of 2 solutions:

1. export the content of .csv to arrays and then use _MySQLAddRecord to add the arrays into the db; I am afraid it will take a long time to export the values from CSV to arrrays

2. use the _MySQLExec to execute LOAD DATA INFILE directly into MySQL and just to put the headers for the csv fle into an array and use it for column names.

Other suggestions would be great. I would really appreciate some examples, too.

Thank you,

Ionut

Posted

ive been using this - truly awesome

props to KAFU

http://www.autoitscript.com/forum/index.php?showtopic=81346&view=findpost&p=583923

Posted

ive been using this - truly awesome

props to KAFU

http://www.autoitscript.com/forum/index.php?showtopic=81346&view=findpost&p=583923

Gcue,

Let me know if I got this straight: I can use _SQLite_Exec to create SQLite tables out of my CSVs and then use _SQLite_Dump to write this tables to an SQLite Db. Is this correct?

Thanks,

Ionut

Posted

youd have to put your csv into an array first then use sqlite_exec to create table and also use sqlite_exec to insert records. then dump to a file

Posted

Yeap, gave it a try but it takes forever to populate an array with more than 65k rows. This solution does not really work out for me. Thanks a lot,anyway!

Regards,

Ionut

Posted (edited)

if CSV not so big and you have some tools like DataPump in Delphi

you can try

1. Save CSV as DBF in Excel

2. Use DataPump/ ODBC to pump data to MySQL

I think theoretically it might work but I never tried it

or maybe DataPump will work with CSV directly

another way is to find on the net free pump tool which does the trick

Edited by tobject
  • 2 weeks later...
Posted (edited)

Sorry so late...

I generally IMPORT the large csv files....up to and over 1 gig...without problem. (3.5 gig is the most I've imported...) I prepare these files with autoit & ms access...depending on how they come in to me...& what their problems are. Most of the data I get is very dirty...and requires parsing, standardization, etc., before any import operation. ... you can control the import with autoit...or ms access...or...bat files...whatever suits your need and configuration. Autoit3 w / stringregexreplace solves most problems...for me at least.

Edited by setirich

Good intentions will always be pleaded for every assumption of authority. It is hardly too strong to say that the Constitution was made to guard the people against the dangers of good intentions. There are men in all ages who mean to govern well, but they mean to govern. They promise to be good masters, but they mean to be masters.-Daniel Webster

Posted

You can use Excel to open the csv file and then transfer to arrays using something like :

$aArray = $oExcel.Activesheet.UsedRange.Value
$aArray2=$oExcel.transpose($aArray)

It is very fast.

Or you can convert the csv to xls and then use ADODB to open it as a database.

  • 7 years later...
Posted

SQLizer does this - just upload your file and hit convert. https://sqlizer.io

You’ll receive a SQL file with INSERT statements and TABLE definitions ready to be imported into your database.

It’s free for files up to 5k rows so you can always try it and see how you get on :)

  • Moderators
Posted

@LouiseP did you miss that this post is over 8 years old? Or that the person asking the question hasn't been on the forum since September of 2010? Please don't resurrect old threads.

"Profanity is the last vestige of the feeble mind. For the man who cannot express himself forcibly through intellect must do so through shock and awe" - Spencer W. Kimball

How to get your question answered on this forum!

Guest
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...