argumentum Posted Saturday at 11:19 PM Posted Saturday at 11:19 PM I have a CSV file. Is 3.12 GB (3,357,068,152 bytes). FileReadToArray() loaded ( just the count ) and UBound() say that is 20073124 and that just happens to be a tiny bit more that the max of 16 million elements an array can hold. How can a file that size be read, chop in @CRLF chunks, filtered ( If Not StringInStr($aFile[$n], ".au3") Then ContinueLoop ) and write to another file the resultant ? Anything goes: SQLite, memory, etc. . Thanks mLipok 1 Follow the link to my code contribution ( and other things too ). FAQ - Please Read Before Posting.
Nine Posted yesterday at 03:52 AM Posted yesterday at 03:52 AM Read it line by line. First thing that comes to mind... “They did not know it was impossible, so they did it” ― Mark Twain Spoiler Block all input without UAC Save/Retrieve Images to/from Text Monitor Management (VCP commands) Tool to search in text (au3) files Date Range Picker Virtual Desktop Manager Sudoku Game 2020 Overlapped Named Pipe IPC HotString 2.0 - Hot keys with string x64 Bitwise Operations Multi-keyboards HotKeySet Recursive Array Display Fast and simple WCD IPC Multiple Folders Selector Printer Manager GIF Animation (cached) Debug Messages Monitor UDF Screen Scraping Round Corner GUI UDF Multi-Threading Made Easy Interface Object based on Tag
argumentum Posted yesterday at 04:27 AM Author Posted yesterday at 04:27 AM (edited) 35 minutes ago, Nine said: First thing that comes to mind... First thing that came to my mind was to read up to X position, split and work on that, then read the next until done 🤷♂️ Line by line would be kind of slow I think Edited yesterday at 04:28 AM by argumentum Follow the link to my code contribution ( and other things too ). FAQ - Please Read Before Posting.
jchd Posted 19 hours ago Posted 19 hours ago (edited) Import in SQLite and apply any SQL magic massage needed to your data. You can just use the SQLite CLI (the command-line interface) with a memory (default) or disc-based database. You can define row and column separators if needed, .import the file as --csv, apply SQL to filter out unwanted rows, rewrite the resultant table to csv or whatever format you wish. If this operation turns to routine, store all these commands in a file and run it with the CLI. Use a recent sqlite3.exe, it's now full of rich features. Else slap a good regex in the face of the file! Edited 19 hours ago by jchd argumentum 1 This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe hereRegExp tutorial: enough to get startedPCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta. SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)
jchd Posted 19 hours ago Posted 19 hours ago 11 hours ago, argumentum said: FileReadToArray() loaded ( just the count ) and UBound() say that is 20073124 and that just happens to be a tiny bit more that the max of 16 million elements an array can hold. IIRC the 16M limit is just an order of magnitude; the actual limit significantly depends on the volume of data. argumentum 1 This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe hereRegExp tutorial: enough to get startedPCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta. SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)
Solution Nine Posted 17 hours ago Solution Posted 17 hours ago (edited) 8 hours ago, argumentum said: Line by line would be kind of slow I think In fact it would be faster. You do not need to create a useless intermediate array. Reading a bunch of lines won't get it faster either. The file is preemptively in memory buffers by the OS. I remember trying to use an overlapped read to perform tasks while reading the file and it ends up with a very little gain but made the script way more complex. Try it and you will be gladly surprised. Edited 16 hours ago by Nine argumentum 1 “They did not know it was impossible, so they did it” ― Mark Twain Spoiler Block all input without UAC Save/Retrieve Images to/from Text Monitor Management (VCP commands) Tool to search in text (au3) files Date Range Picker Virtual Desktop Manager Sudoku Game 2020 Overlapped Named Pipe IPC HotString 2.0 - Hot keys with string x64 Bitwise Operations Multi-keyboards HotKeySet Recursive Array Display Fast and simple WCD IPC Multiple Folders Selector Printer Manager GIF Animation (cached) Debug Messages Monitor UDF Screen Scraping Round Corner GUI UDF Multi-Threading Made Easy Interface Object based on Tag
argumentum Posted 7 hours ago Author Posted 7 hours ago 9 hours ago, Nine said: Try it and you will be gladly surprised. True. I did and yes, it was good Nine 1 Follow the link to my code contribution ( and other things too ). FAQ - Please Read Before Posting.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now