Jump to content

the fastest way to find duplicate lines [solved]


 Share

Recommended Posts

I Have an txt file which contains 1000000 lines.

I want to delete the duplicate lines. 

I have try the code as follow,  however, it runs very slowly :

Local $array
_FileReadToArray('test.txt',$array)
Local $aArrayUnique  = _ArrayUnique($array)
_FileWriteFromArray('test_unique.txt',$aArrayUnique);

 

If anyone can help me a faster way ? ( i want the code can  get the result  in  no more than 20 seconds   ) 

 

 

Edited by fenhanxue
solved
Link to comment
Share on other sites

You can probably somehow speed up de-dup of that many lines, but the main point is that a flat text file isn't suitable for a routine task like that. You'd benefit hugely from converting to a database file. FYI SQLite is pretty easy to use and well supported from AutoIt.

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to comment
Share on other sites

Whilst I completely agree with jchd, if you need a quick, one use, solution to this, you will find (quick Google) most of more robust text editors, e.g. notepad++ (free) and UltraEdit (commercial license) have ready made solutions for this problem.

Problem solving step 1: Write a simple, self-contained, running, replicator of your problem.

Link to comment
Share on other sites

$oBuffer = ObjCreate('Scripting.Dictionary')
$h_File_Source = FileOpen("source.txt")
$h_File_Output = FileOpen("output.txt", 2)
While 1
    $sLine = FileReadLine($h_File_Source)
    If @error Then ExitLoop
    ; if in check buffer skip line
    If $oBuffer.Exists($sLine) Then ContinueLoop
    ; write line to output file
    FileWriteLine($h_File_Output, $sLine)
    ; Add to duplicate check buffer
    $oBuffer.Item($sLine) = 1
WEnd
FileClose($h_File_Source)
FileClose($h_File_Output)

 

Link to comment
Share on other sites

Here is a PowerShell method of removing duplicate lines from an unsorted file, from within AutoIt.

It may be faster on big files.  On small files, KaFu's example is faster.

; Remove Duplicate Rows From A Text File Using Powershell... unsorted file, where order is important.
; Command from:  http://www.secretgeek.net/ps_duplicates

Local $hTimer = TimerInit()
local $sFileIn =  @ScriptDir & '\temp-6.txt'
local $sFileOut = @ScriptDir & '\newlist.txt'
Local $sCmd = '$hash = @{}' & @CRLF & _
        'gc ' & $sFileIn & '| % {if ($hash.$_ -eq $null) {$_} $hash.$_ = 1;} > ' & $sFileOut

RunWait('"C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe" ' & $sCmd, "", @SW_HIDE, 2 + 4) ; Run command in PowerShell.
ConsoleWrite("Time Taken: " & round(TimerDiff($hTimer)/1000,4) & "Secs" & @CRLF)
ShellExecute($sFileOut) ; See unique file.

 

Link to comment
Share on other sites

41 minutes ago, fenhanxue said:

i wonder if the code (    ObjCreate('Scripting.Dictionary')     )     will work in every computer ?

Afaik it's part of WSH and should be available from XP-SP3 on upwards by default:

https://en.wikipedia.org/wiki/Windows_Script_Host

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...