Jump to content

2d Array Input to Unique 2d Array Output .au3?


Recommended Posts

Hi all,

I know it exists - just need help finding the right one - there are tons of topics on Unique 2d Arrays - but what I specifically needs is a 2d in and eliminate duplicates by a Unique "column" getting a 2d array out...(sry, can't explain it better newb) I need all my other columns in unique column output - so it can't be 1d. VERY MUCH APPRECIATED - I love AutoIT and I just can't say enough good things about it - sincerely it's money. Oh ya, and this website rocks - so do the people in the forums - you guys are the best. Thanks so much!

And I've been looking - but just haven't found what I'm looking for yet - I found one that does 2d only but not 1d, one that does 2d in but not 2d out...(built in) I'm just confused with all the choices - anyone have a quick link to the best Array Unique Multidimensional in to Array Unique Multidimensional out ... or something like that (don't have best wording sry)?

Thanks!

"2d Array Input to Unique 2d Array Output .au3?

or

Multidimensional Array Input to Unique Multidimensional Array Output .au3?"

:sorcerer:

Edited by souldjer777

"Maybe I'm on a road that ain't been paved yet. And maybe I see a sign that ain't been made yet"
Song Title: I guess you could say
Artist: Middle Class Rut

Link to comment
Share on other sites

Explain what you want in the output array a little clearer and there might be a way of helping out. Do you want a 2D array in, find all the unique entries in both dimensions or in just one dimension? Example

Col1/Col2

1/ 2

2 / 4

3 / 4

3 / 1

What should the ideal output be from that?

Edited by BrewManNH

If I posted any code, assume that code was written using the latest release version unless stated otherwise. Also, if it doesn't work on XP I can't help with that because I don't have access to XP, and I'm not going to.
Give a programmer the correct code and he can do his work for a day. Teach a programmer to debug and he can do his work for a lifetime - by Chirag Gude
How to ask questions the smart way!

I hereby grant any person the right to use any code I post, that I am the original author of, on the autoitscript.com forums, unless I've specifically stated otherwise in the code or the thread post. If you do use my code all I ask, as a courtesy, is to make note of where you got it from.

Back up and restore Windows user files _Array.au3 - Modified array functions that include support for 2D arrays.  -  ColorChooser - An add-on for SciTE that pops up a color dialog so you can select and paste a color code into a script.  -  Customizable Splashscreen GUI w/Progress Bar - Create a custom "splash screen" GUI with a progress bar and custom label.  -  _FileGetProperty - Retrieve the properties of a file  -  SciTE Toolbar - A toolbar demo for use with the SciTE editor  -  GUIRegisterMsg demo - Demo script to show how to use the Windows messages to interact with controls and your GUI.  -   Latin Square password generator

Link to comment
Share on other sites

Hi BrewManNH and UEZ,

My array is quite extensive... it's 2d of course with about 12 Columns

I want to take the SiteName as a unique value and still have every column to display after I do that (without dupes of course)

There are about 350 rows w/ columns and all of it is "personal data" so I'm only sharing a snippet of that data which I've edited here...

[0]|0|TZ |Start |Name |OUT/OFFICE |Function |Comments |BlackBerry |DeskPhone |Site |SiteName |Facility |Address

[1]|0|ET|0800|Bob|0|FDL|0|555555|5555555|Lead|Services Bldg|Team|Detroit

[2]|0|ET|0|Marc|0|FDL|0|555555557|0|Tech Building |Team|North

Thanks!

I'll look at your example right now UEZ :)

Edited by souldjer777

"Maybe I'm on a road that ain't been paved yet. And maybe I see a sign that ain't been made yet"
Song Title: I guess you could say
Artist: Middle Class Rut

Link to comment
Share on other sites

Firstly we need to know exactly what you mean by a unique 2D array. It can mean several different things. For example: consider that the first column is an ID reference. There may be some repeated references producing too many rows. This is simple and straight forward. It is not the same as forcing a unique arrangement of cell data within a table. Removing all duplicated data from all cells in all rows is a bit more involved.

Edited by czardas
Link to comment
Share on other sites

This will delete duplicate rows from a 2D array (nothing more, nothing less).

#include <Array.au3>

Dim $a_Items[8][5] = _
[["same", "same", "same", "same", "same"], _
["name", "1", "11", "1", "6"], _
["same", "same","same","same", "different"], _
["rpt1", "2", "10", "1", "5"], _
["other", "2", "10", "1", "6"], _
["same","same","same","same","same"], _
["rpt1", "2", "10", "1", "5"], _
["same","same","same","same","same"]]

$a_Items = _2d_Arr_Uniq($a_Items)
_ArrayDisplay($a_Items)

Func _2d_Arr_Uniq($aArray)
    If UBound ($aArray, 0) <> 2 Then Return SetError(1, 0, 0)

    Local $iBoundY = UBound($aArray), $iBoundX = UBound($aArray, 2)

    Local $aTemp[$iBoundY]
    For $y = 0 To $iBoundY -1
        For $x = 0 To $iBoundX -1
            $aTemp[$y] &= $aArray[$y][$x] & "|"
        Next
        $aTemp[$y] = StringTrimRight($aTemp[$y], 1)
    Next

    $aTemp = _ArrayUnique($aTemp)

    ReDim $aArray[$aTemp[0]][$iBoundX]
    For $y = 1 To $aTemp[0]
        $aRow = StringSplit($aTemp[$y], "|", 2)
        For $x = 0 To $iBoundX -1
            $aArray[$y -1][$x] = $aRow[$x]
        Next
    Next

    Return $aArray
EndFunc ;==> _2d_Arr_Uniq
Link to comment
Share on other sites

  • 3 years later...
On 13.11.2012 г. at 2:02 AM, czardas said:

This will delete duplicate rows from a 2D array (nothing more, nothing less).

 

 

#include <Array.au3>

Dim $a_Items[8][5] = _
[["same", "same", "same", "same", "same"], _
["name", "1", "11", "1", "6"], _
["same", "same","same","same", "different"], _
["rpt1", "2", "10", "1", "5"], _
["other", "2", "10", "1", "6"], _
["same","same","same","same","same"], _
["rpt1", "2", "10", "1", "5"], _
["same","same","same","same","same"]]

$a_Items = _2d_Arr_Uniq($a_Items)
_ArrayDisplay($a_Items)

Func _2d_Arr_Uniq($aArray)
    If UBound ($aArray, 0) <> 2 Then Return SetError(1, 0, 0)

    Local $iBoundY = UBound($aArray), $iBoundX = UBound($aArray, 2)

    Local $aTemp[$iBoundY]
    For $y = 0 To $iBoundY -1
        For $x = 0 To $iBoundX -1
            $aTemp[$y] &= $aArray[$y][$x] & "|"
        Next
        $aTemp[$y] = StringTrimRight($aTemp[$y], 1)
    Next

    $aTemp = _ArrayUnique($aTemp)

    ReDim $aArray[$aTemp[0]][$iBoundX]
    For $y = 1 To $aTemp[0]
        $aRow = StringSplit($aTemp[$y], "|", 2)
        For $x = 0 To $iBoundX -1
            $aArray[$y -1][$x] = $aRow[$x]
        Next
    Next

    Return $aArray
EndFunc ;==> _2d_Arr_Uniq

<3 this, really :D

Link to comment
Share on other sites

  • 1 month later...

Old thread, but i found it and it was useful in getting me pointed in the right direction, and have a solution to add.

This solution takes a two dimensional array of arbitrary size, Sorts it by the first sub-element (eg $array[$i][0]), and then goes through each row and checks each column against every column of the already found array of unique rows. As long as you don't care about the order of the items in the array, this is much faster than the items above

; $a is the array to remove duplicates from
; $ignore is coloms to ignore, "2" will ignore wether $a[$i][2] is unique or not. "1|2" will ignore colom 1 AND 2, etc
; $debug will enable debug messages if set to 1
Func _ArrayUnique2D(ByRef $a, $ignore = "|", $debug = 0)
    local $colMax = UBound($a, 2)-1, $s, $iUniqueFound = 0

    ;This should make things faster and more reliable, so we will know if the current row is the same as a previous row faster
    _ArraySort($a)

    ;Start with the larest size return array we might need
    Dim $aRet[UBound($a)][UBound($a,2)]

    ;add the first item
    If $debug Then ConsoleWrite("+      Unique (first row!)"&@CRLF)
    $iUniqueFound += 1
    For $i = 0 to $colMax
        $aRet[1][$i] = $a[0][$i]
    Next

    ;we go down each row of the array
    For $row = 1 to UBound($a)-1
;~      ConsoleWrite("> Row "&$Row&@CRLF)

        ;for every row of unsorted items, go back up the return list of known unique items and
        ;check each one, row by row, until we find values that don't match (meaning this row is unique)
        ;or we find a row that is identicle (or run out of rows) (meaning this row is a duplicate and can be skipped)
        For $rowUnique = $iUniqueFound to 0 step -1

            ;sanity checking. if this ever actually happens, something is wrong
            If $rowUnique > $row Then ContinueLoop

            If $debug Then ConsoleWrite("   Checking "&$row&" against "&$rowUnique&"    ")
            For $i = 0 to $colMax
                If $debug Then ConsoleWrite($a[$row][$i]&"  =  "&$aRet[$rowUnique][$i]&"    ")
            Next
            If $debug Then ConsoleWrite(@CRLF)

            ;check all coloms for the two rows for values that are different
            For $col = 0 to $colMax

                ;if any of the coloms of the two rows are different, this means the current row is unique
                If $a[$row][$col] <> $aRet[$rowUnique][$col] Then

                    ;IF it's part of the ignore list then skip it and check the next colom
                    If StringInStr($ignore, "|"&$col&"|") Then ContinueLoop

                    ;If the second, third, etc colom of the row we are checking is different from the unique row we are checking,
                    ;then we know they are not the same and can skip checking the remaining coloms
                    ;Note: Because we sorted them, we know the second the first colom (index 0) is different, we will never run into a duplicate
                    ;so we know we do not need to check any other unique rows for matches.
                    If $col > 0 And $rowUnique > 1 Then ExitLoop

                    ;if we havn't exited the loop to check the next row, or continued the loop because this colom is ignored, this is unique
                    ;Add it to the return array and move on to the next row to start everything again
                    If $debug Then ConsoleWrite("+      Unique! Col: "&$col&@CRLF)
                    $iUniqueFound += 1
                    For $i = 0 to $colMax
                        $aRet[$iUniqueFound][$i] = $a[$row][$i]
                    Next
                    ExitLoop 2
                EndIf

                ;if it goes through all coloms and it doesn't find any differences then it is a uplicate
                If $col = $colMax Then
                    If $debug Then
                        ConsoleWrite("-     Not unique ("&$rowUnique&") ")
                        For $i = 0 to $colMax
                            ConsoleWrite($a[$row][$i]&"  =  "&$aRet[$rowUnique][$i]&"   ")
                        Next
                        ConsoleWrite(@CRLF)
                    EndIf
                    ExitLoop 2
                EndIf

            Next
        Next
    Next
    _ArrayDelete($aRet, $iUniqueFound+1&"-"&UBound($a)-1)
;~  ConsoleWrite(@error&"   "&$iUniqueFound&@CRLF)
;~  exit
    Return SetExtended($iUniqueFound, $aRet)
EndFunc

Also supports ignoring rows when checking for differences. If for instance you want an array to have one person from each address $array[$i][0=name, 1=address, 2=city], you set the second parameter to "0" and it will ignore name when considering uniqueness. If two people have different names, but the same address and city, it will consider it duplicate.

If anyone can think of ways to optimize this further, let me know

0x616e2069646561206973206c696b652061206d616e20776974686f7574206120626f64792c20746f206669676874206f6e6520697320746f206e657665722077696e2e2e2e2e

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...