Sign in to follow this  
Followers 0
sulfurious

trying to understand this snippet

22 posts in this topic

#1 ·  Posted (edited)

Hi. I think I have a handle on how to fill an array with only one dimension.

I am now playing with a simple delimited text file, say like this:

some text for col 1;some text for col 2

I found this piece of code by gafrost. I am wondering a few things. First, even though the array is Dimmed, it is not assigned any paramaters like [5]. Is this the same in multi D arrays? Can the [] [] be filled in or does it have to be declared? And how can you use Ubound if there is no # of components in the array yet?

Second, would this work from the text example above?

Split out according to delimiter ; so that

$x1 = left side

$x2 = right side

Now I am a bit confused as to how to fill the array. Something like this?

$array = $x1, $x2 (it must be like $array[$][$] = $, $ I would guess)

loop through

And once I have it in this array, am I using it correctly? I am wanting to have one value be a description, such as an app name, the second value be the path to whatever it is I am describing in the first value. So $array[name][path] like that. Is it possible to pull out the array something like this:

$app = $array[1][1]

$path = $array[1][2]

I guess it is keep that one line together, but split into 2 variables. But in an array, that way I can create a gui that fills in buttons dynamically according to the description, and have it's value be the path.

Just a way I could see a use for it.

Sorry this is a bit confusing. AutoIT is an amazingly easy scripting language, but I am really having a time of understanding the array business.

Also, can anyone explain why gafrost is using Ubound so much? Looking at this piece of code it must be pretty powerful, but for some reason even the help file on it is a bit confusing.

Thanks for any help,

sul

$file = FileOpen("test.txt", 0)

; Check if file opened for reading OK
If $file = -1 Then
    MsgBox(0, "Error", "Unable to open file.")
    Exit
EndIf
Dim $array
; Read in lines of text until the EOF is reached
While 1
    $line = FileReadLine($file)
    If @error = -1 Then ExitLoop
    If StringInStr($line, "<TYPE>:") Then
        $found = 0
        For $x = 0 To UBound($array) - 1
            If $line == $array[$x] Then
                $found = 1
                ExitLoop
            EndIf
        Next
        If Not $found Then
            If IsArray($array) Then
                ReDim $array[UBound($array) + 1]
            Else
                Dim $array[1]
            EndIf
            $array[UBound($array) - 1] = $line
        EndIf
    EndIf
WEnd
For $x = 0 To UBound($array) - 1
    MsgBox(0, "", $array[$x])
Next
FileClose($file)
Edited by sulfurious

Share this post


Link to post
Share on other sites



Hi. I think I have a handle on how to fill an array with only one dimension.

I am now playing with a simple delimited text file, say like this:

some text for col 1;some text for col 2

I found this piece of code by gafrost. I am wondering a few things. First, even though the array is Dimmed, it is not assigned any paramaters like [5]. Is this the same in multi D arrays? Can the [] [] be filled in or does it have to be declared? And how can you use Ubound if there is no # of components in the array yet?

Second, would this work from the text example above?

Split out according to delimiter ; so that

$x1 = left side

$x2 = right side

Now I am a bit confused as to how to fill the array. Something like this?

$array = $x1, $x2 (it must be like $array[$][$] = $, $ I would guess)

loop through

Example Data for ini

[sCHEMES]

scheme=Blues

scheme=Browns

scheme=Grays

scheme=Greens

scheme=Oranges

scheme=Pinks/Violets

scheme=Whites

scheme=Yellows

Example Read of that section of ini

$ARRAY = IniReadSection(@ScriptDir & "\colors.ini", "SCHEMES")

Using the above read will get the key/value pair

$ARRAY[0][0] will be the size of the array

$ARRAY[1][0] = "scheme"

$ARRAY[1][1] = "Blues"

etc...

string split of some text for col 1;some text for col 2 will result in

$ARRAY[0] will be size of the array

$ARRAY[1] = "some text for col 1"

$ARRAY[2] = "some text for col 2"

Also, can anyone explain why gafrost is using Ubound so much? Looking at this piece of code it must be pretty powerful, but for some reason even the help file on it is a bit confusing.

Thanks for any help,

sul

1: I declared the variable as non array, so to easly determine if the array has been initialized yet.

2: using Ubound didn't have to keep track of the size of the array. (Returns the size of array)

Hope this helps

Gary


SciTE for AutoItDirections for Submitting Standard UDFs

 

Don't argue with an idiot; people watching may not be able to tell the difference.

 

Share this post


Link to post
Share on other sites

Well, the dolt that I am, I never thought about using an ini for that! Here I have made them as well as inf files for unattended windows installs, and I totally spaced that. That would work really well. Thanks for that reminder, as well as the info about how string splitting would give my array results that would not be as easy to work with.

I see what you mean now about the ubound. One question though about that code snippet, you use the If NOT $found statement. What exactly is happening here? Is that saying $found is not in existence yet, as in there is no array? $found is given a variable and created, but before that you can use a NOT on it? If so, I did not know you could do that.

Now let me pose a new problem. Without posting all of the code, which is long and not very tidy right now, here is what happened.

I made a script that reads a text file called CompNames.txt with UNC server names, such as:

server1

server2

server3

There is a second text file called Glist.txt that has a list of directories to copy upon the appropriate event. It looks like this

dir1

dir2

dir3

When the button is clicked for dir1, this happens

Func _get($AppName)     
GUISetState(@SW_HIDE, $winDOWNLOAD)
Dim $Clist
If Not _FileReadToArray("\\" & $FServer & "\MTL$\CompNames.txt", $Clist) Then
    _errlog("Error creating Clist")
EndIf
ProgressOn("APP DOWNLOAD", "Searching for idle servers", 0, 300, 250 , 16)
Dim $ActiveServer = ""
$i = 0
$ii = 0
$ExitLoop = "no"
for $ii = 1 to 10 
        for $i = 1 to $Clist[0] -1
            If StringTrimRight($Clist[$i], 1) <> @ComputerName Then
                $Iserver = FileExists("\\" & StringTrimRight($Clist[$i], 1) & "\" & $Fpub & "\busy.txt")
                If $Iserver = 1 Then
                    $ActiveServer = StringTrimRight($Clist[$i], 1)
                    _serverstatus($AppName, $ActiveServer)
                    If $ExitLoop = "yes" Then
                        ExitLoop(2)
                    EndIf
                EndIf
            EndIf
        Sleep(500)
        Next    
    Sleep(500)
    ProgressSet($ii * 10, "Please stand by....", "Finding a source for " & $AppName)
next
ProgressOff()
If $ii >= 10 Then SetError(-69)
EndFunc

What is supposed to happen is when the app is first started, the client computer opens the CompNames.txt file and add's it's UNC to it. Then it downloads the apps and shares the app directories. Then upon completion of this the client should be able to serve the directory for other clients. The whole script is designed to distribute a directory, while ensuring that only one client can download from a source at a time.

The script worked fine with 3 or 4 clients using it. Once a few more started running it, the CompNames.txt file (which is only on server1) became "locked". Even logged in on that server I could not write to it.

I should state that every time I opened a file for read purposes, it was as ReadOnly, and then I closed it. When the I opened a file for append or erase, I also closed it. The only things I did not close were the FileReadToArrays.

I am wondering if there is a limitation of windows (xp) where once so many clients open a file it is locked. What happened was that everyone got stuck in that loop above, not finding an available server. Yet the CompNames.txt file had the correct server names, and each server did not have a session active, and was ready for downloading. If I went into the server that housed that CompNames.txt file, I saw like 8 open sessions with CompNames.txt. Once I closed all of those sessions, three clients would find an idle server and start to download. But once thier download was completed, everyone would be stuck in that loop again, finding no idle servers. The script worked fine from the standpoint of doing what it was supposed to with exception to this problem.

Makes me think of 2 things.

1. all clients were running the script from the server, the issue being there

2. there is a limitation in that file CompNames.txt being open by so many clients, and it cannot even be read once that limitation occurs

I could post the whole code if need, but like I said it is quite long. By my standards anyway, 1000+ lines. But, I am just a n00b, so maybe it should only be like 25 lines:)

Any thoughts on this?

sul

Share this post


Link to post
Share on other sites

If I understand correctly, you're having multiple clients modify a single text file that contains a list of machines accessing the files.

I see two very easy solutions:

1. Just have each client create its dummy file and other clients use FindFirstFile and FindNextFile to see which clients are connected. That way, there is no chance of the clients stepping on each other's toes. Each client cleans up its own dummy file when it exits, or a process on the server checks each client occasionally and deletes the dummy file if the client isn't sharing.

2. Perhaps you could have each client create a dummy file by its machine name, and use a maintenance script on the server to watch for those and add them to a readonly list. For instance:

SERVER1 would create a file named SERVER1_busy.txt

SERVER2 would create a file named SERVER2_busy.txt

The server would continuously monitor for files named "*_busy.txt" and take the prefix (machine name) and add it to the main CompNames.txt, then delete the dummy file. That way only one process is modifying the CompNames.txt file, but many should be able to read it concurrently. Also, be careful of opening the file for writing at exactly the same time the clients are opening it for reading. It could mean that either the server is blocked from writing and throws an error, or one or more clients can't open the file for reading. You must account for these exceptions in your error checking.

Here's an interesting question: Is it possible for the server to keep the file open for writing, never closing it, while the clients can still read it? I think the FileClose function actually "flushes" the writes to the file, so the answer would be no, but you may want to investigate this.

Also, what about using an INI file such as CompNames.INI. The section name would be the client machine name, as follows:

[sERVER1]

Active=1

[sERVER2]

Active=1

Then you could just use something like $bActive = IniRead($path & "CompNames.ini", @ComputerName, "Active", "0"). The server would do an IniWrite($path & "CompNames.ini", $ClientName, "Active", "1") for each client dummy file it finds.

Share this post


Link to post
Share on other sites

Mmmm. Interesting points. You understand exactly what my problem is I believe. Since I have recently seen the benefiets of using an ini, I was toying with an idea along those lines.

And I was thinking of what you call dummy files, sort of.

I was also thinking of writing to that file only after all the dirs that I want to disperse are properly set up on each client. That way, unless by some chance all downloads are completed at exactlty the same time, only one client is writing.

What I don't understand is how come this lockup is happening during ReadOnly opens. I could see if 6 clients were trying to write at the same time. It seemed to have it's issue when they were looping through that code snippet, I am assuming when they dump the txt to the array.

And if I get what you are suggesting, that a script is running on the server to monitor, at intervals, new dummy files, and it adds the needed text to the ini. ?

That too might work. Had not thought about the server doing any of it.

I will play a bit and see.

Thanks a mint for the suggestions.

sul

Share this post


Link to post
Share on other sites

More ideas... :dance: <-- I love this emoticon, it just needs the transforming sound from the Transformer's. lol

Take your file reading code and create a test script that simply reads the file and loops infinitely. Have it report how many iterations it's on and run this on several workstations to see at what point it locks up. If it doesn't lock up, then you've eliminated file reading as the culprit. Conversely, if it does, then you should have an idea of your threshold and can adjust for this limitation.

Another thought: Instead of having clients "pull" the files, have the server "push" them to the clients, then create the dummy file. When the client sees its dummy file, it knows it has the necessary files and does its work (sharing them or whatever).

Hmmm... You could have the client script run and check to see if it has the files. If it doesn't, it creates a dummy file named @ComputerName & ".feedme" and waits for a file named @ComputerName & ".sharing". A process on the server scans for the "*.feedme" files, pushes the files to that client, deletes the ".feedme" file, and creates the ".sharing" file. When the client sees its ".sharing" file exists, it continues its work of sharing the files or whatever. Any clients looking for other clients (which are now servers if they're sharing the files) can search for ".sharing" files on the server (or "controller" at this point). Now no process is stepping on any other process and there's synchronicity between the clients and the server. This is still subject to the connection limit in Windows XP though.

:D ...or use a database instead of files... install MySQL or something to handle the tracking aspect. If you decide to go this route, I can send you some example code for DB access via AutoIt using objects.

:D ...or... I think you can make an AutoIt script into a client/server application using sockets, but I'm not positive. Search the forums for this because you could make this a client/server program, in which case you know for sure that only the server script is touching the database files.

By the way, it seems like what you're trying to implement is just a P2P really. :dance: Isn't there a free P2P solution out there that fits your needs? :( There are a thousand ways to do what you're trying to do. :D Try several to find the best solution for your situation.

Good luck

:whistle:

Share this post


Link to post
Share on other sites

Check out the AutoIt TCP functions to create the client/server version:

TCPAccept
TCPCloseSocket
TCPConnect
TCPListen
TCPRecv
TCPSend
TCPShutdown
TCPStartup
TCPTimeout (Option)

TCPCloseSocket has a long code sample of many of these functions.

...or the UDP functions:

UDPBind
UDPOpen
UDPRecv
UDPSend

These should work for creating your own light-weight database server.

Share this post


Link to post
Share on other sites

Wow. Some good ideas. I am goint to use that loop part and try to determine how many fileopens the limit is, as I think that is the issue.

I guess I should tell you more about what the script is supposed to do, then you might have some better ideas, since you seem to produce them in quantity:)

A number of years ago my brothers and a few friends and I started having lan parties at thanksgiving and xmas. With better games coming out, esp FPS ones, more people we knew wanted to play. We maxed out at 16 in my basement, plus the 3rd kid was born and did not appreciate 16 fraggers yelling across the room at midnight. So we started renting out places. We have an avg of 20 peeps now, sometimes hitting 30 or so. Anyway, as I started this whole thing, I seem to be the one who organizes it, as well as play IT at them. One thing we found out long ago was that everyone usually has a different version of each game. Or they have saved games, or are just paranoid and don't want to mess up thier installed version. So we started making an image that is patched to a certain point or whatever. We use no-cd exe's as well. Basically whatever I put on the server, everyone copies to their machine. That way we know everyone can connect no problems.

In the past, if a person was using a nocd patch different than the one others were using, everyone would have to backup thier original, put the patched on, and hope we had the right one to connect to the server with. Not any more. Just get the master image. But with 20+ clients downloading, you end up with some pretty slow transmission speeds. So, last year I started writing batch files, with menus, that were supposed to keep only one client downloading from a server at a time. They worked, but it took a bit to code those up, and with only the text interface, confused a lot of people.

So, a guy named MHz who roams around here from time to time turned me onto autoit. Now I can script it, as well as put a small gui together for it.

So here is what happens.

I have 3 servers which have the exact same share name for game 1. Only the dual proc raid server has the compiled script. That same server also has a text file with a list of the directories to get. And it houses the text file with the computer names, which is initially the 3 servers. As client#1 runs the script, it was supposed to write it's UNC name to the CompNames.txt file. As the script runs, it pops up a gui with let's say 4 games to download. So client#1 clicks on button 1.

Now what happens is it reads the compnames.txt to an array. It ignores it's own UNC name, but checks for the share name for that button. If it finds it, it then reads a text file on whatever computer it finds has the share and determines if it is already busy or idle. It it is idle, it writes to that text file it's UNC name, the time and it's IP. Then it commences to download. When it is done, it erases that busy line and puts the text back in that says the server is idle again. Client#2 was also trying to get one of the directories. While client#1 was downloading, client#2 found the share, but read the text file that said the server was busy, so it looped through each server 4 times and then returned to the gui, waiting for the button to be pushed and start all over.

This worked well with my home network, 4 computers. When we had our last lan party, the idea was, download the files, when you have them all, run another script that completed setting the client up as a server. Since it's UNC name was already in the compnames text file, computers were already looking to it, they just never found it as idle.

Sort of p2p, but not really. Anyway, 3 servers feed to 3 peeps. When they are done, now 6 servers feed to 6 peeps, and so on. Trying to speed things up, be organized, and most of all, stop playing IT the whole day figuring out why cleint#x can't connect to the server.

So what happened? With 4 or 5 clients running it, it worked awesome. When more started, everyone got in the server search loop. I assume it is because the compnames.txt file was locked and they could not even read it. Meaning each client was looking for no server. And although each server was idle and ready, until I closed all the sessions on the main fileserver, no one started dloading. ONce I closed all open sessions to that file, 3 would start downloading, but then everyone else would be back to the same no server found loop.

So, seeing what I am trying to do, what might your suggestions be?

sul

Share this post


Link to post
Share on other sites

Wow. Some good ideas. I am goint to use that loop part and try to determine how many fileopens the limit is, as I think that is the issue.

I guess I should tell you more about what the script is supposed to do, then you might have some better ideas, since you seem to produce them in quantity:)

A number of years ago my brothers and a few friends and I started having lan parties at thanksgiving and xmas. With better games coming out, esp FPS ones, more people we knew wanted to play. We maxed out at 16 in my basement, plus the 3rd kid was born and did not appreciate 16 fraggers yelling across the room at midnight. So we started renting out places. We have an avg of 20 peeps now, sometimes hitting 30 or so. Anyway, as I started this whole thing, I seem to be the one who organizes it, as well as play IT at them. One thing we found out long ago was that everyone usually has a different version of each game. Or they have saved games, or are just paranoid and don't want to mess up thier installed version. So we started making an image that is patched to a certain point or whatever. We use no-cd exe's as well. Basically whatever I put on the server, everyone copies to their machine. That way we know everyone can connect no problems.

In the past, if a person was using a nocd patch different than the one others were using, everyone would have to backup thier original, put the patched on, and hope we had the right one to connect to the server with. Not any more. Just get the master image. But with 20+ clients downloading, you end up with some pretty slow transmission speeds. So, last year I started writing batch files, with menus, that were supposed to keep only one client downloading from a server at a time. They worked, but it took a bit to code those up, and with only the text interface, confused a lot of people.

So, a guy named MHz who roams around here from time to time turned me onto autoit. Now I can script it, as well as put a small gui together for it.

So here is what happens.

I have 3 servers which have the exact same share name for game 1. Only the dual proc raid server has the compiled script. That same server also has a text file with a list of the directories to get. And it houses the text file with the computer names, which is initially the 3 servers. As client#1 runs the script, it was supposed to write it's UNC name to the CompNames.txt file. As the script runs, it pops up a gui with let's say 4 games to download. So client#1 clicks on button 1.

Now what happens is it reads the compnames.txt to an array. It ignores it's own UNC name, but checks for the share name for that button. If it finds it, it then reads a text file on whatever computer it finds has the share and determines if it is already busy or idle. It it is idle, it writes to that text file it's UNC name, the time and it's IP. Then it commences to download. When it is done, it erases that busy line and puts the text back in that says the server is idle again. Client#2 was also trying to get one of the directories. While client#1 was downloading, client#2 found the share, but read the text file that said the server was busy, so it looped through each server 4 times and then returned to the gui, waiting for the button to be pushed and start all over.

This worked well with my home network, 4 computers. When we had our last lan party, the idea was, download the files, when you have them all, run another script that completed setting the client up as a server. Since it's UNC name was already in the compnames text file, computers were already looking to it, they just never found it as idle.

Sort of p2p, but not really. Anyway, 3 servers feed to 3 peeps. When they are done, now 6 servers feed to 6 peeps, and so on. Trying to speed things up, be organized, and most of all, stop playing IT the whole day figuring out why cleint#x can't connect to the server.

So what happened? With 4 or 5 clients running it, it worked awesome. When more started, everyone got in the server search loop. I assume it is because the compnames.txt file was locked and they could not even read it. Meaning each client was looking for no server. And although each server was idle and ready, until I closed all the sessions on the main fileserver, no one started dloading. ONce I closed all open sessions to that file, 3 would start downloading, but then everyone else would be back to the same no server found loop.

So, seeing what I am trying to do, what might your suggestions be?

sul

<{POST_SNAPBACK}>

assuming that the issue is occuring because of the file becoming locked, what i'd suggest is when you're reading the file, instead of reading it directly, to make a copy of the file locally, and open the copy. and for the writing, you can do the write to the original file once it's completed. Or you could make the file like a webpage and interact with it as a page instead of just as a file, that should work to get around any limit on number of times it can be opened....

1100111 00001011101111 00011101101111 00010111100100 00001111110100 00110111110010 00101101111001 0011100i didn't make up this form of encryption, but i like it.credit to the lvl 6 challenge on arcanum.co.nz

Share this post


Link to post
Share on other sites

Hmm. Copy the file to client would work. But here is an issue. Part of the reason to have the file located on one server is so that when a client name is added to it and can become a server, the next time the server search loop begins it will make a new array that includes the newly written UNC name.

I suppose it could copy to local every time it initiates the server search loop. Unfortunatley, with different peeps using different OS's or not using simple file sharing, or just becuase of particular policies, I don't think it would work to 'push' the file out from the server. It needs to be initiated from client side. Granted I can make the script change things on the client, but some people wig out when I do simple things like create a directory and then share it. Can't imagine what would happen if I messed with thier security policies.

I have yet to mess with anything webbish, so I don't know exactly what you are referring to as far as making it a web page. Understand the concept somewhat, just never messed with it.

I don't see an issue with writing to the original once the client has completed being set up as a server. If I make a seperate script to do the writing, I could easily make a dummy file that allows only one open session for writing at a time. I am wondering about 15 clients copying that file every couple minutes (every time they start the server search loop).

Very good ideas though. lol, almost too many now to try them all.

Keep em coming, I am sure the solution is there. I wonder if switching the server OS to 2000 server would cure the issue. I do have a copy, but with xp and simple file sharing, that works oh so simple for the lan parties. No permissions to set etc.

Funny how something so simple as playing games with some friends could turn into such a learning experience. Not that I mind, I have been building databases for a few years now, and scripting in autoit has vastly improved my abilities to write front ends in vb.

sul

Share this post


Link to post
Share on other sites

Hmm. Copy the file to client would work. But here is an issue. Part of the reason to have the file located on one server is so that when a client name is added to it and can become a server, the next time the server search loop begins it will make a new array that  includes the newly written UNC name.

I suppose it could copy to local every time it initiates the server search loop. Unfortunatley, with different peeps using different OS's or not using simple file sharing, or just becuase of particular policies, I don't think it would work to 'push' the file out from the server. It needs to be initiated from client side. Granted I can make the script change things on the client, but some people wig out when I do simple things like create a directory and then share it. Can't imagine what would happen if I messed with thier security policies.

I have yet to mess with anything webbish, so I don't know exactly what you are referring to as far as making it a web page. Understand the concept somewhat, just never messed with it.

I don't see an issue with writing to the original once the client has completed being set up as a server. If I make a seperate script to do the writing, I could easily make a dummy file that allows only one open session for writing at a time. I am wondering about 15 clients copying that file every couple minutes (every time they start the server search loop).

Very good ideas though. lol, almost too many now to try them all.

Keep em coming, I am sure the solution is there. I wonder if switching the server OS to 2000 server would cure the issue. I do have a copy, but with xp and simple file sharing, that works oh so simple for the lan parties. No permissions to set etc.

Funny how something so simple as playing games with some friends could turn into such a learning experience. Not that I mind, I have been building databases for a few years now, and scripting in autoit has vastly improved my abilities to write front ends in vb.

sul

<{POST_SNAPBACK}>

well the way the copy would work, is like:

comp a is the initial server, it has a list of servers, which at start is only it.

comp b copies the list, downloads from server, then writes itself to the original file so that comp c then sees comp a and b on the list, when it copies the file... you may want to look into using like BitTorrent though, you could create your own little tracker and torrent and spread the files very quickly and efficiently to each computer...


1100111 00001011101111 00011101101111 00010111100100 00001111110100 00110111110010 00101101111001 0011100i didn't make up this form of encryption, but i like it.credit to the lvl 6 challenge on arcanum.co.nz

Share this post


Link to post
Share on other sites

I did not realize you could use a torrent app like that. I think someone else mentioned that too. Well, maybe it is time to learn about torrent and p2p stuff.

Thanks for the input.

sul

Share this post


Link to post
Share on other sites

Umm, excuse my ignorance, but would a torrent really be any faster? I mean, comp a is serving the file, on my network, at around 97% of the 10/100. Maybe down around 80% utilization if I have not tweaked the clients nic. If 3 clients can download at nearly full utilization of the nic from the servers, how would a torrent be any faster?

Would it not be slower if while downloading, it was also uploading pieces to other computers?

I could see the distribution of one dir/file being faster in regards to more sources being available, but would it not still be limited to nic throughput speeds? Thus dloading from one server vs. mulitple torrents still be a bandwidth issue?

Again, excuse my ignorance here, just thought about it for a minute and was wondering.

sul

Share this post


Link to post
Share on other sites

Umm, excuse my ignorance, but would a torrent really be any faster? I mean, comp a is serving the file, on my network, at around 97% of the 10/100.  Maybe down around 80% utilization if I have not tweaked the clients nic. If 3 clients can download at nearly full utilization of the nic from the servers, how would a torrent be any faster?

Would it not be slower if while downloading, it was also uploading pieces to other computers?

I could see the distribution of one dir/file being faster in regards to more sources being available, but would it not still be limited to nic throughput speeds? Thus dloading from one server vs. mulitple torrents still be a bandwidth issue?

Again, excuse my ignorance here, just thought about it for a minute and was wondering.

sul

<{POST_SNAPBACK}>

in a nutshell, here is how a torrent works...

you start with 1 file (file/dir to be distributed) on one computer. all of the other computers then connect to that computer or 'seed' it begins sending pieces of the file to each of the clients, now the clients are all connected too in this cluster, and as each of them receives a portion of the file, it sends that portion to the others. since each computer is sending and receiving as quickly as your network will let them, you're actually putting the file on each of the computers at kind of the same time... now that is a very simplified explanation of how it works, but i think it's enough to illustrate the concept.


1100111 00001011101111 00011101101111 00010111100100 00001111110100 00110111110010 00101101111001 0011100i didn't make up this form of encryption, but i like it.credit to the lvl 6 challenge on arcanum.co.nz

Share this post


Link to post
Share on other sites

OK. So the implication in this situation is this.

3 computers have the files. Let's say all 20 computers request them through a torrent process. Each piece of each file is broadcast (becuase it is on big cluster) around until each of the 20 clients has a complete image. No searching for servers. No worrying about multiple downloads of large files from one server slowing throughput. Kind of a revolving download based on the end result, with every client pitching in if it has the pieces.

Sounds like bandwidth wise, it would not be any faster. But there would be no server lines at all, which would increase setup times. I like it. I will dig up some info and play with it.

Thnx for the explanation.

sul

Share this post


Link to post
Share on other sites

#16 ·  Posted (edited)

Here's the terminology I'm going to use:

"Controller" is a single machine that maintains a list.

"Client" is a machine that is not sharing yet.

"Server" is a machine that has the files and is sharing them.

Now, If I were to design this from scratch, I think I would probably create a client/server solution where the server would keep a multi-dimensional array of server names, IP addresses (to avoid DNS issues), and the number of times each server has been used. All TCP connections would use TCP. The computer names are just a reference for talking to the controller.

A machine running the script would connect to the controller and ask for a timestamp and local path of the current fileset (where servers are sharing the game from). It would then check that the local path exists. If it doesn't, it knows it's not a server and will then ask the controller for the least used server in its list. If the local path does exist, then it checks for a dummy file in that path to see if its timestamp is older or newer than the one provided by the controller. If the local files are older than the controller's timestamp, it needs to update and will ask the controller for the least used server. If the controller does not have an available server it tells the client to try again. The client waits 1 minute and asks again. If the local files do match the timestamp, then it goes directly into server mode and tells the controller that it's available to handle clients.

If the machine is in client mode now (needs to download the files and has asked the controller for a server to get them from), then it contacts the server and asks for the files. The server acknowledges the request, then tells the controller that it's in use so the controller won't tell any other clients to connect to it. The controller also increments the number of connections handled by that server in order to give all servers equal priority. The client copies the files using whatever method you prefer and creates the fileshare. When it's done, it tells the controller that it's now a server and is able to handle clients, then tells the server that it's finished, which in turn tells the controller that it's available to handle clients.

I'd also use the OnAutoItExit function to tell the controller that it's no longer serving.

The controller would have an OnAutoItStart that would load an INI file to an array and an OnAutoItExit that would dump the array back into the INI file.

The communication between the controller and other machines would be something like:

"INIT" or "STARTUP" - client asking controller for a timestamp and path

"GET SERVER" - client asking controller for a server IP

"GET FILES" - client asking a server for files

"ACTIVE" - server telling controller it's in use

"AVAILABLE" - server telling controller it can handle clients

"OFFLINE" or "SHUTDOWN" - server telling controller it's NOT available to handle clients

Let me know if any of this is unclear. By the way, all the Torrent posts slipped in while I was writing this and trying to work... :whistle:

Edited by rmorrow

Share this post


Link to post
Share on other sites

You need to be clear on where your bottleneck(s) is/are. Is it the clients, is it the switch (or, God forbid, hub *cringe*), or is it the server resources (disks, CPU, memory, etc).

The only way to overcome a bandwidth issue is to use more bandwidth. Make sure everyone is connecting at 100/Full. Use a switch instead of a hub. Consider the quality of the switch. Consider having the group chip in for a gigabit switch and everyone buying their own gig NIC. Is there a wireless network involved? If so, kill it. Go with a wired solution.

If it's the game boxes, then it's their issue. Let them deal with it.

If it's the server, consider adding more resources. I would definitely put Server OS on there, and would seriously consider using a torrent solution.

Share this post


Link to post
Share on other sites

Wow. Lot of info there.

Start here first. Two 96 port Cisco Catalyst 5000 switches, double trunked. On each, first bank of 24 running forced 100/full, second 24 bank running forced 100/half, remaining 2-24 port banks auto-negotiate. Router in system to handle dhcp. 'Controller' server is dual pIII 733, 1gig ram, xp pro, raid 0 for OS, raid 1 for files.

'Gameserver' server is xp2400, 1g ram, xp pro

'My rig' server is xp2500, raid 0, 2gig ram, xp pro

Gigabit wise, that would be nice, but I don't think that will happen for awhile yet. Most of the peeps spend so much on having legal games (which we try our best to enforce) that they don't really feel like shelling out more for switches and such.

Bottleneck is certainly not the switches, nor is it the peeps machines, which I have mostly tweaked thier nics for 100/full. Server is able to pump out, if it likes the nic on the other end, about 97% of 100mbs. Or, at a low, maybe 80%. I have spent many many hours tweaking that setup. You know, don't want any lag :whistle:

I think the issue is, like I said, either a limitation in the OS only allowing so many handles on a file at once, or with autoIT opening the text file so many times, perhaps not closing it. I say that because there were no files being downloaded at all, just a stagnation in the server search loop. Which could be either the script was open by too many clients, or the file was locked from being read, thus an endless server search. Maybe I just need some more sleep()'s in there to handle that. I think it is an OS limitation myself.

Regardless, it appears I am now headed down the road to recoding, as so many good ideas have been brought up.

I will have to digest your post a little bit more. While I seem to be somewhat gifted when it comes to absorbing computer stuff, I am not sure I have been introduced to all that you commented about. Well, maybe just not sure yet how I would go about coding up an app that runs on the server to watch stuff.

Certainly seems, while more complicated, like a 'professional' approach to managing clients and file transfers.

I will cogitate on that and get back to it tommorrow.

Thanks again to everyone bringing up ideas.

sul

Share this post


Link to post
Share on other sites

#19 ·  Posted (edited)

Yeah, your network sounds optimal except for running XP on the server. Throw Server on there and let it be the only fileserver. Just create a batch file to robocopy the files.

If you want a server, use a server OS.

Edited by rmorrow

Share this post


Link to post
Share on other sites

OH! I forgot to ask why you have ports set to 100/half... why would you ever force a switch to half unless you're using a VoIP solution or something that requires it? That would kill your gameplay if anyone is plugged into it.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0