Sign in to follow this  
Followers 0
lgodfrey

InetGet/Sleep "Bug"

8 posts in this topic

Since InetGet does not have a timeout parameter, you have to use background downloading to create your own timing loop to simulate a maximimum time for download before you assume an "error" occured and take action.

Imagine this scenario:

the absolute min time that a InetGet(URL,File,1,0) call takesfor a particular URL is 100 msec. (the min and average download times can be surprisingly stable over months of us, but can be quite different for different URLs)

Then it is fairly clear that you should be able to use sleep(80), and then start using @inetgetactive to check if the download is complete. This works of course, but using sleep(80) actually increases your download time!!!!!!!! and it increases it a lot, by as much as 70% for the average download time as well as the minimum download time. (increase depends on the URL)

I would consider this a bug if the sleep function is messing with an InetGet download, but my guess is that it is a windows quirk. I have been desperately trying to find a work around to no avail, so I thought I'd report this phenomenon here in the hopes that it can get fixed, if possible.

For some URLs, a smaller sleep time,(up to about half the minimum download time), does not appear to increase the download times , but for others (with faster download times?) even a sleep of 20% of the absolute min synchronous download time significantly increased the asynchronous download time.

In my scripts, I cannot have the download times increased at all, so I have to give up cpu time and start checking the download without any initial sleep time, which hurts 'cause have additional scrips running in parallel that could use more CPU time if it was available.

Here is the code I prepared to prove my case. It measures the download times for synchronous and asynchronous downloads, the later with and without an initial sleep period:

;Download timing tests***********************************************************************************************

*********************************
opt("MustDeclareVars", 1)     ;0=no, 1=require pre-declare      *************not default
Dim $SavedTXTfile = "C:\test.txt", $K, $StartTime, $URL = "http://www.autoitscript.com/autoit3/", $TSleep
Dim $Loops = 20, $T0, $T1, $LoopOverHead, $TminOH = 1000, $TminS = 1000, $TminA1 = 1000, $TminA2 = 1000
Dim $DownLoadTimeSync, $DownLoadTimeASync1, $DownLoadTimeASync2

;establish loop time without downloading
$StartTime = TimerInit()
For $K = 1 To $Loops
    $T0 = TimerInit()
    $T1 = TimerDiff($T0)
    If $T1 < $TminOH Then $TminOH = $T1
Next
$TminOH = Round($TminOH, 3)
$LoopOverHead = Round(TimerDiff($StartTime) / $Loops, 3)

;measure average download time for synchronous download (first time....download times get smaller if you repeatedly download the same site,and then stabalizes, so these downloads let things stabalize
$StartTime = TimerInit()
For $K = 1 To $Loops
    $T0 = TimerInit()
    InetGet($URL, $SavedTXTfile, 1, 0) ;start and finish download (synchronous)
    $T1 = TimerDiff($T0)
    If $T1 < $TminS Then $TminS = $T1
Next
$TminS = Round($TminS, 3)
$DownLoadTimeSync = Round(TimerDiff($StartTime) / $Loops - $LoopOverHead, 3)

;measure average download time for synchronous download after download times have stabalized above
$StartTime = TimerInit()
For $K = 1 To $Loops
    $T0 = TimerInit()
    InetGet($URL, $SavedTXTfile, 1, 0) ;start and finish download (synchronous)
    $T1 = TimerDiff($T0)
    If $T1 < $TminS Then $TminS = $T1
Next
$TminS = Round($TminS, 3)
$DownLoadTimeSync = Round(TimerDiff($StartTime) / $Loops - $LoopOverHead, 3)

;measure average download time for Asynchronous download without initial sleep
$StartTime = TimerInit()
For $K = 1 To $Loops
    $T0 = TimerInit()
    InetGet($URL, $SavedTXTfile, 1, 1) ;start download in background mode (asynchronous)
    While @InetGetActive;loop until it is finished
        Sleep(10);simulate InetGet synchronous checking back about every 10 msec
    WEnd
    $T1 = TimerDiff($T0)
    If $T1 < $TminA1 Then $TminA1 = $T1
Next
$TminA1 = Round($TminA1, 3)
$DownLoadTimeASync1 = Round(TimerDiff($StartTime) / $Loops - $LoopOverHead, 3)  ;measure average download time for synchronous download

;measure average download time for Asynchronous download with initial sleep prior to loop checking for completed download
$TSleep = $TminS-20  ;change this line to different sleep times to see the effect on the download times

$StartTime = TimerInit()
For $K = 1 To $Loops
    $T0 = TimerInit()
    InetGet($URL, $SavedTXTfile, 1, 1);start download in background mode (asynchronous)
    Sleep($TSleep);sleep close to the maximum amount of time before a download could be complete
    While @InetGetActive
        Sleep(10) ;simulate InetGet synchronous checking back about every 10 msec
    WEnd
    $T1 = TimerDiff($T0)
    If $T1 < $TminA2 Then $TminA2 = $T1
Next
$TminA2 = Round($TminA2, 3)
$DownLoadTimeASync2 = Round(TimerDiff($StartTime) / $Loops - $LoopOverHead, 3)

ConsoleWrite("Loop Over Head = " & $LoopOverHead & "  Loop min cycle time = " & $TminOH & @CR)
ConsoleWrite("Synchronous average DL time = " & $DownLoadTimeSync & "  Sync. min cycle time = " & $TminS & @CR)
ConsoleWrite("Asynchronous average DL time = " & $DownLoadTimeASync1 & "  Async. min cycle time = " & $TminA1 & @CR)
ConsoleWrite("Asynchronous average DL time with sleep = " & $DownLoadTimeASync2 & "  Async. min cycle time with sleep = " & $TminA2 & "  Sleep time= " & $TSleep & @CR)

Exit

Best Regards

Larry


I'm, Lovin' IT, X

Share this post


Link to post
Share on other sites



#2 ·  Posted (edited)

I would fully expect that on a fast broadband connection that the web-page you're downloading will arrive along with the header. In other words, with a page that small, the file is downloaded almost before the InetGet() function returns, even in asynchronous mode. That means you're sleeping while the download is already finished. A file size that small also produces irrelevant results. There's simply not enough time to accurately simulate "real-world" conditions.

Keep in mind, too, the medium you're using. The internet is a very unpredictable transport medium. You seem to be wanting to quantitate things in terms of milliseconds but with the internet, the results are going to be significantly skewed depending on luck more than anything else. This is especially true of a 4,580 byte web page. The larger the file, the better the delta you can calculate.

I will grant you one thing. Your previous statement about asynchronous mode blocking for a time is true (In on of your (too) many threads on this subject). There are some bits of code that should be moved out to the worker thread. Currently AutoIt doesn't go into asynchronous mode until after the connection has been established with the remote site. If the site is down or slow in responding, this will cause the function to briefly block when it shouldn't.

Edit: Fixed spelling mistakes.

Edited by Valik

Share this post


Link to post
Share on other sites

Thanks for the reply, Valik! I very much appreciate some feedback.

I's like to respond to some of your comments:

I will grant you one thing. Your previous statement about asynchronous mode blocking for a time is true (In on of your (too) many threads on this subject). There are some bits of code that should be moved out to the worker thread. Currently AutoIt doesn't go into asynchronous mode until after the connection has been established with the remote site. If the site is down or slow in responding, this will cause the function to briefly block when it shouldn't.

Thanks for confirming this. This explains why, in asynchronous mode, a sleep prior to checking for a completed download can delay the asynchronous download. It looks like the max initial sleep you can use is the min synchronous download time minus the connect time. My advice now is that if there are tens of packets to download , you can sleep for about half the minimum synchronous dl time, but if you have less than 10 or so packets to download, then do not do an initial sleep. I will have to use work arounds until it gets fixed, if it does. (Use WinHTTP object for instance, unless someone has a better suggestion)

Keep in mind, too, the medium you're using. The internet is a very unpredictable transport medium. You seem to be wanting to quantitate things in terms of milliseconds but with the internet, the results are going to be significantly skewed depending on luck more than anything else. This is especially true of a 4,580 byte web page. The larger the file, the better the delta you can calculate.

I totally agree IF you are talking about the first download from a URL. However, repetitive downloads seem to "burn" a path to the URL, and the download times stabalize. If you do statistical analysis of download times, you will see that the DL times can have a standard deviation of as little as +/-10 msec! There are outliers though, where the request and download seem to get "lost" (a few percent of the downloads), and I am trying to control those. Iif you code it right repetitive downloads are very predictable 98% of the time.

I would fully expect that on a fast broadband connection that the web-page you're downloading will arrive along with the header.  In other words, with a page that small, the file is downloaded almost before the InetGet() function returns, even in asynchronous mode.  That means you're sleeping while the download is already finished.  A file size that small also produces irrelevant results.  There's simply not enough time to accurately simulate "real-world" conditions.

Did you run my example code? The results clearly show that this is the wrong picture of what is going on. For the web pages that I am interested in, there are about 5 packets sent back. And the code I posted is not a simulation of real world conditions, it is exactly the real world conditions I have to deal with.

(In on of your (too) many threads on this subject).

The reason I have posted so much on this subject and the subject of sleep is that absolutely no one bothered to respond to any of the postings on these subjects until your comments. If no one answered after a reasonable amount of time, I would repost with simpler request or one with related questions to get info not in the help files to see if my work around ideas might work. I think I can see why there are no responses now...my postings were thought to be loony since I was trying to do things with downloads that were "crazy". But they are not, run the example code and you will see. If you want, save the download times to a file and do statistics on them. Bottom line is that the current wisdom on unpredictability of downloads is wrong if you do repetitive downloads.

Thanks again for some feedback and vital information.

Best Regards

Larry aka Roger Dangerfield :)


I'm, Lovin' IT, X

Share this post


Link to post
Share on other sites

Asycnrhonous downloads are slower. I will attribute some of this time in the implementation. Its probably a combination of not doing everything in the worker thread (The bug that you found) and also just plain and simple thread overhead that occurs on startup. I think that fixing the "bug" would alleviate some of this slow-down. To me, that renders your tests worthless (At this point). The so-called asynchronous mode is actually a mixed-mode function at the moment.

Share this post


Link to post
Share on other sites

Asycnrhonous downloads are slower.  I will attribute some of this time in the implementation.  Its probably a combination of not doing everything in the worker thread (The bug that you found) and also just plain and simple thread overhead that occurs on startup.  I think that fixing the "bug" would alleviate some of this slow-down.  To me, that renders your tests worthless (At this point).  The so-called asynchronous mode is actually a mixed-mode function at the moment.

<{POST_SNAPBACK}>

Yes, I agree, asynchronous downloads are slower, UNLESS the download gets "lost" and is taking too long. (I have no idea what the correct buzz word is for this). You can capture this event in asynchronous mode and restart the download, beating out the synchronous download time. For non-problem downloads, the asynchronous is only a couple of % slower at most than a synchronous download, so (for me) the benefit of capturing the problem download outweighs the slightly slower routine download time.

Re the testing, they give insights into how to best utilize the mixed-mode background InetGet. In addition, if you repeatedly run the sample code, you will see that download times can be very repeatalbe, contrary to your thoughts on the subject. Sounds pretty usefull to me!

Larry


I'm, Lovin' IT, X

Share this post


Link to post
Share on other sites

As it is not a bug I move it in support :)

Share this post


Link to post
Share on other sites

As it is not a bug I move it in support :evil:

<{POST_SNAPBACK}>

However

The help file says: 

background [optional]0 = (default) Wait until the download is complete before continuing.

1 = return immediately and download in the background (see remarks).

and Valik said:

Currently AutoIt doesn't go into asynchronous mode until after the connection has been established with the remote site. If the site is down or slow in responding, this will cause the function to briefly block when it shouldn't.

My experience is that "briefly" is an understatement, it can be as long as several seconds instead of the typical 30-100msec or so for a high speed connection and a good server on the other end. I can't live with even 100 msec. I think there is no question that this is a bug. :)

Regards

Larry


I'm, Lovin' IT, X

Share this post


Link to post
Share on other sites

Its not a bug, its an oversight. The code works exactly as it is written, it doesn't work exactly how it's documented. That doesn't necessarily mean its a bug in the code. I would personally rather see the code corrected to reflect the documentation as opposed to the other way around. However, in either case, this is not a show-stopper and its not a bug, either. Its just something we need to take a look at sometime.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0