Jump to content

Recommended Posts

Hi all,

I was wondering if you can help me with the function called; InetRead().
My scripts are using this function a lot for several conditions and everything works fine!

But sometimes when the server is a little bit buggy of simply not available my script is hanging.
It takes about 90sec before this function returns a Timeout, when i adjust the parameter it still is hanging about 90sec.

The following script is a example where the script is hanging for aprox. 90sec;

; Set Timeout to 2sec
AutoItSetOption ("TCPTimeout", 2000)

; Read Website

; Show Msgbox before Ending Script.


The following script is a example where the script show the Msgbox pretty fast;


; Set Timeout to 2sec
AutoItSetOption ("TCPTimeout", 2000)

; Read Website

; Show Msgbox before Ending Script.


My question now is, what am i doing wrong and/or is there a other way to prevent Hanging the script?

Thanks all :)


Link to post
Share on other sites

Hey Guys,

Searching multiple forum Topic's i realized that a timeout option isn't coming soon on this function.
So i created a simple alternative for the InetRead Function.

My function is first looking for the domain prefix by using the _URLSplit() Function from SmOke_N (Thanks! :) )
Then the function simply tries a InetGet on this domain withhout waiting for it, and starting a loop for Timeout.

If InetGetInfo is receiving a Download Complete command the function is relaying the complete URL to the original InetRead() function, then returning the Data.

For yours to use if needed, it is working perfectly here :)

#include <InetConstants.au3>
#include <MsgBoxConstants.au3>
#include <WinAPIFiles.au3>
#include <array.au3>

; Read Website

; Show Msgbox before Ending Script.

Func _InetRead($IR_URL,$IR_OPTIONS = 1,$IR_TIMEOUT = 2000)

    ; Define Var's
    Local $sFilePath = _WinAPI_GetTempFileName(@TempDir)
    Local $ProcessCounter = 0
    Local $ReadedOutput, $GetDomain

    $GetDomain = _URLSplit($IR_URL)

    If NOT IsArray($GetDomain) Then Return ""

    ; Start Connection to Server, Forced Reload.
    Local $hDownload = InetGet($GetDomain[1]&"://"&$GetDomain[2], $sFilePath, 1, 1)

    ; Start loop for Timeout
        $ProcessCounter += 1
    Until InetGetInfo($hDownload, 2) OR $ProcessCounter > ($IR_TIMEOUT/50)

    ; Close the handle returned by InetGet.

    ; Delete the file.

    If $ProcessCounter > ($IR_TIMEOUT/50) Then Return ""

    $ReadedOutput = InetRead($IR_URL,$IR_OPTIONS)
    Return $ReadedOutput


Func _URLSplit($szUrl)
    Local $sSREPattern = '^(?s)(?i)(http|ftp|https|file)://(.*?/|.*$)(.*/){0,}(.*)$'
    Local $aUrlSRE = StringRegExp($szUrl, $sSREPattern, 2)
    If Not IsArray($aUrlSRE) Or UBound($aUrlSRE) - 1 <> 4 Then Return SetError(1, 0, 0)
    If StringRight($aUrlSRE[2], 1) = '/' Then
        $aUrlSRE[2] = StringTrimRight($aUrlSRE[2], 1)
        $aUrlSRE[3] = '/' & $aUrlSRE[3]
    $szProtocol = $aUrlSRE[1]
    $szDomain = $aUrlSRE[2]
    $szPath = $aUrlSRE[3]
    $szFile = $aUrlSRE[4]
    Return $aUrlSRE
EndFunc   ;==>_URLSplit


Just paste these Functions in your script, then simply add the Underscore to all your InetRead functions, enjoy! :)

PS: Standard Timeout Setting is 2000 (Milliseconds), but you can change it to your needs.

Edited by Blueman
Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By walec
      The function below worked for a long time. Now it doesn't download anything and I don't know where the error is.
      #include <InetConstants.au3> Local $Link = 'https://allegro.pl/kategoria/nasiona-warzywa-99776?bmatch=e2101-d3681-c3682-hou-1-4-0319' Local $dStrona = InetRead($Link) Local $sStrona = BinaryToString($dStrona, $SB_UTF8) ;GUICtrlSetData($myedit, $sStrona) ConsoleWrite($sStrona) The function was good because it reading data in the background.
      Thank you for any hints or possibly other solutions / functions that are reading data in the background.
    • By HansHenrik
      edit: this is probably in the wrong place, can a moderator move it to wherever it belongs?

      is there any way to completely disable TCPTimeout and make TCPRecv() wait indefinitely? maybe setting it to 0 or -1 or something? 
      the docs doesn't seem to mention any way to disable it

      - the underlying C code would set SO_RCVTIMEO to 0 , aka

      DWORD timeout=0;
      setsockopt(s, SOL_SOCKET, SO_RCVTIMEO, &timeout, sizeof(timeout));
    • By Tersion
      Here test example:
      The code is not prefect, but it's demonstrates the problem.
      So, when I download more than 2 files simultaneous from one server (in my example I download 3 simultaneously) by InetGet() with background option, I will get only 2 active downloads. It's seems like server limitation. The third will remain with 0 read bytes until one of the previous downloads would finished. And if I want to close this third 0 bytes read download with InetClose() I will get hang of Window GUI for several seconds! After that, the third download would be closed and everything will continue to work normally.
      If I InetClose() one of the active downloads with more than 0 bites read - all works fine!
      You can try by your self given example, or watch this .gif's:
      1. Window GUI hangs when I trying to Cancel (InetClose()) download with 0 bytes read download:

      2. All works as it should, when I Cancel more than 0 bytes read download:

      Any ideas why is it happens?
    • By rudi
      any suggestion to track down, why a script is freezing after a longer run time?
      This is a general question, I've checked the common mistakes, I've made so far (inside a loop doing fileopen() without fileclose(), runwait() for progs, that don't terminate, infinite recursive loops etc.)
      This script stops just after several days of running...
      Is  there a direct way to "count" the amount of MEM and handles, a autoit script is using itself, or by child processes?
      Regards, Rudi.
    • By 31290
      Hi everyone, hope you are doing fine
      Well, I'm currently writing a small script that goes to a certain web page, finds the first link of a specified section and download the file associated to this link.
      Depending on the computer that the tool is launched, the script gets the computer model and search in the (provided here) ini file which link to follow.
      At first, Dell was kind enough to provide only one link but now, they provide two of them. The first one is now a .txt file (  ) whereas my script has been designed to download only the fist and latest link released for the BIOS Update.

      Here's the current code which is working with only the first and latest link of the BIOS category:
      So the question is: 
      In the case of double links like shown in the picture above, how it is possible to tell the script to download only the link containing an the .exe file?
      Of course, I could have changed the array result to [1] instead of [0] [which is working] but it seems that Dell does that randomly and that I deal with a lot of computer models.
      Thanks for the help you can provide, 
  • Create New...