Muhammad_Awais_Sharif

InetGet correct way of checking error during downloading a file

4 posts in this topic

I am using this code to Download a file 
I don't know correct way of checking error during downloading

Global $DownloadStatus = InetGet($url, $fileName, Default, $INET_DOWNLOADBACKGROUND)

While 1
    $ArrayOFDownloadStatuts = InetGetInfo($DownloadStatus)
    If Not $ArrayOFDownloadStatuts[4] = 0 Then
            MsgBox(0,"Error","Error During Downloading")
            ExitLoop
    EndIf
    If $ArrayOFDownloadStatuts[2] Then
            MsgBox(0,"","Compelted")
            ExitLoop
    EndIf
WEnd

I am using that code but don't know why it gives error during downloading the file 

is it wrong way of checking error or is there network error on my pc ? :P 
thank you :D 

Share this post


Link to post
Share on other sites



From help file:   fifth element (at index 4) - The error value for the download. The value itself is arbitrary. Testing that the value is non-zero is sufficient for determining if an error occurred - so I assume that if that element has value of zero than there is no error. You could also do something like this:

$ArrayOFDownloadStatuts = InetGetInfo($DownloadStatus)
If @error Then
;do something
EndIf

Share this post


Link to post
Share on other sites

thanks for suggestion :D 

Share this post


Link to post
Share on other sites

You are welcome :D

1 person likes this

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now

  • Similar Content

    • jonson1986
      By jonson1986
      Hey
      I'm trying to use InetGet function to download multiple images from a website, some pages having three images, some pages having 4 images some of them having more...
      I wrote belong codes to work with autoit and having issues when autoit find not matching url available then autoit just script stopped without any error and I just want to same all the avaialble images on the website if links are not more left then script should moves on but script just stopped...
      Here is complete scenerio
      I've so many webpages with random number of images are hosting on those pages, in below code, InetGet able to download first three files easily and when it reaches to 4th link that is missing on the webpage that's why script stopped just but I want autoit to download only those images those are links are available and rest of files needs to be skipped automatically for this page if on the next page 4th link of image avaiable then autoit script needs to download 4th one also.
      Furthermore, please help me to download files with it's original names instead of whatever name I want to same them like in InetGet I've to give some name to the file that I'm willind to download instead of original name that is available online.
      Please Help me.
      Here i my code;
      $File6 = @ScriptDir & "\images_source.txt" $txt6 = FileRead($File6) $target_source7 = _StringBetween($txt6, 'src="', '"') if Not @error Then InetGet ( $target_source7[0], @ScriptDir & '\Description\Image1.jpg'); 1st image download coz link is available InetGet ( $target_source7[1], @ScriptDir & '\Description\Image2.jpg'); 2nd image download coz link is available InetGet ( $target_source7[2], @ScriptDir & '\Description\Image3.jpg'); 3rd image download coz link is available InetGet ( $target_source7[3], @ScriptDir & '\Description\Image4.jpg'); 4th image not able to download coz link is not available and script just stopped EndIf  
    • 31290
      By 31290
      Hi Guys, 
      Since I'm able to get a Dell equipment warranty status thanks to my API key, I'm using an UDF to extract data from an XML file and get the end date. > 
      Thing is, when using InetGet, the original file is in JSON format and the UDF is not working anymore, even if I download the file with the xml extension. Therefore, and when I manually download the page with Chrome, I have a proper XML file where the UDF is working fine.
      Here's my code:
      I even tried to convert the json to xml > https://www.autoitscript.com/forum/topic/185717-js-json-to-xml/
      I took a look here https://www.autoitscript.com/forum/topic/104150-json-udf-library-fully-rfc4627-compliant/ but I don't understand anything :/
       
      The XML read UDF is just perfect for my needs but I'm stuck here... 
      Thanks for any help you can provide
      -31290-
      3MTXM12.json
      3MTXM12.xml
    • Dent
      By Dent
      Hi everyone,
      My script uses IE11 on Win7 to log in to a site and enters data into a couple of forms. Upon clicking a link this data is used by the site to generate a PDF report.
      With my current set-up if I do this manually the PDF opens in a new IE tab and I can download or print it. If I right-click the link that creates the PDF and choose Save Target As the PDF is generated and the Open/Save As dialogue at the bottom of the screen opens. All good.
      However I would like the script to automatically download the PDF and close IE and then exit. Closing IE (_IEQuit) and exiting the script are easy enough, but I'm struggling getting the script to download the PDF.
      The link to generate the PDF contains a unique number each time the page with the link is reached, so it's not static. The link position however, using _IELinkGetCollection I can tell the link to generate the PDF is always the 10th one from the end of the page, so using $iNumLinks - 10 I am able to click the link.
      What I believe I need to use is InetGet however the problem I've been facing is that the link isn't static and I haven't worked out a way to access the link by index - is this possible?
      Here is the website HTML for the section containing the link although I don't think it's of much use but it at least shows the format of the link (I can't post a link as it's a password protected area)...
      <div class="rmButton right"><a title="Generates a PDF version of the market report in a new window." href="/rmplus/generatePdf?mr_id=60991" target="_blank">print/save as pdf</a></div> The full link https://www.rightmove.co.uk/rmplus/generatePdf?mr_id=60991 just for completeness - visiting it will give a HTTP 500 unless logged in.
      And here is the code that clicks this link opening the generated PDF in a new tab...
      $oLinks = _IELinkGetCollection($oIE) $iNumLinks = @extended $PrintPDF = _IELinkClickByIndex($oIE, ($iNumLinks - 10)) So, how to use InetGet to visit that link? Or is there a way to Save As the newly opened tab? I've tried _IEAction($oIE, "saveas") but it seems not to work in a tab containing only a PDF.
    • Muhammad_Awais_Sharif
      By Muhammad_Awais_Sharif
      Hi
      I want to run  UrlDownloadEx  run in background so GUI don't hang.
      It should give response During Downloading any file with this udf.
      Please Help me 
      Thank you 
       
    • rudi
      By rudi
      Hello.
       
      What does it mean, when InetGET() returns @Error=13?
       
       
      This script worked exactly once to download a single JPG image file from a WebCam, GANZ, Model "ZN-MD221M". Any later tries failed
      The URL (see script below) doesn't work in FF, IE, Chrome either. In FF it's trying to download the image for maybe 5 Minutes, then FF is giving up, displaying this message:
      Die Grafik "http://ADMIN:1234@10.27.20.211/cgi-bin/still-image.cgi?duration=1~60" kann nicht angezeigt werden, weil sie Fehler enthält. In English about this: The Image <url> can't be displayed, as it contains errors.  
      Wireshark traces shows simply no "answer" to the "GET" for the image file. This is my script:

       
      ; GANZ Cam im Bunker #include <inet.au3> HttpSetProxy(1) $URL="http://ADMIN:1234@10.27.20.211/cgi-bin/still-image.cgi?duration=1~60" ConsoleWrite('@@ Debug(' & @ScriptLineNumber & ') : $URL = ' & $URL & @CRLF & '>Error code: ' & @error & @CRLF) ;### Debug Console $file="C:\temp\bunker\bunker.jpg" FileDelete($file) ConsoleWrite('@@ Debug(' & @ScriptLineNumber & ') : $File = ' & $File & @CRLF & '>Error code: ' & @error & @CRLF) ;### Debug Console $result=InetGet($URL,$File,1+2) ConsoleWrite('@@ Debug(' & @ScriptLineNumber & ') : $result = ' & $result & @CRLF & '>Error code: ' & @error & @CRLF) ;### Debug Console The console output is this one:
      >"C:\Program Files (x86)\AutoIt3\SciTE\..\AutoIt3.exe" "C:\Program Files (x86)\AutoIt3\SciTE\AutoIt3Wrapper\AutoIt3Wrapper.au3" /run /prod /ErrorStdOut /in "H:\DATEN\PUBLIC\Bunker\Script\Mittags-ein-Schuss.au3" /UserParams +>17:18:39 Starting AutoIt3Wrapper v.15.920.938.0 SciTE v.3.6.0.0 Keyboard:00000407 OS:WIN_7/Service Pack 1 CPU:X64 OS:X64 Environment(Language:0407) +> SciTEDir => C:\Program Files (x86)\AutoIt3\SciTE UserDir => C:\Users\admin\AppData\Local\AutoIt v3\SciTE\AutoIt3Wrapper SCITE_USERHOME => C:\Users\admin\AppData\Local\AutoIt v3\SciTE >Running AU3Check (3.3.14.2) from:C:\Program Files (x86)\AutoIt3 input:H:\DATEN\PUBLIC\Bunker\Script\Mittags-ein-Schuss.au3 +>17:18:39 AU3Check ended.rc:0 >Running:(3.3.14.2):C:\Program Files (x86)\AutoIt3\autoit3.exe "H:\DATEN\PUBLIC\Bunker\Script\Mittags-ein-Schuss.au3" --> Press Ctrl+Alt+Break to Restart or Ctrl+Break to Stop @@ Debug(8) : $URL = http://ADMIN:1234@10.27.20.211/cgi-bin/still-image.cgi?duration=1~60 >Error code: 0 @@ Debug(15) : $File = C:\temp\bunker\bunker.jpg >Error code: 0 @@ Debug(20) : $result = 0 ####################### >Error code: 13 ####################### +>17:19:09 AutoIt3.exe ended.rc:0 +>17:19:09 AutoIt3Wrapper Finished. >Exit code: 0 Time: 30.6  
      so Inetget() returns "Size of download = 0 bytes", and @error=13, what does this error code mean?
       
       
      Regards, Rudi.