Sign in to follow this  
Followers 0
Tjalve

InetGet issues

13 posts in this topic

Hello everyone.

Im having an issue with the InetGet function and i was hoping someone could help me provide som assiatnce.

Im writing a script that monitor serveral servers and send information about performance characterics using a webbservice, to a server. I then get a dashboard of thye health of all my serverers.

To this. Ive written a simpel update program. The idee is that the program should check online for a new version and if found, dowload and install it. In testing it all works, but in practise ive run into server issues. One was that the updater program (a separte script from the main script) was crashing when trying to unzip the downloaded zip file (witch contains the new version of thne scrips and configuration for the script). This happend at ablur 60% of all the servers.

On close inspection it seems that the InetGet function doesnt download the entire file. It just downloads about 140KB of it. Therefore the unziping of the file crashes the script.

To not crash the script, i added a check to see the size of the file befor the DL and after the DL. That works of couse, but it doesnt solve the problem. I guess it has to do with firewalls and antovirus software, softaware or somthing like that. The problem is that theese servers are on infrastracture that is out of my control. Downloading the file manually trough internet explorer seems to work though.

Any help would be appritiated.

Here is a part of my code.

$dlsize = InetGetSize ($dl4,1)
        FileWriteLine(@ScriptDir & "\verbose.log", @YEAR & "-" & @MON & "-" & @MDAY & " " & @HOUR & ":" & @MIN & ":" & @SEC & " - Updater - Downloading file: " & $dl4)
        FileWriteLine(@ScriptDir & "\verbose.log", @YEAR & "-" & @MON & "-" & @MDAY & " " & @HOUR & ":" & @MIN & ":" & @SEC & " - Updater - File is: " & $dlsize & " bytes.")
        InetGet($dl4,@TempDir & "\system.zip",1)
        sleep(2000)
        $retry = 10
        while 1
            if FileExists(@TempDir & "\system.zip") then
                if $dlsize = FileGetSize(@TempDir & "\system.zip") then
                    FileWriteLine(@ScriptDir & "\verbose.log", @YEAR & "-" & @MON & "-" & @MDAY & " " & @HOUR & ":" & @MIN & ":" & @SEC & " - Updater - Download OK")
                    ExitLoop
                EndIf
            Else
                FileWrite(@TempDir & "\system.zip",InetRead($dl4,1))
            EndIf
            sleep(1000)
            if $retry = 0 then
                FileWriteLine(@ScriptDir & "\verbose.log", @YEAR & "-" & @MON & "-" & @MDAY & " " & @HOUR & ":" & @MIN & ":" & @SEC & " - Updater - Download Faild")
                if $silent = 0 then MsgBox(0,"ERROR","Could not download file. Check Antivirus and firewalls.")
                Exit
            EndIf
            $retry = $retry - 1
        WEnd

Share this post


Link to post
Share on other sites



Have you read the Remarks part of the help file entry for InetGet ?

Share this post


Link to post
Share on other sites

Yes i have.

Share this post


Link to post
Share on other sites

I believe you might be downloading a binary file in ASCII mode, try something like this instead see if it makes any difference.

InetGet($dl4,@TempDir & "\system.zip",9) ; Binary mode

If I posted any code, assume that code was written using the latest release version unless stated otherwise. Also, if it doesn't work on XP I can't help with that because I don't have access to XP, and I'm not going to.
Give a programmer the correct code and he can do his work for a day. Teach a programmer to debug and he can do his work for a lifetime - by Chirag Gude
How to ask questions the smart way!

I hereby grant any person the right to use any code I post, that I am the original author of, on the autoitscript.com forums, unless I've specifically stated otherwise in the code or the thread post. If you do use my code all I ask, as a courtesy, is to make note of where you got it from.

Back up and restore Windows user files _Array.au3 - Modified array functions that include support for 2D arrays.  -  ColorChooser - An add-on for SciTE that pops up a color dialog so you can select and paste a color code into a script.  -  Customizable Splashscreen GUI w/Progress Bar - Create a custom "splash screen" GUI with a progress bar and custom label.  -  _FileGetProperty - Retrieve the properties of a file  -  SciTE Toolbar - A toolbar demo for use with the SciTE editor  -  GUIRegisterMsg demo - Demo script to show how to use the Windows messages to interact with controls and your GUI.  -   Latin Square password generator

Share this post


Link to post
Share on other sites

 

I believe you might be downloading a binary file in ASCII mode, try something like this instead see if it makes any difference.

InetGet($dl4,@TempDir & "\system.zip",9) ; Binary mode

Thanx. Ill give it a shot.

The strange thing is that it seems to be working on some servers (including my own machine). Im also downloading a small text file (with the version information in it) and that one always download. But that might be becuase its so small perhaps. But i will try :).

Share this post


Link to post
Share on other sites

I have now tested. I also did a filewritleline to see how mutch of the file has been downloaded every secound.

I get 166912 bytes everytime.

Binary dodnt help. And it seems if the file is smaller then that, it works. But i cannot be 100% sure.

Any more idees?

Share this post


Link to post
Share on other sites

One thing I might suggest trying is to use FileOpen in binary and write mode before using this line:

FileWrite(@TempDir & "\system.zip",InetRead($dl4,1))

Also, use binary mode on the InetRead line as well. FileWrite may also be failing to write the complete file correctly if it's binary data and you're trying to write it without opening the file in binary mode. Just a thought I had.


If I posted any code, assume that code was written using the latest release version unless stated otherwise. Also, if it doesn't work on XP I can't help with that because I don't have access to XP, and I'm not going to.
Give a programmer the correct code and he can do his work for a day. Teach a programmer to debug and he can do his work for a lifetime - by Chirag Gude
How to ask questions the smart way!

I hereby grant any person the right to use any code I post, that I am the original author of, on the autoitscript.com forums, unless I've specifically stated otherwise in the code or the thread post. If you do use my code all I ask, as a courtesy, is to make note of where you got it from.

Back up and restore Windows user files _Array.au3 - Modified array functions that include support for 2D arrays.  -  ColorChooser - An add-on for SciTE that pops up a color dialog so you can select and paste a color code into a script.  -  Customizable Splashscreen GUI w/Progress Bar - Create a custom "splash screen" GUI with a progress bar and custom label.  -  _FileGetProperty - Retrieve the properties of a file  -  SciTE Toolbar - A toolbar demo for use with the SciTE editor  -  GUIRegisterMsg demo - Demo script to show how to use the Windows messages to interact with controls and your GUI.  -   Latin Square password generator

Share this post


Link to post
Share on other sites

One thing I might suggest trying is to use FileOpen in binary and write mode before using this line:

FileWrite(@TempDir & "\system.zip",InetRead($dl4,1))

Also, use binary mode on the InetRead line as well. FileWrite may also be failing to write the complete file correctly if it's binary data and you're trying to write it without opening the file in binary mode. Just a thought I had.

you mean like this?

$filmeck = FileOpen(@TempDir & "\test.zip",18)
FileWrite($filmeck,InetRead($dl4,9))
FileClose($filmeck)

I tried that, but i just ended up with an empty zip file at 0 bytes.

Share this post


Link to post
Share on other sites

#9 ·  Posted (edited)

Alright. Ive done som more digging. I uploaded a new file and everything worked just as it should, for a while. Now i get the same results again, but this time it stuck at 800KBs.

If feels like there is something blocking the DL. I dont know. Some kind of antivirus or somthing? Anyone have any idees?

If i just enter the URL in IE on the server. Then i can DL and save the entire file without issues.

Edited by Tjalve

Share this post


Link to post
Share on other sites

Tjalve,

I have a similar issue:

Every time I utilize InetGet for a ZIP file, the saved file is 8kb (even when downloading from my local machine). I can manually enter the URL in IE and the file downloads properly. I have no problems downloading other filetypes such as PDF and TXT using InetGet.

Did you find a solution to this?

Share this post


Link to post
Share on other sites

A couple of questions which may help track down the error.

Why do you call Sleep() in your script, if you're downloading synchronously (i.e. blocking until the download completes)? You haven't specified 1 for the optional "background" parameter, so presumably it is blocking as it should.

Why don't you assign the return value of the InetGet() call to a variable and check that it matches the expected size (and/or actual size)?

If the return value was zero, then have you checked the @error variable after the InetGet() call?

Share this post


Link to post
Share on other sites

#12 ·  Posted (edited)

InetGet works fine for me for zip files.
However try this one.

Global $sPath = @ScriptDir & "\test.zip"
$sUrl = "http://download.thinkbroadband.com/20MB.zip"
FileDownload($sUrl, $sPath)
Func FileDownload($url, $SavePath)
 Local $xml, $Stream
 $xml = ObjCreate("Microsoft.XMLHTTP")
 $Stream = ObjCreate("Adodb.Stream")
 $xml.Open("GET", $url, 0)
 $xml.Send
 $Stream.Type = 1
 $Stream.Open
 $Stream.write($xml.ResponseBody)
 $Stream.SaveToFile($SavePath)
 $Stream.Close
EndFunc
Edited by AutID

Share this post


Link to post
Share on other sites

 

InetGet works fine for me for zip files.

However try this one.

Global $sPath = @ScriptDir & "\test.zip"
$sUrl = "http://download.thinkbroadband.com/20MB.zip"
FileDownload($sUrl, $sPath)
Func FileDownload($url, $SavePath)
 Local $xml, $Stream
 $xml = ObjCreate("Microsoft.XMLHTTP")
 $Stream = ObjCreate("Adodb.Stream")
 $xml.Open("GET", $url, 0)
 $xml.Send
 $Stream.Type = 1
 $Stream.Open
 $Stream.write($xml.ResponseBody)
 $Stream.SaveToFile($SavePath)
 $Stream.Close
EndFunc

Thanks AudID,

The aforementioned file downloads with no issues. The problem I'm experiencing must be with a particular website / download protocol.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Similar Content

    • Dante_t
      By Dante_t
      Hi Guys, I need help. I have searched the forum before posting and i couldn't find anything. The code below works fine when downloading files from "http" sites, but when trying to download from "https" sites, no files are downloaded. I tried different sites and I experience the same problem everywhere. Is there something I'm missing or doing wrong? Please note that I'm not a programmer and I'm new to this. I'm just using logic wherever i can to get things done. your help will be highly appreciated.
       
      #include <InetConstants.au3>
      #include <MsgBoxConstants.au3>
      #include <WinAPIFiles.au3>
      ; Download a file in the background.
      ; Wait for the download to complete.

      Example()
      Func Example()
          ; Save the downloaded file to the temporary folder.
          Local $sFilePath = "d:\"
          ; Download the file in the background with the selected option of 'force a reload from the remote site.'
          Local $hDownload = InetGet("https://en.wikipedia.org/wiki/HTTPS#/media/File:Internet2.jpg", $sFilePath& "Internet2.jpg", $INET_FORCERELOAD, $INET_DOWNLOADBACKGROUND)
          ; Wait for the download to complete by monitoring when the 2nd index value of InetGetInfo returns True.
          Do
              Sleep(250)
          Until InetGetInfo($hDownload, $INET_DOWNLOADCOMPLETE)
          ; Retrieve the number of total bytes received and the filesize.
          Local $iBytesSize = InetGetInfo($hDownload, $INET_DOWNLOADREAD)
          Local $iFileSize = FileGetSize($sFilePath&"Internet2.jpg")
          ; Close the handle returned by InetGet.
          InetClose($hDownload)
          ; Display details about the total number of bytes read and the filesize.
          MsgBox($MB_SYSTEMMODAL, "", "The total download size: " & $iBytesSize & @CRLF & _
                  "The total filesize: " & $iFileSize)
          ; Delete the file.
          ;FileDelete($sFilePath)
      EndFunc   ;==>Example
       
    • swatsapkraz
      By swatsapkraz
      First script here. Thanks for taking the time.
      I want to download a file from my dropbox or other cloud file host and I want autoit to read the file and proceed.
      Here are the references I've gone through, it's just I'm not familiar yet with autoit so I'm looking for advice:
      https://www.autoitscript.com/autoit3/docs/functions/InetGet.htm
      https://www.autoitscript.com/autoit3/docs/functions/FileRead.htm
       
      How would I start out downloading a text file from dropbox and if in the file there is a 1 then it will proceed with the rest of the script if there is a 0 or if the file cannot be downloaded I want it to just end.
       
      Thank you for taking the time to read this and I apologize in advance if this seems very trivial for some but this is my first script and I'm hoping this is the correct place to ask this question.
    • jonson1986
      By jonson1986
      Hey
      I'm trying to use InetGet function to download multiple images from a website, some pages having three images, some pages having 4 images some of them having more...
      I wrote belong codes to work with autoit and having issues when autoit find not matching url available then autoit just script stopped without any error and I just want to same all the avaialble images on the website if links are not more left then script should moves on but script just stopped...
      Here is complete scenerio
      I've so many webpages with random number of images are hosting on those pages, in below code, InetGet able to download first three files easily and when it reaches to 4th link that is missing on the webpage that's why script stopped just but I want autoit to download only those images those are links are available and rest of files needs to be skipped automatically for this page if on the next page 4th link of image avaiable then autoit script needs to download 4th one also.
      Furthermore, please help me to download files with it's original names instead of whatever name I want to same them like in InetGet I've to give some name to the file that I'm willind to download instead of original name that is available online.
      Please Help me.
      Here i my code;
      $File6 = @ScriptDir & "\images_source.txt" $txt6 = FileRead($File6) $target_source7 = _StringBetween($txt6, 'src="', '"') if Not @error Then InetGet ( $target_source7[0], @ScriptDir & '\Description\Image1.jpg'); 1st image download coz link is available InetGet ( $target_source7[1], @ScriptDir & '\Description\Image2.jpg'); 2nd image download coz link is available InetGet ( $target_source7[2], @ScriptDir & '\Description\Image3.jpg'); 3rd image download coz link is available InetGet ( $target_source7[3], @ScriptDir & '\Description\Image4.jpg'); 4th image not able to download coz link is not available and script just stopped EndIf  
    • 31290
      By 31290
      Hi Guys, 
      Since I'm able to get a Dell equipment warranty status thanks to my API key, I'm using an UDF to extract data from an XML file and get the end date. > 
      Thing is, when using InetGet, the original file is in JSON format and the UDF is not working anymore, even if I download the file with the xml extension. Therefore, and when I manually download the page with Chrome, I have a proper XML file where the UDF is working fine.
      Here's my code:
      I even tried to convert the json to xml > https://www.autoitscript.com/forum/topic/185717-js-json-to-xml/
      I took a look here https://www.autoitscript.com/forum/topic/104150-json-udf-library-fully-rfc4627-compliant/ but I don't understand anything :/
       
      The XML read UDF is just perfect for my needs but I'm stuck here... 
      Thanks for any help you can provide
      -31290-
      3MTXM12.json
      3MTXM12.xml
    • Dent
      By Dent
      Hi everyone,
      My script uses IE11 on Win7 to log in to a site and enters data into a couple of forms. Upon clicking a link this data is used by the site to generate a PDF report.
      With my current set-up if I do this manually the PDF opens in a new IE tab and I can download or print it. If I right-click the link that creates the PDF and choose Save Target As the PDF is generated and the Open/Save As dialogue at the bottom of the screen opens. All good.
      However I would like the script to automatically download the PDF and close IE and then exit. Closing IE (_IEQuit) and exiting the script are easy enough, but I'm struggling getting the script to download the PDF.
      The link to generate the PDF contains a unique number each time the page with the link is reached, so it's not static. The link position however, using _IELinkGetCollection I can tell the link to generate the PDF is always the 10th one from the end of the page, so using $iNumLinks - 10 I am able to click the link.
      What I believe I need to use is InetGet however the problem I've been facing is that the link isn't static and I haven't worked out a way to access the link by index - is this possible?
      Here is the website HTML for the section containing the link although I don't think it's of much use but it at least shows the format of the link (I can't post a link as it's a password protected area)...
      <div class="rmButton right"><a title="Generates a PDF version of the market report in a new window." href="/rmplus/generatePdf?mr_id=60991" target="_blank">print/save as pdf</a></div> The full link https://www.rightmove.co.uk/rmplus/generatePdf?mr_id=60991 just for completeness - visiting it will give a HTTP 500 unless logged in.
      And here is the code that clicks this link opening the generated PDF in a new tab...
      $oLinks = _IELinkGetCollection($oIE) $iNumLinks = @extended $PrintPDF = _IELinkClickByIndex($oIE, ($iNumLinks - 10)) So, how to use InetGet to visit that link? Or is there a way to Save As the newly opened tab? I've tried _IEAction($oIE, "saveas") but it seems not to work in a tab containing only a PDF.