Jump to content
Sign in to follow this  

File not downloading...

Recommended Posts


So I'm attempting to download a file that may have a new name that changes every day. I can parse the file name but for some reason my script just hangs or I never get the downloaded file. If I paste my download link, from the 'ClipPut', into a browser, it works perfectly. Below is what I have so far, can anyone see what nuance I'm missing?



#include <IE.au3>

Global $SequenceNumber




Func GetFileName()
    Local $oIE = _IECreate("http://www.symantec.com/security_response/definitions/download/detail.jsp?gid=sonar", 0, 0)
    Local $sString = _IEBodyReadHTML($oIE)

    Local $iPosSeqStart = StringInStr($sString, 'http://definitions.symantec.com/defs/sonar/')
    Local $iPosSeqEnd = StringInStr($sString, '">', 0, 1, $iPosSeqStart)
    $SequenceNumber = (StringMid($sString, $iPosSeqStart + 43, $iPosSeqEnd - $iPosSeqStart - 43) & @CRLF)
    MsgBox(0, "Download Information:", "DefFile: " & $SequenceNumber & @CRLF & "DownloadPath: " & "http://definitions.symantec.com/defs/sonar/" & $SequenceNumber)
    ClipPut("http://definitions.symantec.com/defs/sonar/" & $SequenceNumber)
EndFunc   ;==>GetFileName

Func Download()
    ; Advanced example - downloading in the background
    Local $hDownload = InetGet("http://definitions.symantec.com/defs/sonar/" & $SequenceNumber, @ScriptDir & "\" & $SequenceNumber, 1, 1)
    Until InetGetInfo($hDownload, 2) ; Check if the download is complete.
    Local $nBytes = InetGetInfo($hDownload, 0)
    InetClose($hDownload) ; Close the handle to release resources.
    MsgBox(0, "", "Bytes read: " & $nBytes)
EndFunc   ;==>Download
Edited by mdwerne

Share this post

Link to post
Share on other sites

I used this function in the past. I don't know why, but if the local file already existed, it would not download and overwrite it.  I found that testing for the existence of the local file and deleting it if found, first, and the function worked.  It may not be what's happening in your case.  Also the bug may have been fixed as I don't have the latest AutoIt.  But it shouldn't take long to try the work-around.

Share this post

Link to post
Share on other sites

the $SequenceNumber variable contains 2 invisible characters at the end,
chr(10) and chr(13) that cause the InetGet() function to fail,
to remove that characters use the StringStripWS() function like this, and it will download the file.

Local $hDownload = InetGet("http://definitions.symantec.com/defs/sonar/" & $SequenceNumber, @ScriptDir & "\" & StringStripWS($SequenceNumber,2), 1, 1)


small minds discuss people average minds discuss events great minds discuss ideas.... and use AutoIt....

Share this post

Link to post
Share on other sites

@MilesAhead - Thanks for the reply and suggestion, but as long as I'm an admin on the box, which I am, the file overwrites the old one just fine.

@PincoPanco - DUDE!!! (assuming you're a dude ;) ) you're da bomb!!

That was it exactly!!

I mistakenly left the " & @CRLF" at the end of the $SequenceNumber variable.

This bad: $SequenceNumber = (StringMid($sString, $iPosSeqStart + 43, $iPosSeqEnd - $iPosSeqStart - 43) & @CRLF)

This good: $SequenceNumber = (StringMid($sString, $iPosSeqStart + 43, $iPosSeqEnd - $iPosSeqStart - 43))

Once I removed the extra '& @CRLF' all was well and I didn't need to strip the white space.

Thanks for taking the time to troubleshoot this for me. :thumbsup:


Share this post

Link to post
Share on other sites

I had seen the @crlf at the end of the variable, but I thought you had put for some other obscure reason.... so I removed it only in the function ;)


small minds discuss people average minds discuss events great minds discuss ideas.... and use AutoIt....

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Similar Content

    • Tersion
      By Tersion
      Why I can't get more than two simultaneously background downloads?
      Simultaneously background downloads with InetGet() works! It was just server limitation from which I downloaded files.
    • leb
      By leb
      Hello there,
      I am using the function InetGet example in the help files on a website and getting the error 13. First, I searched the help files and the forum for a list of errors to consult to no avail. I suspect this is a 400's server reply based error but it would be nice to get more info about it.
      Please help, thanks.
    • Dante_t
      By Dante_t
      Hi Guys, I need help. I have searched the forum before posting and i couldn't find anything. The code below works fine when downloading files from "http" sites, but when trying to download from "https" sites, no files are downloaded. I tried different sites and I experience the same problem everywhere. Is there something I'm missing or doing wrong? Please note that I'm not a programmer and I'm new to this. I'm just using logic wherever i can to get things done. your help will be highly appreciated.
      #include <InetConstants.au3>
      #include <MsgBoxConstants.au3>
      #include <WinAPIFiles.au3>
      ; Download a file in the background.
      ; Wait for the download to complete.

      Func Example()
          ; Save the downloaded file to the temporary folder.
          Local $sFilePath = "d:\"
          ; Download the file in the background with the selected option of 'force a reload from the remote site.'
          Local $hDownload = InetGet("https://en.wikipedia.org/wiki/HTTPS#/media/File:Internet2.jpg", $sFilePath& "Internet2.jpg", $INET_FORCERELOAD, $INET_DOWNLOADBACKGROUND)
          ; Wait for the download to complete by monitoring when the 2nd index value of InetGetInfo returns True.
          Until InetGetInfo($hDownload, $INET_DOWNLOADCOMPLETE)
          ; Retrieve the number of total bytes received and the filesize.
          Local $iBytesSize = InetGetInfo($hDownload, $INET_DOWNLOADREAD)
          Local $iFileSize = FileGetSize($sFilePath&"Internet2.jpg")
          ; Close the handle returned by InetGet.
          ; Display details about the total number of bytes read and the filesize.
          MsgBox($MB_SYSTEMMODAL, "", "The total download size: " & $iBytesSize & @CRLF & _
                  "The total filesize: " & $iFileSize)
          ; Delete the file.
      EndFunc   ;==>Example
    • swatsapkraz
      By swatsapkraz
      First script here. Thanks for taking the time.
      I want to download a file from my dropbox or other cloud file host and I want autoit to read the file and proceed.
      Here are the references I've gone through, it's just I'm not familiar yet with autoit so I'm looking for advice:
      How would I start out downloading a text file from dropbox and if in the file there is a 1 then it will proceed with the rest of the script if there is a 0 or if the file cannot be downloaded I want it to just end.
      Thank you for taking the time to read this and I apologize in advance if this seems very trivial for some but this is my first script and I'm hoping this is the correct place to ask this question.
    • jonson1986
      By jonson1986
      I'm trying to use InetGet function to download multiple images from a website, some pages having three images, some pages having 4 images some of them having more...
      I wrote belong codes to work with autoit and having issues when autoit find not matching url available then autoit just script stopped without any error and I just want to same all the avaialble images on the website if links are not more left then script should moves on but script just stopped...
      Here is complete scenerio
      I've so many webpages with random number of images are hosting on those pages, in below code, InetGet able to download first three files easily and when it reaches to 4th link that is missing on the webpage that's why script stopped just but I want autoit to download only those images those are links are available and rest of files needs to be skipped automatically for this page if on the next page 4th link of image avaiable then autoit script needs to download 4th one also.
      Furthermore, please help me to download files with it's original names instead of whatever name I want to same them like in InetGet I've to give some name to the file that I'm willind to download instead of original name that is available online.
      Please Help me.
      Here i my code;
      $File6 = @ScriptDir & "\images_source.txt" $txt6 = FileRead($File6) $target_source7 = _StringBetween($txt6, 'src="', '"') if Not @error Then InetGet ( $target_source7[0], @ScriptDir & '\Description\Image1.jpg'); 1st image download coz link is available InetGet ( $target_source7[1], @ScriptDir & '\Description\Image2.jpg'); 2nd image download coz link is available InetGet ( $target_source7[2], @ScriptDir & '\Description\Image3.jpg'); 3rd image download coz link is available InetGet ( $target_source7[3], @ScriptDir & '\Description\Image4.jpg'); 4th image not able to download coz link is not available and script just stopped EndIf