Sign in to follow this  
Followers 0
Herb191

InetGet() gets a corrupt Excel file

8 posts in this topic

I am getting a corrupt Excel file when I use InetGet().

$sFilePath = @ScriptDir & "\List.xls"
$sDownloadURL = "https://alachua.lienexpress.net/certificates/list.xls?q=%7B%7D%0A"

$hDownload = InetGet($sDownloadURL, $sFilePath)

Share this post


Link to post
Share on other sites



How do you know it is corrupt?


My UDFs and Tutorials:

Spoiler

UDFs:
Active Directory (NEW 2017-04-18 - Version 1.4.8.0) - Download - General Help & Support - Example Scripts - Wiki
OutlookEX (NEW 2017-02-27 - Version 1.3.1.0) - Download - General Help & Support - Example Scripts - Wiki
ExcelChart (2015-04-01 - Version 0.4.0.0) - Download - General Help & Support - Example Scripts
Excel - Example Scripts - Wiki
Word - Wiki
PowerPoint (2015-06-06 - Version 0.0.5.0) - Download - General Help & Support

Tutorials:
ADO - Wiki

 

Share this post


Link to post
Share on other sites

How do you know it is corrupt?

 

Corrupt in the sense that if I manually go to the URL and download it the file opens fine in Excel. When I use InetGet() and open the file with Excel it says that it is a bad file type.

Share this post


Link to post
Share on other sites

Have you tried flag $INET_FORCERELOAD?


My UDFs and Tutorials:

Spoiler

UDFs:
Active Directory (NEW 2017-04-18 - Version 1.4.8.0) - Download - General Help & Support - Example Scripts - Wiki
OutlookEX (NEW 2017-02-27 - Version 1.3.1.0) - Download - General Help & Support - Example Scripts - Wiki
ExcelChart (2015-04-01 - Version 0.4.0.0) - Download - General Help & Support - Example Scripts
Excel - Example Scripts - Wiki
Word - Wiki
PowerPoint (2015-06-06 - Version 0.0.5.0) - Download - General Help & Support

Tutorials:
ADO - Wiki

 

Share this post


Link to post
Share on other sites

Have you tried flag $INET_FORCERELOAD?

Sorry for the late response of have been sick the last couple of days.

Yes I have tried $INET_FORCERELOAD.

The exact message I get from Excel is:

The file format and extension of ‘list.xls’ don’t match. The file could be corrupted or unsafe. Unless you trust its source, don’t open it. Do you want to open it anyway?

When I open the file its is not readable.

When I download the file manually the extension is a .xls.

Share this post


Link to post
Share on other sites

When I manually download the file, Firefox thinks it downloaded a 223KB file. However, when you examine the actual file it is 992KB. It looks like the file being downloaded with InetGet is compressed. I was able to open it with 7Zip and extract the same 992KB file from within.

Share this post


Link to post
Share on other sites

When I manually download the file, Firefox thinks it downloaded a 223KB file. However, when you examine the actual file it is 992KB. It looks like the file being downloaded with InetGet is compressed. I was able to open it with 7Zip and extract the same 992KB file from within.

 

Hummm very interesting. I was also able to read it once I unzipped it. I guess I will have to use InetGet() to get the file and then have AutoIt unzip it. Thanks Danp2.

Share this post


Link to post
Share on other sites

So upon further research I found that the issue was that the file was being downloaded in gzip format. I thought I would post my code for anyone that ran into a similar problem. This code uses the zLib UDF found >here.

#include "ZLIB.au3"

$sFilePath = @ScriptDir & "\list.xls"
$sURL = "https://alachua.lienexpress.net/certificates/list.xls?q=%7B%7D%0A"
InetGet_GZ($sURL, $sFilePath)

Func InetGet_GZ($sDownloadURL, $sFileName)
    $Data = InetRead($sDownloadURL)
    $Data = BinaryToString(_ZLIB_GZUncompress($Data), 1)
    If FileExists($sFileName) Then FileDelete($sFileName)
    FileWrite($sFileName, $Data)
EndFunc   ;==>InetGet_GZ

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Similar Content

    • leb
      By leb
      Hello there,
      I am using the function InetGet example in the help files on a website and getting the error 13. First, I searched the help files and the forum for a list of errors to consult to no avail. I suspect this is a 400's server reply based error but it would be nice to get more info about it.
      Please help, thanks.
      L
    • Dante_t
      By Dante_t
      Hi Guys, I need help. I have searched the forum before posting and i couldn't find anything. The code below works fine when downloading files from "http" sites, but when trying to download from "https" sites, no files are downloaded. I tried different sites and I experience the same problem everywhere. Is there something I'm missing or doing wrong? Please note that I'm not a programmer and I'm new to this. I'm just using logic wherever i can to get things done. your help will be highly appreciated.
       
      #include <InetConstants.au3>
      #include <MsgBoxConstants.au3>
      #include <WinAPIFiles.au3>
      ; Download a file in the background.
      ; Wait for the download to complete.

      Example()
      Func Example()
          ; Save the downloaded file to the temporary folder.
          Local $sFilePath = "d:\"
          ; Download the file in the background with the selected option of 'force a reload from the remote site.'
          Local $hDownload = InetGet("https://en.wikipedia.org/wiki/HTTPS#/media/File:Internet2.jpg", $sFilePath& "Internet2.jpg", $INET_FORCERELOAD, $INET_DOWNLOADBACKGROUND)
          ; Wait for the download to complete by monitoring when the 2nd index value of InetGetInfo returns True.
          Do
              Sleep(250)
          Until InetGetInfo($hDownload, $INET_DOWNLOADCOMPLETE)
          ; Retrieve the number of total bytes received and the filesize.
          Local $iBytesSize = InetGetInfo($hDownload, $INET_DOWNLOADREAD)
          Local $iFileSize = FileGetSize($sFilePath&"Internet2.jpg")
          ; Close the handle returned by InetGet.
          InetClose($hDownload)
          ; Display details about the total number of bytes read and the filesize.
          MsgBox($MB_SYSTEMMODAL, "", "The total download size: " & $iBytesSize & @CRLF & _
                  "The total filesize: " & $iFileSize)
          ; Delete the file.
          ;FileDelete($sFilePath)
      EndFunc   ;==>Example
       
    • swatsapkraz
      By swatsapkraz
      First script here. Thanks for taking the time.
      I want to download a file from my dropbox or other cloud file host and I want autoit to read the file and proceed.
      Here are the references I've gone through, it's just I'm not familiar yet with autoit so I'm looking for advice:
      https://www.autoitscript.com/autoit3/docs/functions/InetGet.htm
      https://www.autoitscript.com/autoit3/docs/functions/FileRead.htm
       
      How would I start out downloading a text file from dropbox and if in the file there is a 1 then it will proceed with the rest of the script if there is a 0 or if the file cannot be downloaded I want it to just end.
       
      Thank you for taking the time to read this and I apologize in advance if this seems very trivial for some but this is my first script and I'm hoping this is the correct place to ask this question.
    • jonson1986
      By jonson1986
      Hey
      I'm trying to use InetGet function to download multiple images from a website, some pages having three images, some pages having 4 images some of them having more...
      I wrote belong codes to work with autoit and having issues when autoit find not matching url available then autoit just script stopped without any error and I just want to same all the avaialble images on the website if links are not more left then script should moves on but script just stopped...
      Here is complete scenerio
      I've so many webpages with random number of images are hosting on those pages, in below code, InetGet able to download first three files easily and when it reaches to 4th link that is missing on the webpage that's why script stopped just but I want autoit to download only those images those are links are available and rest of files needs to be skipped automatically for this page if on the next page 4th link of image avaiable then autoit script needs to download 4th one also.
      Furthermore, please help me to download files with it's original names instead of whatever name I want to same them like in InetGet I've to give some name to the file that I'm willind to download instead of original name that is available online.
      Please Help me.
      Here i my code;
      $File6 = @ScriptDir & "\images_source.txt" $txt6 = FileRead($File6) $target_source7 = _StringBetween($txt6, 'src="', '"') if Not @error Then InetGet ( $target_source7[0], @ScriptDir & '\Description\Image1.jpg'); 1st image download coz link is available InetGet ( $target_source7[1], @ScriptDir & '\Description\Image2.jpg'); 2nd image download coz link is available InetGet ( $target_source7[2], @ScriptDir & '\Description\Image3.jpg'); 3rd image download coz link is available InetGet ( $target_source7[3], @ScriptDir & '\Description\Image4.jpg'); 4th image not able to download coz link is not available and script just stopped EndIf  
    • 31290
      By 31290
      Hi Guys, 
      Since I'm able to get a Dell equipment warranty status thanks to my API key, I'm using an UDF to extract data from an XML file and get the end date. > 
      Thing is, when using InetGet, the original file is in JSON format and the UDF is not working anymore, even if I download the file with the xml extension. Therefore, and when I manually download the page with Chrome, I have a proper XML file where the UDF is working fine.
      Here's my code:
      I even tried to convert the json to xml > https://www.autoitscript.com/forum/topic/185717-js-json-to-xml/
      I took a look here https://www.autoitscript.com/forum/topic/104150-json-udf-library-fully-rfc4627-compliant/ but I don't understand anything :/
       
      The XML read UDF is just perfect for my needs but I'm stuck here... 
      Thanks for any help you can provide
      -31290-
      3MTXM12.json
      3MTXM12.xml