Thatusernameisalreadytaken Posted December 13, 2012 Share Posted December 13, 2012 (edited) Update 2:the final thing to blame was antivirus (nod32) which was scanning the file before completing the download for too long.Update:the problem was fixed by downloading a file via WinHTTP.au3 functions.Func _WinHttpReadData($hRequest, $iMode = Default, $iNumberOfBytesToRead = Default, $pBuffer = Default)with default third parameter (=8192) it doesn't write the last chunk of file mostly anytime, but a couple of times it was successfulby setting it to 64000 the problem has gone completely----Hi all!I've got a script that should download a .zip file approx. 150Mb long from an URL. The URL itself redirects to the file in maven repository in local network.If downloaded using IE or FF, it works well. But I try to make it independent of browsers and to omit these savedialogs too.If I use InetGet():$inetinfo=InetGet($path_to_our_file, $path_to_save_file, 1, 0)the file seems to be downloaded normally, but:1) InetGetInfo() returns:Bytes read: 162843648 (e.g.)Size: 162843736 (e.g.)Complete - yesSuccessful - no@error: 32@extended: 12002the last one is windows's ERROR_WINHTTP_TIMEOUT2) Comparing the files got through a browser and InetRead shows that the latter one lacks around 100 bytes at the end of file.The fails exactly the same way.There are other .zip files, larger and smaller than this one, some are always downloaded correctly both via browsers and InetGet(), some are never. It appears that in this case the size really doesn't matter.I googled hard but was unable to find the answer. Your help (or a workaround without involving browsers) would be appreciated Edited December 15, 2012 by Thatusernameisalreadytaken Link to comment Share on other sites More sharing options...
FireFox Posted December 13, 2012 Share Posted December 13, 2012 I think that the problem comes from the server sending the file. Can you check the lastest bytes from the original file and the downloaded one ? AFAIK, InetGet supports well big files. Br, FireFox. Link to comment Share on other sites More sharing options...
Thatusernameisalreadytaken Posted December 13, 2012 Author Share Posted December 13, 2012 I think that the problem comes from the server sending the file.But it works with the browsers correctly; for me it looks like InetGet() somehow ignores the last packet received and stops writing the file one step earlier than it had to.Can you check the lastest bytes from the original file and the downloaded one ?Right now 88 bytes were cut out that are exactly the last bytes of the file downloaded correctly. Murphy's law would say that these are the most precious - and they are, as there is a mark of central directory record ending Link to comment Share on other sites More sharing options...
FireFox Posted December 13, 2012 Share Posted December 13, 2012 (edited) But it works with the browsers correctly; for me it looks like InetGet() somehow ignores the last packet received and stops writing the file one step earlier than it had to.Probably because the browsers functions are better made to download files.Right now 88 bytes were cut out that are exactly the last bytes of the file downloaded correctly. Murphy's law would say that these are the most precious - and they are, as there is a mark of central directory record ending You should submit your problem to the bugtracker with the link causing problem (public one of course).The only option you have is to make your own download function using TCP.Br, FireFox. Edited December 13, 2012 by FireFox Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now