I am using this code to Download a file
I don't know correct way of checking error during downloading
Global $DownloadStatus = InetGet($url, $fileName, Default, $INET_DOWNLOADBACKGROUND) While 1 $ArrayOFDownloadStatuts = InetGetInfo($DownloadStatus) If Not $ArrayOFDownloadStatuts = 0 Then MsgBox(0,"Error","Error During Downloading") ExitLoop EndIf If $ArrayOFDownloadStatuts Then MsgBox(0,"","Compelted") ExitLoop EndIf WEnd I am using that code but don't know why it gives error during downloading the file
is it wrong way of checking error or is there network error on my pc ?
Prior to posting this question, I have experimented with using FileExists, InetGetInfo and InetGet/InetClose functions to test downloading files into my application folder, in case if my file did not already exist.
In my case, I was able to pinpoint to a very specific web address, eg.) https://dropbox.com, etc. and then check/download the file.
Now my question is, instead of checking from a hardcoded URL, is it possible to:
1. Do a check against the IP address of a remote computer, eg.) my friend's or colleague's desktop computer
2. After checking the IP address if valid and can be connected to, access the computer's hard drive location, eg.) that machine's C:specifiedFolderspecifiedFile.txt
3. Download a copy of that file into a local computer
I did read ('?do=embed' frameborder='0' data-embedContent>>) but that was what I accomplished earlier, so now I am not sure if I were to change the method of file retrieval, what changes are necessary?
This is unrelated to the topic I posted earlier today.
I'm using inetget to retrieve a file online. sometimes this works, sometimes it doesn''t work but inetgetinfo gives a non-zero error number. Atleast I have an idea about what went wrong in that case.
However, the most confusing of all is I sometimes get no error message but the file does not appear in my target directory.
my code is pasted below:
$i=99999 $keyword = "amazon" $hDownload = InetGet("http://www.google.com/trends/fetchComponent?q=" & $keyword & "&geo=US&cid=TIMESERIES_GRAPH_0&export=3&date=today%203-m&cmpt=q", @DesktopDir & "data_" & $i & ".txt",0,1) $inetgettime=0 Do Sleep(250) $inetgettime=$inetgettime+0.25 If $inetgettime>10 Then $timedout=1 InetClose($hDownload) ConsoleWrite("timed out waiting for download " & $inetgettime & @CRLF) $timedout=1 ExitLoop EndIf Until InetGetInfo($hDownload, 2) ; Check if the download is complete. $err = InetGetInfo($hDownload) ConsoleWrite("internet error = " & $err & @CRLF) When the strange non-saving while $err=0 behaviour occurs, it is too fast for my 10 second timeout loop.
The script takes about 0.9 seconds to run when the behaviour I described above is shown.
Does anybody have an idea as to the cause of this?