Sign in to follow this  
Followers 0
JerryD

InetGet problem

9 posts in this topic

I have the following code which I originally developed in the production version of AutoIt (3.1.1.0) and worked, correctly changing the $Online variable to zero when a server was off line:

Dim $TestURLs[6]
$TestURLs[0] = 5
$TestURLs[1] = 'http://pcdm.intuit.com/index.html'
$TestURLs[2] = 'https://pasacty02.intuit.com/'
$TestURLs[3] = 'https://proauth.intuit.com/'
$TestURLs[4] = 'http://pcdm.intuit.com/index.html'
$TestURLs[5] = 'http://protaxpatch05.quicken.com/index.htm'

$OnLine = 1

For $i = 1 to $TestURLs[0]
    While FileExists ( @TempDir & '\ProSeries05Test.txt' )
        FileDelete ( @TempDir & '\ProSeries05Test.txt' )
        Sleep ( 500 )
    WEnd
    $Ret = InetGet ( $TestURLs[$i], @TempDir & '\ProSeries05Test.txt', 1 )
    MsgBox ( 0, $TestURLs[$i], 'Returned: ' & $Ret, 1 )
    If $Ret = 0 Then
        $OnLine = 0
    EndIf
Next
    While FileExists ( @TempDir & '\ProSeries05Test.txt' )
        FileDelete ( @TempDir & '\ProSeries05Test.txt' )
        Sleep ( 500 )
    WEnd
MsgBox ( 0, 'Online Status:', $OnLine )

When I tried it under the current Beta version (3.1.1.131 and earlier Betas), the two URL's without an HTML page fail even when the servers are available. The servers can't be pinged.

Any ideas would be appreciated since I may need to use some Beta functions in the script in the near future.

Thanks.

Share this post


Link to post
Share on other sites



I have the following code which I originally developed in the production version of AutoIt (3.1.1.0) and worked, correctly changing the $Online variable to zero when a server was off line:

Dim $TestURLs[6]
$TestURLs[0] = 5
$TestURLs[1] = 'http://pcdm.intuit.com/index.html'
$TestURLs[2] = 'https://pasacty02.intuit.com/'
$TestURLs[3] = 'https://proauth.intuit.com/'
$TestURLs[4] = 'http://pcdm.intuit.com/index.html'
$TestURLs[5] = 'http://protaxpatch05.quicken.com/index.htm'

$OnLine = 1

For $i = 1 to $TestURLs[0]
    While FileExists ( @TempDir & '\ProSeries05Test.txt' )
        FileDelete ( @TempDir & '\ProSeries05Test.txt' )
        Sleep ( 500 )
    WEnd
    $Ret = InetGet ( $TestURLs[$i], @TempDir & '\ProSeries05Test.txt', 1 )
    MsgBox ( 0, $TestURLs[$i], 'Returned: ' & $Ret, 1 )
    If $Ret = 0 Then
        $OnLine = 0
    EndIf
Next
    While FileExists ( @TempDir & '\ProSeries05Test.txt' )
        FileDelete ( @TempDir & '\ProSeries05Test.txt' )
        Sleep ( 500 )
    WEnd
MsgBox ( 0, 'Online Status:', $OnLine )

When I tried it under the current Beta version (3.1.1.131 and earlier Betas), the two URL's without an HTML page fail even when the servers are available. The servers can't be pinged.

Any ideas would be appreciated since I may need to use some Beta functions in the script in the near future.

Thanks.

I don't have an answer for you on what might have changed, but is it only a coincidence that the two not working are also the only two SSL https URLs?

:whistle:


Valuater's AutoIt 1-2-3, Class... Is now in Session!For those who want somebody to write the script for them: RentACoder"Any technology distinguishable from magic is insufficiently advanced." -- Geek's corollary to Clarke's law

Share this post


Link to post
Share on other sites

#3 ·  Posted (edited)

I don't have an answer for you on what might have changed, but is it only a coincidence that the two not working are also the only two SSL https URLs?

:whistle:

The last one returns 0 as well (at least sometimes...)

I guess the beta implements better checks on the HTTP server results (i.E. if the server returns HTTP Code "200 OK", or something else). The https:// links and the last link all create an error, as directory browsing is either not allowed (https://) or the URL is not available (last one).

If you check @error after InetGet() you will see that the function sometimes returns an errorvalue of 10. Unfortunately the error codes are not documented for InetGet(). You should open a bug report for this.

Cheers

Kurt

Edited by /dev/null

__________________________________________________________(l)user: Hey admin slave, how can I recover my deleted files?admin: No problem, there is a nice tool. It's called rm, like recovery method. Make sure to call it with the "recover fast" option like this: rm -rf *

Share this post


Link to post
Share on other sites

cant you use the ping()? not sure if it would work


My Scripts:Radioblog Club Music DownloaderOther stuff:Fun movieIm serious read the help file it helps :PFight 'Till you drop. Never stop, You Cant give up. Til you reach the top Fight! you’re the best in town Fight!

Share this post


Link to post
Share on other sites

cant you use the ping()? not sure if it would work

From his first post: "The servers can't be pinged."

Cheers

Kurt


__________________________________________________________(l)user: Hey admin slave, how can I recover my deleted files?admin: No problem, there is a nice tool. It's called rm, like recovery method. Make sure to call it with the "recover fast" option like this: rm -rf *

Share this post


Link to post
Share on other sites

#6 ·  Posted (edited)

Ah... somebody had the same problem and wrote HTTP UDF. I guess this is the solution for your problems: http://www.autoitscript.com/forum/index.ph...c=29631&hl=

EDIT: Oh, wait. It's only for HTTP not for HTTPS.

Cheers

Kurt

Edited by /dev/null

__________________________________________________________(l)user: Hey admin slave, how can I recover my deleted files?admin: No problem, there is a nice tool. It's called rm, like recovery method. Make sure to call it with the "recover fast" option like this: rm -rf *

Share this post


Link to post
Share on other sites

Arh im sorry i didnt read that Sorry again :whistle:


My Scripts:Radioblog Club Music DownloaderOther stuff:Fun movieIm serious read the help file it helps :PFight 'Till you drop. Never stop, You Cant give up. Til you reach the top Fight! you’re the best in town Fight!

Share this post


Link to post
Share on other sites

I've re-posted this to the v3 Bug Reports forum:

http://www.autoitscript.com/forum/index.php?showtopic=29708

Please use the link above to reply.

And no, the servers are not PINGable.

Share this post


Link to post
Share on other sites

I used curl for Windows in a project, because curl returns very detailed error codes. http://curl.haxx.se/download.html. Run curl.exe with RunWait() and use it's error codes and error messages to determine if the download was successfull.

Cheers

Kurt


__________________________________________________________(l)user: Hey admin slave, how can I recover my deleted files?admin: No problem, there is a nice tool. It's called rm, like recovery method. Make sure to call it with the "recover fast" option like this: rm -rf *

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0