Jump to content

InetGet problem


Recommended Posts

I have the following code which I originally developed in the production version of AutoIt (3.1.1.0) and worked, correctly changing the $Online variable to zero when a server was off line:

Dim $TestURLs[6]
$TestURLs[0] = 5
$TestURLs[1] = 'http://pcdm.intuit.com/index.html'
$TestURLs[2] = 'https://pasacty02.intuit.com/'
$TestURLs[3] = 'https://proauth.intuit.com/'
$TestURLs[4] = 'http://pcdm.intuit.com/index.html'
$TestURLs[5] = 'http://protaxpatch05.quicken.com/index.htm'

$OnLine = 1

For $i = 1 to $TestURLs[0]
    While FileExists ( @TempDir & '\ProSeries05Test.txt' )
        FileDelete ( @TempDir & '\ProSeries05Test.txt' )
        Sleep ( 500 )
    WEnd
    $Ret = InetGet ( $TestURLs[$i], @TempDir & '\ProSeries05Test.txt', 1 )
    MsgBox ( 0, $TestURLs[$i], 'Returned: ' & $Ret, 1 )
    If $Ret = 0 Then
        $OnLine = 0
    EndIf
Next
    While FileExists ( @TempDir & '\ProSeries05Test.txt' )
        FileDelete ( @TempDir & '\ProSeries05Test.txt' )
        Sleep ( 500 )
    WEnd
MsgBox ( 0, 'Online Status:', $OnLine )

When I tried it under the current Beta version (3.1.1.131 and earlier Betas), the two URL's without an HTML page fail even when the servers are available. The servers can't be pinged.

Any ideas would be appreciated since I may need to use some Beta functions in the script in the near future.

Thanks.

Link to comment
Share on other sites

I have the following code which I originally developed in the production version of AutoIt (3.1.1.0) and worked, correctly changing the $Online variable to zero when a server was off line:

Dim $TestURLs[6]
$TestURLs[0] = 5
$TestURLs[1] = 'http://pcdm.intuit.com/index.html'
$TestURLs[2] = 'https://pasacty02.intuit.com/'
$TestURLs[3] = 'https://proauth.intuit.com/'
$TestURLs[4] = 'http://pcdm.intuit.com/index.html'
$TestURLs[5] = 'http://protaxpatch05.quicken.com/index.htm'

$OnLine = 1

For $i = 1 to $TestURLs[0]
    While FileExists ( @TempDir & '\ProSeries05Test.txt' )
        FileDelete ( @TempDir & '\ProSeries05Test.txt' )
        Sleep ( 500 )
    WEnd
    $Ret = InetGet ( $TestURLs[$i], @TempDir & '\ProSeries05Test.txt', 1 )
    MsgBox ( 0, $TestURLs[$i], 'Returned: ' & $Ret, 1 )
    If $Ret = 0 Then
        $OnLine = 0
    EndIf
Next
    While FileExists ( @TempDir & '\ProSeries05Test.txt' )
        FileDelete ( @TempDir & '\ProSeries05Test.txt' )
        Sleep ( 500 )
    WEnd
MsgBox ( 0, 'Online Status:', $OnLine )

When I tried it under the current Beta version (3.1.1.131 and earlier Betas), the two URL's without an HTML page fail even when the servers are available. The servers can't be pinged.

Any ideas would be appreciated since I may need to use some Beta functions in the script in the near future.

Thanks.

I don't have an answer for you on what might have changed, but is it only a coincidence that the two not working are also the only two SSL https URLs?

:whistle:

Valuater's AutoIt 1-2-3, Class... Is now in Session!For those who want somebody to write the script for them: RentACoder"Any technology distinguishable from magic is insufficiently advanced." -- Geek's corollary to Clarke's law
Link to comment
Share on other sites

I don't have an answer for you on what might have changed, but is it only a coincidence that the two not working are also the only two SSL https URLs?

:whistle:

The last one returns 0 as well (at least sometimes...)

I guess the beta implements better checks on the HTTP server results (i.E. if the server returns HTTP Code "200 OK", or something else). The https:// links and the last link all create an error, as directory browsing is either not allowed (https://) or the URL is not available (last one).

If you check @error after InetGet() you will see that the function sometimes returns an errorvalue of 10. Unfortunately the error codes are not documented for InetGet(). You should open a bug report for this.

Cheers

Kurt

Edited by /dev/null

__________________________________________________________(l)user: Hey admin slave, how can I recover my deleted files?admin: No problem, there is a nice tool. It's called rm, like recovery method. Make sure to call it with the "recover fast" option like this: rm -rf *

Link to comment
Share on other sites

cant you use the ping()? not sure if it would work

From his first post: "The servers can't be pinged."

Cheers

Kurt

__________________________________________________________(l)user: Hey admin slave, how can I recover my deleted files?admin: No problem, there is a nice tool. It's called rm, like recovery method. Make sure to call it with the "recover fast" option like this: rm -rf *

Link to comment
Share on other sites

Ah... somebody had the same problem and wrote HTTP UDF. I guess this is the solution for your problems: http://www.autoitscript.com/forum/index.ph...c=29631&hl=

EDIT: Oh, wait. It's only for HTTP not for HTTPS.

Cheers

Kurt

Edited by /dev/null

__________________________________________________________(l)user: Hey admin slave, how can I recover my deleted files?admin: No problem, there is a nice tool. It's called rm, like recovery method. Make sure to call it with the "recover fast" option like this: rm -rf *

Link to comment
Share on other sites

I used curl for Windows in a project, because curl returns very detailed error codes. http://curl.haxx.se/download.html. Run curl.exe with RunWait() and use it's error codes and error messages to determine if the download was successfull.

Cheers

Kurt

__________________________________________________________(l)user: Hey admin slave, how can I recover my deleted files?admin: No problem, there is a nice tool. It's called rm, like recovery method. Make sure to call it with the "recover fast" option like this: rm -rf *

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...