Jump to content

Getting Text Off A Website


Guest Lor20
 Share

Recommended Posts

Guest Lor20

Hi there,

I have a problem i am writing a script that is going through a bunch of sites and i would really like to be able to have a verified report that the actions worked.

I can't however use URLdownloadtofile because well i could only get the pages from before the actual userintervention took place and not the ones after that (after submitting a form on a php site) i tried it by using the clipboard

WinActivate("handle=" & $HANDLE[$I], "")
      sleep(10)
      send("^a")
      send("^c")
      sleep(10)
      $result=StringSplit ( clipget(),@LF )
      $resultlines = UBound($result)
      if $resultlines>45 Then
         if $result[42]="Error" then 
            FileWriteLine($report,"Invalid Entry")
     ......

but that is both unreliable and cpu hungry (but the unreliability is a lot worse than the cpu neediness) so i am wondering if anyone has a better idea - since URLdownloadtofile does basically have browsercontrol i was wondering if there is a way to get the innerhtml from the page to search for the stuff

Lor20

Link to comment
Share on other sites

I use Firefox as my browser:

http://www.mozilla.org/products/firefox/

So after the page is loaded, a more reliable way would be to: Controlsend ^u to the browser which pops up a window starting with "Source of:"

I can then:

controlsend("Source of:","","MozillaWindowClass4","^a^c") to copy it info clipboard and parse it with autoit.

I also use the default find commands in webpages a lot.

send("^f")

winwait("Find")

send("find this{enter}")

send("{esc 2}")

send("^c")

if clipget()="find this" then dosomething()

just an example, kinda wrote from memory so I might be off a bit, and could be browser dependant.

Anyway maybe gives you more ideas to try.

If you were the one who made the website, I prefer to have custom exportable XML webpages for output, it makes life far easier.

Edited by scriptkitty

AutoIt3, the MACGYVER Pocket Knife for computers.

Link to comment
Share on other sites

  • 6 months later...

Hi, have idea but no realisation.

In IE there is a internet temporary files directory, where there is information i need actually ).

Idea is to use this directory and to get information from temporary files.

But it is interesting that usual find (Windows) finds files in this directory and autoit script program not (.

Any sugestions?

Alex

Link to comment
Share on other sites

Try looking in the following location:

C:\Documents and Settings\<username>\Local Settings\Temporary Internet Files\Content.IE5

<{POST_SNAPBACK}>

thx for the answer , unfortunately doesnt work.

U can check it is not dificult . (

Link to comment
Share on other sites

you could always go file, save as.  that or ^a then ^c then copy the text from the clipboard.

<{POST_SNAPBACK}>

Ok, this is a good idea!

But still, the best case is when we use something what we have already.

It is seems to be that this temporary directory is a little special or whatever.

Thx anyway,

Alex

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...