Sign in to follow this  
Followers 0
Guest Lor20

Getting Text Off A Website

7 posts in this topic

Hi there,

I have a problem i am writing a script that is going through a bunch of sites and i would really like to be able to have a verified report that the actions worked.

I can't however use URLdownloadtofile because well i could only get the pages from before the actual userintervention took place and not the ones after that (after submitting a form on a php site) i tried it by using the clipboard

WinActivate("handle=" & $HANDLE[$I], "")
      sleep(10)
      send("^a")
      send("^c")
      sleep(10)
      $result=StringSplit ( clipget(),@LF )
      $resultlines = UBound($result)
      if $resultlines>45 Then
         if $result[42]="Error" then 
            FileWriteLine($report,"Invalid Entry")
     ......

but that is both unreliable and cpu hungry (but the unreliability is a lot worse than the cpu neediness) so i am wondering if anyone has a better idea - since URLdownloadtofile does basically have browsercontrol i was wondering if there is a way to get the innerhtml from the page to search for the stuff

Lor20

Share this post


Link to post
Share on other sites



#2 ·  Posted (edited)

I use Firefox as my browser:

http://www.mozilla.org/products/firefox/

So after the page is loaded, a more reliable way would be to: Controlsend ^u to the browser which pops up a window starting with "Source of:"

I can then:

controlsend("Source of:","","MozillaWindowClass4","^a^c") to copy it info clipboard and parse it with autoit.

I also use the default find commands in webpages a lot.

send("^f")

winwait("Find")

send("find this{enter}")

send("{esc 2}")

send("^c")

if clipget()="find this" then dosomething()

just an example, kinda wrote from memory so I might be off a bit, and could be browser dependant.

Anyway maybe gives you more ideas to try.

If you were the one who made the website, I prefer to have custom exportable XML webpages for output, it makes life far easier.

Edited by scriptkitty

AutoIt3, the MACGYVER Pocket Knife for computers.

Share this post


Link to post
Share on other sites

Hi, have idea but no realisation.

In IE there is a internet temporary files directory, where there is information i need actually ).

Idea is to use this directory and to get information from temporary files.

But it is interesting that usual find (Windows) finds files in this directory and autoit script program not (.

Any sugestions?

Alex

Share this post


Link to post
Share on other sites

#4 ·  Posted (edited)

Try looking in the following location:

C:\Documents and Settings\<username>\Local Settings\Temporary Internet Files\Content.IE5

Edited by this-is-me

Who else would I be?

Share this post


Link to post
Share on other sites

Try looking in the following location:

C:\Documents and Settings\<username>\Local Settings\Temporary Internet Files\Content.IE5

<{POST_SNAPBACK}>

thx for the answer , unfortunately doesnt work.

U can check it is not dificult . (

Share this post


Link to post
Share on other sites

you could always go file, save as. that or ^a then ^c then copy the text from the clipboard.


 

Spoiler

shoot_zpsfd329d66.png
dontbelieveeverythingyouthink_zps0e1e900

Madness is the first step to understanding...

Share this post


Link to post
Share on other sites

you could always go file, save as.  that or ^a then ^c then copy the text from the clipboard.

<{POST_SNAPBACK}>

Ok, this is a good idea!

But still, the best case is when we use something what we have already.

It is seems to be that this temporary directory is a little special or whatever.

Thx anyway,

Alex

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0