Jump to content

Reading a link from a page to a var or text file


Recommended Posts

Helo again... Today has been a really bad day for me... everytime I solve a problem I have another one...

So this is my question:

How do I read links from a web page to a string variable? The link is not static. In fact there are about 10 stactic links in the page, but I want to read the non stactic ones...

Let me explain a little better.

Imagine you make an autoit script to open each new thread in this forum, so, when u run the script it will open IE in this address and analyze all links and ignore the ones that where here already here.

Omg... I'm confusing myself... :S

ok. simpler that that... A script that u run and writes in a text file all links in this forum.... (I think I'll manage to do the rest)

Thx in advance.

Edited by Dryden

"Build a man a fire, and he'll be warm for a day. Set a man on fire, and he'll be warm for the rest of his life." - Terry Pratchett.

Link to comment
Share on other sites

Why not store every link, then every time you scan each link just check what you already got stored against that link? If that link is unique then then its a new one.. This might get slow if there are a lot of strings to compare against. Get what im saying?

www.itoady.com

A* (A-star) Searching Algorithm - A.I. Artificial Intelligence bot path finding

Link to comment
Share on other sites

I didn't get that far... 1st I only what to know how to get the info out from a web page, I'll think about that later... But I don't think it will be slow, cos the idea is to only download the page once when u run it, and compare the links to the ones already in a text file, and store the new ones...

but basically what I want to know is how to get the links out... Is there some command that I can do with an address and it will retrieve the links??

Edited by Dryden

"Build a man a fire, and he'll be warm for a day. Set a man on fire, and he'll be warm for the rest of his life." - Terry Pratchett.

Link to comment
Share on other sites

I didn't get that far... 1st I only what to know how to get the info out from a web page, I'll think about that later... But I don't think it will be slow, cos the idea is to only download the page once when u run it, and compare the links to the ones already in a text file, and store the new ones...

but basically what I want to know is how to get the links out... Is there some command that I can do with an address and it will retrieve the links??

Check the help file for the _IE* functions. You would create and instance of IE with _IECreate and the URL, then either read the body HTML or get a collection of objects from the page. Best to play around with the demo scripts to see how it works.

BTW, welcome to AutoIt!

:whistle:

Valuater's AutoIt 1-2-3, Class... Is now in Session!For those who want somebody to write the script for them: RentACoder"Any technology distinguishable from magic is insufficiently advanced." -- Geek's corollary to Clarke's law
Link to comment
Share on other sites

I think i solved it with _IELinkGetCollection . Thx for the help.

"Build a man a fire, and he'll be warm for a day. Set a man on fire, and he'll be warm for the rest of his life." - Terry Pratchett.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...