Jump to content

Extracting hyperlinks from a website


Recommended Posts

I was wondering, is there a stable way to extract the hyperlinks from a website?

I'm using Firefox, and the way I do it right now is this:

-Press right mouse button on a certain location

-Press ctrl + b (to show source)

-Press ctrl + a, ctrl + c

-Do some hyperlink searching voodoo

-Press ctrl + F4

But if there happens to be an image on that exact location, showing the website's source won't work and obviously it won't work in IE either.

Please help.

Link to comment
Share on other sites

Isn't this easier (in FF):

-Press ctrl+U (show page source)

-Do some hyperlink searching voodoo

-Press ctrl+F4

Guess that is only works when the page has no frames though...

--

Clouds®

Hey, thanks, that did the trick. >:)<

I looked in Firefox's manual, but I didn't see that one listed.

Link to comment
Share on other sites

  • Moderators

You might also try...

#include <IE.au3>

$sURL = "http://www.autoitscript.com/autoit3/"
$sSearch = "SciTE"

$oIE = _IECreate($sURL)
$oLinks = _IELinkGetCollection($oIE)
$iNumLinks = @extended
ConsoleWrite("Link Count: " & $iNumLinks & @CRLF)
For $oLink In $oLinks
    If StringInStr($oLink.href, $sSearch) Then
        ConsoleWrite($oLink.href & @CRLF)
    EndIf
Next
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...