stolie Posted September 20, 2004 Share Posted September 20, 2004 At the moment i manually click and download a number of links on a page. The number, url name and location (on the page) of these links changes on a daily basis. Is there a function within Autoit that automatically recognises that a hyperlink exists, and then to click that link and download the information on that page? I have used Auoit for 2 very small programs already and am very much a noob to programming. Thanks in advance for any help , cheers Link to comment Share on other sites More sharing options...
CyberSlug Posted September 21, 2004 Share Posted September 21, 2004 (edited) URLDownloadToFile("http://www.pageWithLinksToDownload.com", "C:\page.html") FileRead("C:\page.html", FileGetSize("C:\page.html")); Scan the code for <a href=" ... ">blahblahblah</a>; For standard HTML Links, you want what's between the quotes....URLDownloadToFile each link you find....Too slow by 1 minute Edited September 21, 2004 by CyberSlug Use Mozilla | Take a look at My Disorganized AutoIt stuff | Very very old: AutoBuilder 11 Jan 2005 prototype I need to update my sig! Link to comment Share on other sites More sharing options...
stolie Posted September 21, 2004 Author Share Posted September 21, 2004 Thanks guys i will get into the help files and give it a go cheers Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now