Jump to content

IE.au3 Download pages from file


Recommended Posts

InetGet or _INetGetSource might be faster for you in this case - unless you need to manipulate the pages in any way before you download them or you want the the page contents after some sort of client-side processing (typically Javascript) is performed in the pages.

Dale

Free Internet Tools: DebugBar, AutoIt IE Builder, HTTP UDF, MODIV2, IE Developer Toolbar, IEDocMon, Fiddler, HTML Validator, WGet, curl

MSDN docs: InternetExplorer Object, Document Object, Overviews and Tutorials, DHTML Objects, DHTML Events, WinHttpRequest, XmlHttpRequest, Cross-Frame Scripting, Office object model

Automate input type=file (Related)

Alternative to _IECreateEmbedded? better: _IECreatePseudoEmbedded  Better Better?

IE.au3 issues with Vista - Workarounds

SciTe Debug mode - it's magic: #AutoIt3Wrapper_run_debug_mode=Y

Doesn't work needs to be ripped out of the troubleshooting lexicon. It means that what you tried did not produce the results you expected. It begs the questions 1) what did you try?, 2) what did you expect? and 3) what happened instead?

Reproducer: a small (the smallest?) piece of stand-alone code that demonstrates your trouble

Link to comment
Share on other sites

InetGet or _INetGetSource might be faster for you in this case - unless you need to manipulate the pages in any way before you download them or you want the the page contents after some sort of client-side processing (typically Javascript) is performed in the pages.

Dale

Check out the _FileReadToArray() Function and use a For...In...Next loop to loop through each element in the array (each line of your text file).

You should be able to write it in just a few lines of code.

Link to comment
Share on other sites

I haven't tested this but you can give it a shot and it should get you started:

#include <File.au3>

_FileReadToArray("URLs.txt",$theArray)

For $URL in $theArray
   $somepage=Inetget($URL)
  ;Do stuff with the contents of the page here, like write it to another file.
Next
Link to comment
Share on other sites

I have a text file with 8000 urls separated by a new line, I want to write a script to download the html documents but I am not sure how, I know I would have to use IE.au3 but I'm a bit lost. Any help would be appreciated :)

You can probably do this faster in iMacros for Firefox, which can read the list, open the websites and save the pages including all images:

https://addons.mozilla.org/en-US/firefox/addon/3863

http://wiki.imacros.net/Demo-Loop-Csv-2-Web

http://wiki.imacros.net/Browser_Automation#Saving_Web_Sites

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...