redfive19 Posted July 29, 2005 Posted July 29, 2005 Hello all, Been tooling around with INetGet. Was wondering if there was a way to get all files of a certain type from a URL's folder. for instance, I am trying to auto update my extentions in Firefox silent install folder. So whenever they come out with new ones, I don't have to hunt them down and save them locally. Basically trying to do something like this: InetGet("http://ftp.mozilla.org/pub/mozilla.org/extensions/down_them_all/*.xpi", "C:\downthemall.xpi") or even InetGet("http://ftp.mozilla.org/pub/mozilla.org/extensions/down_them_all/downthemall*.xpi", "C:\downthemall.xpi") The thing that's tripping me up of course is that INetGet is looking for the exact filename. Am I just out of luck? -redfive
scriptkitty Posted July 29, 2005 Posted July 29, 2005 Servers typically don't even let you view the folder contents, but if they do, download the index they give, and parse it to download them all. AutoIt3, the MACGYVER Pocket Knife for computers.
seandisanti Posted July 29, 2005 Posted July 29, 2005 scriptkitty said: Servers typically don't even let you view the folder contents, but if they do, download the index they give, and parse it to download them all.<{POST_SNAPBACK}>or get Anawave Websnake. I try not to suggest non AutoIT stuff here, but i think that may fit your needs... you can have it download a mirror of the site (including directory structure) and then use normal FileFindFirstFile() and FileFindNextFile() functions to delete all except the ones that fit what you need...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now