Justmobile Posted June 25, 2004 Posted June 25, 2004 Hi all, Question i want to get a web page to a text file, posible option are open the browser with given url and save as txt. below code i found from an other topic $website = "http://www.autoitscript.com" Run(@ProgramFilesDir & "\Internet Explorer\IEXPLORE.EXE" & " " & $website) This will open the given url , but now iam stuck i don't know the furthere actions. Is there someone with a smart solution? thanks allot. Justmobile
Justmobile Posted June 25, 2004 Author Posted June 25, 2004 Yep thanks i did, but save to file directly to txt , i will get the source of the html page, and not only the text from the page self Justmobile
sugi Posted June 25, 2004 Posted June 25, 2004 Then you'd have to create or find a html to text converter.Tryhttp://www.w3.org/Tools/html2things.htmlhttp://www.fdisk.com/doslynx/lynxport.htm
emmanuel Posted June 25, 2004 Posted June 25, 2004 you could open the page in IE and copy paste it into notepad, that'd give you the displayed text of it... or just copy it and then use clipget() to get the coppied text into a variable and work with that... "I'm not even supposed to be here today!" -Dante (Hicks)
tutor2000 Posted June 25, 2004 Posted June 25, 2004 This ditty I wrote can be ammended into a function that cleans HTML out of a web page.http://www.autoitscript.com/forum/index.ph...c=3159&hl=boredLar.You could just save it from the clipboard to.I wrote a little proggy that does thatRick
Justmobile Posted June 25, 2004 Author Posted June 25, 2004 Thanks Guys i will try it soon Justmobile
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now