anystupidassname Posted August 10, 2005 Share Posted August 10, 2005 (edited) Edited August 15, 2005 by anystupidassname This signature is computer generated, nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#....... Link to comment Share on other sites More sharing options...
layer Posted August 10, 2005 Share Posted August 10, 2005 (edited) Not sure if this will help for the second question... But here is how I found links in a webpage... It may not be the shortest, best possible way, but it works for me (NOTE: YOU HAVE TO USE THE HTTP STUFF LIKE HTTP://) Func _INetCrawl($sURL) Local $URL, $ulr, $line, $hRead InetGet($sURL, @TempDir & "\URLs.html") $hRead = FileOpen(@TempDir & "\URLs.html", 0) While 1 $line = FileReadLine($hRead) If @error = -1 Then ExitLoop $ulr = StringSplit($line, '<a href="', 1) If $ulr[0] = 1 Then ContinueLoop Else $URL = StringLeft($ulr[2], StringLen($ulr[2])) $URL = StringSplit($URL, '"', 1) EndIf $list = FileOpen("URLs.txt", 1) FileWriteLine($list, $URL[1]) WEnd EndFunc ;==>_INetCrawl I tried making a web crawler, but whatever... And sorry for the bad variable names... $url and $ulr bahh, not good Edited August 10, 2005 by layer FootbaG Link to comment Share on other sites More sharing options...
layer Posted August 10, 2005 Share Posted August 10, 2005 Ahhh... Sorry bout' that than. FootbaG Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now