Jump to content
Sign in to follow this  
anystupidassname

FTP active vs. passive

Recommended Posts

:whistle: Edited by anystupidassname

This signature is computer generated, nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#nothing can go wron#.......

Share this post


Link to post
Share on other sites

Not sure if this will help for the second question... But here is how I found links in a webpage... It may not be the shortest, best possible way, but it works for me (NOTE: YOU HAVE TO USE THE HTTP STUFF LIKE HTTP://)

Func _INetCrawl($sURL)
   Local $URL, $ulr, $line, $hRead
   InetGet($sURL, @TempDir & "\URLs.html")
   $hRead = FileOpen(@TempDir & "\URLs.html", 0)
   While 1
      $line = FileReadLine($hRead)
      If @error = -1 Then ExitLoop
      $ulr = StringSplit($line, '<a href="', 1)
      If $ulr[0] = 1 Then
         ContinueLoop
      Else
         $URL = StringLeft($ulr[2], StringLen($ulr[2]))
         $URL = StringSplit($URL, '"', 1)
      EndIf
      $list = FileOpen("URLs.txt", 1)
      FileWriteLine($list, $URL[1])
   WEnd
EndFunc  ;==>_INetCrawl

I tried making a web crawler, but whatever... :whistle: And sorry for the bad variable names... $url and $ulr bahh, not good :dance:

Edited by layer

FootbaG

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...