yton Posted June 4, 2010 Share Posted June 4, 2010 Greetings,href = _FFXpath("//a[@href='www.site.com/1*']", "text")is there any way to use mask in this expression?* or smth else - i just cannot get information on this topici need to get all links from the page of the following type:www.site.com/12www.site.com/17www.site.com/19please, help! Link to comment Share on other sites More sharing options...
Zibit Posted June 4, 2010 Share Posted June 4, 2010 you need to be more specific, it's hard to understand. Creator Of Xtreme DevelopersPixel Pattern UDFTray GUI UDFMathssend & recive register scriptMouse Control via Webcam Link to comment Share on other sites More sharing options...
Zibit Posted June 4, 2010 Share Posted June 4, 2010 if i got this right, you want to get all href from a source ( source file from site.com ) am i right ? Creator Of Xtreme DevelopersPixel Pattern UDFTray GUI UDFMathssend & recive register scriptMouse Control via Webcam Link to comment Share on other sites More sharing options...
yton Posted June 4, 2010 Author Share Posted June 4, 2010 not all, but those that have the structure site.com/1* www.site.com/12 www.site.com/17 www.site.com/19 Link to comment Share on other sites More sharing options...
Zibit Posted June 4, 2010 Share Posted June 4, 2010 (edited) k i dont know if this will work but how about this: $source = _INetGetSource("http://www.test.com/12") filewriteline("source.txt", $source) $i = 0 $count = _filecountline("source.txt") do $i = $i + 1 $read = filereadline("source.txt", $i) $split = stringsplit($read, " ", 1) if $split[0] <> 0 then $n = 0 do if $split[$n] = "href" then _func($split[$n]) ;added this cause i dont know what you want to do with the href. until $split[$n] = $split[0] endif until $i = $count i think it should work, i just scratched it up on fast reply Edited June 4, 2010 by Zibit Creator Of Xtreme DevelopersPixel Pattern UDFTray GUI UDFMathssend & recive register scriptMouse Control via Webcam Link to comment Share on other sites More sharing options...
Zibit Posted June 4, 2010 Share Posted June 4, 2010 k noticed after i posted the script -.- Creator Of Xtreme DevelopersPixel Pattern UDFTray GUI UDFMathssend & recive register scriptMouse Control via Webcam Link to comment Share on other sites More sharing options...
Zibit Posted June 4, 2010 Share Posted June 4, 2010 i think i got it you should try this: func _func($split[$n]) $i = 0 $var = 0 while 1 $left = stringleft($split[$n], $i) if $left = '"' then $var = $var + 1 if $var = 2 then exitloop wend return $left endfunc Creator Of Xtreme DevelopersPixel Pattern UDFTray GUI UDFMathssend & recive register scriptMouse Control via Webcam Link to comment Share on other sites More sharing options...
Zibit Posted June 4, 2010 Share Posted June 4, 2010 k now i get it... you want it like this: * www.site.com/randomtext * www.site.com/anotherrandom about it: you cant get all the links on that site without a basis... i mean if the source doesnt contain a link like www.site.com/12 you cant get the link. it has to be written somewhere Creator Of Xtreme DevelopersPixel Pattern UDFTray GUI UDFMathssend & recive register scriptMouse Control via Webcam Link to comment Share on other sites More sharing options...
yton Posted June 4, 2010 Author Share Posted June 4, 2010 i need links that have a similar part: site.com/1www.site.com/12www.site.com/17www.site.com/19each of above mentioned links has 1 (one) as a first digit after the slashi need the mask for an expression that will help me gather all such links Link to comment Share on other sites More sharing options...
Zibit Posted June 4, 2010 Share Posted June 4, 2010 this requires math... use $i = $i + 1 look at this and you might understand what im saying $i = 0 while 1 $i = $i + 1 $source = _inetgetsource("www.site.com/1" & $i) wend Creator Of Xtreme DevelopersPixel Pattern UDFTray GUI UDFMathssend & recive register scriptMouse Control via Webcam Link to comment Share on other sites More sharing options...
yton Posted June 4, 2010 Author Share Posted June 4, 2010 well, i get the idea, but i need the list of urls, not the whole page code Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now