cageman 4 Posted August 29, 2010 (edited) right now i have a program that initiates a lot of HTTP requests. The problem is that i need to wait before i get the HTML source back. This takes about 2 seconds. So i would prefer to initiate multiple requests at the same time and just add them to a database. I know "fake" multiple threading exists for autoit, but would that be the best solution. The main thing i need to do is just give a function or another program a hyperlink to gather data from and store some specific things. Right now i have a solution that calls a php file which then gets the correct data from the hyperlink and stores them in a database. In this case i just need to send a http request to the specific php file, that makes the required http requests, and don't need to wait for the info to return. The requests are all for the same site, thus maybe the server just blocks my requests sometimes. The problem is that some transfers (1/200) seem not to be done. If i put in the hyperlink manually it works though. Right now i initiate a new HTTP request to the php file every second. It takes 2-3 seconds to gather so maybe i should make it a 2-3 seconds interval? how many HTTP requests at the same time are safe? This might be the problem, but it seems that randomly some requests are not working. If some things are not clear, please tell me. Edited August 29, 2010 by cageman Share this post Link to post Share on other sites
Tvern 11 Posted August 29, 2010 I think you can do what you want by starting a background download for each link (using inetget) and then checking each download to see if it has completed, or gave an error.there is an example here that can initiate and check multiple downloads at the same time, purely to show the concept.If you run into any problems it would help to have a link to the site and/or your current code Share this post Link to post Share on other sites
cageman 4 Posted August 29, 2010 Thx, i think this will do the trick. Share this post Link to post Share on other sites