Sign in to follow this  
Followers 0
GrayFox

Faster than _InetGetSource?

6 posts in this topic

Hi all! I understand that _InetGetSource is faster than using functions from IE.au3 as IE doesnt have to load...

However, is there anything faster than _InetGetSource? Can you just get a single line of the source, for example. That sounds a bit silly though.. :mellow:

Thanks :P

Share this post


Link to post
Share on other sites



_IeGetSource is basically IERead combined with BinairyToString. If you use IERead you can Get the file from local cache if available.

Another option if you are trying to read allot of pages at the same time is to get em all in the background using InetGet and read them afterwards. You may have to rise the limit of simultanious downloads if you go this way.

I don't think lines are being enumerated server side, so in order to download line 21, you'd have to download lines 1-20 in order to count them.

You may be able to use something like the Urlmon.dll to open a stream of a file and read that stream untill the specified line is reached, after which you can close the stream. I've only succeeded in doing a regular download using urlmon though

Disclaimer: I'm not sure if the assumtion about server site enumeration, or Urlmon being able to stream a download is correct. Just giving you something to think about.

Share this post


Link to post
Share on other sites

Thanks Tvern!

I will give InetGet a go, I am reading alot of pages but sequentially, it sounds like youre talking about increasing the limit of simultaneous downloads of IE? Urlmon.dll sounds a bit over my head, but I will test the other one :mellow:

Share this post


Link to post
Share on other sites

#4 ·  Posted (edited)

For the connection limit <a href="https://www.autoitscript.com/forum/index.php?showtopic=113704" class="bbc_url" title="External link" rel="external">this post</a> has some information on how to fix it.<br>I think I'll have a shot at this myself as I've never tried something like this.<br><br>Edit:<br><br>I've made a little test script that downloads the google image 100 times with different methods and included a way to easily change the maximum open connections to one server.<br>The amount of open connections had no influence on the non-background INetGet as only 1 will be active. The Background method seemed to be fairly fast with a maximum of 20 connections, but this will probably change depending on the speed of your connection and the size of the files you download.<br>My results where that downloading in the background is 5 to 10x faster then waiting for each file.<br><br>

Edited by Tvern

Share this post


Link to post
Share on other sites

#5 ·  Posted (edited)

For the connection limit this post has some information on how to fix it.

I think I'll have a shot at this myself as I've never tried something like this.

edit:

I've made a little test script that downloads the google image 100 times with different methods and included a way to easily change the maximum open connections to one server.<br>The amount of open connections had no influence on the non-background INetGet as only 1 will be active. The Background method seemed to be fairly fast with a maximum of 20 connections, but this will probably change depending on the speed of your connection and the size of the files you download.<br>My results where that downloading in the background is 5 to 10x faster then waiting for each file.

p.s the script will create a folder called "files" in the scriptdir.

Local $aDownloads[100]
;~ _MaxConnections(20) ;uncomment to set the max open connections to one server to 20
_BackgroundMethod($aDownloads)
_StandardMethod($aDownloads)
Exit

Func _BackgroundMethod($downloads)
    Local $complete, $timer
    DirRemove(@ScriptDir & "\files\background",1)
    DirCreate(@ScriptDir & "\files\background")
    $timer = TimerInit()
    For $i = 0 To UBound($downloads)-1
        $downloads[$i] = InetGet("http://www.google.com/intl/en_com/images/srpr/logo1w.png",@ScriptDir & "\files\background\file" & $i & ".png",1,1)
    Next
    Do
        For $i = 0 To UBound($downloads)-1
            If InetGetInfo($downloads[$i],2) Then
                ;you could start performing functions on the completed file already by making a function call here.
                $downloads[$i] = False
                $complete += 1
            EndIf
        Next
    Until $complete = UBound($downloads)
    ConsoleWrite("The background method finished in " & TimerDiff($timer) & "ms."& @CRLF)
EndFunc

Func _StandardMethod($downloads)
    Local $timer
    DirRemove(@ScriptDir & "\files\standard",1)
    DirCreate(@ScriptDir & "\files\standard")
    $timer = TimerInit()
    For $i = 0 To UBound($downloads)-1
        InetGet("http://www.google.com/intl/en_com/images/srpr/logo1w.png",@ScriptDir & "\files\standard\file" & $i & ".png",1,0)
    Next
    ConsoleWrite("Standard method finished in " & TimerDiff($timer) & "ms."& @CRLF)
EndFunc

Func _MaxConnections($max)
    RegWrite("HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings","MaxConnectionsPer1_0Server","REG_DWORD","0x" & Hex($max,8))
    RegWrite("HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings","MaxConnectionsPerServer","REG_DWORD","0x" & Hex($max,8))
EndFunc
Edited by Tvern

Share this post


Link to post
Share on other sites

Thanks very much Tvern, thats exactly what I was after! :mellow:

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0