NOTE: This document is work in progress. Your view and contribution on content is welcome. Spellchecking and grammar will be worked on when the document approaches a stable state.
This tutorial will be about profiling and optimizing code when we have identified a bottleneck. There are several paths down this road. You could analytically evaluate your code, or fragments of it. You could optimize parts that don't feel good. And you can measure how long time the code takes to run.
When we profile the code we use a stopwatch to log how much time that is spent from the code starts until it ends execution. There are two types of time we can use. The real time spent and the CPU time spent to execute the code. Both methods have its usage. But measuring the CPU time should normally result in better knowledge than measuring clock time. The exception could be in routines where the code calls external resources that depends on other factors rather than the CPU. Ex: printer, network, disk IO.
The simplest approach
Autoit has two handy clock functions to do a regular stopwatch measure of code execution time. Namely TimerInit and TimerDiff. TimerInit is used to get a reference point and TimerDiff to get the difference between the reference point and the call to TimerDiff in milliseconds. So the simplest possible sample could be:
Local $i, $j, $sum, $count, $ts, $td For $j = 1 to 10 $ts = TimerInit() Do $count += 1 $sum += Random(-5, 5, 1) Until $sum = 0 OR $count = 100000 ; Write/create report $td = TimerDiff($ts) ConsoleWrite( StringFormat("%#.6g ms, Iter=%.6d, avg=%.6g", $td, $count, $td/$count) & @LF) $count = 0 Next
Now although this is a reasonable way of measuring execution time, it does not give an accurate picture to identify bottlenecks. And the reason is that it measures real time and not the time spent by the CPU to execute this part of the code. You can probably see this if one of the iterations reaches a $sum=0 in few iterations compared to one that uses max iterations.
The more accurate approach
API notes: GetProcessTimes, GetThreadTimes Windows versions based on NT kernel has at least two API calls to make profiling easier for us. GetProcessTimes will return the CPU time spent by the given process. GetThreadTimes will return the CPU time spent by the given thread. As AutoIt does not support user defined threads, we will use the GetProcessTimes call. A profiling UDF has been created by @uten in the AutoItScript forum. As for now we will use that code. Save the code in an include file. I have called mine profiler.au3 and saved it in an include directory. (TODO: Link to page explaining userdefined include directories.)
Global $FILETIME = "dword;dword" ;Contains a 64-bit value representing the number of 100-nanosecond intervals since January 1, 1601 (UTC). Global $SYSTEMTIME = "ushort;ushort;ushort;ushort;ushort;ushort;ushort;ushort" Func FileTimeToNum(ByRef $FT) Return BITOr(BitShift(DllStructGetData($FT, 1), 32), DllStructGetData($FT, 2)) EndFunc Func OpenProcess($pid = @AutoItPID) Local $PROCESS_QUERY_INFORMATION = 0x0400 Local $PROCESS_VM_READ = 0x0010 $Process = DLLCall("kernel32.dll","int","OpenProcess","int", _ BitOR($PROCESS_QUERY_INFORMATION,$PROCESS_VM_READ),"int",0,"int", $pid) If $Process = 0 Then Return SetError(1, 0, 0) Return $Process EndFunc Func GetProcessTimes(ByRef $hProcess, ByRef $t1, ByRef $t2, ByRef $t3, ByRef $t4) ;BOOL WINAPI GetProcessTimes( ; HANDLE hProcess, ; LPFILETIME lpCreationTime, ; LPFILETIME lpExitTime, ; LPFILETIME lpKernelTime, ; LPFILETIME lpUserTime ; ) ; Local $p1 = DllStructGetPtr($t1) Local $p2 = DllStructGetPtr($t2) Local $p3 = DllStructGetPtr($t3) Local $p4 = DllStructGetPtr($t4) ;ConsoleWrite($p1 & ", " & $p2& ", " & $p3 & ", " & $p4 & @LF) Local $ret = dllcall("kernel32.dll","int", "GetProcessTimes","int",$hProcess , "ptr", $p1, "ptr", $p2, "ptr", $p3, "ptr", $p4) If $ret = 0 Then ConsoleWrite("(" & @ScriptLineNumber & ") : = Error in GetProcessTimes call" & @LF) SetError(1, 0 , $ret) EndIf Return $ret EndFunc Func ProfileInit() Local $process = OpenProcess(@AutoItPID) if @error then ConsoleWrite("!OpenProcess failed terminating" & @LF) Exit EndIf Local $ret $ret = 2 Local $t1 = DllStructCreate($FILETIME) Local $t2 = DllStructCreate($FILETIME) Local $t3 = DllStructCreate($FILETIME) Local $t4 = DllStructCreate($FILETIME) If GetProcessTimes($process, $t1, $t2, $t3, $t4) Then $ret[$ret + 1] = $process $ret = FileTimeToNum($t3) $ret = FileTimeToNum($t4) Else ConsoleWrite("(" & @ScriptLineNumber & ") := @error:=" & @error & ", @extended:?" & @extended & @LF) SetError(@error, @extended, 0) EndIf ;ArrayDump($ret, "ProfileInit Out") Return $ret EndFunc Func ProfileDiff(ByRef $init) Local $ret $ret = 2 ;ArrayDump($init, "ProfileDiff($init)") Local $t1 = DllStructCreate($FILETIME) Local $t2 = DllStructCreate($FILETIME) Local $t3 = DllStructCreate($FILETIME) Local $t4 = DllStructCreate($FILETIME) Local $n1, $n2 If GetProcessTimes($init[$init + 1], $t1, $t2, $t3, $t4) Then $ret = FileTimeToNum($t3) - $init $ret = FileTimeToNum($t4) - $init Else ConsoleWrite("(" & @ScriptLineNumber & ") := @error:=" & @error & ", @extended:=" & @extended & @LF) SetError(@error, 0 , 0) EndIf Return $ret EndFunc
Now that we have a some profiling routines. ProfileInit and ProfileDiff we can change our sample to make use of those UDF's. Note that the time returned from ProfileDiff are in 100-nanoseconds intervals. To get ms we have to divide by 10000.
#include <profiler.au3> Local $i, $j, $sum, $count, $ps, $pd, $uk For $j = 1 to 10 $ps = ProfileInit() Do $count += 1 $sum += Random(-5, 5, 1) Until $sum = 0 OR $count = 100000 ; Write/create report $pd = ProfileDiff($ps) $uk = Round(($pd + $pd)/10000, 0) ;User and kernel time ConsoleWrite( StringFormat("%#.6g ms, Iter=%.6d, avg=%.6g", $uk, $count, $uk/$count) & @LF) $count = 0 Next
When do we need to optimize our code? Well, that's a good question. A rule of thumb is to make the code work first. If we experience that the code is slow, we have to find the cause. When one or more causes are found, we have to consider if it is possible to optimize the code.
Let's have a look at a real world sample. Sorting stuff is a classic.
Getting some data
Let's grab some data from the AutoItScript forum. The data is not relevant for the sample so if you have a fairly big HTML file on your disk, you could use that instead.
#include <inet.au3> $data = _InetGet("http://www.autoitscript.com/forum/index.php?showtopic=36941") ;$data = ReadFile("<your file>")
Data to plain text. Take one
Func WordList(ByRef $data) ;Shave of html tags Local $res ;Return string EndFunc
Data to plain text. Take two
Func Wordlist2(ByRef $data) Local $res = StringRegExp($data, "TODO", 3) ;Array to string ;Return string EndFunc
Sorting the text. Take one
Using bouble sort
Sorting the text. Take two
Using something faster.