New version: Uploaded on 20 Jan 10
I wrote this in response to a forum question and thought it might be useful more widely as I have seen a few posts about average colour values before.
The function returns the average colour value (in hex RGB format) over a selected area of the display. The user can define the area and also the resolution of the averaging. Obviously the resolution determines the accuracy of the result and the speed of the operation. As an example, I got the following values from my 1680x1050 display running at 2.13 GHz on Vista Home Premium. The display was about 50% SciTE and 50% blue desktop:
Step Time Av Colour
100 60ms 8899C2
10 720ms 92A6C7
5 2.7 secs 91A4C6
2 16.5 secs 92A6C7
1 65.7 secs 92A6C7
In testing I found that using a resolution of 10 gave an excellent average colour value at a reasonable speed - so I chose that as the default value.
Edit: As requested by picea892 below, the function will now accept a HANDLE for a GUI or a control and automatically set its client area as the area to average.
Edit 2: Fixed a memory leak - see below for details.
Edit 3: Fixed code for new Hex behaviour - see below for details.
The function itself, including a short example:
New version in post #10 below
As always, comments welcome - and criticism, as I am not a GDI expert by any means.