Search the Community
Showing results for tags 'screen reader'.
Found 2 results
Hello, my question is based on topic about Nvda Screen reader development atThis Link Autoit GUI is totaly nice for blind users, because it's controls are mostly standart. But in cooperation with Screen Readers this guis are much slower than other guis. Do you have any idea about reason of this behaviour? Thanks a lot for any answer. Fenzik
I'm using AutoIt on an app that I'm running on an Android emulator; my script basically clicks a spot, sleeps, clicks another, etc, etc. Instead of waiting the same fixed number of miliseconds every time and clicking the same predefined position, I'd like for the script to detect when & where the next clickable object pops up on on the emulator window, and either trigger it or pick another response depending on its coordinates. As a side note, I've been rewriting my code and using PixelChecksum(), PixelSearch(), and PixelGetColor() to this end, and it's getting the job done now, but I'd be much happier with something that automatically detects buttons like a screen reader would, if that makes sense.