Jump to content

Recommended Posts

Posted

Having attempted to locate information about AutoIt Script automation, using touch (touch screen) input, I'm looking for insight from those with significant greater Windows API experience than I have (which is none).

I'm working with automated testing of an application (running on WIndows 10) that does not have Windows controls (i.e. cannot use ControlClick).  I have to visually (i.e. image compare) controls to be able to determine whether they're present and whether they have been activated, as the only way to determine whether the controls are used or not.  That part I have down, but I have to deal with a touch screen activation of controls for the final system, and mouse control in my test development environment.  The latter is easy, but touch is something I'm having some issues with determining support for.

Doing both is even more difficult, but I need to be able to be able to control the system using touch controls and mouse controls.  Alternately I can switch between them, between the development and the (touch screen system) environment.

I have noticed the Win API InjectTouchInput call, and a single forum that references that, but I don't seem to be able to get that to work.  ...though it is possible that is because I'm attempting to utilize it in a development environment, which does not have a touch screen, (even if running the application with a touch only enable interface).

Has anyone created user functions that I can leverage?  Any help would be appreciated.

Thank you,

EURUS1993

  • Moderators
Posted

Moved to the appropriate forum.

Moderation Team

Public_Domain.png.2d871819fcb9957cf44f4514551a2935.png Any of my own code posted anywhere on the forum is available for use by others without any restriction of any kind

Open spoiler to see my UDFs:

Spoiler

ArrayMultiColSort ---- Sort arrays on multiple columns
ChooseFileFolder ---- Single and multiple selections from specified path treeview listing
Date_Time_Convert -- Easily convert date/time formats, including the language used
ExtMsgBox --------- A highly customisable replacement for MsgBox
GUIExtender -------- Extend and retract multiple sections within a GUI
GUIFrame ---------- Subdivide GUIs into many adjustable frames
GUIListViewEx ------- Insert, delete, move, drag, sort, edit and colour ListView items
GUITreeViewEx ------ Check/clear parent and child checkboxes in a TreeView
Marquee ----------- Scrolling tickertape GUIs
NoFocusLines ------- Remove the dotted focus lines from buttons, sliders, radios and checkboxes
Notify ------------- Small notifications on the edge of the display
Scrollbars ----------Automatically sized scrollbars with a single command
StringSize ---------- Automatically size controls to fit text
Toast -------------- Small GUIs which pop out of the notification area

 

Posted

Thank you, Melba23.

I found some prior content from Andreik, Ozz, and rektmuch, and came up with the following (TouchSupport.au3).

As I mentioned, I do not have any Win API experience, and I had to just experiment to determine what worked.

Some interesting points in regard to that experimentation:

1. I thought the tap and drag would be simple, by simply doing a "down" action followed by an "Update" and an "Up" action.  I did find that I needed to update the mouse pointer for that to work consistently.  Hence the MoveTouchPtr function.

2. I need three different functions: Tap, Double-tap, and Tap with Drag.  Making the total functions: Tap, DoubleTap, TapDrag and MoveTouchPtr.

3. I have been testing this on a non-touch laptop, and still need to test this in the final environment.  ...but, I did notice that I have "touch" points remaining, only in two places, which is where I used the TapDrag function.  If anyone has any idea of why that is, please feel free to update the forum thread and/or suggest/provide updates to the script.

TouchSupport.au3

Posted (edited)

Well, I wrote an example some time ago but I can't really test it without having a device with touch support. From the three functions that you need, did you managed to make any of them working?

Edited by Andreik
Posted

Andreik,

Indeed I did.

As I mentioned, the tap and drag was interesting, because I had to move the pointer to get "dragging" the pointer to work, hence the addition of the MoveTouchPtr and TapDrag functions.  The DoubleTap function really isn't necessary, but I wanted to have the ability to control the timing easily, which this achieves, but maybe the tap function should have a "number of taps" parameter to make that even easier.

Regardless, with the implementation of the these functions users can modify the specific functions to fit their needs, unless we want a specific implementation to add to formal AutoIt Script functions.

Note that this was tested with both a regular system (Core i9 w/ RTX-3050 graphics card and no touch screen (but with a touch enabled application)), and a Core i7 system with integrated graphics and a touch screen (10-point multi touch P-CAP).  The important part being that the application may be configured for input from touch, mouse or both.  I tested with the application set specifically for Touch input, and tested the functions on both systems.

...so, yes: The functions work.

  • 3 months later...
Posted

Andreik,

One question regarding your original code design.  I'm seeing a lot of USB events when using it (and only when using it), which means handling all those events and associated delays.  Do you have any idea why these calls would cause USB events?

Posted

I am not aware about how Microsoft implemented these functions but it's unusual to cause USB events. Are you sure it's related to Touch Injection API?

Posted

Yes.  It is interesting because we have the option to set the automation and the application, independently, to use either Mouse or Touch.  As we use an on-screen keyboard we have a plethora of input (i.e. touch events).  As we are using the (touch) input to unlock BitLocker encrypted devices we receive the USB events, and it only occurs when using the (AutoIt Script automation (i.e. using these routines performing the touch).

If we use actual touch input (i.e. touching the touch screen) it does not occur.  If we use mouse input, manually or through AutoIt Script, it does not occur.

...so, the only way I can see that we get these (USB) events is through the specific calls to the API.  I didn't see any USB event correlation in the calls, but I'm not a Windows API expert, by any stretch of the imagination.

Posted

You can ask on Social MSDN for an answer since it's something related to Microsoft APIs. There might be a chance someone has an idea if and why Touch Injection API triggers USB events.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...