Sign in to follow this  
Followers 0
JeffKang

Consumer level eye tracking - easy activation of virtual buttons without touchscreen - wxpython for buttons, & AutoIt for macros behind buttons?

3 posts in this topic

#1 ·  Posted (edited)

(First off, sorry in advance, as I’m not sure if this is the right place to post my inquiry).

*Consumer level eye tracking - easy activation of virtual buttons without touchscreen*

After using Autohotkey for remapping, I soon didn't have enough keyboard buttons to attach macros and lines of code to them, so I'd have to make new scripts that use the same button. After more scripts, it can be easy to forget which button does what.

I have a repetitive strain injury of tendinosis (chronic tendinitis), so I was following a couple of eye tracking companies that were planning to launch cheap, mass-market products. One company has just begun preorders.

You can now optionally take away your hands for moving the mouse cursor. Instead, stare at a target on-screen button, and using a keyboard button to click, you can instantly invoke custom virtual buttons that have your macros and commands that are attached to them. Quick activation of on-screen shortcut tiles without a touchscreen is now more feasible. You could pretty much design the buttons to look however you want. Customizable, virtual buttons are infinitely more productive than static physical keys.

e.g. I remapped F1 to launch a google search on whatever is on the clipboard:

F1::Run http://www.google.com/search?hl=en&safe=off&q=%Clipboard% .

With another script, F1 could execute something completely different. And within that script, depending on the context, such as what program is currently running, or what window is in focus, the use of F1 could change again; it can get confusing.

It would be more intuitive to look at a virtual button that is actually labeled, "Clipboard Google Search", and then tap my activation key.

*wxpython for buttons, & AutoIt for macros behind buttons?*

Is it possible to use a GUI tool like wxpython to design the buttons, and then use a scripting language like AutoIt, AutoHotkey, AutoKey, or Sikuli to run the scripts and functions?

I found a blog post called "Importing AutoIt into Python". Here is some of what is mentioned:

“It is possible to import all of AutoIt's functions from a required .dll file into python.  Then I can use python to code and when I need an AutoIt function I can just call the function.

First you must tell Python that you are going to be using an external file, you do this by using import.

You can now call AutoIt functions using autoit.TheFunctionName()”

There is also some information in a StackOverflow post called "Calling AutoIt Functions in Python" (http://stackoverflow.com/questions/3301561/calling-autoit-functions-in-python):

“In order to call Autoit functions from python using Autoit's COM interface, there are few prerequisites:

An installation of python with the appropriate python for windows extentions (pywin32)

An full installation of Autoit. (It is possible to only install the COM portion, but it is more involved than a full installation.)

In order to call autoit functions from python, add the following code:

import win32com.client
autoit = win32com.client.Dispatch("AutoItX3.Control")

You can now call autoit functions with the autoit variable.

autoit.Run("NotePad.exe")
autoit.ControlClick(WINDOW, "", "[CLASSNN:TTreeView1]", "left", 1, 53, 41)

I'm not a programmer; just beginning. Are wxpython and AutoIt (AutoItX?) a reasonably good direction for getting something like what I am describing?

Thanks for any info.

Edited by JeffKang

Share this post


Link to post
Share on other sites



Hi JeffKang,

Is this still something your working with / has interest for?

I made a system like this for an ALS patient (He could only move his head, everything else was dead ) almost 10 years back. We used some freeware, autoit, and an ordianary web cam to enable the man to read papers (PDF), read, write and, send mail, use skype, play music, listen to radio and do other tasks.

I don't really understand how you think from your post but If your serious about this I can dig out some information from my archive and discuss your aproach with you.

Best regards

Uten

1 person likes this

Share this post


Link to post
Share on other sites

#3 ·  Posted (edited)

Thanks a lot for responding. Sorry for the late reply; I interact on the computer slowly.

I’m going to finish up on researching some of the current eye-tracking interfaces.

I’m not looking for anything full-fledged to be built, like a replicate of any complete interface that I find. I personally just want to be able dwell and fixate on a widget in order to left click (I have a very limited amount of clicks per day, so when I read, I read on auto scroll, so I can’t set the pace).

I’m part of several disability communities, and I know that people are also looking for some sort of basic interface, since the eye trackers are meant for developers, and don’t come with many ready to go features.

I think that if I can learn the code behind a left click, I should be able to expand that, and do other actions, like right clicking, and activating keyboard buttons. If I see the code for how a left click widget is placed, I can possibly parlay that knowledge into designing and positioning other widgets.

Of the interfaces found so far, one is free, but I’m not sure if its open-source and customizable.

Initiating a free and open source interface, even if it’s just for one action, like a left click, could really start something. I know quite a few disability communities that I would be able to promote the source code to.

However, it could most likely grow to something that’s useful to the general population too. It’s only now that eye tracking interfaces are really being examined because it’s only now that an eye tracker costs $100, and not a few thousand dollars. This probably means that eye tracking interfaces have a vast amount of room to grow and improve.

I’ll get back to you once I finish the brief research.

TalkingEyes: Eye-tracking and ALS Google+ Community: https://plus.google.com/u/0/communities/103103072435695270907

http://www.reddit.com/r/EyeTracking

http://theeyetribe.com/forum/ (There is a computer control section, and an accessibility section)

Edited by JeffKang

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0