How could a quadriplegic teen control the CoSpaces app?

I have an incredibly smart, visually acute, nonverbal, profoundly deaf, quadriplegic, ventilator dependent teenage son who I would love to teach to code.

I’ve exposed him to scratch (Little Bits Iron Man Gauntlet, awesome STEM toy) but he has never been able to manipulate the blocks using his eye tracking Tobii I12+ interface. I have always had to be his hands on the iPad, and when he isn’t directly in control, he gets bored easily. Bridges need to be built to make this more natural for him.

Any suggestions on how the CoSpaces app could be made accessible to him ?

Ottawa, Ontario, Canada

Hi Myles,

Can you give more detail about what your son can do with the eye-tracking interface or other means?

And also what he has tried with CoSpaces already - ie, has he tried locating, selecting , and manipulating objects on CoSpaces?

Many thanks,
Geoff @ TechLeap

1 Like

I took a quick look at the “Welcome to CoSpaces Edu” htt ps:// and I don’t see an easy way my son could control this via only eye gaze.
How extensive a set of keyboard shortcuts do you support?

The targets on the screen are extremely small for an eye-tracking user such as my son, which icons could be selected by keyboard shortcuts?

To help you understand more nuances of this use case, I’ll refer you to a similar post I made for Mathematica last year:

We built an eye tracking pane of glass that stands between an eye tracking user and Minecraft on a PC and allows the user to operate a virtual Xbox controller using your not their eyes.

Very intresting question, I have some pupils which I would lime to involve with similar difficolties.

Hi Myles, can you provide some answers to these questions?

It’s not clear to me, from the links you’ve provided, what sorts of interfaces work well for your son. It would be helpful for developers to know what is possible with eye gaze, what sized icons are best etc. I’m sure you’re already aware that you can increase the size of everything using built-in browser zoom accessibility features, and that this is possible with CoSpaces Edu.

Please see CoSpaces Edu educator resources to get started for more resources and keyboard shortcuts currently available.

Many thanks,
Geoff @ TechLeap

1 Like

Please see for the gold standard of universal access.

My son is able to use this website independently solely by using natural dwell-to-select where his eyes become the mouse and he clicks by lingering on a particular point and keeping his gaze within a minimum radius of that point for a prescribed amount of time.

Our Minecraft adapter includes a circular x/y widget that translates a gaze into an x/y joystick signal that we use to direct the camera up/down/left/right with a fine level of detail which results in a very natural and compelling interface. We also have a walk forward while button pressed paradigm and an auto walk paradigm where the avatar walks until explicitly stopped.

Additional controls need to layered on top of this, for example, see Xcessity IRIS Interactors.

And, in the Mathematica post I already stated the icon size requirements.

Using techniques such as zoom mouse to access small features add latency and do not result in a natural control experience.


Gaze-based select is very common within Google Cardboard VR experiences. I suggest setting browser display zoom to 250% to change icon size. While gaze-based select isn’t supported natively in the editor, it would be an ideal browser extension. I had a look, but couldn’t find any. Found this though:

It might work. Anyway, have a go with the editor with browser zoom enabled to see what works. If you have specific suggestions for the CoSpaces Edu developers, you can add these as a feature request (tag the post). Make sure specific requests are prioritized, so devs know which are the most important/effective.

Many thanks,
Geoff @ TechLeap