3 min read

Thanks!

Thank you to Gregory Taschuk for his 20EUR donation to the EyeSkills PayPal Pool!  The pool has already paid for a simple 3D printer to help prototype parts for the eyetracktive.org headset.

If we can get another 150 EUR in the pool in the next six days (to bring it to 500 EUR)  I would get this dirt cheap (but robust) laser diode cutter which is on special offer until the 15th of April.  It is only 2.5W, but that’s plenty for cutting card – which is what we need to work with to prototype improved designs for the eyetracktive.org core which folds up to hold the eye tracking cameras and usb hub.  It would be a major motivator and speed-up to have that here “in-house”.

General Progress

The documentation for EyeTracktive.org has been inching along in “after the kids go to bed” campaign modus – but finally, the parts have come together.  Check it out and see what you think!

On the EyeSkills software front, I have extended the 2D/3D switching abilities of the EyeSkills camera rig and extended the “spoken UI” framework I’m building to cover both scenarios equally well.  It’s getting quite exciting, and getting towards the point where we can start working on re-fashioning the experiences and user experience.

The first vertical slice is ready.  This takes the user from first launching the app, through a basic introduction, into a 3D scene, and gives them a first experience of what they perceive when the eye is presented with two different/contrasting patterns.  Although it doesn’t sound like much, behind the scenes are many mechanisms (for handling data, controlling flow, managing user selection, scene switching etc.) which will be used repeatedly throughout the app.

Next I’m focusing on branching on the basis of what the user perceives into different paths to establish eye-misalignment (we need to do this differently for alternating and non-alternating types of strabismus).

We have  5 (!) wonderfully talented UX/UI volunteers interested in helping, but perhaps we need somebody skilled at coordination to help facilitate that team. Anybody interested? Get in touch!

Community News

Congratulations to Julia on her Doctoral thesis defence!  She showed that Strabismus surgery has no impact on muscular coordination in children, whatsoever.  With luck, we will be conducting some trials with her research group on the impact of EyeSkills – but that is still a relatively long road.  It starts with the ethics commission, and Julia has very *very* kindly offered to backup my risk analysis with a literature review and to work on the ethics document.

We also have a new full-time member of the team!  Meet Monique https://www.amazon.de/gp/product/B07CYHBW9G/ref=ppx_od_dt_b_asin_title_s00?ie=UTF8&psc=1.  She’s helping us test different IR emitter positions, headset fit, and camera angles. She’s impressively patient, and doesn’t groan at my jokes.  Thanks for the donations for paying her wages!

Funding/Competitions  –  Would you like to work on EyeSkills and get paid?

Here are two interesting opportunities:

The OpenKnowledgeFoundation in Germany is “providing a number of mini-grants of $5,000 to support individuals or organisations in developing an open tool for reproducible science or research built using the Frictionless Data specifications and software. “. Are you a data-science person? Would you be interested in putting some months of work in funded by this grant? We need to put in an application quickly!  Get in touch!  https://toolfund.frictionlessdata.io/

Hackaday is running an interesting “hardware design contest focused on product development. DesignLab connects you to engineers, expert mentors, and other powerful resources to take your product from concept to DFM.”  Our eyetracktive.org work is perfect for this.  Do you want to help form a team around taking the prototype to production?

https://prize.supplyframe.com/

Let me know if these newsletters are too long or short.  I could talk about much more, but I figured I’d stick to the most pressing or interesting items for this one!

Ben

Would you like to beta test EyeSkills* or just follow what we are doing?