EyeSkills is already a complex piece of software, with many thousands of hours poured into its design, execution and testing. So far, most of these costs have been met privately – after all, this started out as a project to help my son. In the past six months we have received generous funding from the prototype fund which regards EyeSkills as one of its “lighthouse projects, amongst the most visionary and well executed project we have had”.
For the next two days I’ll be in Dresden, promoting EyeSkills at Bionection.
Our presentation will be in Panel 4 (Smart Medical Devices) alongside some very interesting other speakers (https://www.bionection.com/en/program).
As you know by now, EyeSkills is hosted on BitBucket as a git repository. Git can be pretty confusing if you haven’t had a few years experience with it, but this might help! This is a great set of “Fight Rules” for dealing with common (and not so common) situations!
This is a quick overview of what the EyeSkills framework contains.
Here is a first attempt to explain our approach with the help of a simple video model. If you have ideas about how to refine and improve it – or could even do so yourself – please let us know!
What we won’t manage in this iteration are further extensions to the eye straightening environments.
*) We then need to start pushing the visual system to improve coordination with the eyes, so we begin a random continual displacement of the fused scene, to force the eyes to maintain fusion whilst simultaneously tracking the scene in view.
*) After this, we start altering the perceived distance of the scene (scaling)…
*) and introduce instabilities in each eye (almost imperceptible loses of signal which increase in length and frequency) to get the mind used to dealing with instability.
I look forward to implementing all of these!
This was intended as a quick test of new features, but it generated some very interesting ideas and insights. Here, we describe the order of the scenes tested, insights won, and finally, draw conclusions. Cliff has alternating strabismus.
So, here I am in Seoul having spent a week in Wonju promoting EyeSkills courtesy of BioSaxony.ev and WMIT along with a host of scientists from the Fraunhofer Institute. It’s been a very intense week, with interest from a local hospital, LG and a manufacturer of medical prototypes.
To understand most of the code in EyeSkills, you’ll need to get a grasp of Euler angles and Quaternions.
Here is a good place to get a first feel and move on to other resources!
A collection of interesting videos about accommodation/vergence issues in VR headsets, and how to work around them using light fields and binocular suppression(Monovision !!!)