Here is a description of the lecture I’ll be giving:
We mostly see with the mind, and the mind is flexible. For the four hundred million people with Lazy Eye, their brain encountered an installation error when linking both eyes as babies. As a PlanB their brain switched one eye off. I’ll talk a bit about how the visual system works, and how our open-source virtual reality software (backed by social impact lab Leipzig and the prototypefund.de) can hack through that suppression and provide a chance to “re-install” full sight with two eyes.
By providing an open set of tools for creating comparable experiments, our goal is not just to provide a tool, and a set of tools for building more tools, but to provide the basis for one of the world’s largest open-science experiments.
Nobody claims to have predictive scientific models of how the visual system works in its entirety, and that means there is so much more still to discover. In the case of Lazy Eye, some aspects of the visual system are de-activated and/or dormant. What we can do is to comparatively explore which techniques and approaches have which effects on opening visual perceptions, and thereby drive our understanding of the system forward on a theoretical and practical level.
If you’d like to know more, check out www.eyeskills.org and come along to this talk 🙂
EyeSkills is already a complex piece of software, with many thousands of hours poured into its design, execution and testing. So far, most of these costs have been met privately – after all, this started out as a project to help my son. In the past six months we have received generous funding from the prototype fund which regards EyeSkills as one of its “lighthouse projects, amongst the most visionary and well executed project we have had”.
As you know by now, EyeSkills is hosted on BitBucket as a git repository. Git can be pretty confusing if you haven’t had a few years experience with it, but this might help! This is a great set of “Fight Rules” for dealing with common (and not so common) situations!
What we won’t manage in this iteration are further extensions to the eye straightening environments.
*) We then need to start pushing the visual system to improve coordination with the eyes, so we begin a random continual displacement of the fused scene, to force the eyes to maintain fusion whilst simultaneously tracking the scene in view.
*) After this, we start altering the perceived distance of the scene (scaling)…
*) and introduce instabilities in each eye (almost imperceptible loses of signal which increase in length and frequency) to get the mind used to dealing with instability.
EyeSkills Feature demonstration / user test with Cliff W – 02.09.2018
This was intended as a quick test of new features, but it generated some very interesting ideas and insights. Here, we describe the order of the scenes tested, insights won, and finally, draw conclusions. Cliff has alternating strabismus.
So, here I am in Seoul having spent a week in Wonju promoting EyeSkills courtesy of BioSaxony.ev and WMIT along with a host of scientists from the Fraunhofer Institute. It’s been a very intense week, with interest from a local hospital, LG and a manufacturer of medical prototypes.