EyeSkills Newsletter – Design. Code. Refactor. Rinse. Repeat. 09.04.2019

Thanks!

Thank you to Gregory Taschuk for his 20EUR donation to the EyeSkills PayPal Pool!  The pool has already paid for a simple 3D printer to help prototype parts for the eyetracktive.org headset.

If we can get another 150 EUR in the pool in the next six days (to bring it to 500 EUR)  I would get this dirt cheap (but robust) laser diode cutter which is on special offer until the 15th of April.  It is only 2.5W, but that’s plenty for cutting card – which is what we need to work with to prototype improved designs for the eyetracktive.org core which folds up to hold the eye tracking cameras and usb hub.  It would be a major motivator and speed-up to have that here “in-house”.

Continue reading “EyeSkills Newsletter – Design. Code. Refactor. Rinse. Repeat. 09.04.2019”

EyeSkills Newsletter – Prototyping Madness 20.03.2019

Hiya!

So much has happened in the last two weeks that I barely know where to begin…. but I must begin with a wave of gratitude.

First of all, I would like to express my heartfelt gratitude to Holger Hahn and Andreas Freund, who have both donated to the project (https://www.paypal.com/pools/c/8byPUuuQ1D).  I’m utterly blown away.

Thanks to these donations I have built up a cheap Ender 3D Pro (cheap, but with quite astounding print quality) which has already been massively helpful in speeding up prototyping.

The emerging hardware team has also been spending a fair amount privately buying and testing different endoscopic cameras and nano/micro usb hubs (more on that in a bit) so this support will help us cover those costs (and upcoming costs). Again, thank you.  It’s so inspiring to receive energy coming back into the project.

Secondly, I would like to express my deepest respect and thanks to (left to right in the picture below) Johann, Rene, (Iana, who sadly became ill on the first weekend), Flo, Andre, Asieh and Cong… and the main organisers Cong, Isabelle and Daniel, for their incredible efforts over the last three weekends of the Berlin Hackademy.  When I got back on Sunday from the final weekend, I didn’t get back out of bed until Tuesday – there was just nothing left in the tank. It’s been intense, but worth every Joule.  The team feels like family, and what we’ve built in such a short space of time is really something to be proud of.

The headset is something, in and of itself, that deserves a crowd funding.  The world needs an ultra-low cost eye tracking solution which isn’t just on paper, but which is actually being used and developed.  I hope we can do this from within the EyeSkills project, as eye tracking is critical to enabling us to operate safely and effectively, whilst generating the quantitive evidence we need to modernise medical approaches to Lazy Eye.

Here are some more notes on the second weekend if you’re interested 🙂

Here is also a quick look at a video we put together covering the output of the project, and a website (which I’m still trying to complete as I find an hour here or there) with the open-source open-hardware designs available for you to download – https://eyetracktive.org.

Thirdly, I would like to welcome four new volunteers to https://chat.eyeskills.org.  To handle data, balancing respect for privacy and security with benefits for the whole community, is at the core of the project.  Our new volunteer Martin lives and breaths these concerns.  I’m very happy to welcome his voice to the community.

Making EyeSkills really usable (moving it away from an experimental platform which is hard to understand at first) is my main focus for the coming few months, and that requires input on the User Experience and User Interaction side of the system.  Flo (from the Hackademy) has offered to keep an eye on the process, while Guneet from India and Ant from the UK are both getting more actively involved.  I’m very grateful for their more expert input.  Rework is so time consuming, I hope we can make less mistakes and get to a really good experience more quickly than otherwise possible.

On the privacy front – as some of you may know, I made a big effort at the start to host almost everything we use ourselves – from RocketChat, GitLab, the Website to Sendy and so on.  When the CCC talk suddenly generated the resonance it did, however, I needed to respond quickly to setup some kind of volunteering form and a project specific email address (I just wasn’t prepared at all!).  I did this quickly with a Google Form and setup a Google email address. These are both quite secure, from everybody except Google – and the question arises, how much do you trust Google?

Hosting our own mail server for the core team (i.e. email addresses which end in eyeskills.org), for instance – is a non-trivial thing to do.  I have had offers from within the community to do this, but I worry about maintenance and all the associated potential problems with blacklisting/spam etc.

As far as I see it, we have three choices : Keep using Google, self-host, or use a secure email provider.  If we use a secure email provider then it needs to be paid for each month.  I think this is a question which I would like *you* to answer.  Please indicate what you would prefer for now :

https://app.tomvote.com/answer/41a930ba1552a7f6fda6ffa65c8dbd15

You will be asked whether you would prefer us to use a paid email service, switch to self-hosting, or keep using google email. Remember, this is about what the core team will use to communicate with you, not about what you have to use personally.

Right now I’m busy refactoring the Framework for a better experience (thanks again to the amazing prototypefund.de for their support!) although I’m *very* sorry that I don’t have a build ready to show yet.  I ran into a few technical blockers (like this one) and over-estimated how much time I’d have with the Hackademy running in parallel 🙁  Nevertheless, there will be something soon, progress is being made one step at a time.  Before you know it 🙂 we’ll need input from the EyeTracking cameras in the headset… but there is still a lot to do there.

Our amazing electronics expert Moritz (who it turns out has a super powers in soldering things so small you can barely see them with the naked eye):

…is taking charge of harassing Chinese camera manufacturers for more detailed camera specifications and quotes for parts, because the amazing Rene, Andre and Johann have discovered in their deep dives into the android usb layer, that the supported image formats of different chips are critical as to whether or not we can get two cameras working simultaneously.

I also want to give a special shoutout to Rene for putting aside three weekends back-to-back, away from his family, on top of an incredibly stressful managerial day job.  He’s seriously determined.

When we get far enough to have the first working prototypes, I will call out to you for TEN alpha-testers.  We will ask you to cover the raw costs per headset (around 100EUR, as they are based on “samples” with high shipping and unit cost) and if you would like to offer us something for our time that would also be appreciated.  HOWEVER, if you do sign up for this, it will be on condition that you take it really seriously – that you use the system every day for at least a month, and give the most detailed and considered feedback you can about the performance of the system, where it is weak, and what you think we could improve.  We want *real* testers 🙂

We still have a way to go, but I’ll circle back around to this when the time is right.

I’m sure there are other things I’ve forgotten,  but I cannot resist the urge to get coding any longer, so ciao for now… and again, thank you for being here!

Ben

Struggling with a serialization error?!?

For some reason, I can run my current Unity spike on the laptop, but not on the phone.  I finally managed to find some indication of what might be going wrong by attaching the debug process to the phone (having built a development version with script debugging) and restarting the app.

Conclusion : Assemblies break the traditional use of the “Editor folder” convention. This (invisibly) causes broken builds, and is a PITA when you have dependencies on third party plugins which use editor scripts.

I get errors like :

Continue reading “Struggling with a serialization error?!?”

The second Hackademy weekend.

The energy, the passion, the determination of this team never ceases to amaze me. More amazing progress.

Let’s start talking about hardware:

My focus was on continuing the work to produce a new central component for Google Cardboard (v2) capable of supporting eye tracking.

This is an intermediate design – laser cut from card, with one of the first 3D printed custom camera holders mounted and containing the internals of an endoscopic camera.

Laser cutters are a *lot* of fun!

Here’s a closer look at the holder –

A later iteration has mounting slots and a redesign of the cardboard so that (with the help of a 3D printed tool) the mounts are always installed in the right direction and in the right location.  The channel/slot running lengthways up the middle is there to allow heat from the metal backplate of the camera and PCB to escape.

Designing these parts in OpenSCAD (they are fully parameterised so we can cope with different camera parts or headset designs) was a bit of a headscratcher at times.  It took several iterations and alternative designs until I was happy.

…but I’m actually really pleased with the OpenSCAD description.  I built it so it both shows the flattened cut, but also applies all the correct transformations to show what it would look like folded – very helpful when seeing what works best!

At the same time, Moritz was busy working out how to remove the IR filters from the camera. He ended up making a custom tool to unscrew the camera lens assembly, and had to break out the IR filters using brute force :

He also custom shortened and re-soldered the camera cables so we could mockup the prototype, and added an IR light source to replace the LED rings on the original cameras – resulting in our first IR shot of an eye!

Of course, we only managed to get that result because of the hard work put in by Johan, Andre and Rene (left to right) on interfacing Android with those cameras. The two main sticking points which are left are how to address both cameras simultaneously, and integrating OpenCV with Unity.

At the same time, what use is all this when a google cardboard is so uncomfortable and poorly fitting, letting in light and disturbing from what needs to be a concentrated experience?

Asieh’s incredibly intuitive and empathic design sense, coupled with Cong’s ability to synthesise many (often) divergent requirements resulted in what I think is a truly inspiring solution :

This is an add-on to a GCV2 which is amazingly comfortable, fits all heads, and blocks all light!

I also liked this “starwars” style prototype :

So, with hardware and ergonomics, we come to the other essential component which makes for a good experience… the User Experience!  Flo (sadly, no picture?!?!) has an inspirational mind in a super relaxed soul.

After many boards full of analysis :

he drew this little cartoon “Eh?!? Where are you buggering off to?!?” which connected with old ideas I’d had from a story telling perspective many years ago. Together this is going to form the foundation for how we well the EyeSkills story and build the experience!

It’s been fourteen days straight with a level of intensity that is barely sustainable, but it’s been worth it.  I’m looking forward to the third and final event happening on the coming event, and hope that we can all get our tasks finished in time this week!

 

A quick note on Unit Testing in Unity

The Unity/VR learning curve hadn’t left me space to tackle unit testing until now. Well, it had, but my initial encounter was so bad that I decided to leave it until I had a little more time to look again.

Well, at the moment I’m building out the basic structures for handling a more fluid and complete user experience, so a reliable structure and repeatable experience is essential – so it’s time for Test Driven Development.

I may extend this post as time goes by with tips and tricks as I encounter them, but first off – the unit testing won’t work unless you explicitly create assemblies for the relevant parts of a project.  In this case:

I needed to create an assembly “Hackademy” for the spike (pre-prototype code) I’m developing, which then references the “EyeSkills” assembly (so it was able to find the relevant EyeSkills framework classes), so that the “Tests” Assembly could then reference the “Hackademy” assembly (so that the tests could find my scripts to test).  It was also necessary to explicitly reference the “EyeSkills” assembly in the “Tests” assembly so I could create Mocks referencing interfaces within the Framework.

It’s also worth pointing out that, despite running from within the context of a unit test and within the EyeSkills namespace, any classes doing dynamic loading from within the Framework will fail to find classes only in the testing assembly. You need to move them into the same assembly which will be looking for them. A bit weak really.

Annoying. Clunky. Poorly documented.

As usual, Unity’s IDE also failed to keep track of alterations to the assembly files (I needed to delete and recreate the Framework folder) causing a terrible mess which was only fixed after semi-randomly deleting .meta files and several restarts of Unity.  The IDE has now reached a level of software quality where it is almost inevitably a buy-out target for Microsoft.

For all my occasionally deep dissatisfaction, however, when Unity works it works well, handles every situation imaginable, and does get the job done.  It’s not perfect, but then, perfect is the enemy of the good!

Newsletter – Happy Hacking – 04.03.2019

It’s been a few weeks since the last newsletter, but that doesn’t mean nothing has been happening.

Progress on eye tracking

First, I did some 3D printing with Moritz to retrofit an existing headset with mounts to hold our endoscopic cameras, this worked well…

https://www.eyeskills.org/another-iteration-of-our-eye-observation-open-hardware/

In parallel, Rene has managed to get the cameras working quite reliably in various versions of Android – quite a relief!

At the Hackademy it became clear to me, however, that we should and could start at an even lower and more accessible price point – by hacking the standard Google Cardboard V2 design to include a new “component” at its heart! https://www.eyeskills.org/hackademy-hacking-google-cardboard-v2/

Progress on UX

Our ultimate goal is to make EyeSkills usable and affordable for the world. It should educate and inform just as much as promoting experience and training.   We made considerable progress identifying where to start… and I hope, we will be able to start getting bi-weekly prototypes out to those of you who have registered for beta-testing, to get your feedback in these critical early stages.

Development

Even if it was only a single line, it is a milestone.  The first code merge has happened!  Thanks for your input Rene!

Funding

At a personal level, I’m still unsure how best to move EyeSkills towards an economically self-sustaining global existence without sacrificing its principles to short-term thinking.  My last round of contacts with traditional funders have convinced me that we will have to do things very differently indeed.

As the community continues to show an interest, and I look at the skills we have, I begin to think that we really could do this on our own.

At this point, I would like to give a heartfelt shout out and a thank you to  Bradley Lane who donated 50 EUR towards our MoneyPool:

https://www.paypal.com/pools/c/8byPUuuQ1D

I am very grateful for your gesture of support.

Interesting meetings and new people

We have a big data specialist waiting for our go-ahead to get his hands dirty as a volunteer! With decades of experience at the coal-face of an internationally renowned cloud company, I think his input will be invaluable. More on this in the coming months….! 🙂

We also met with a leading neuroscientist from the Charite in Berlin – which led to some very interesting discussions on how to handle issues such as detached retinas.

… welcome to other new members of the community, I hope I’ll be introducing you soon 😉

Have a lovely day!

 

Hackademy – hacking Google Cardboard V2

Rather than implementing a new VR Headset for EyeSkills, could we push the price-point down even further, by hacking the standard Google Cardboard V2 design?  That’s one of the questions we explored last weekend at Careable’s first Hackademy!

The standard design has a capacitive button which touches the screen as user input when the user depresses an “origami” lever on the right upper side:

In the manufacturer’s schematics it becomes clear that this is a separate unit…  so after pulling it apart…

…we figured, why not replace this with a new core capable of supporting on-device eye tracking?  Here’s the first mockup.  The tabs at the bottom represent the cameras, the fuzzy tubes are cables, the yellow foam blocks are the USB connectors, and the purple block is the USB hub (all to correct size).

This could be folded back into the existing headset to give us just what we need!

I’m now moving towards programming a parametric model of this interior component which we can prototype using the Hacker Space’s laser cutter.  Very much looking forward to going back there this weekend!!!

At the same time, other members of the group have been exploring how to modify a standard v2 cardboard to improve both the ergonomics and reduce stray light entering the viewing area (after all, people’s foreheads and noses vary more than you might realise until you start looking closely!!!).

Here we are in Potsdam’s wonderful little Maker Space…

And here are some of the wonderful team explaining what we’re doing to members of other teams at the Hackademy….

It was an inspiring weekend, but 7am starts and getting back to bed at midnight after solid work three days in a row have taken their toll a little!

Are we Eyetracktive enough?

Super busy day! It started with the kick off of the prototype fund second round:

Followed by a journey down to Potsdam for the kick off of the Hackademy!  Go team Eyetracktive!

Another iteration of our “eye observation” open-hardware

Our goal is to get objective feedback about eye position and behaviour, which requires some sort of eye tracking.  A decent eye-tracking headset costs anywhere between $400  and $10,000 dollars…  which is just too much for the majority of the world who earn less than $10 a day. So, lets make it affordable!

Continue reading “Another iteration of our “eye observation” open-hardware”