A quick note on Unit Testing in Unity

The Unity/VR learning curve hadn’t left me space to tackle unit testing until now. Well, it had, but my initial encounter was so bad that I decided to leave it until I had a little more time to look again.

Well, at the moment I’m building out the basic structures for handling a more fluid and complete user experience, so a reliable structure and repeatable experience is essential – so it’s time for Test Driven Development.

I may extend this post as time goes by with tips and tricks as I encounter them, but first off – the unit testing won’t work unless you explicitly create assemblies for the relevant parts of a project.  In this case:

I needed to create an assembly “Hackademy” for the spike (pre-prototype code) I’m developing, which then references the “EyeSkills” assembly (so it was able to find the relevant EyeSkills framework classes), so that the “Tests” Assembly could then reference the “Hackademy” assembly (so that the tests could find my scripts to test).  It was also necessary to explicitly reference the “EyeSkills” assembly in the “Tests” assembly so I could create Mocks referencing interfaces within the Framework.

It’s also worth pointing out that, despite running from within the context of a unit test and within the EyeSkills namespace, any classes doing dynamic loading from within the Framework will fail to find classes only in the testing assembly. You need to move them into the same assembly which will be looking for them. A bit weak really.

Annoying. Clunky. Poorly documented.

As usual, Unity’s IDE also failed to keep track of alterations to the assembly files (I needed to delete and recreate the Framework folder) causing a terrible mess which was only fixed after semi-randomly deleting .meta files and several restarts of Unity.  The IDE has now reached a level of software quality where it is almost inevitably a buy-out target for Microsoft.

For all my occasionally deep dissatisfaction, however, when Unity works it works well, handles every situation imaginable, and does get the job done.  It’s not perfect, but then, perfect is the enemy of the good!

Struggling with a serialization error?!?

For some reason, I can run my current Unity spike on the laptop, but not on the phone.  I finally managed to find some indication of what might be going wrong by attaching the debug process to the phone (having built a development version with script debugging) and restarting the app.

Conclusion : Assemblies break the traditional use of the “Editor folder” convention. This (invisibly) causes broken builds, and is a PITA when you have dependencies on third party plugins which use editor scripts.

I get errors like :

Continue reading “Struggling with a serialization error?!?”

Tales From the Trenches – Upgrading Unity 2018 to 2019. Where’d the TrackedPoseDriver go?!?

I decided it was time to move onto the 2019 version of Unity.

I tried upgrading the project automatically and was immediately met with a compile error :

“The type or namespace name ‘SpatialTracking’ does not exist in the namespace ‘UnityEngine’ (are you missing an assembly reference?)”.

This in turn meant that the TrackedPoseDriver wasn’t being found.

Continue reading “Tales From the Trenches – Upgrading Unity 2018 to 2019. Where’d the TrackedPoseDriver go?!?”

Why it is tricky to use a VR Reticle as an input mechanism

As long as we don’t know where each individual eye is looking, a reticle may appear to be in two places at once, or one view of the reticle may be suppressed.

In some cases this may be acceptable, when the user has the option to close one eye of their choice and position the reticle to make a simple selection.  Otherwise, a reticle is only a viable choice of input *after* we have achieved fusion between the eyes.

Notes on Unity Animation

This wasn’t a bad starting point : https://www.youtube.com/watch?v=vPgS6RsLIjk

It’s important to remember, when you’ve created a sprite, that you need to add a SpriteSkin. Sometimes it fails to automatically detect the bones in your sprite, but so far, that’s been simple to solve by making a few minor changes to the sprite, reapplying, and then the “CreateBones” button in the SpriteSkin successfully works.  If you have an existing animation, you can drag and drop it onto the sprite. Next step – animation transitions.

In the Animation pane you can create a new animation from the drop-down, but to create links between those elements you’ll need to make sure the Animator window is visible (Window->Animation->Animator). There you can make links between the various states (https://www.youtube.com/watch?v=HVCsg_62xYw).  How can we have those state transitions occur without scripting? It turns out that the transitions already happen, but you need to “Play” a scene containing the model.

Where the ordering of limbs is incorrect, go into the SpriteEditor>SkinningEditor and set the individual bone depth, by selecting the relevant bones.

The next issue will be transitioning sprite parts (rather than just animating their position).  My best guess is that we’ll end up animating enable/disable/active on alternative game objects inside the Animator (I hope).  Yep. That was quite intuitive.  Place the object you want somewhere inside the bone hierarchy of the sprite (inside the scene editor) and then, in the Animation pane, add a property for that object for “enabled” and animate it.

I suspect that, to enable the pupil to move freely around, I’ll have to add a mask around the “white” of the eye.

This is quite exciting.  A lot of opportunities for clearer communication and more interesting and interactive scenes have just opened up 🙂

Ultimately, I’d like to create a 3D representation (mesh) of the mascot, and a toon shader to go with it, which would be the most flexible approach but for now I’ll create the basic poses I need to start with as .SVG, then export to sprites and animate.

I seems that one can create too many bones.  The issue I’ve run into is that slicing the sprite prevents the unity editor from allowing me to create bones which span the different sprite parts (surprise, it’s still buggy).  However, using autogeometry to split up the sprite, makes it almost impossible to control when the bones overlay each other (e.g. around the eye) and control over things like mouth expression is currently beyond me using the inbuilt approach.

I suspect the way to do this is, is to create a completely separate multi-sprite for the eye and another for the mouth (with multiple expressions in the multi-sprite), and then to place these inside the bone object hierarchy.

A potential problem with this approach is that alterations to the bone structures seem to invalidate the sprite skin / bone editor in the scene – requiring it to be destroyed and recreated, which will lose all my setup 🙁

So, that worked well (I think).

There are eight sprites along the top, and only the collection of body parts below are skinned.  On the left in the scene hierarchy, you can see the other parts are placed under the body bone – with each game object having a “Sprite Renderer” added. Is there are better way.  The different parts of the multi-sprite are always visible in the object panel beneath the scene hierarchy.