As you can maybe tell from the title of this post, it's been one of those weeks! I have made a fair bit of progress, although nowhere near as much as I was hoping to make to be ready for MK 2 presentations next week. The running joke since 2nd year is that technology hates me, and this week has pretty much been similar to the reasons why that has been a joke for so long. The plan was to get the arduino talking to Unity this week, or as a stop gap connecting my phone wirelessly and using the accelerometer inside the phone to get an idea of how the controls can work. I think the control element is really going to effect how the game play is designed, and ultimately which game prototype is developed further in Unity. It is really the last bit of the puzzle I need to solve!
Unfortunately this didn't happen, and much of the week has been taken up staring at errors like this:
On reflection, I think I know what was going wrong, however upon following tutorials on how to fix the issues, the advice didn't seem to correlate with the options I had available on my system. Which was frustrating to say the least. The aim is to keep trying and if not seek some help. I spoke to some friends who study game design at Abertay on Monday, and they know of people who can help me getting the two connected if I still keep hitting a brick wall!
A bit of tech success!
The week wasn't a complete tech nightmare though, as I managed to connect the accelerometer a get a lot more reliable readings than I have before. Turns out soldering is the only way to keep it from disconnecting!
This meant I was able to get the MPU "teapot" example for the component working in processing. I am starting to consider how I can perhaps develop the game playing using processing. The accelerometer currently controls the game object, however if the background were to be animated / move, and I am able to put in collisions, there is a chance I would be able to use this instead of needing to create the game in unity. I don't think the end result would be as good though. But, it is another thing to try next week!
But I won't let it defeat me!
Towards the end of the week, I wanted to take a bit of a break from the coding, and make progress in other areas. I have been meaning to develop my wireframes and sketches of the app, so I thought it would be a good idea to be able to show the other tech part of the project "working" for the MK2 presentation. I have been thinking more about how to connect the app to the suit and the user journey
These are my original storyboards for the parent app experience, including both set up of the suit and then after the session. I was thinking more about the set up and the QR code that I have drawn attached to the inside of the box. What I am thinking is perhaps this could be heat pressed on to a piece of fabric alongside the care instructions for the product. That way it connects the suit to the app experience, where the parent can sync the suit to the app on their phone, adding in their child's details - to give the child a magically customised experience. (The game will know things like their name and parents can set learning goals, enable behaviour related settings - such as a time for the game to go to sleep so the child can relax and get in to their sleep time routine more easily etc..) So the plan is to create a QR code link to the prototype and add it to a set of care instructions to put in the heat press and see if this will print clearly enough that the phone can recognise it.
In terms of developing the app prototype, I began to put art work in to Photoshop, Illustrator, After Effects and Proto.io
As you can see, I have also recreated my own handwriting font, but this time using a sharpie. I wanted it to come out thicker, but I think where I printed the template larger, it has obviously scaled it accordingly, so it is still thinner than I would like. In addition to this, I used some of the characters I didn't think I would need and used those boxes to trace the game shapes (Circle, Star, Triangle and Pentagon) to give them more of a hand-drawn feel. I will experiment with this further next week, before finalising the art work.
I decided to first prototype the after session experience, because there is animation involved I have been screen videoing the prototypes as I test view them to get a feel for the development. The following videos show the progression I have made from the start taking ideas from the sketches.
The first one I tried to create the animation purely in Proto.io, as this has always worked for me in the past. However, as you can see the animation isn't very fluid and it took a long time to try and figure it out to even get what you can see above. I have never used after effects before, but after googling a tutorial of how to animate a radial spin, I was able to get an idea of how it could work. I then used the video function in proto.io to control when the animation plays.
It has given a much smoother animation. However, I am very aware that my inspiration for the data visualisation is very similar to Apple activity for the Apple watch. So I wanted to try and develop this more, and still keep it coherent with the branding for Play.world. I decided to experiment a little more with what is possible in after effects, and have come up with 3 different versions. Although the last screen grab shows the preference I have visually.
In addition to this, I wanted to use Proto.io to animate the experience of the parent sending a reward to their child for meeting one of their goals.
I am really excited about the progress I have made this week with the app and hope to fill in more content ready for the MK2 presentations. The plan is also to link the QR code that I will heat press on the care instructions to this app, so that I can demonstrate they will be linked as part of the "full experience" that will be shown in the final video.
I have also been thinking about the possibility of designing the suit with an option for the main body to be black in colour. This gave me and idea to try and design a version of the app with a black background colour. I am thinking that if the suit you order is has the white body, the white backgrounded version will link to the app, and the same if you get the black body, you would get a black background version of the suit.
Next week is the MK2 presentations, so I will need to get as much prepared for that as possible. I am not going to be in the position that I hoped, however this way I will also get some feedback on the app experience, which I haven't shown to anyone yet. I am hoping to get a game sprite linked in the MPU teapot example with the accelerometer, as well as finish the start screen animations. I am going to Cherlea's tomorrow to fit Caitlyn in the latest suit and to get her feedback on the new design. Also I will be taking the photo for the degree show booklet catalogue.
I think mostly at this point it is about keeping the momentum going. Despite the set back with the tech, I have still managed to achieve a lot this week, but there is still a fair bit to go! I am getting testing weekends arranged with my test families for the 6/7 April, and then 13/14 April (which is also when I will shoot my video). So getting this tech working is of top priority.
My sublimation fabric arrived last week too, so once the presentations are out of the way I am going to take a day or two to get the suits ready for the test families. The children I am testing with range from 3 - 10, so I want to have a range of sizes so everyone can join in!