Joining the team at Studio Gobo during the development of the Disney Infinity 3.0: Moana, and Rise Against the Empire, I was the first full-time UX Designer on staff. In my role, I introduced and evangelised a player-centric approach to designing systems, levels and interfaces. I also worked to update and improve usability testing processes in the studio, helping deliver more value and actionable results to make our games even more awesome.
September '15 - May '16
UXPin, Visio, Photoshop, InDesign, Hansoft, OBS
Game Designers (x2), Concept Artist, Programmers (x3)
Getting player feedback has been an important part of the design process since Studio Gobo first opened their doors. The studio has their own in-house usability lab facilities, and have been investing in frequent usability testing for several years now. The culture of iterative design and testing is ingrained into the DNA of the studio.
However, with no one dedicated to UX, there was a lack of overall ownership, meaning things could fall through the cracks. UI was driven largely by concept artists, and usability testing suffered from a lack of standardisation, and the outcomes were often not followed up.
Having initially observed and prioritised issues for usability tests toward the end of Rise Against the Empire, joining the team during Moana as a UX Designer I took ownership of testing in the studio, and provided UX support across all features, from system and level design, as well as UI design for mini-games.
When I first took over the usability testing, I met with various team members to understand what they felt was working with the current process, and what could be improved. This exercise also helped better understand how the team was running tests, and kept them involved in the discussion.
From these interviews, and by reviewing the existing documentation, I identified the following areas that could be improved:
The first step in getting better data was creating test plans for each usability session. I would formalise the goals of the team ("Do players understand how to use the Fish Hook mechanic?", "Is the difficulty curve for puzzles in Polatu too steep?" etc.) which would then inform what parts of the game would be tested, and how, to zero in on specific areas. These test plans meant we could maximise the return on investment for usability testing and capture data that was useful for the team
I also wanted to tackle the problem of recruiting unrepresentative players. In addition to creating a simple screener survey via Google Forms to ensure players from our existing database fit our required profile, I created a dedicated page on the Studio Gobo website to recruit additional players to increase the size of our database. I also discontinued the infrequent sessions with small sample sizes, and instead created a schedule of regular tests with at least five players (or pairs of players) per round. These were aligned with key milestones to test content and deliver feedback when it was most relevant.
The studio already had a dedicated usability lab which functioned well. It was comfortably furnished, and audio and video recording were available. Even so, there were some quick wins I was able to make.
Firstly, I switched to OBS for recording sessions (instead of Wirecast that was being used). OBS had more useful features (such as the ability to composite footage), and allowed me to stream sessions directly to the team. I also used a web plugin to record gamepad inputs, something the team had requested. The final footage then included the gameplay, gamepad inputs, and room feed.
Streaming the footage meant the team could watch in real time and as a group, which always got more traction than asking people to watch the videos afterwards. By having the team be involved in this aspect, it helped increase visibility of sessions and meant they could react instantly (when necessary) if they identified an issue. We would meet after each session to compare notes, allowing them to engage with this aspect of the tests too.
At the end of each round of testing I compiled the results in a presentation for the team. As part of this, I summarise the issue and describe the possible causes, and provide clips of them occurring for the team to watch. Finally, I also assigned a severity rating to issues based on whether they affected core systems, their persistence, and their recurrence. This high level summary was invaluable for the team as they could get a download of the issues and also observe them happening without watching every session.
To ensure issues could be included as part of sprint work and followed up on, I started tracking them via Hansoft (and Jira for later projects). Adding issues to project management software was an important step as it gave the wider team visibility, and meant I was able to check on the status of issues between sessions. I could then make sure they were re-tested to see if a given fix had worked.
As part of the report presentations, we would assign each issue to an owner who was then responsible for making sure a fix was applied, ideally before the next round of testing where possible. This ownership ensured nothing slipped through the cracks.
During pre-production of Moana, the team wanted to gain a better understanding of the target audience's in-game behaviour and cognitive abilities in a similar game. The team identified The Legend of Zelda: The Wind Waker as a quality and design benchmark. However, while the game was aimed at players of all ages, we hypothesised that some of the aspects of the game would pose problems for our target audience. By running formal usability sessions on Wind Waker, the aim was to uncover usability issues and successes with the design.
Over the course of two weeks, ten pairs of players in our target demographic (young girls between the ages of 6 and 12) were invited to the studio. Each pair played the opening 4 hours of the game, taking turns with the controls. During this time, players learnt some of the core combat and tool mechanics for the game, and completed two of the dungeon levels which emphasised puzzle solving.
In the end we uncovered a number of issues specific to our audience. These included a persistent lack of experimentation to solve puzzles from the youngest children, difficulty extracting meaning from ambiguous instructions, and general issues with fine motor skills and navigation. We also found that young players often preferred exploring the environment rather than focusing on their active mission. Free gameplay, as well as fun and unexpected interactions in the open world, were highlighted as the best parts by players.
These findings were presented to the studio and formed the basis of how the design for Moana would be implemented.
Aside from general support with UX input and usability testing, I also owned the UX design of several mini-games within Moana. The goals of the mini-games were to add variety and replayability to the game, unlocking new content as the player progressed, and rewarding exploration. The UI had to successfully bury complex systems in order to create a rewarding experience for players, while not interfering with the gameplay.
To initiate these mini-games, I opted to have the interactions in-world, rather than just popping up a UI. This made the experience feel more toylike and experimental, something which tested well with players. For example with the Music Fale mini-game which had three difficulties and several levels, the proposal had a button the player could jump on to change the stage, and they initiated the game by walking up to an interaction point for their desired difficulty. This proved to be very intuitive, and during usability tests players would happily spam jumping on the button to see the whole stage change in front of them.
With the team happy with the flow, I worked on the UI design. As we had 3 mini-games, I wanted to share as many elements as possible across these so players didn't need to learn new elements, this consistency was vital for reducing cognitive load especially for our youngest players.
As scoring was less critical in the second-to-second gameplay, I aligned this with the existing Infinity HUD leaning into the behaviour of players checking that section of the screen. For the score multiplier which was much more important, rather than having this in the same place for each mini-game, I instead positioned it based on where the player's attention was. This ensured they always had a clear view of this without needing to move their eyes away.
During my first months at Studio Gobo I was able to make quick wins to both design in general, and the feedback collected during usability tests. In both instances, I demonstrated the need for UX in the studio and was hired as the first dedicated UX Designer.
Most importantly, these processes are still in place today. The whole team is involved in UX of course, but I continue to be a driving force for a more player-centric approach in the design process. Additionally, the improvements I put in place for usability testing, and the lab setup, continue to deliver on their investment, ensuring we're able to test our games with representative players and make them even better.
Designed with ♥ somewhere between Brighton and Vancouver