FPS External Play Test
My process for planning, conducting and analyzing Irradiate Games’ external play tests to inform our feature updates.
Role: Lead UX Strategist
Timeline: 2 months
Tools: UE5, Figma, Miro
-
Provide UX insights for our teams’ inquiries about our game:
How satisfying are the jump and dash mechanics?
Does the current combat ecosystem provide a varied gameplay experience?
Are the enemy types diverse and challenging?
What are players' opinions specifically on the sword weapon?
What accessibility features should be implemented?
-
The team outlined feature updates and department sprint tasks based on our UX research findings.
-
Team Constraints
First level wasn’t completed on time for testing and the LD Lead had to make a new test level in 3 days.
Days before the test build due date the Tech team noticed bugs with the enemies.
Due to disagreements on sprint task expectations the UX Researcher responsible for test recruitment and test location left the team.
UX Challenge
The new test build didn’t have a planned onboarding process and the LD lead wanted to present mechanic instructions as a wall of text at the start of the level.
Overcoming Challenges
I asked the team if we could test 2 weeks later to have time to fix bugs and design a new play test process.
I lead the UX Team in designing and implemented action prompts to onboard player to mechanics.
I recruited test participants and setup a play test environment in my home.
UX research process
With this first round of internal play testing, I started our cyclical process of assessing which design decisions need validating, observing how players adapt to the gameplay and analyzing player feedback to inform feature updates.
Use jump links below to skip to a section
Reviewed initial feedback Irradiate Games received from friends and colleagues.
Observed streamer play throughs of games with similar gameplay and organized those findings.
Presented competitive analysis to team leads to convey how players interpret similar gameplay and what we may want to consider.
Reviewed initial feedback Irradiate Games received from friends and colleagues.
Observed streamer play throughs of games with similar gameplay and organized those findings.
Presented competitive analysis to team leads to convey how players interpret similar gameplay and what we may want to consider.
Defined research objectives with department leads based on what the team wanted to understand and validate.
Wrote a play test script and plan to have a consistent process to walk participants through.
Scheduled and conducted playtest sessions with the Tech and Level Design team members.
Defined research objectives with department leads based on what the team wanted to understand and validate.
Wrote a play test script and plan to have a consistent process to walk participants through.
Scheduled and conducted playtest sessions with the Tech and Level Design team members.
Organized and workshopped playtest insights with department leads to brainstorm potential solutions.
Prioritized updates needed for the next round of testing based on level of effort.
Organized and workshopped playtest insights with department leads to brainstorm potential solutions.
Prioritized updates needed for the next round of testing based on level of effort.
Before I joined Irradiate Games, the Tech Lead had their friends and colleagues play through the game. I organized their feedback and did a competitive analysis to help us create a starting point for assessing our gameplay.
Research objective workshop
Reviewed initial feedback
First, I made a sticky note for each feedback quote to organize them by players’ actions, thoughts and feelings.
Next, I did an affinity mapping session where I sorted related feedback that resulted in these categories: AI intelligence, enemies, combat, gameplay, level design and mechanics.
For each category, I pointed out a few insights and suggested what we may wanted to further discover. I then walked the department leads through the miro board for group discussion.
From this exercise, our team realized we needed:
Enemies with diverse characteristics and balanced levels of difficulty.
More observational feedback noting what players do and how they strategize.
To ask players open ended, non-leading questions to gain more verbose findings on how they think and feel.
I did a competitive analysis on 5 games our gameplay is based on: Doom Eternal, Quake I, UltraKill, Shadow Warrior 3 and Wolfenstein: The New Order.
I created a filterable Notion database to organize 150 qualitative data findings from streamer play through observations.
Observed streamer play throughs
To help teammates review quickly, each finding:
Was filterable by game, related departments and research finding types such as friction, navigation and customization.
Included links to video examples of each interaction.
Presented competitive analysis insights
I formatted the tech related findings into a presentation that gave an overview of the games we want to emulate and how players have interpreted their gameplay.
The tech design presentation covered:
Player settings preferences such as their opinions on mouse sensitivity and their control remapping habits.
How players were onboarded to mechanics and whether the intended objectives and means of completing them were understood.
What players considered satisfying gameplay and how players wanted to feel like they’ve earned rewarding experiences.
Defining research goals
After reviewing what players thought of our mechanics and discussing gameplay examples from similar games, I formed questions they wanted our internal play testing to answer.
Collaborated on defining research objectives
First, I met with department leads to discuss what the team wanted to understand about players’ experiences with each of our mechanics.
Next, I formatted their interests into open ended non-leading questions and highlighted the key information we’re looking for as research goals.
I then wrote a play test script for the team to review to ensure test sessions covered what was most important to our team.
Assess their feel and usability.
Compare their feel and usability to their Doom Eternal counter parts.
Observe how players strategize with this combination of mechanics.
Identify potential updates to help players experience these mechanics as our developers intended.
Our play testing research goals:
With the test script finalized, I made play test session time slots with calendly for Tech and Level Design teammates to schedule their sessions.
I moderated and recorded 10 play test sessions over 2 weeks. I encouraged players to share their thoughts and kept quiet to hear what they instinctively brought up, only asking our questions if they weren’t yet answered after through observation.
Scheduled & conducted play test sessions
I rewatched each play test recording and noted the players’ quotes and observed actions to organize into themes and behavioral trends. Each insight considered whether the players were experiencing our gameplay as intended.
UI implementation
Organized and workshopped play test insights
For each mechanic, I organized player observations and feedback by related behaviors and thoughts. I wrote key takeaways for each trend group and asked the team leads to review the key takeaways before our discovery workshop.
“I didn't feel like I wanted to use the shotgun hook through out that entire sequence. Once I started getting momentum I just don't trust the mechanic as it stands at the minute to not put me in a situation I can't really control too much. Because I was in control and gaining momentum myself, this is feeling like a good pace for me. If I threw the hook in there, based on how its behaved based on past uses, I don't trust that it's not gonna get me killed.”
~ Participant quote
The thunder strike’s health drop amount wasn’t rewarding enough for players to rely on it for health regeneration.
The shotgun hook’s speed and direction of pulling players towards enemies was inconsistent and made it difficult to predict how to aim.
Players wanted more feedback such as being knocked back to perceive the effects of recoil and camera movements to imagine the effort of climbing up a ledge.
Some of our key findings were:
Prioritized updates needed
I collaborated with department leads to brainstorm potential solutions for the players’ needs weighing the pros and cons of the scalability and level of effort for each adjustment.
We had prioritization workshops to agree on a list of updates that I created dev tasks for the Tech Team to implement before our next round of testing.
Hide the weapon while mantling and have camera changes when mantling to simulate climbing.
Knock the player back when firing and add a spread of attack to emulate a shotgun.
Update the shotgun hook’s gap closing speed and aim more steady and make enemy hit boxes larger from a distance.
Some of our updates included:
Test study design
After reviewing what players thought of our mechanics and discussing gameplay examples from similar games, I formed questions they wanted our internal play testing to answer.
Collaborated on defining research objectives
First, I met with department leads to discuss what the team wanted to understand about players’ experiences with each of our mechanics.
Next, I formatted their interests into open ended non-leading questions and highlighted the key information we’re looking for as research goals.
I then wrote a play test script for the team to review to ensure test sessions covered what was most important to our team.
Assess their feel and usability.
Compare their feel and usability to their Doom Eternal counter parts.
Observe how players strategize with this combination of mechanics.
Identify potential updates to help players experience these mechanics as our developers intended.
Our play testing research goals:
I rewatched each play test recording and noted the players’ quotes and observed actions to organize into themes and behavioral trends. Each insight considered whether the players were experiencing our gameplay as intended.
Analyzing feedback
Organized and workshopped play test insights
For each mechanic, I organized player observations and feedback by related behaviors and thoughts. I wrote key takeaways for each trend group and asked the team leads to review the key takeaways before our discovery workshop.
“I didn't feel like I wanted to use the shotgun hook through out that entire sequence. Once I started getting momentum I just don't trust the mechanic as it stands at the minute to not put me in a situation I can't really control too much. Because I was in control and gaining momentum myself, this is feeling like a good pace for me. If I threw the hook in there, based on how its behaved based on past uses, I don't trust that it's not gonna get me killed.”
~ Participant quote
The thunder strike’s health drop amount wasn’t rewarding enough for players to rely on it for health regeneration.
The shotgun hook’s speed and direction of pulling players towards enemies was inconsistent and made it difficult to predict how to aim.
Players wanted more feedback such as being knocked back to perceive the effects of recoil and camera movements to imagine the effort of climbing up a ledge.
Some of our key findings were:
Presenting UX deliverables
After reviewing what players thought of our mechanics and discussing gameplay examples from similar games, I formed questions they wanted our internal play testing to answer.
Collaborated on defining research objectives
First, I met with department leads to discuss what the team wanted to understand about players’ experiences with each of our mechanics.
Next, I formatted their interests into open ended non-leading questions and highlighted the key information we’re looking for as research goals.
I then wrote a play test script for the team to review to ensure test sessions covered what was most important to our team.
Assess their feel and usability.
Compare their feel and usability to their Doom Eternal counter parts.
Observe how players strategize with this combination of mechanics.
Identify potential updates to help players experience these mechanics as our developers intended.
Our play testing research goals:
Retrospective
I’m so glad the team found our play test results and workshop sessions insightful and we’re looking forward to improving both our mechanics and testing process.
Irradiate Games’ next steps:
The Tech Team has finished implementing the updates we’ve planned.
The Level Design Team has made a new test environment for us to start scheduling sessions with test participants.
I started preparations for upcoming external play test sessions as well as round 2 of internal testing.
Lessons for Allyson to apply in the future:
Make research findings types less specific
After reviewing the categories with a Sr. UX Researcher, he mentioned the research finding types were a tad too specific but they’ll still work well for our next round of tests.
Make research feedback simpler and faster to review
I plan to lower the barrier to reviewing test findings ahead of workshopping our full Miro board of feedback.
I’ll share out a 1 page PDF with the team highlighting key takeaways and short video clip examples.