3 min read

Tips for Being a Successful UAT Tester

By Luis Machado on Jul 9, 2021 12:48:44 PM

Blogpost-DisplayImage-July_Tips for being a successful UAT tester

User acceptance testing (UAT) is a critical practice to employ for a multitude of products and processes.  For the purpose of this article most of my examples will be within the context of migrating or merging instances for Atlasssian products. Nonetheless, these tips can be used for other avenues: I actually picked up these habits working as a QA tester for a video game publisher.

Context is king

When testing a product or a process, such as a migration or a merger of two instances, if you come across any issues, the most important thing you can do is provide as much context as possible so the developer or admin whose responsibility it is to correct the issue can have as best of an understanding as possible of how the issue came about. The best way to achieve this is by telling them what you did (repro steps), telling them what you expected to happen (expected result), and then telling them what actually happened (actual result).  By providing the steps you took and giving the context of what you expected from those steps, followed by what actually happened, it paints a better picture for the team in charge of dealing with it.

Screenshot or it didn’t happen

Speaking of pictures, we used to have a saying on the QA team I worked with: “Screenshot or it didn’t happen.” If you can provide a screenshot of your issue, you increase the chance that the person responsible for resolving the issue will be able to address it without any back and fourth.  Screenshots of any errors you see on pages, or incorrect configurations or data, help identify the exact issue, with no room for interpretation.  If you’re doing user acceptance testing, a screenshot of the UAT instance where the issue lives and what it looks like in production is even better. Again we’re trying to establish context for what your expectation was and what you actually saw.

Often during migrations or mergers, the individuals who are performing the work do not have the context of what the content is and what it should look like.  This is why user acceptance testing is such a valuable tool: It gives the users a chance to scope out the changes and see if anything looks wrong.  So it is the tester’s job to provide as much information as possible to resolve any issues. Here’s an example of an issue related to a migration:

  • Summary - Write a brief summary of the issue you’ve run into, it can be a simple statement, 2-3 sentences at most. (This can be optional depending on the medium for reporting the issue, if you’re using a Jira project to track bugs this would be important. If you’re tracking things in a table, the description would probably be sufficient)
  • Description - Provide a detailed description of what you observed. Include specifics like a link to the exact page or any particular tools used. This is a situation where less is less, more is more.
  • Reproduction Steps - Give a detailed step by step walkthrough of how you achieved the result.
  • Expected Result - At the end of the reproduction steps explain what you expected to see.
  • Actual Result - Also describe what you actually saw; be sure to indicate how this is different from the result you expected.
  • Expected and Actual results can sometimes be obvious or at least seem that way, just remember that it may be obvious to you but not necessarily to someone with a different context.
  • Screenshots - Where possible, include screenshots of the errors or issue you witnessed, and provide a comparison if possible to paint that contextual picture.

The most important thing to remember when doing testing of any kind is providing context. Always assume you can’t… assume anything! Treat it like the person you’re explaining the issue to has no idea what you’re talking about.  And if you have any questions regarding UAT, or how it can make the most of your processes, drop us a line, we'd love to help you out!

Topics: atlassian migrations tips gaming user-acceptance-testing merge
5 min read

The Case for User Acceptance Testing

By Amanda Babb on Oct 22, 2019 12:00:00 PM

New phone, who dis?

On a random Sunday, my husband surprised me by forcing me into the AT&T Store. You see, I had recently been struggling with my phone recently and although it was still a great phone, it just wasn't as fast as I needed it to be. My Bluetooth was sketchy at best, I had to use headphones or the speaker for phone calls, and my battery life meant I carried a backup battery everywhere. At almost four years old, I was carrying a dinosaur. I picked out my new iPhone Xs Max and started the data transfer process.

"Would you like a case?" 

Of course, I want a case. My shiny new toy needs to be protected from my own idiocy of sleeping with it, shoving it in my back pocket, throwing it on my car seat, stuffing it in my purse, and all the other things I do with my phone (read: play games while in the bathroom. You do it too. Don't judge). 

I like clear cases. After all, Apple didn't spend its resources thinking about the aesthetics of these devices for nothing. And while I want my precious..er..phone to be protected, I also want to showcase its beauty. I pulled two clear cases off the wall, debated for all of 10 seconds and made my choice: a Pelican Marine case that promises "total protection from the elements". Fancy. 

User Acceptance Testing

It took two days for me to put the case on my phone. Why? User Acceptance Testing (UAT). 

Yup. Those are the instructions for installing my case.

"Attention: this is the best way to do this and, by the way, we need you to test this before installing your phone."


"Put a piece of tissue in the case and submerge it in water for 30 minutes."


In case you're wondering, my case passed the test. I could move forward with the rest of the installation steps. But for me, this triggered a thought about User Acceptance Testing. 

Whenever we work with clients, we expect them to engage in User Acceptance Testing. We've spent a lot of time together implementing a solution that fits your needs and requirements. While we provide role-based test cases, it's up to you and your testing team to make sure the tools and solution match. Without this, the go-live for the solution will fail. According to The Standish Group's CHAOS report, user involvement is either number one or number two factor in successful, challenged, or failed projects. Considering we spend approximately $250 Billion on IT application development projects in the United States, wouldn't we want to engage our users early and often? 

Agile as a Feedback Mechanism

The CHAOS report includes comparisons between Agile and Waterfall frameworks and methodologies. However, the critical thing to remember is it's not HOW you implement a project, but whether or not you involve your users. Scott Ambler and Associates makes this comparison using the Cost of Change curve. While the Cost of Change curve is used to advocate using Agile, the message is clear regardless of framework. Early and active stakeholders lead to fewer defect costs because of the shortened feedback cycle. Users gotta use so we can make adjustments. 

This is what we forget about Agile. After all in the Agile Manifesto, two of the four points emphasize the people side of how we develop technology. We are supposed to value people over process and customer collaboration over contract negotiation. I have to ask, then, when was the last time you and your team held a stakeholder demo? Or created a focus group? Or at least bounced an idea off a mentor? This is where many organizations fail at agile: we're afraid to engage our users early for feedback. As a result, 50% of IT projects fail. And while the failure rate is dropping, I have to challenge whether it's because we're getting better at feedback or whether we're getting better at manipulating data. 

Test Plan: User, Customer, Stakeholder

Your audience is critical as are their roles. To solicit feedback early and often, you must place yourself in the role of the user, customer, or stakeholder. We've moved away from calling them User Stories to just Stories. However, the user profiles of these roles are critical to the creation of a good story. Who's the audience? What do they really want?

I like to think of the use case of my Pelican phone case. As a frequent traveler, I want the maximum protection for my phone for whatever I'm doing. I should be confident that it provides me impact, water, dust, and "being stupid" protection. I should also expect it to be lightweight, but aesthetically pleasing. In order to fulfill my requirements and acceptance criteria, what should my involvement be when I'm finally ready to use it? Is asking me to do a single, 30-minute test too much? At the price point of both the phone and the case, should I not be eager to be part of the testing process to make sure I have the highest quality product? I think this is a great way to validate whether or not the user gets what they wanted. And while my test was successful, there were clear instructions as to what to do if the test failed. 

Defining not only what you're intending to deliver, but to whom makes us better able to deliver the right thing with the right quality. Users gotta use. 

Lessons Learned: Users gotta use

No matter how cool or innovative or disruptive, if users don't use, we've failed. Complicated technology implementations or clunky user experiences or missing a key demographic are all failures. For as much as you expect to "fail fast", do you actually solicit the feedback needed early and often to adjust? I, for one, would love to see the failure statistics for my new phone case. My brain is putting together a Pareto chart of root causes as I type this. However, I also would like to see the number of users that didn't engage in the testing steps. After all, functional testing and user involvement is only as effective as the users that use. 

What do you think you can do better to engage the various roles of your users? And remember, it's not just tools and technology, it's the processes that support these as well. Want to learn how Praecipio Consulting outlines the importance of UAT in a migration? Watch our on demand webinar, Plan Your Jira Server to Cloud Migration

Topics: user-acceptance-testing

Praecipio Consulting is an Atlassian Platinum Partner

This means that we have the most experience working with Atlassian tools and have insight into new products, features, and beta testing. Through our profound knowledge of Atlassian environments and their intricacies, we can guide your organization as you navigate these important changes.


In need of professional assistance?


Contact Us