• About
  • Giving Back

aclairefication

~ using my evil powers for good

Category Archives: Context

Preparing for Winter

05 Tuesday Nov 2024

Posted by claire in Context, DevOps, Experiences, Experiments, Reliability

≈ Leave a Comment

Tags

DevOps

I recently discovered a delightful niche celebration called Fat Bear Week. More than a million people voted for the final contest this year to crown winner Queen Grazer. Leading up to the final vote, many members of the community campaign for their favorite contenders, highlighting their different strengths and achievements. As a newcomer, I appreciate their posts with various kinds of analysis about the bears’ risks and coping strategies as they all try to solve the same problem: readiness for the months ahead.

To begin, park rangers highlight bears for the Fat Bear Week bracket. Some of this has to do with which bears are engaging in the parts of the park that are monitored; after all, this is a participation sport for all the fans. While some fans are engaged months ahead of time, others come to the party at the end for the last week of the season. I’m sure you’ve seen a variety of risk profiles for your applications and your contributors. I like to think of a particular company’s suite of applications as balancing resources with the needs of the whole portfolio as we iterate toward improvement.

We can take lessons from these natural preparations:

  1. Learn from Failure
  2. Prepare for the Unexpected
  3. Prioritize Health

“You take care of what you love, and when you observe it enough you fall in love… the more you connect.” – Charlie Annenberg, Explore.org re: bearcams

Learn from Failure

Year over year, having data to go on helps us to see our continuous improvement, just as it helps long-term bear watchers to see the ongoing story of bears as they survive and thrive. Not all bears survive or thrive, and an analysis of this season’s bear tragedies and conflicts is inherent to studying the brown bears of Katmai National Park and Preserve. Human behaviors like feeding the bears or driving in bear territory can lead to failures for the bears as well as heartache for the people in the park and bearcam watchers. Conflicts among bears are a part of nature and knowing about those aspects of bear life helps people to appreciate the survivors. (Yes, we know bears gonna bear, but it’s still a lot to bear when you see it live.) Despite loss, the community celebrates the successes those bears had and their legacy for the Katmai bears who continue, with a new generation of bears taking over at Brooks Falls this year.

During this emotional outpouring, my first visit to this world of bearcam devotees included hearing some community members wish to preserve the experience of knowing bears for years versus attracting new fans who don’t have that context. Long-term viewers felt that newer viewers didn’t appreciate the history behind the losses so that they might seem trivial. I believe both long-term experience within the context and fresh perspective can bring value to our conversations about what we’re focused on, whether that’s software systems or bears. Hearing what has worked in the past gives me options about how to judge the present situation that intermix with my new ideas as an outsider. I see many of my skills from earlier roles as portable to reliability engineering, and that overlap points me to the alignment of our preparations with the overall goals of teams and the organization. Experts are invaluable in these conversations.

Prepare for the Unexpected

In their chats, interpretive rangers or ranger naturalists “interpret natural and cultural history to help people understand and appreciate the significance of national parks…. with education and interpretation related to the world-famous bearcams.” Watchers can submit questions for experienced professionals to discuss and answer. Reminds you of some remote working knowledge shares you’ve seen, right? Done well, a docent’s remarks can inspire action and bring up different ideas for investigation. While we bearcam viewers at home may not be running experiments with the animals, we’re definitely considering different ways to judge the animals’ weight gain, incorporating the various data sources into our discussions as we cheer our favorite bears. (Terrestrial Lidar?! Yes! Yes! and Yes!) I get similar value from hearing experienced industry colleagues talk about how they have faced problems and the solutions they’ve implemented to address the challenges. Looking for similar risk factors or application characteristics helps me to form experiments when adopting different practices. Will these practices be “best” in a context-free sense or are they optimal for a particular kind of problem that we face (e.g., a peak event)?

For the bears, hibernation season is the time to survive, and all their preparations are tested to the limits. For retail technologists, there are annual events that tend to coincide with other retailers in the industry, e.g., the heavily advertised holiday season. (Did I really go pick the bones of the Halloween crafts right next to aisles of Christmas trees and decor? Yes, yes, I did.) That test of our technology systems and business readiness is most visible during peak business events, when we take extra precautions and think carefully about the unexpected impact issues can have on systems. While we prepare year-round to have our systems stand up to the needs of the market, our routine incorporates seasonal variations designed into our business cycles. We prioritize the most urgent and important work that will support our success.

Prioritize Health

Prioritization. Bears are masters. Limited resources to devote to the crunch time leading up to their goal result in instinctive decision-making. They focus on bulking up at the expense of most everything else, although the mama bears have a juggling act of feeding and protecting cubs as well. As a mama bear myself, I identify with that struggle to focus while also staying aware of my surroundings and changing circumstances to be ready to act, although I’m not likely to be roaring in the face of a competitor like Queen Grazer would do. We develop instincts about our systems over time through experience in running them and familiarity with the people involved.

May we collaborate effectively to have a happily boring (read: peak business as usual) season of surviving and thriving!

And if you love brown bears, contribute to supporting them in Katmai!

SpringOne 2019 Links

09 Wednesday Oct 2019

Posted by claire in Coaching, Context, Developer Experience, Events, Personas, Speaking, SpringOne2019, Training, User Experience

≈ Leave a Comment

Thanks to everyone who came to our Time to Good DX presentation!

Time to Good DX

We often hear focus on the customer, but what do you do when you
customers are your coworkers? Developers are the largest group of
individual contributors in software teams. It’s about time Developer
Experience (DX) got the focus it deserves! Devs are users, too!
Wouldn’t it be great if your user needs were met?

DevNexus – TimeToGoodDX – HandoutDownload

I know an hour isn’t enough time to delve deeply into this area, so here are some links to help you to explore this important subset of UX!

Articles

Time to Hello World and this

Drink your own champagne

API docs as affordance and this

Communication and this

Development pain points

Characteristics of good DX

Great APIs – heuristic analysis

Developers as a special case of users

Product management in platform products and in API products

API model canvas

(Vanilla UX)

UX personas

Presentations

Great DX in 5 minutes!

Platform as Product

More platform as product

DX Segments

DX Principles

DX Trends

UX tools for Developer users

Lean Enterprise

Reports

Developer Skills [PDF]

Podcasts

Don’t Make Me Code

Greater than Code

Tooling

git-utils

assertj-swagger

Examples of DX

Jest automation framework

Netflix DX

Faster deployment

Visualizing metrics

Stripe API docs

Twilio API docs

Open source triage

Apigee DX

Salesforce DX and this

Attending to networks

06 Tuesday Aug 2019

Posted by claire in Approaches, Change Agent, Coaching, Context, Experiments, Soft Skills

≈ Leave a Comment

Martin Grandjean [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)]

I have been following Esther Derby with interest for years. I find her wise counsel refreshing and I admire her ability to connect deeply with attendees at conferences and training sessions. You can imagine my excitement upon finding her new book 7 Rules for Positive, Productive Change in my mailbox!

I sat down that evening and applied my customary approach to getting the lay of the land: starting with the index and moving backward to the table of contents. I had one major problem: I kept getting caught on helpful diagrams and interesting anecdotes. Still, I managed to charter a line of inquiry that led me to deep-dive in several parts of the book: networks of relationships in organizations and how they influence the success of change. I didn’t realize how long I’d been at reading up on Rule 4 until I looked up to see it was past midnight!

I love the idea of change by attraction. Change that people want to be part of is the kind of change I want to be an agent of. As I’ve previously written, I have sometimes seen an attitude of “not invented here” that corresponds to the “It won’t work here” argument that Esther’s approach debunks. Experimentation within the walls produces examples of what can work in this context.

I agree with a heuristic approach to figuring things out, so I wondered about the “rules” part of this book’s title. Happily, Esther recognizes that these rules are for “learning and problem-solving, especially when a bit of trial and error is involved… when there isn’t an obvious path.” Accordingly, I found helpful heuristics to guide my questioning when trying to understand how to help others with change.

In particular, I’ve become quite curious about the informal networks that quickly spread ideas, the people “whose opinions are trusted and respected and whom people go to for advice.” I couldn’t help geeking out about the graph theory aspects of the organizational network analysis (ONA), but strategies to “reshape the network to make it more useful both for sharing information about the teams and for sharing ideas and expertise” really got my attention. So I ordered a spare copy of this book to share with my local network.

I’m already thinking up different experiments that I might try to increase information sharing and connectedness of communities, both at work and among professional contacts. Now that my initial investigation has been fruitful, I’ve switched to working my way methodically through each page of the 7 Rules for Change and it’s helping me to sort out and prioritize those potential interventions. Providing more serendipity and more informal opportunities to connect with each other matters to me – and I’m so glad to have Esther’s insights to help guide my exploration. As she says, “Heuristics point a way, and methods and models guide action.”

DevNexus 2019 links

20 Wednesday Mar 2019

Posted by claire in Coaching, Context, Developer Experience, DevNexus2019, Events, Personas, Speaking, Training, User Experience

≈ Leave a Comment

Thanks to everyone who came to our Time to Good DX presentation!

Time to Good DX

We often hear focus on the customer, but what do you do when you
customers are your coworkers? Developers are the largest group of
individual contributors in software teams. It’s about time Developer
Experience (DX) got the focus it deserves! Devs are users, too!
Wouldn’t it be great if your user needs were met?

DevNexus – TimeToGoodDX – HandoutDownload
Time to Good DX from Claire Moss

I know an hour isn’t enough time to delve deeply into this area, so here are some links to help you to explore this important subset of UX!

Articles

Time to Hello World and this

Drink your own champagne

API docs as affordance and this

Communication and this

Development pain points

Characteristics of good DX

Great APIs – heuristic analysis

Developers as a special case of users

Product management in platform products and in API products

API model canvas

(Vanilla UX)

UX personas

Presentations

Great DX in 5 minutes!

Platform as Product

More platform as product

DX Segments

DX Principles

DX Trends

UX tools for Developer users

Lean Enterprise

Reports

Developer Skills [PDF]

Podcasts

Don’t Make Me Code

Greater than Code

Tooling

git-utils

assertj-swagger

Examples of DX

Jest automation framework

Netflix DX

Faster deployment

Visualizing metrics

Stripe API docs

Twilio API docs

Open source triage

Apigee DX

Salesforce DX and this

Minimum Viable Product Manager

29 Wednesday Aug 2018

Posted by claire in Agile, Agile2018, Approaches, Context, Experiments, Metrics, Protip, Retrospective, Scrum, Soft Skills, Training, User Stories

≈ Leave a Comment

At Agile2018, I attended Richard Seroter’s Product Ownership Explained session, where I heard about bad and good product owners. Product ownership/management has many facets including

  • advocating processes and tools
  • style of leadership
  • customer interactions
  • relationship with engineers
  • approach to continuous improvement
  • product lifecycle perspective
  • sourcing backlog items
  • decomposing work
  • running through a sprint
  • meeting involvement
  • approach to roadmap
  • outbound communication

Now I’ve been working alongside many customer proxy team members (e.g. business analyst, product owner, product manager) over the years. I’ve learned how to create testable, executable, deliverable user stories in a real-world setting. I wasn’t going into this talk blind. I just haven’t always focused on the Product role.

This time, I looked at the role with the mindset of what it would take for me to check all the boxes in the “good” list. As each slide appeared, my list of TODOs lengthened. I started to feel overwhelmed by the number of things I wanted to improve…

“How you doin’, honey?” “Do I have to answer?!?”

I walked out of that talk thinking I’m not sure I want to sign up for this epic journey. The vision of the idyllic end state was more daunting than inspiring. How could I possibly succeed at this enormous task? Would I want to sign up for that? My initial reaction was no! How could I take on all the technical debt of stretching into a new role like Product? How long would the roadmap to “good” take?

Analysis

When I evaluate things off the cuff, I often consider good-bad-indifferent. Maybe knowing what “good” and “bad” look like wasn’t helping me. I knew I didn’t want to be merely “indifferent”… maybe what I really wanted to know was this:

What does a minimum viable product manager look like?

One of my big takeaways from Problem Solving Leadership (PSL) with the late, great Jerry Weinberg was limiting work in process (WIP) or “one thing at a time” (as close to single piece flow as possible) improves effectiveness. If I take that approach to a PO/PM role, I’m afraid that I would completely fail. So I will reduce the practices to as few as I possibly can without completely losing the value of the role. I want only the *critical* path skills or capabilities! Everything else can be delegated or collectively owned or done without. So what can I discard?

In this thought experiment, I’m proposing finding the least possible investment in each essential aspect of the PO/PM role that would move from bad past merely indifferent to viable (but only just!). I needed to reduce my expectations! If I allow minimum viable to rest somewhere in my default scale, then it fits between indifferent and good. That means I deliberately do *not* attempt to inject all of the good practices at once. So let’s revisit the axes of expertise and the lists of behaviors that are good and bad…

What’s the least I could do?

Decomposition

Advocating processes and tools

Good: contextual & explanatory & collaborative (fitting process to team + pragmatic tool choices + only important consistency + explains process value + feedback leading to evolving)
Minimum viable: pragmatic minimalism (choose a simple process & let practices earn their way back in as value is explained + choose an available tool + allow consistency to emerge + request feedback)
Indifferent: status quo (follow existing process/ceremony w/o questioning + let others choose tools + don’t justify)
Bad: dogmatism (one practice fits all + adhere to ceremony + prescribed toolchain + standardization + trust process + don’t justify)

Style of leadership

Minimum viable: leads by example (models behaviors for others without trying to modify their behaviors) + doesn’t worry about respect + consultative decisions + experiments/loosely decides + sometimes available to the team but not constantly + flexible + defaults to already available metrics

Customer interactions

Minimum viable: meets with customers at least once + builds casual relationship with a key customer + gets second-hand reports on Production pain + occasional customer visit + default information sources

For me, this one slides a bit too far toward indifferent… I’m not sure how little I could care about customers and still get away with being acceptable at PO/PM…

Relationship with engineers

Minimum viable: physically co-locates when convenient + T-shaped when it comes to the technical domain (i.e. aware but not trying to develop that skill as an individual contributor) + attends standup + shares business/customer/user information at least at the beginning of every epic + champion for the product & trusts everyone on the team to protect their own time

Approach to continuous improvement

Minimum viable: default timebox + takes on at most 1 action item from retrospective, just like everyone else + plans on an ad hoc/as needed basis (pull system) allowing engineers to manage the flow of work to match their productivity + prioritizes necessary work to deliver value regardless of what it’s called (bug, chore, enhancement, etc)

Product lifecycle perspective

Minimum viable: tweaks customer onboarding in a small way to improve each time + cares about whole cross-functional team (agile, DevOps, etc) + asks questions about impact of changes + allows lack of value in an existing feature to bubble up over time

Sourcing backlog items

Minimum viable: occasionally talks to customers + cares about whole cross-functional team (including Support) + backlog is open to whole team to add items that can be prioritized + intake system emerges + tactical prioritization

I do have twinges about the lack of strategy here, so I guess I’m looking at this part of minimum viable Product *Owner* (i.e. the mid-range focus that Richard points out in his 10th slide).

Decomposing work

Minimum viable: progressive elaboration (i.e. I need to know details when it’s near term work and not before) + thin vertical slices and willing to leave “viable” to the next slice in order to get a tracer bullet sooner + trusts the team to monitor the duration of their work & to self-organize to remove dependencies (including modifying story slicing)

Running through a sprint

Minimum viable: doesn’t worry about timeboxes (kanban/flow/continuous/whatever) + focus on outcome of each piece of work (explores delivered value) + releases after acceptance (maybe this is just continuous delivery instead of continuous deployment, depends on business context)

Meeting involvement

Minimum viable: collaborates with team members to plan as needed (small things more often) + participates in retrospectives + ongoing self-study of PO/PM

Approach to roadmap

Minimum viable: priorities segmented by theme + roadmap includes past delivery/recent accomplishments + adjusts communication as needed/updates for new info + flexible timeline in a living document + published roadmap accessible to all stakeholders on self-serve basis

Outbound communication

Minimum viable: allows org to self-serve info + shares priorities with manager & customers + environment for continuous self-demo/trying features + transparency

What are the minimum viable versions of the tools of a product owner?

  • Backlog – list of ideas not fleshed out until it’s time to run them
  • Sprint planning – ad hoc meetings in a pull system initiated by the need for work definition to execute
  • Roadmap – technical vision of system capabilities + compelling story of the product value proposition
  • Prototyping, wireframing – whiteboard pictures + text-based descriptions
  • Team collaboration – a big TODO list that everyone can access
  • Surveying/user testing – chat program that both team & user can access
  • Analytics – NPS score informally collected from customer conversation
  • Product visioning – I think this goes in with Roadmap for me?

So I’ll agree that the PO/PM role is critical and necessary. I would like for creative problem solvers to fill the role – and to be fulfilled by the role! In order for that to be viable, for people to grow into a Product role, there needs to be education on how to begin – and it can’t be spring-fully-formed-from-the-head-of-Zeus! Christening someone PO/PM doesn’t endow them with sudden wisdom and insight. Skill takes time to develop.

Set realistic expectations for beginners. Help teams to welcome people to grow in the role by offering both challenge and support from all the team members. As with any team need, the agile team has collective ownership to solve the problem, not relying on a single point of failure in the role specialist. Having a beginner PO/PM is an excellent time to reinforce that!

Don't worry, people. I so got this!

If I were a Product Manger, I would definitely prefer to be a full-featured representative of that specialization! However, I encourage you to revisit Richard’s presentation and do your own decomposition of the Product role. What is absolutely essential? What can you do without?

What is *your* minimum viable Product Manager?

Agile Testing Days USA links

27 Wednesday Jun 2018

Posted by claire in Agile, Agile Testing Days USA, Approaches, Coaching, Context, Experiences, Experiments, Exploratory Testing, Podcast, Publications, Soft Skills, Speaking, Training

≈ Leave a Comment

Refactoring Test Collaboration from Claire Moss

Here are some resources we’re using in my Agile Testing Days USA workshop Refactoring Test Collaboration

Slides

Abstract

Collective ownership for testing starts with understanding testing. Rework your team dynamics to evolve past duplication and improve performance through whole team testing. Take home practical patterns for improving your team’s collaboration on testing. Because teams who own testing have more confidence in the customer value of their results.

As the Pragmatic Programmers say, “refactoring is an activity that needs to be undertaken slowly, deliberately, and carefully,” so how do we begin? In this session, we will experience the complex interactions of an agile team focused on demonstrating customer value by answering a series a questions:

  • Where do testers get their ideas?
  • How are you planning to accomplish this proposed testing, tester?
  • Why not automate all the things?
  • Who is going to do this manual testing and how does it work?
  • How do we know whether we’re testing the right things?

Build your own list of TODOs from these various practical collaboration approaches and begin deduping your team’s testing for a better first day back at the office.

Key-Learning

  • Approaches to handle objections to executing the testing work
  • Ways to mentor test helpers, including pairing
  • Investing in testing the team believes in
  • Understand how other team members have been testing the work so far
  • Advising on opportunities to inject test thinking into all of the team activities, from story writing through to unit testing, to make the system more testable

Resources

Refactoring

Collaboration + failing at collaboration

WHOSE testing skills + Exploratory testing + Elisabeth Hendrickson’s Test Heuristics Cheat Sheet [PDF] + book Explore It!

Agile Manifesto

Walking Skeletons, Butterflies, & Islands + my blog post elaborating on the conference

Big Visible Testing + my blog post elaborating on the presentation

Testing pyramid + critique of the testing pyramid/alternatives

Extreme programming lifecycle

eBook: Katrina Clokie’s A Practical Guide to Testing in DevOps + Role mapping

Westrum model + organizational culture & safety

Linda Rising’s change patterns & books on Fearless Change

Deployment pipeline

High Performance Practices [PDF] + book Accelerate

Continuous Testing

Empathy-Driven Development + empathy practices

Many interactive aspects of my workshop were inspired by Sharon Bowman’s book Training From the Back of the Room

facilitation book Collaboration Explained

metrics book Measuring and managing performance in organizations

book Testing Extreme Programming + some follow-up thoughts

Soon to come! Claire Moss on Let’s Talk About Tests, Baby podcast

Organizing meetups

03 Friday Mar 2017

Posted by claire in Approaches, Context, Experiences, Experiments, Lean Coffee, Protip, Retrospective, Social media, Soft Skills, Software Testing Club Atlanta, Speaking, Training, Unconference, Volunteering

≈ Leave a Comment

Announcing Ministry of Test Atlanta

Last fall was the last of our Software Testing Atlanta Conference (STAC) events. An attendee at my Intentional Learning Workshop chatted with me afterward. I mentioned that I have been a local meetup organizer and have struggled with how much control to retain. My attendee urged me to give the meetup back to the community and I have been pondering that ever since.

I’ve been the primary organizer of the Software Testing Club Atlanta meetup since we began as an affiliate of the UK-based Software Testing Club in October 2013. My charter has always been to serve and develop the local testing community including connecting it with the global virtual community. Not everyone agreed about including digital attendees, but I am willing to experience the friction of a virtual meeting to help people to attend who otherwise would not have a chance. Inclusion matters to me.

I also prefer small groups and experiential events/activities that Justin talks about. I have never had a goal of increasing the size of our meetup beyond what a single facilitator could manage in a workshop.

STAC was just a bigger extension of the meetup for me. I always wanted to reach more people in the local community, so putting together a conference focused on my geographic region was a great chance to bring new local voices to the fore. I never wanted it to be a big formal event, so I’m working on an ATL software testing unconference for the fall: shortSTAC. More on that to come!

This has been an awesome ride over the last 3 years, but we’re re-branding and branching out into our very own Meetup now known as Ministry of Test Atlanta!

Please join us to keep up with our events!

 

As part of our reboot, I wanted to share some thoughts on what challenges a meetup organizer confronts every month and why monthly events are so difficult to sustain!

Meetups are tough for reasons

 

1. Location, location, location!

People interested in testing are spread out across ATL and traffic suuuuuucks. Plus, I have no budget, so someone has to be willing to host for free or sponsor the venue fee $$. I don’t want to hold the meetup only in one part of the city since that alienates interested test enthusiasts. Proximity to public transit is something I’m not sure matters, but it would make the meetup more accessible to more testers.

Over the past 3 years, we’ve had completely different crowds depending on which part of the city we chose. I preferred to rotate locations to give everyone some opportunity to attend, even though that introduced uncertainty that probably negatively affected attendance… It’s impossible to make the “right” choice for everyone who *might* attend…

Anyway, I work at VersionOne now and that means I can host, so that’s one variable taken care of!

2. Scheduling

We hold meetings on weeknights assuming that people are more likely to do work-related things on workdays – and would be more reluctant to give up their weekend fun time to work-ish things. Getting all of the stars aligned to schedule these meetups monthly *and* give enough time for people to RSVP and then work out the logistics of showing up… Timing is hard.

Since we tend to meet after work, providing food and drink encourages people to attend, but that’s not free… and I have no budget.

3. Funding

Food and drink cost $$ – someone has to be willing to sponsor the foodz, and drink

Possible sources of funding:

  • donations from individual attendees
  • local sponsors (probably companies)
    • I’ll have to check on company budget to see whether I can do pizza & sodas every time but I know I can do it sometimes.
  • the Association for Software Testing
  • Software Testing Club/Ministry of Test
  • or even the Agile Alliance.

4. Content

Not everyone wants to present or run a workshop or host a round table or … yeah. People will show up but may not want to provide content. I have to find a willing volunteer to do it for free or someone to sponsor a fee $$.

We infrequently have presentations. Most of our events are workshops or rountables or some sort of interactive experience. My go-to is Lean Coffee since it lowers the barrier to getting groups together and provides value to attendees every time.

I’m definitely interested in scheduling joint events with other Atlanta meetups in the future.

5. Publicity

How do people find out about meetings? I do the social media management, but I have no budget so … mostly word of mouth otherwise? Maybe chat rooms?

  • Software Testing Club
  • Twitter
  • Facebook
  • Google Plus
  • LinkedIn

6. Audience

I assume that most of the people who want to come to a testing meetup are testers, but not all test enthusiasts are testers. We’ve had development-types show up, so I want to keep it open and inclusive.

7. Viewpoint advocated

I refuse to insist people agree with me. I won’t call it a context-driven testing meetup or an agile testing [PDF] meetup because I want to welcome people who subscribe to other philosophies of testing. That said, I also don’t want vendor talks (and yes I work for a vendor now). This group is for engaging with ideas focusing on and around testing, not for mind-clubbing or selling or exchanging business cards. Active participation is expected and encouraged.

8. Volunteers

Organizing: While I have always had a core group of enthusiastic participants, I’ve never had a formal organizing committee. Being a one-woman-show most of the time is pretty exhausting, y’all. The meetup consumed lots of my free time. I made my professional hobby the primary thing I did for fun outside of the office for years. Um… not a sustainable model. I do not recommend it. At the same time, working with others means compromise, so consider carefully the tradeoffs and find allies who believe in your mission.

Presenting: Members of my core group have all helped out with content for the meetup – for which I am eternally grateful! I’ve also encouraged other local aspiring presenters to practice on us. Occasionally, someone I know from the wider testing community is in town and joins us to share their wit and wisdom. I resisted presenting at my own event for a long long time… until I needed content LOL

Perception and Certainty

27 Friday Feb 2015

Posted by claire in Approaches, Context, Design, Experiences, Experiments, Soft Skills, Testing Humor, Training

≈ Leave a Comment

A funny thing happened today at work. I found out that some of my colleagues literally see things differently. Many of us found ourselves surprised by what others perceived to be true about something as simple as an image. We were swept up in #dressgate: a raging internet controversy about a photo of a dress and its colors.

I’m on Team Blue and Black. However, I wanted to see how the other half lives. I tried various ways to see white and gold: viewing the image on different devices, changing screen brightness, angling the screen, walking around in different ambient light. The various experiments all produced the same results. Trusting my perceptions, I could not give any credence to the perspective that the dress was a different pair of colors, despite seeing many online posts to that effect.

I mentioned this to my team at work, only to discover that there were others who had no idea anyone disagreed with them. As a member of Team White and Gold, my team’s designer was surprised to hear there was a Team Blue and Black – as surprised as I was. 🙂 I couldn’t help wondering whether she was expecting a covert camera to emerge as part of some elaborate prank.

Fortunately, working with designers means having deeper organizational knowledge about colors. By the time lunch rolled around, another colleague had created an online tool for experimentation with the image to see for ourselves how image manipulation would change perception. Another designer mentioned that he had sampled the original image to identify the colors and then created swatches of the colors perceived by others to overlay the image in order to show both positions contrasted with each other, explaining about the impact of shadows and subtle colorblindness.

Designers FTW!

Designers FTW!

Then, he suggested another avenue of investigation: flash blindness. In flash blindness, a bright light bleaches (oversaturates) the retinal pigment resulting in sudden vision loss that doesn’t immediately return to normal, but it usually wears off gradually. So my team devised an experiment to expose our designer’s eyes to a bright white lightsource: a blank page on a screen. When she quickly switched from the bright white background to the original dress image, she was able to see blue and black coloration. However, after a few moments, when she glanced at the dress image again, her retinas had recovered and she saw the original white and gold pigments. This was consistent with reports from other online posters who mentioned scrolling down the page and then being able to see different colors. This transient state seemed to be a source of great consternation and some panic.

While this was a fun way to spend our lunch hour, it was also a great opportunity to practice some of the problem-solving skills I learned at last year’s Problem Solving Leadership workshop:

  • Experimenting to gather information – Although I was not able to see the white and gold version of the dress without manipulating the image, I learned new ways that didn’t work.
  • Perceptions, What’s true for you – I felt quite certain about the stability my own perceptions after looking at them from various angles
  • Watch how other people are behaving – While I thought it was quite surprising that many others had such completely different perceptions, I did not assume they were wrong just because I couldn’t observe the same things.
  • Be cautious about not noticing – I gave others the benefit of the doubt knowing that I can bias myself to ignore information sometimes.
  • How to take in info – I looked for a variety of sources of information about the disparate points of view to obtain a balanced set of data.
  • Resisting information – I paid attention to reports of heated arguments between people from the different viewpoints, noticing the emotion involved in what seemed like a purely factual question.
  • Motives (test interpretation, seek intent) – I asked two observers from Team White and Gold questions since they could see what I could not
  • Reading minds – I tired not to assume that anyone was punking me or simply being ornery but instead was open to the possibility of being wrong.
  • Style vs intent (make more congruent) – Rather than trying to convince anyone of my point of view, I listened to their experiences and observed their learning process.
  • Social structures – It was interesting to see that even within the design group there were opposing assessments of the information. I also saw how team members collaborated rather than confronted each other when trying to understand where each was coming from.
  • How do you get people to recognize what you saw? – I waited for an opportunity for them to experience it directly and shared the information that I had so the other team members could judge for themselves, now that they had more to work data
  • Show you care by speaking up – I could have ignored people who didn’t agree with me, dismissing their viewpoint as simply wrong. However, engaging in dialogue was a great team-building experience and helped to establish more common understanding.
  • Reactions – By giving myself a charter of observing others’ behavior, thought processes, and evidence, I was better able to empathize with what was a shocking experience from their point of view.
  • Eyes open! Use your senses – I took suggestions from the designers about resources for assessing color perception and did not assume that I could gather unbiased information. In the end, I know more about myself than I did when this silly discussion started.
  • Learn from others – I certainly know more about color, perception, troubleshooting, experimentation, and these particular colleagues than I did before I posted the question “What color is this dress?” so I call today a win. 🙂
  • Aaaaand I couldn’t help trolling just a little bit by “wearing the colors” today…

Blue-Black or White-Gold?

Blue-Black or White-Gold?

 

March 2014 Software Testing Club Atlanta meetup

27 Thursday Feb 2014

Posted by claire in Approaches, Context, Experiences, Software Testing Club Atlanta, Speaking

≈ Leave a Comment

RSVP for the March 2014 meetup of Software Testing Club Atlanta features our own Eric Jacobson’s “Maybe We Don’t Have to Test It” from STAREast 2013:

Testers are taught they are responsible for all testing. Some even say “It’s not tested until I run the product myself.” Eric Jacobson believes this old school way of thinking can hurt a tester’s reputation and — even worse — may threaten the team’s success. Learning to recognize opportunities where you may not have to test can eliminate bottlenecks and make you everyone’s favorite tester. Eric shares eight patterns from his personal experiences where not testing was the best approach. Examples include patches for critical production problems that can’t get worse, features that are too technical for the tester, cosmetic bug fixes with substantial test setup, and more. Challenge your natural testing assumptions. Become more comfortable with approaches that don’t require testing. Eliminate waste in your testing process by asking, “Does this need to be tested? By me?” Take back ideas to manage not testing including using lightweight documentation for justification. You may find that not testing may actually be a means to better testing.

As quality assurance manager for Turner Broadcasting System’s Audience & Multi-Platform Technologies (AMPT) group, Eric Jacobson manages the test team responsible for Turner’s sales and strategic planning data warehouse and its broadcast traffic system. Eric was previously a test lead at Turner Broadcasting, responsible for testing the traffic system that schedules all commercials and programming on Turner’s ten domestic cable networks, including CNN, TNT, TBS, and Cartoon Network. Prior to joining TBS, he was a tester at Lucent Technologies. Eric joined the tester blogosphere in 2007 and has been blogging about testing on testthisblog.com every week since.

When Meetup.com is back up, I’ll link to the page so you can RSVP.

For now, plan to join us on the evening of Thursday March 20th.

Location TBD (Let me know if you want to host us!)

Testing For Humans? Try Empathy

11 Friday Oct 2013

Posted by claire in Context, Experiences, Experiments, Personas, Skype Test Chat, Soft Skills, Test Retreat, User Experience, User Stories

≈ 1 Comment

sympathy-empathy
Two months ago, Matt Heusser organized Test Retreat and I attended, along with 27 other testers open to new ideas and wanting to change the world. Sound a little ambitious? Let’s find out!

My first blog post in this series is about Michael Larsen‘s Testing For Humans, which he was unable to live blog due to presenting. 🙂 However, Michael did post Alessandra Moreira‘s notes in his live blog of the event.

Fortunately, I was able to live-tweet the talk that he described as:

Testing for Humans: It’s Sticky, Messy and Very Satisfying

Abstract: Software development is a beautiful thing. We often create amazing ideas and features that would be wonderful… if only those pesky humans that end up using, abusing, and mis-understanding our brilliant code weren’t part of the equation. Unfortunately, we’re all in the business of developing software for people (yes, even when it’s machine to machine communication, it serves human beings somewhere. What are some ways that we can approach testing software outside of the theoretical ideals, and actually come to grips with the fact that real human beings will be using these products? How can we really represent them, test for and on behalf of them, and actually bring in a feature that will not just make them happy, but represent the way they really and actually work, think and act?

Expected Deliverables: An excellent debate, some solid strategies we can all take home, and some “really good” practices that will be helpful in a variety of contexts.

My take-aways were:

    • People are imperfect so ideal users aren’t enough for testing.
    • By definition, a composite of many people (e.g. user persona) is a model.
    • Too many user descriptions based on small differences is overwhelming, not practical for testing.

On Wednesday night of this week, I joined Christin Wiedemann‘s regularly scheduled Skype test chat with some other lovely wonderful tester folks and we focused on empathy in testing. We wrestled our way to some working definitions of empathy and sympathy, which was much better than shallow agreement though it took a bit of time to establish. We agreed that testers need to observe, focus on, and understand users in order to serve them better. We find that empathy for our users and passion for our testing work go hand-in-hand since we care about helping people by producing good software.

Then we struggled with whether empathy is an innate trait of a person testing or whether empathy is a learnable skill that testers can develop through deliberate practice. (Go watch the video behind that link and come back to me.) We concluded that knowing what others are thinking and feeling, getting inside their skins, in the context of using the software is essential to good testing, though this might require a bit of perseverance. This can go a long way toward avoiding thinking we have enough information just because it’s all we know right now.

As I mentioned in the chat, I’ve found that user experience (UX) design is an amazing ally for testers. One tool that helped me to develop more empathy for my users is user personas. (Later, I found that forming user personas of my product teammates helped me to develop empathy for them as well.)

I immediately took to (end) user personas as a natural progression from user stories. After all, user stories are focused on value to and outcomes for a particular group of users. Describing those users more specifically in a user persona dovetailed nicely. Rather than some sterile requirements that never name the user, identifying a role – or, even better, a rich symbol such as a named primary persona – focuses the product team’s efforts on serving someone by helping us to understand the purpose of the work we do.

We also discussed interviewing users, visits to users, and experiential exercises as techniques to help us call upon empathy when we are testing. In my work history, I’ve been fortunate to hear about my UX team’s great work in a collaborative design workshop, to contribute to designing ad hoc personas for my product, to participate in a UX-led contextual inquiry, and to log actual usability sessions led by my product team’s UX designer. (Yes, my fast fingers came in handy. Yuk yuk.) My innovation days team developed a usability logging product that evolved from an existing solution I tested/used in those usability sessions, so I was a natural fit to test it. I’m curious about empathy maps but haven’t tried them myself yet.

It’s fair to say I’m a UX-infected tester. More than fair. I identify with the curiosity I see in the UX profession and I admire the courage to kill their darlings (carefully crafted designs) when evidence shows it is time to move on. After all, we’re not building this product to marvel at our own cleverness but instead to serve humans.

Image credit

← Older posts

♣ Subscribe

  • Entries (RSS)
  • Comments (RSS)

♣ Archives

  • November 2024
  • October 2019
  • September 2019
  • August 2019
  • March 2019
  • February 2019
  • November 2018
  • August 2018
  • June 2018
  • May 2018
  • March 2017
  • August 2016
  • May 2016
  • March 2015
  • February 2015
  • February 2014
  • January 2014
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • May 2013
  • April 2013
  • February 2013
  • December 2012
  • November 2012
  • October 2012
  • August 2012
  • July 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011

♣ Categories

  • #testchat
  • Acceptance Criteria
  • Agile
  • Agile Testing Days USA
  • Agile2013
  • Agile2018
  • AgileConnection
  • Approaches
  • Automation
  • Better Software
  • CAST 2011
  • CAST 2012
  • CAST 2013
  • CAST2016
  • Certification
  • Change Agent
  • Coaching
  • Context
  • DeliverAgile2018
  • Design
  • Developer Experience
  • DevNexus2019
  • DevOps
    • Reliability
  • Events
  • Experiences
  • Experiments
  • Exploratory Testing
  • Hackathon
  • ISST
  • ISTQB
  • Lean Coffee
  • Metrics
  • Mob Programming
  • Personas
  • Podcast
  • Protip
  • Publications
  • Retrospective
  • Scrum
  • Skype Test Chat
  • Social media
  • Soft Skills
  • Software Testing Club Atlanta
  • Speaking
  • SpringOne2019
  • STAREast 2011
  • STAREast 2012
  • STARWest 2011
  • STARWest 2013
  • Tea-time With Testers
  • Techwell
  • Test Retreat
  • TestCoachCamp 2012
  • Tester Merit Badges
  • Testing Circus
  • Testing Games
  • Testing Humor
  • Training
  • TWiST
  • Uncategorized
  • Unconference
  • User Experience
  • User Stories
  • Visualization
  • Volunteering
  • Weekend Testing

♣ Meta

  • Log in

Proudly powered by WordPress Theme: Chateau by Ignacio Ricci.