Loading…
Agile2015 has ended

Sign up or log in to bookmark your favorites and sync them to your phone or calendar.

Testing & Quality [clear filter]
Monday, August 3
 

14:00

The Product Owner's Guide to Writing Acceptance Tests (Paul Carvalho)
Limited Capacity seats available

Abstract:
Aristotle once said "Well begun is half done." User Stories need 3 C's to get the team going, not two. Don't skip the Confirmation/Acceptance Criteria!
There is help for Product Owners (or BA's) who want to write good acceptance criteria. ATDD and BDD suggest writing these acceptance tests in Gherkin notation. Given-When-Then may seem odd or intimidating at first glance, but it gets easier with understanding and practice. Writing good acceptance criteria for User Stories requires looking at Requirements in a slightly different way.. and maybe picking a tester's brain too.
Join Agile Coach and Testing consultant, Paul Carvalho, as he shares insights, models and tips to help you embrace the art of writing good tests to help get your teams started on the right track. Learn new ideas and techniques to help you get the most from your Product Backlog Refinement activities to benefit your teams and customers.
Learning Outcomes:
  • Help Product Owners write clear acceptance criteria for user stories
  • Overcome the fear of Given-When-Then format for acceptance criteria
  • What Testers should know that they don't know about how they can help their teams learn to test better
Attachments:

Speakers
avatar for Paul Carvalho

Paul Carvalho

Agile Coach, Trainer, Quality Driven Inc.
Paul is a Testing expert, Agile coach, interactive teacher, Rubyist, comic relief, and efficiency enthusiast with over 20 years of experience in various domains. A Quality consultant by trade, Paul helps companies deliver world-class value. Beware: his eyes sparkle when he talks about... Read More →


Monday August 3, 2015 14:00 - 15:15
Chesapeake 7/8/9

15:45

Do the tester role survive in a test infected team? (Juan Gabardini)
Limited Capacity seats available

Abstract:
In a test-infected team everybody takes part on testing and quality. So, the question is: which testing perspectives and skills the team should develop? Should some members remains as 'testers'? Are those perspectives and skills the same as in the traditional tester role?
Testers were often second-class citizens in traditional teams. When evolving to agility, some feel they should become programmers or become obsolete because, as everybody test, then the tester role is not needed anymore. In other teams, someone in the team keep the ‘tester’ tag, but it is unclear what they should do.
In this session I will present some ideas on the tester role on both traditional and test infected teams and open a discussion to answer the above questions, not as a black or white choice but as context-aware response.
We will also discuss how to bring the test mindset and skills to the whole team.
Learning Outcomes:
  • Skills and mindset in a traditional tester role
  • Skills and mindset in the tester role in a test infected team
  • Heuristics, books, and exercises to develop testing skills and mindset in the whole team.
Attachments:

Speakers

Monday August 3, 2015 15:45 - 17:00
Potomac 5/6
 
Tuesday, August 4
 

09:00

Automated Testing of Mobile Apps (Karl Krukow)
Limited Capacity seats available

Abstract:
Quality on mobile is a challenge! Developing mobile apps requires dealing with multiple platforms, OS versions, form-factors and resolutions, varying hardware capabilities and network conditions. At the same time, users have extremely high expectations for the mobile experience and will promptly punish with bad App Store reviews when disappointed. User expectations are set by fast-moving consumer apps such as Facebook, YouTube and Google Maps which deliver fast, responsive, quality apps with frequent release cycles.
Do you want to get started with automated testing (and perhaps BDD) while delivering on the technical challenges posed by mobile? This session is for you! The talk aims to inspire and empower attendees to start mobile test automation today -- the time is right and the tools have matured.
We set the stage by discussing the challenges of mobile quality, and argue that automation is central in scaling QA and moving towards continuous delivery. Then we show you a way forward by providing an introduction to the most popular open-source mobile test automation tools out there suitable for use with the most popular devices. We have a love for BDD and in our demos, we show how to create executable specifications for mobile apps which act as cross-platform automated acceptance tests.
If time permits, we will very briefly show how one might tackle the device fragmentation problem using Xamarin Test Cloud, a cloud-based service that provides managed access to more than a thousand mobile devices for the purpose of mobile quality assurance.
Learning Outcomes:
  • Participants will leave with:
  • - An understanding of the importance of quality in mobile.
  • - A strong belief that professional practices can be applied in mobile too. That the tools have matured, and that automated testing, BDD, continuous integration and (almost) deployment is now viable and necessary.
  • - A way forward: how to get started today with cross-platform automated testing for mobile.
  • - Advice on technical best practices and pitfalls from an experienced practitioner.
  • - a fun experience in their backpacks ;)
Attachments:

Speakers
avatar for Karl Krukow

Karl Krukow

Lead, Xamarin Test Cloud, Xamarin
Karl Krukow is the Technical Lead on Xamarin Test Cloud, an innovative cloud-based service that provides managed access to more than a thousand mobile devices for the purpose of mobile quality assurance. Before joining Xamarin, Karl co-founded LessPainful, a mobile test automation... Read More →


Tuesday August 4, 2015 09:00 - 10:15
National Harbor 6/7

10:45

Explore with Intent - Exploratory Testing Self-Management (Maaret Pyhajarvi)
Limited Capacity seats available

Abstract:
As an active learner, you will get better every day you spend on testing. Exploratory testing treats test design, test execution and learning as parallel, mutually supportive activities, to find things we don’t know we don’t know. Doing things in parallel can be difficult, and testing needs to adjust to the tester’s personal skill level and style. Your skill to self-manage your work and your learning - making learning and reflection a habit - is what differentiates skilled exploratory testing from randomly putting testing activities together.
This session teaches you how to explore with intent that fits your personal style and skill, and how to be courteous towards your team members with your information needs. For self-management skills of exploratory testing, we use a notebook thinking tool that focuses on four types of ideas in parallel to keep track of our exploration: Mission (sandboxing my services), Next charter (goal for a timebox), Details (notes I can act on now or postpone a little) and Other charters (identifying more work).
In addition to sharing stories and notes I’ve created on a notebook while I test, we will practice together the most difficult thing to do in parallel: focus on detail and the big picture of testing.
Learning Outcomes:
  • Learn to test with intent that fits your personal style and skill with simple self-management tool
  • Learn how 2 hours of testing can be completely different in contents and how you control the contents
  • Learn to keep track of what you are about to do when the plan is supposed to change as you learn, to know if you are done
  • Learn to handle interruptions to your testing to improve its flow: report/ask now or later and to collect ideas of what to test later while you are testing
Attachments:

Speakers
avatar for Maaret Pyhäjärvi

Maaret Pyhäjärvi

Testing Specialist, Granlund Oy
Software specialist with soft spots for hands-on testing, helping teams grow and building successful products and businesses. Agile, Lean, Lean startup are mindsets I work with. I work towards happiness and wellbeing of people who create software - not just programmers.


Tuesday August 4, 2015 10:45 - 12:00
National Harbor 12

15:45

Example Mapping (Matt Wynne)
Limited Capacity seats available

Abstract:
In this session I'll teach you a simple, practical technique that you can use to break down any user story.
BDD and ATDD enthusiasts already know how useful it is to have the three amigos - tester, product owner and developer - meet to discuss a new user story before they start development. What many teams don't have is a clear structure for these conversations. Sometimes they can take a long time, or drain the group's energy by going round in circles.
Over many years of teaching hundreds of people about BDD, I've developed a simple practical technique that will allow you to break down a story in about 25 minutes. All you need is a pack of coloured index cards, some pens, and a curious attitude.

Learning Outcomes:
  • the purpose of a three amigos session
  • a practical technique for visualising what you know, and don't know about a user story
  • the difference between rules and examples



Speakers
avatar for Matt Wynne

Matt Wynne

Co-founder, director, Chief Mountaineering Officer, Cucumber Limited
Matt is one of the world's leading BDD practitioners. A programmer, coach, trainer and popular international speaker, he was as invited to join the Cucumber core team in 2009. Together with Aslak Hellesøy, the creator of Cucumber, he's co-author ofThe Cucumber Book, Behaviour-Driven... Read More →


Tuesday August 4, 2015 15:45 - 17:00
National Harbor 3
 
Wednesday, August 5
 

10:45

Agile Testing in the Enterprise (Janet Gregory)
Limited Capacity seats available

Abstract:
When agile development first gained popularity, agile meant collocated teams, including testers, programmers, analysts, and customers who were expected to perform many functions. As agile methods have spread and expanded, large organizations and those with globally-distributed teams are facing challenges with their testing in their agile deployment. One example is dependencies between teams mean that a single team cannot necessarily have complete control over testing a feature. Economies of scale for testing is something that many organizations have not considered; think about what testing belongs at the team level, and what testing may go beyond.
Having worked with many such teams, Janet Gregory has observed ways that testing in agile teams can still help deliver a high-quality software product. Whether your agile team is part of an enterprise solution, or part of a distributed team is scattered across time zones with individuals working remotely from home, or is part of an offshore outsourced project, you’ll take away methods and tools to help develop open communication, deal with cultural differences both within an organization and across continents specifically related to testing activities.
Learning Outcomes:
  • • Concrete ideas of how to tackle some of the testing issues that large teams and organizations face.
  • • Suggestions on how to problem solve your specific testing issue that your team faces
Attachments:

Speakers
avatar for Janet Gregory

Janet Gregory

Agile Coach, DragonFire Inc.
An agile testing coach and practitioner, Janet Gregory (@janetgregoryca) is the co-author of Agile Testing: A Practical Guide for Testers and Agile Teams, More Agile Testing: Learning Journeys for the Whole Team, and a contributor to 97 Things Every Programmer Should Know. Janet specializes... Read More →


Wednesday August 5, 2015 10:45 - 12:00
Potomac D

14:00

A Poet's Guide to Automated Testing (George Dinwiddie)
Limited Capacity seats available

Abstract:
When first starting out with automated acceptance tests, people are often happy just to get them to run correctly. Soon, however, they start finding they have to rewrite their old scenarios when new features are added. Or they disable some scenarios "for now" so they can continue to make progress. Newcomers need explanations to understand the tests. So do the business analysts. It even takes you awhile to figure out some of the older tests. Then, one day, the VP stops by, asking about them...
The crucial aspect of test automation is creating clear and expressive descriptions of the system being built. It’s easy to write tests that a computer can understand. But can you write tests that people, even non-technical people, can understand? Will it be obvious whether or not the test is correct? This is not a matter of dumbing things down.
Highlight the concepts. Express just the right details. There is a synergy between the expressiveness of tests and the maintainability. Achieving clarity in natural language is essential for their long-term viability. Come get some hints on expressing your tests clearly and succinctly.
Learning Outcomes:
  • Notice the effect of word choice
  • Select words for clarity and descriptiveness
  • Describing the assumed context
Attachments:

Speakers
avatar for George Dinwiddie

George Dinwiddie

Grand Poobah and Jack of All Trades, iDIA Computing, LLC
The promoter of the “Three Amigos” name for collaborative exploration of business requirements, George has worked with others to further the practical application of Behavior Driven Development (BDD). He helps organizations refine their business requirements to produce long-term... Read More →


Wednesday August 5, 2015 14:00 - 15:15
National Harbor 6/7

15:45

UseTables to Drive out Ambiguity/Redundancy, Discover Scenarios, and Solve World Hunger (Ken Pugh)
Limited Capacity seats available

Abstract:
Ambiguous or missing requirements cause waste, slipped schedules, and mistrust with an organization. Implementing a set of misunderstood requirements produces developer and customer frustration. Creating acceptance tests prior to implementation helps create a common understanding between business and development.
Acceptance tests start with communication between the members of the triad- business, developer, and tester. In this session, we specifically examine how to use tables as an effective means of communication. Employing tables as an analysis matrix helps a team discover missing scenarios. Redundant tests increase test load, so we show how performing an analogy of Karnaugh mapping on tables can help reduce redundant scenarios. We demonstrate that examining tables from various aspects, such as column headers, can reduce ambiguity and help form a domain specific language (DSL). A consistent DSL decreases frustration in discussing future requirements.
We briefly show how to turn the tables into tests for Fit and Gherkin syntax.
Learning Outcomes:
  • How to elicit details of a requirement using tabular format
  • How to use tables to search for missing scenarios in acceptance tests
  • How to discover ambiguity and redundancy in acceptance tests
  • A way to logically connect tables to classes and modules
  • How to break complicated requirements represented by tables into smaller ones
Attachments:

Speakers
avatar for Ken Pugh

Ken Pugh

Fellow Consultant, Net Objectives
Ken Pugh (ken.pugh@netobjectives.com, @kpugh, facebook/kpughconsutl) is a fellow consultant with Net Objectives (www.netobjectives.com). He helps companies transform into lean-agility through training and coaching. His particular interests are in communication (particularly effectively... Read More →


Wednesday August 5, 2015 15:45 - 17:00
National Harbor 13
 
Thursday, August 6
 

09:00

Hands-on manual UI testing workshop (Emma Armstrong, Lisa Crispin)
Limited Capacity seats available

Abstract:
Most of us are faced with User Interfaces to test but how many of us are taught to actually test them? Have you ever forgotten to test something about a UI or have you had to cover environments where you are less familiar with system variations that may affect the application.

This workshop will look at both the theory and practice of testing User Interfaces. Using physical examples we will look at how the environment changes the tests you need to consider.
Working together through these exercises, you will strengthen your own testing skills library that you can draw from in the future.
Learning Outcomes:
  • How to test a User Interface while it’s still in design
  • User Interface considerations to be aware of
  • Oracles and Heuristics to consider for testing User Interfaces
  • Environmental Variations that affect User Interfaces
Attachments:

Speakers
avatar for Emma Armstrong

Emma Armstrong

Mrs, Towers Watson
Emma Armstrong is a test engineer and has been baking quality into software since 2000. In that time she has gotten her hands dirty with both manual and automated testing and had the opportunity to dig into everything from compilers to web applications.She has worked with most methodologies... Read More →
avatar for Lisa Crispin

Lisa Crispin

Testing Advocate at mabl (USA)
Lisa Crispin is the co-author, with Janet Gregory, of More Agile Testing: Learning Journeys for the Whole Team (2014), Agile Testing: A Practical Guide for Testers and Agile Teams (2009), the LiveLessons “Agile Testing Essentials” video course, and “Agile Testing for the Whole... Read More →


Thursday August 6, 2015 09:00 - 10:15
Potomac D

10:45

Visual Testing: It’s Not What You Look At, It’s What You See (MIke Lyles)
Limited Capacity seats available

Abstract:
How many times have you driven all the way home, only to realize you didn’t remember anything from the drive. Your mind was in a different place, and you were driving on autopilot. Or maybe you walk out to your garage and get in your car every day and are so used to the surroundings that you don’t notice that something has been taken or moved to a new location. When our eyes are so familiar with the things we see every day, our brains are tricked into believing that there is nothing that has changed.
In the popular TV show, “Brain Games”, we find many exercises where you, the audience, are asked to pay attention and focus on what is happening. That simple focused attention gets the majority of people in trouble, because the art of focusing on a specific area or activity prohibits the audience from seeing things that are going on around them. This “inattentional blindness” causes key details to be missed. Your brain is the most complex tool that you will ever have in your possession. However, with a highly complex tool comes the need to ensure that it is used appropriately and to its full potential.
In the testing profession, such focused concentration, leading to “inattentional blindness” can be detrimental to the success of the product being delivered. As testers, we must find a way to constantly challenge our visual images and prohibit our brain from accepting that there are no changes which could impact the quality of the product. It is critical to be aware of the entire surroundings of the testing activity and to be able to recognize and call out changes that may be easily overlooked without an attention to detail.
In this session, Mike Lyles will challenge the audience to literally “think outside the box”. The audience will be given specific exercises to show how that the human mind sometimes overlooks details when they seem visually insignificant or unrelated. We will examine how testers can become better prepared for such oversights and discuss strategies that can be used immediately in your organizations. The key to eliminating the risk of oversight and missed problems is learning how to identify the areas where you may have originally ignored a focused effort.

Learning Outcomes:
  • Key Takeaways:
  • • An understanding that no matter how good we believe we are as testers, we have to realize that there is the possibility of being so familiar with a product that our eyes do not notice changes that sneak in.
  • • Tips to recognizing patterns and potential gaps that many visual testing activities may miss.
  • • Techniques that can be used in becoming a better visual tester.



Speakers
avatar for MIke Lyles

MIke Lyles

Quality Engineering Program Manager, Belk
Mike Lyles is a Quality Engineering Program Manager with over 22+ years in IT: development, PMO, and Software Testing. His experience spans functional testing, test environments, software configuration management, test data management, performance testing, test automation, service... Read More →


Thursday August 6, 2015 10:45 - 12:00
National Harbor 13

15:45

Performance Testing in Agile Contexts (Eric Proegler)
Limited Capacity seats available

Abstract:
The discipline of performance testing has had difficulty keeping up with Agile software development and deployment processes. Many people still see performance testing as a single experiment, run against a completely assembled, code-frozen, production-resourced system, with the “accuracy” of simulation and environment considered critical to the value of the data the test provides. This clashes directly with Agile principles of embracing continuous change, frequent delivery, and regular feedback.
Performance and scalability can become significant concerns once users get on the system, and can trigger expensive refactoring. Critical design decisions could come much more cheaply and sooner with timely performance feedback earlier in the project. How do we provide actionable and timely information about performance and reliability when the software is not (or never) complete, when the system is not yet assembled, or when the software will be deployed in more than one environment? I will deconstruct “realism” in performance simulation, talk about performance testing more cheaply in order to test more often, and suggest strategies and techniques.
Learning Outcomes:
  • What are the risks we are testing for?
  • What is “Realistic”? Why do we need it?
  • Agile Performance Testing Techniques
  • Testing with Available Hardware
  • Performance Testing Incomplete Systems
  • Strategies for Reporting Performance Feedback
Attachments:

Speakers
avatar for Eric Proegler

Eric Proegler

Director, Test Engineering, Medidata Solutions
Eric Proegler is a Director of Test Engineering for Medidata Solutions in San Francisco, California.Eric is the Vice President and Treasurer for the Association for Software Testing. He is also the lead organizer for WOPR, the Workshop on Performance and Reliability. He’s presented... Read More →


Thursday August 6, 2015 15:45 - 17:00
National Harbor 8
 
Friday, August 7
 

09:00

“Follow-your-nose” testing – questioning rules and overturning convention (Christin Wiedemann)
Limited Capacity seats available

Abstract:
Is testing really keeping up with the advances of software development? Are our testing approaches evolving as quickly as the new technologies, or are we being left behind, using the same methods and techniques as we did a decade ago?
Testing needs to get more innovative, find new ways to test more efficiently and effectively, and to better adapt to each unique context. The first step is to realize that testing is not about finding answers, but about asking questions. Nobel laureate Dr. Michael Smith advocated “follow-your-nose research” in his field, biotechnology; he was willing to pursue new ideas even if it meant that he had to learn new methods or technologies. Similarly testers should do “follow-your nose testing”, exploring new approaches and questioning old habits.
This workshop suggests an approach for test planning that encourages innovation and overcomes barriers to quality. Through a cogent discussion of ideas around brainstorming, collaboration and creativity, you are provided with new insights that can help you revolutionize the test industry! Working in smaller groups we explore different examples of test challenges we have experienced ourselves, covering topics ranging from tools and environments to methodologies and teams. Using our new tools for encouraging innovation through collaboration, we try to come up with revolutionary suggestions for how to address these challenges. Focusing on asking the right questions, we might also come up with a few answers.
Learning Outcomes:
  • Testing needs to continuously re-invent itself to keep up with the advances of software development, which means we need to creative a collaborative environment that encourages creativity and innovation.
  • Takeaways:
  • + A presentation of ideas about why testing needs to get more innovative
  • + Why classic brainstorming doesn’t work, and how to build creative, innovative teams
  • + Tools to re-invent testing practices
  • + An understanding of what “follow-your-nose testing” is, and how to apply it on any software development project
Attachments:

Speakers
avatar for Christin Wiedemann

Christin Wiedemann

Exec. VP, PQA Testing
After finishing her Ph.D. in Physics at Stockholm University in 2007, Christin Wiedemann started working in IT as a software developer, but soon discovered that she found software testing to be more interesting and challenging. Changing careers, she started working as a tester, and... Read More →


Friday August 7, 2015 09:00 - 10:15
Chesapeake 7/8/9