Loading…
Agile2015 has ended

Sign up or log in to bookmark your favorites and sync them to your phone or calendar.

Testing & Quality [clear filter]
Tuesday, August 4
 

09:00

Automated Testing of Mobile Apps (Karl Krukow)
Limited Capacity seats available

Abstract:
Quality on mobile is a challenge! Developing mobile apps requires dealing with multiple platforms, OS versions, form-factors and resolutions, varying hardware capabilities and network conditions. At the same time, users have extremely high expectations for the mobile experience and will promptly punish with bad App Store reviews when disappointed. User expectations are set by fast-moving consumer apps such as Facebook, YouTube and Google Maps which deliver fast, responsive, quality apps with frequent release cycles.
Do you want to get started with automated testing (and perhaps BDD) while delivering on the technical challenges posed by mobile? This session is for you! The talk aims to inspire and empower attendees to start mobile test automation today -- the time is right and the tools have matured.
We set the stage by discussing the challenges of mobile quality, and argue that automation is central in scaling QA and moving towards continuous delivery. Then we show you a way forward by providing an introduction to the most popular open-source mobile test automation tools out there suitable for use with the most popular devices. We have a love for BDD and in our demos, we show how to create executable specifications for mobile apps which act as cross-platform automated acceptance tests.
If time permits, we will very briefly show how one might tackle the device fragmentation problem using Xamarin Test Cloud, a cloud-based service that provides managed access to more than a thousand mobile devices for the purpose of mobile quality assurance.
Learning Outcomes:
  • Participants will leave with:
  • - An understanding of the importance of quality in mobile.
  • - A strong belief that professional practices can be applied in mobile too. That the tools have matured, and that automated testing, BDD, continuous integration and (almost) deployment is now viable and necessary.
  • - A way forward: how to get started today with cross-platform automated testing for mobile.
  • - Advice on technical best practices and pitfalls from an experienced practitioner.
  • - a fun experience in their backpacks ;)
Attachments:

Speakers
avatar for Karl Krukow

Karl Krukow

Lead, Xamarin Test Cloud, Xamarin
Karl Krukow is the Technical Lead on Xamarin Test Cloud, an innovative cloud-based service that provides managed access to more than a thousand mobile devices for the purpose of mobile quality assurance. Before joining Xamarin, Karl co-founded LessPainful, a mobile test automation... Read More →


Tuesday August 4, 2015 09:00 - 10:15
National Harbor 6/7

10:45

Explore with Intent - Exploratory Testing Self-Management (Maaret Pyhajarvi)
Limited Capacity seats available

Abstract:
As an active learner, you will get better every day you spend on testing. Exploratory testing treats test design, test execution and learning as parallel, mutually supportive activities, to find things we don’t know we don’t know. Doing things in parallel can be difficult, and testing needs to adjust to the tester’s personal skill level and style. Your skill to self-manage your work and your learning - making learning and reflection a habit - is what differentiates skilled exploratory testing from randomly putting testing activities together.
This session teaches you how to explore with intent that fits your personal style and skill, and how to be courteous towards your team members with your information needs. For self-management skills of exploratory testing, we use a notebook thinking tool that focuses on four types of ideas in parallel to keep track of our exploration: Mission (sandboxing my services), Next charter (goal for a timebox), Details (notes I can act on now or postpone a little) and Other charters (identifying more work).
In addition to sharing stories and notes I’ve created on a notebook while I test, we will practice together the most difficult thing to do in parallel: focus on detail and the big picture of testing.
Learning Outcomes:
  • Learn to test with intent that fits your personal style and skill with simple self-management tool
  • Learn how 2 hours of testing can be completely different in contents and how you control the contents
  • Learn to keep track of what you are about to do when the plan is supposed to change as you learn, to know if you are done
  • Learn to handle interruptions to your testing to improve its flow: report/ask now or later and to collect ideas of what to test later while you are testing
Attachments:

Speakers
avatar for Maaret Pyhäjärvi

Maaret Pyhäjärvi

Testing Specialist, Granlund Oy
Software specialist with soft spots for hands-on testing, helping teams grow and building successful products and businesses. Agile, Lean, Lean startup are mindsets I work with. I work towards happiness and wellbeing of people who create software - not just programmers.


Tuesday August 4, 2015 10:45 - 12:00
National Harbor 12
 
Wednesday, August 5
 

10:45

Agile Testing in the Enterprise (Janet Gregory)
Limited Capacity seats available

Abstract:
When agile development first gained popularity, agile meant collocated teams, including testers, programmers, analysts, and customers who were expected to perform many functions. As agile methods have spread and expanded, large organizations and those with globally-distributed teams are facing challenges with their testing in their agile deployment. One example is dependencies between teams mean that a single team cannot necessarily have complete control over testing a feature. Economies of scale for testing is something that many organizations have not considered; think about what testing belongs at the team level, and what testing may go beyond.
Having worked with many such teams, Janet Gregory has observed ways that testing in agile teams can still help deliver a high-quality software product. Whether your agile team is part of an enterprise solution, or part of a distributed team is scattered across time zones with individuals working remotely from home, or is part of an offshore outsourced project, you’ll take away methods and tools to help develop open communication, deal with cultural differences both within an organization and across continents specifically related to testing activities.
Learning Outcomes:
  • • Concrete ideas of how to tackle some of the testing issues that large teams and organizations face.
  • • Suggestions on how to problem solve your specific testing issue that your team faces
Attachments:

Speakers
avatar for Janet Gregory

Janet Gregory

Agile Coach, DragonFire Inc.
An agile testing coach and practitioner, Janet Gregory (@janetgregoryca) is the co-author of Agile Testing: A Practical Guide for Testers and Agile Teams, More Agile Testing: Learning Journeys for the Whole Team, and a contributor to 97 Things Every Programmer Should Know. Janet specializes... Read More →


Wednesday August 5, 2015 10:45 - 12:00
Potomac D

14:00

A Poet's Guide to Automated Testing (George Dinwiddie)
Limited Capacity seats available

Abstract:
When first starting out with automated acceptance tests, people are often happy just to get them to run correctly. Soon, however, they start finding they have to rewrite their old scenarios when new features are added. Or they disable some scenarios "for now" so they can continue to make progress. Newcomers need explanations to understand the tests. So do the business analysts. It even takes you awhile to figure out some of the older tests. Then, one day, the VP stops by, asking about them...
The crucial aspect of test automation is creating clear and expressive descriptions of the system being built. It’s easy to write tests that a computer can understand. But can you write tests that people, even non-technical people, can understand? Will it be obvious whether or not the test is correct? This is not a matter of dumbing things down.
Highlight the concepts. Express just the right details. There is a synergy between the expressiveness of tests and the maintainability. Achieving clarity in natural language is essential for their long-term viability. Come get some hints on expressing your tests clearly and succinctly.
Learning Outcomes:
  • Notice the effect of word choice
  • Select words for clarity and descriptiveness
  • Describing the assumed context
Attachments:

Speakers
avatar for George Dinwiddie

George Dinwiddie

Grand Poobah and Jack of All Trades, iDIA Computing, LLC
The promoter of the “Three Amigos” name for collaborative exploration of business requirements, George has worked with others to further the practical application of Behavior Driven Development (BDD). He helps organizations refine their business requirements to produce long-term... Read More →


Wednesday August 5, 2015 14:00 - 15:15
National Harbor 6/7

15:45

UseTables to Drive out Ambiguity/Redundancy, Discover Scenarios, and Solve World Hunger (Ken Pugh)
Limited Capacity seats available

Abstract:
Ambiguous or missing requirements cause waste, slipped schedules, and mistrust with an organization. Implementing a set of misunderstood requirements produces developer and customer frustration. Creating acceptance tests prior to implementation helps create a common understanding between business and development.
Acceptance tests start with communication between the members of the triad- business, developer, and tester. In this session, we specifically examine how to use tables as an effective means of communication. Employing tables as an analysis matrix helps a team discover missing scenarios. Redundant tests increase test load, so we show how performing an analogy of Karnaugh mapping on tables can help reduce redundant scenarios. We demonstrate that examining tables from various aspects, such as column headers, can reduce ambiguity and help form a domain specific language (DSL). A consistent DSL decreases frustration in discussing future requirements.
We briefly show how to turn the tables into tests for Fit and Gherkin syntax.
Learning Outcomes:
  • How to elicit details of a requirement using tabular format
  • How to use tables to search for missing scenarios in acceptance tests
  • How to discover ambiguity and redundancy in acceptance tests
  • A way to logically connect tables to classes and modules
  • How to break complicated requirements represented by tables into smaller ones
Attachments:

Speakers
avatar for Ken Pugh

Ken Pugh

Fellow Consultant, Net Objectives
Ken Pugh (ken.pugh@netobjectives.com, @kpugh, facebook/kpughconsutl) is a fellow consultant with Net Objectives (www.netobjectives.com). He helps companies transform into lean-agility through training and coaching. His particular interests are in communication (particularly effectively... Read More →


Wednesday August 5, 2015 15:45 - 17:00
National Harbor 13
 
Thursday, August 6
 

15:45

Performance Testing in Agile Contexts (Eric Proegler)
Limited Capacity seats available

Abstract:
The discipline of performance testing has had difficulty keeping up with Agile software development and deployment processes. Many people still see performance testing as a single experiment, run against a completely assembled, code-frozen, production-resourced system, with the “accuracy” of simulation and environment considered critical to the value of the data the test provides. This clashes directly with Agile principles of embracing continuous change, frequent delivery, and regular feedback.
Performance and scalability can become significant concerns once users get on the system, and can trigger expensive refactoring. Critical design decisions could come much more cheaply and sooner with timely performance feedback earlier in the project. How do we provide actionable and timely information about performance and reliability when the software is not (or never) complete, when the system is not yet assembled, or when the software will be deployed in more than one environment? I will deconstruct “realism” in performance simulation, talk about performance testing more cheaply in order to test more often, and suggest strategies and techniques.
Learning Outcomes:
  • What are the risks we are testing for?
  • What is “Realistic”? Why do we need it?
  • Agile Performance Testing Techniques
  • Testing with Available Hardware
  • Performance Testing Incomplete Systems
  • Strategies for Reporting Performance Feedback
Attachments:

Speakers
avatar for Eric Proegler

Eric Proegler

Director, Test Engineering, Medidata Solutions
Eric Proegler is a Director of Test Engineering for Medidata Solutions in San Francisco, California.Eric is the Vice President and Treasurer for the Association for Software Testing. He is also the lead organizer for WOPR, the Workshop on Performance and Reliability. He’s presented... Read More →


Thursday August 6, 2015 15:45 - 17:00
National Harbor 8