TestBash Philadelphia 2016

testbashphilly

On November 10-11th 2016 TestBash is back, this time in Philadelphia!

Let’s get ready to….. #TestBash in Philadelphia!!!

Testers of the world will (re)unite on November 10-11th 2016 in Philadelphia (US) for the most awesome, friendliest and jam packed software testing conference ever!

Our TestBash Philadelphia conference is a two day event all about software testing.

There will be plenty to soak up and learn. Speakers and their talks are listed below.

Date: Thursday 10th November 9.00am (Registration will open at 8am) – Friday 11th November 2016 5.30pm.

Location: FringeArts Theatre

Registration

Book early to save money and avoid sad faces!

Register now

Looking for a hotel to book?  See our hotels recommendation page.

There will be informal activities/meetups on the evenings of Wednesday, Thursday and Friday – please bear this in mind when booking your travel.

Sponsors

medidata_logo_color_rgb-01   TGS-Logo   poutline-01

We need sponsors to help make this TestBash awesome.  Find out how you can support TestBash.

The Speakers. The Very Awesome Speakers.

Gaining Consciousness – Fiona Charles

How do we draw a line in our decision making process between trained intuition and careless assumption?

Sometimes, an expert medical doctor makes a dazzlingly accurate diagnosis in a complex case that baffles other physicians, yet cannot explain how she knows what ails the patient. In other cases, a doctor with similar or greater expertise might miss the diagnostic mark entirely.

It’s natural and human for a skilled practitioner to make some decisions purely on instinct. We have trained our instincts through increasing experience and craft. But if we fool ourselves into believing that we can operate on instinct alone, we can lapse into a state like unconsciousness, as if we were walking in our sleep.

To avoid the rigid mindset of sleepwalkers and the potentially terrible mistakes that could result, testers have to question everything. We especially have to question our own assumptions.

Let’s explore some ways we can do that.

Fiona Charles teaches organizations to manage their software testing risk, and IT practitioners project skills “beyond process”—hands-on practical skills essential to thrive and excel on any kind of software project. An expert test consultant and manager, she has been immersed in the action through 30+ years of challenging projects across the business spectrum on both sides of the Atlantic. Throughout her career, Fiona has advocated, designed, implemented and taught pragmatic and humane practices to deliver software worth having. Fiona’s articles and blog posts appear frequently, and she conducts experiential workshops at international conferences, and in-house for clients.

She is the co-founder (with Anne-Marie Charrett), of Speak Easy, a volunteer organization whose goal is to increase gender diversity and help new speakers find their voices at tech conferences. Contact Fiona via her website www.quality-intelligence.com and follow her on twitter @FionaCCharles.

Testing without Testers (and other dumb ideas that sometimes work) – Alan Page

You’ve heard the rumors, and you’ve seen it happen. An organization or development team decides they don’t need testers, and you have big questions and massive concerns. Is quality not important anymore? Are they irresponsible or idiotic? Are their hats on too tight? Do testers still have jobs?

Alan Page is a career tester who has not only gone through the “no-tester” transition, he’s taking it head on and embracing it. Alan will share experiences, stories, strategies, and tactics (and failures) on how he’s taken everything he’s learned in over twenty years of software testing, and used those skills to have an impact on software engineering teams at Microsoft. Whether you’re going through this transition yourself, think it may be coming, or just want to tell someone what an absurd idea this is, this is the talk for you.

Alan Page has been building and testing software for nearly 25 years. He was the lead author on the book How We Test Software at Microsoft, contributed chapters for Beautiful Testing and Experiences of Test Automation: Case Studies of Software Test Automation , and recently published a collection of essays on test automation in The A Word. He also writes about a variety of software engineering subjects on his blog at http://angryweasel.com/blog.

Alan joined Microsoft as a member of the Windows 95 team, and since then has worked on a variety of Windows releases, early versions of Internet Explorer; Office Lync and Xbox One. Alan also served for two years as Microsoft’s Director of Test Excellence. Currently, he’s working on a brand new collaboration application for Microsoft.

A Mob Testing Experience – Maaret Pyhäjärvi

Skilled exploratory testing includes a lot of tacit knowledge, often acquired over a long period of time, learning in layers. The testers themselves have hard time explaining what and why they are doing to cover the application and identify risks and problems. What does skilled testing look like? How do you learn to test like an exploratory tester, with intent of understanding coverage while finding useful information? Mob testing – a group testing activity utilizing one computer – voices out the tacit knowledge in the group of individuals on a shared task. It makes a great mechanism for building habits and transferring skills over passing knowledge.

With this demo mob testing session, you get a glimpse into the heads of testers while they test because “for an idea to go from your head to the computer, it must go through someone else’s hands”. This is a specific communication style called Strong-style pairing, and it connects the group of brilliant minds in the mob and in the audience on a shared experience. The talk shows you how exploratory testing is learning about an application, one empirical experiment at a time and gives you ideas on how you could learn from your peers in a mob format.

We close the session with a group discussion retrospective. You were part of the experience as observers, what did we learn watching the patterns of this testing? Did the core group in the mob miss something the observers could pay attention to?

Maaret Pyhäjärvi is a software professional with testing emphasis. Her day job is working with a small software product development team as a hands-on testing specialist, doing continuous delivery with limited test automation. On the side, she teaches exploratory testing and makes a point of adding new, relevant feedback for test-automation heavy projects through skilled exploratory testing. In addition to being a tester and a teacher, she is a serial volunteer for different non-profits driving forward the state of software development. She blogs regularly at http://visible-quality.blogspot.fi and is the author of Mob Programming Guidebook.

Episode VII: A Tester Training Program Awakens – Megan Studzenski & Cheri Kure

A long time ago, in a galaxy far, far away, a lone Team Leader at Hyland Software recognized a need to train all new testers coming onto Hyland’s QA team. This is her story: How she worked to form a team focused on education, how she obtained buy-in from managers who had never focused on training, and how she created her program content. Come learn of her trials, tribulations, and triumphs in training: Over the past four years, the program has grown from a single-day program, to seven days–plus additional classes. The education team is now three trainers strong. Learn how she mentored and empowered her team members to expand their testing knowledge and shape the program for themselves. We’ll talk candidly about bumps in the road, introducing new ideas to leadership, teaching employees with little-to-no testing experience, and the impact this training program has had on our department. We can’t promise we’ll have lightsabers (though you never know), but we can promise you’ll walk away with more knowledge about what it takes to train testers on a department-wide scale.

Megan Studzenski is a Departmental Trainer in Hyland Software’s Quality Assurance department, which is an official way of saying she teaches people to be better testers. She is responsible for teaching introductory testing skills to new employees, as well as devising testing workshops and experiences for more skilled testers. Megan spent two years as a technical writer before transitioning to testing, and from there her love of public speaking led her into her current training role. She broke into conference speaking at CAST 2015 with a successful no-slides talk about how she does her job, and her pet project is advancing exploratory testing in a work environment driven primarily by bug reports. She occasionally tweets about testing, baseball, cats, and Star Wars at @TinyTesterTalks.

Cheri Kure has been involved in software testing for nearly 30 years.  She began her testing career at Fiserv Inc. testing IBM mainframe applications for financial institutions.  Cheri joined Hyland  Software Inc. in 2004 as a tester.  She has led and taught a testing team.  In her first eight years at Hyland Software she saw the Quality Assurance department triple in size, and decided she wanted to invest her time in teaching new employees how to test software.  Today Cheri leads the QA Education team, which is responsible for onboarding and training new testers.

What the Hell Kind of Testing is That? – Nancy Kelln

Many organizations are not ready to accept the differences between exploratory testing and more traditional testing methods. As testers who have an exploratory approach to testing it can be challenging to gain acceptance and buy-in from leadership. Often times people you are trying to sell to are left asking “What The Hell Kind of Testing Is That?” and not in a good way. As an exploratory tester, Nancy Kelln has implemented exploratory testing concepts at various organizations over the past six years. Her experience spans implementing these concepts as a tester, a test lead, and also as a manager. She also has experience in selling exploratory testing to testing teams, management, leadership and senior leadership across numerous IT organizations. During these implementations she has experienced many successful and failed attempts. Thru stories from the trenches we will examine the lessons learned at each of the organizations and share with attendees what worked and what didn’t. As well as how to recover when things go awry. If you are working with exploratory testing or have taken the Rapid Software Testing course and are wondering how to implement, this session will give you some valuable insight into how to proceed.

Test Manager at FGL Sports Ltd. with 16 years of diverse IT experience, Nancy enjoys working with teams that are implementing or enhancing their testing practices and provides adaptive testing approaches to both exploratory and traditional testing teams. She has coached test teams in various environments and facilitated numerous local and international workshops and presentations. From small scale to multi-million dollar projects; Nancy has played many roles within testing including Project Test Manager, Test Manager, Test Lead and Tester. A co-founder of POST, Nancy is an active member of the Calgary Software Quality Discussion Group, Association for Software Testing, and the Software Test Professionals organization. Nancy and her family live in Airdrie, Alberta, Canada. Connect with Nancy on Twitter @nkelln.

The road to enlightenment: How we learned to stop worrying and start treating infrastructure like any other feature – Abby Bangser and James Spargo

Ever think to yourself “I’d just love to be a fly on the wall of someone else having these problems”? Well here is your chance!

As a developer, James was tasked with building out both the production environment and the path our application code would take to get there. Abby was a Quality Analyst on the team and worked closely with the application team to identify risks early on, build quality into the application from the beginning, and drive team exploratory testing. This is where our interesting story starts.

We will take you on our quest to bring business needs to the forefront of infrastructure and deployment work. We wanted to treat infrastructure code the same way as application code, stories that provide business value, acceptance criteria following SMART principles, and testability at both an automated and exploratory level.This quest wasn’t always an easy one. We’re also going to talk about the highs, lows, heated debates, disagreements and celebrations in the hope of saving you from some of the same pains that we experienced on our quest for enlightened infrastructure work.

Through a combination of dramatic re-enactments, interspersed with presentations of the details, we will explain the situations we found ourselves in, the way we handled those situations (the good, the bad, and the ugly) and the learnings we took from them. You will leave this session a believer that by taking a more analytical view of infrastructure work, both the development team and the business will have the confidence to quickly and reliably deploy their application.

Abby Bangser has been an excited member of the Ministry of Testing family for 3 years now. After attending in 2014 she took her first ever stage in 2015 as a part of the 99 second talks in Brighton, was able to volunteer at TestBash NY in the fall and then co-host a workshop in Brighton 2016. Outside of TestBash, Abby has had the opportunity to speak on the DevOps track at Agile20xx and Agile Testing Days in 2015 as well as European Testing Conference and Nordic Testing Days in 2016.

At ThoughtWorks Abby has the opportunity to work in a variety of domains, countries, and team dynamics. While the technical challenges of each domain and tech stack have been interesting, she has realized that team practices and team ownership have a much deeper impact on the end deliverable.

James Spargo plays a multi-functional role in developing software for ThoughtWorks’ clients. As a developer he works on both functional and cross-functional technology and also helps identify road maps to help businesses perform better. James has been an active community member around the DFW area and more globally on Open Source Software projects. As a part of his community leadership James has presented at local meetups about the work he has done in the OSS space.

Succeeding as an Introvert – Elizabeth Zagroba

You’re an introvert. You do your best work when you can think a problem through alone in a quiet space. You express yourself better in writing or when you have a heads up before a meeting.

But your company is cool! So your office resembles a sweatshop: large rows of desks squished into a concrete room with minimal sound deadening. And you company culture encourages team work! So anyone can call you or stop by your desk with immediate requests of varying levels of emergency. You’re always being put on the spot, only later to think of who would be best qualified to answer the question, what a better solution might be, or where an inefficiency could be eliminated.

In my talk, I’ll frame my learnings from Quiet by Susan Cain and Introvert Power by Laurie Helgoe with personal experiences about how to function effectively in offices unfriendly to introverts. I’ll explore how American culture rewards those who speak the most over those who have something to say.

Elizabeth Zagroba is a context-driven software tester at Huge in Brooklyn. She’s tested innovative user interfaces for iOS and Android apps, responsive websites, content management systems, and streaming and on-demand audio. Before Huge, Elizabeth worked on the digital team at a public radio station in Manhattan.

Giving Something Back ­- Testing In The Pub ­ Live! – Dan Ashby and Stephen Janaway

We started the Testing In The Pub podcast (www.testinginthepub.co.uk), almost by accident in 2014. Myself and Dan Ashby would chat to each other over tea and talk about testing and whatever else was on our minds. One day we thought “why not record that? Someone may be interested”. We didn’t know that, 2 years, 26 episodes and countless thousands of downloads later, people would be listening and finding the shows useful for their testing.

Testing In The Pub is also our way of giving something back to the software testing community. The community is extremely important to us; we’ve learnt so much from it and I encourage all testers to be using it as much as they can. But that community won’t go on forever unless more people contribute to it. Without contributions a community dies.

In this presentation we would like to explain how and why we started Testing In The Pub. We’ll explain why starting a podcast or any other sort of community activity is extremely personally rewarding, as well as being an important part of what keeps the testing community as vibrant and useful as it is. We hope to encourage others to do the same.

We’d then like to do something different. The second half of the presentation will be a live recording of Testing In The Pub which can then be edited and distributed either during, or shortly after, the presentation. The subject matter for the podcast will be aligned with the conference themes and we would engage the audience and make them part of the show. In doing this we hope that everyone can have some fun, but we also hope to show them that starting and taking part in community activities is easy.

Stephen is a mobile and e-commerce Coach, Strategist and Manager. Over the last 15 years he’s worked for companies such as Nokia. Ericsson, Motorola and the YOOX NET-A-PORTER GROUP, as well as advising a number of mobile application companies on testing and delivery strategies. He has written and presented many times about testing, frequently with a focus on mobile devices and mobile applications. Stephen loves talking to others about software testing, test techniques and the mobile device and application world in general. You can contact him via his website (www.stephenjanaway.co.uk) or on Twitter (@stephenjanaway).

Dan Ashby has been a Software Tester for almost a decade now, testing a wide range of products from PC drivers, to printer software/hardware/firmware, to web apps and websites of all different sizes. He has a passion for Exploratory Testing with a focus on testing web-based applications and web sites. Find him on Twitter at @danashby04.

How to Get Automation Included in Your Definition of Done – Angie Jones

While most teams appreciate the benefits of automation, it is commonly viewed as too time-consuming to be considered as part of an agile sprint; resulting in automation being done in isolation and typically months after the story has been closed. There are several problems with this approach. The automation team members are not as familiar with the requirements as team members who were engaged within the sprint, which could result in them missing key aspects while testing. Automated regression testing of the features doesn’t take place until far too long after the feature has been delivered which means more manual regression and/or a period where new features are being introduced but no regression is taking place.

Join Angie Jones as she discusses agile-friendly approaches to test automation which will allow teams to close their sprints with automation in place. These automation techniques allow scrum teams to work smarter, not harder, and find bugs quicker with a more narrowed scope of the root cause, essentially leading to quicker resolution times. Angie will also walk through an example Story and demonstrate how to apply these techniques to ensure automation is achievable within the sprint.

Angie Jones is a Consulting Automation Engineer at LexisNexis. Angie advises several scrum teams on QA Automation strategies and best practices and has developed automation frameworks for countless software products. Angie is known for her innovative and out-of-the-box thinking style which has resulted in more than 20 patented inventions in the US and China. Angie is also an Adjunct Instructor of Computer Programming at Durham Technical Community College. Angie is a strong advocate for diversity in Technology and volunteers with organizations who champion this cause such as TechGirlz and Black Girls Code. She can be found on Twitter at @techgirl1908.

Embracing Change: The challenges of changing ‘traditional’ mindsets – Christina Ohanian

Some background context:

In the course of my work, I’m often asked how to help companies drive greater understanding and awareness of agility – especially for those who are used to following more traditional waterfall methodologies. After years of following a particular approach, legacy ways of working become entrenched, and changing them can feel impossible especially in companies that have siloed departments. Everywhere we look, new user behaviours and new technology are overturning industries at a blinding speed and for most companies with more traditional modes of working, keeping up – let alone getting ahead – is a business-critical concern especially when you bring quality into the picture.

The problem at hand:

So what holds companies back from making the shift toward agile thinking? ‘Radical’ change is seen as a big ask in terms of culture, time and resource – not to mention risk. This can produce a lot of resistance internally to adopting an agile approach, so how can we set about convincing more traditional mindsets to move towards agile practices? In truth, there’s no single or simple answer to this question, but as a former QA manager and current Agile Coach here’s my take on it.

My thoughts and experiences:

Change is a scary thing. The reality however, is that change is inevitable and learning to embrace change is both healthy and an urgent necessity. Without innovation, development and change, companies would ultimately stagnate and would never survive, let alone grow. For me, success comes down to four key things that are vital to help drive change in mindsets:

Value: Proof of concept – Why you need to break the rules and show working practices.
Trust: Collaborating and building relationships – The importance in getting everyone involved.
Time: Change doesn’t happen overnight – Think it’s 1 month’s work? You’re kidding yourself.
Flexibility: Learning to inspect & adapt – There’s no one way!

As an avid agile evangelist Christina is passionate about helping build and support self organising teams and individuals from the ground up. She loves learning about people, their passions and what motivates them. As a member of the Agile community of practice, she speak and run workshops at conferences. She is also a Graphics Illustrator and enjoys bringing this into the workspace to run meetings and help teams collaborate.

You can normally find her in the Spitfire building, London, working at The App Business, if not she is probably running a workshop somewhere or speaking at a conference, and if you fail at that, she is sitting in a coffee shop, having a latte and sketching!

Want to find out more? Find Christina on Twitter @ctohanian or on her website www.agileandsketch.com

The Deep End: Applying Newly Acquired Testing Skills on the Job – Israel Alvarez

Learning to be a software tester is hard work and beginning your career in a startup adds an extra layer of difficulty. Applying skills you have recently acquired while continuously studying your craft and learning about your business poses questions that have to be addressed. What information is important? How can you give feedback the quickest? When should you address test automation? And most importantly: how do you communicate testing concerns with PMs, devs, and the CEO? Through this talk I will present an experience report on my journey as new software tester and what I encountered and learned through how I approach testing, delivery, automation, and communication.

Israel Alvarez is the first and only QA Engineer at Thuzio, a platform that connect marketers & advertisers with influencers. Before working at Thuzio, he interned at fintech company Liquident where he was tasked with reviving test automation efforts. Prior to Liquidnet he was worked as a software tester at Enharmonic, an agency that helps startups with architecture problems. Having had majored in Philosophy, being passionate about such branches epistemology, metaphysics, and normative ethics, he discovered these academic knowledge can be extremely useful (if not necessary) to software testing. This realization is largely due to the Rapid Software Testing and Rapid Testing Intensive courses he’s taken with James Bach as well as the Software Testing curriculum developed by Keith Klain at Per Scholas. Today Israel enjoys reading as much philosophy as he can, training Brazilian Jiu jitsu, and working on Python projects

Test Like a Cat (Not a Dog) – Lanette Creamer

Since the days before history was recorded, dogs have been man’s best friend. Sharing our company along with our leftover scraps from the campfire, they are our companions. The loyalty, happy nature, cooperation and unconditional love a dog can provide has inspired us to keep them near for protection and enjoyment. Dogs obey the pack rules based on an established hierarchy, providing stability for the greater good. With their positivity and teamwork ability, dogs demonstrate traits that are admirable in a friend, colleague or employee. While dogs are also awesome, I believe as an industry, testers have dog traits well ingrained in our culture. It is time to move beyond, and while keeping the good traits we can learn from dogs, incorporate more tricks from cats.

Cats have a different history, being revered as Gods in ancient Egypt. They have yet to forget it. Cats will be fine with or without our approval or intervention, giving them a distinct survival advantage. Cats are charming with an alluring purr that contributes to healing as well as soothing the stress of their human companions. They not only will gladly eat what is provided to them, but recent research (2015, Dr. Gary Weitzman, Author of How to Speak Cat) has shown they meow while their humans are around in order to meet their goals. Regardless of the situation they find themselves in, a cat is actively seeking any advantage, or they are conserving their energy and awaiting more favorable conditions. Cats are observant and see well in situations where others may be at a loss, using a different method than dogs. Testers have learned to sniff out problem areas and search for nests of bugs. We hunt down bugs with determination. However, traits like patience and seeing in the dark, like a cat, when information is lacking are a huge advantage to add. Cats come equipped with whiskers (an excellent adaptable tool of self-awareness) that help them determine in advance what routes are possible, and which to rule out. Like cats, modern testers may share separate territory, be a part of a larger testing group, or be a lone feral tester in the wilds of an Agile project. Testers may have abundant requirements and ability to question developers, or they may have absolutely nothing, but you can be assured, if they test like a cat they will survive either way.

Saying software testing is dead is dead. The dog days of testing are over. Don’t work like a dog. Test like a cat!

Lanette likes testing software, glittery lip gloss, and her rescued shelter cat. Throughout her career, Lanette has evangelized advancement of real-time human thought balanced with tool assisted checking in whole team software quality. After working for a decade at Adobe, Lanette provided consulting for 2 years, training, testing, and leading teams on IT projects at a major coffee company, a medical data team and then started a testing discipline at Silicon Publishing, Inc. before finding her way to The Omni Group. She occasionally presents at conferences and blabbers (sometimes about testing) on Twitter @lanettecream. Lanette is now a Software Test Pilot working at The Omni Group exclusively on iOS and Mac software.

Agile Tester Interactions: The Story Of Story Kickoff – Ash Coleman

In this talk I will be encouraging testers to recognize the importance of their own perspective and personal accounts. Their knowledge base is valuable. It has significance and deserves to be heard. So how does one engage with their team to be heard? In short: the Agile User Story kickoff.

With each user story there is a beginning, a middle and an end. Establishing a user story by first understanding not only what is being asked for but also by hearing the described relevance and focus of other team members will fortify its outcome. Leading a collaborative effort of the team can prove that everyone has a different perspective to represent, dependent on their role, allowing all to be heard, be it in business, development or test.

In this talk we will be discussing the importance of story kickoff in an Agile setting, both individually and within a team. Focussing on how it can be incorporated into the workflow and the positives and negatives associated with this collaborative effort.

Ash Coleman is a Quality Assurance Analyst at Huge with over 4 years of digital experience. Since joining Huge she has been at the focal point of transitioning toward Behavior Driven Development methodologies. Her past experience as a professional chef has helped her establish a determination to understand and comply with user satisfaction as well as build her career in using technology as a means to satisfy user demands. Her continual desire to mediate between business and digital fronts has been well served by the culmination of her experiences.

Registration

Book early to avoid sad faces!

Register now

Microsponsors

We need sponsors to help make this TestBash awesome.  Find out how you can support TestBash.

chathamfin-logo_stack_rgb_cg9   sahi_pro_logo1626x1161    PerfBytes-Feature-Graphic  TISQA_logotestlodgemanchester    PractiTest-Logo      rl13-header-logo     bugreplaytm     vornexmanchester