Friday, August 12, 2011

Had a Blast at CAST 2011

CAST 2011 is over. I am still digesting the impressions and events of the last 3 days. I feel both intoxicated from all the information and energy I consumed as well as inspired and giddy. This was my first CAST. As a n00b I had the advantage of experiencing and feeling things more acutely than someone who has already attended a similar event or two like this before.
This conference was like a big party, a bee hive, an alliance of wise people who shared the same values and beliefs in their craft but yet would not hesitate to speak their mind, or ‘duel’ each other over an idea that they felt strongly about. It felt like a celebration of a new spiritual movement of renegade rebel testers passionate about their craft and open to new ideas, self development and learning.

As much as I tried to pace myself I could not help feeling overwhelmed with so much content being covered, phenomenal ideas and concepts touched upon, thoughts and quotes shared and unique humans met in such a short time. It was so nice to see the Brothers Bach, finally meeting in person my Miagi-Do school sensei and mentor Matt Heusser and Miagi testing karateka (practitioners) such as Ajay Balamurugadas and others. Having taken BBST Foundations and Bug Advocacy courses I grew very fond of Doug Hoffman and meeting him in person was another special event.

 It has been less than a year since I learned about and subsequently joined the Context Driven Testing (CDT) community but I think intuitively I have always tried to test from this perspective. The last day’s workshop by James Bach on Context-Driven Testing Leadership was dear to my heart. After meeting James Bach last September and surviving his grueling flowchart exercise puzzle I started learning about CDT. I remember when I first read the manifesto I tried to figure out what exactly is implied by ‘context’ since context might mean different things to different people.

To me ‘context’ covers a wide range of matters such as project information, environment, conditions, schedules and constraints, requirements and stakeholders’ priorities. I was really thrilled when James tasked the class to review the manifesto principles and provide feedback, suggestions and changes as the principles remained intact for the last few years which in his words ‘made him worried’. This exercise was an opportunity for me to hear what other participants had to say on the subject and help clarify or confirm my own deliberations.

I really enjoyed collaborating with my tablemates, Markus Gärtner, Phil McNeely and Alex Bantz, soft-spoken gentlemen full of original ideas and interesting work experiences and stories.  We felt stuck at times during our discussion but continued throwing ideas across the table and then came up with a list of suggestions and posted them to the flip chart. Markus Gärtner wrote about it in-depth in his blog. I thought it was a great idea to add another principle to the manifesto on the importance of self-education and self-development as it is considered a core value to constantly strive to improve our skills and knowledge.

Markus Gärtner held a session on self-education on the first day of the conference and suggested different ideas on how this can be achieved through reading and writing blog, articles and books; learning programming languages, attending various events and taking classes. He mentioned the AST BBST Foundations and Bug Advocacy courses that I have personally taken. I have found them extremely helpful in learning how to think outside the box and to deal with uncertainty in every day work situations.

Ajay Balamurugadas shared his story on Weekend Testing where passionate testers get together to practice their skills and help each other grow by providing feedback and sharing ideas. I think Ajay can be considered a testing hero who started this amazing movement which has become contagious and spread throughout the world with operating chapters in Asia, Europe and the US

I liked Ajay’s answer when he was asked what’s required to participate. His answer was simple – having passion, knowing English and being able to use Skype. I was willing to give it a try and lucky for me Michael Larsen led a session on Day 2 where online attendees collaborated with a few of us at CAST. It was bit challenging to participate in Skype threads. We ended up discussing things amongst ourselves in the room and then transcribing it online. Our test project was Ebay.com. Each of us worked on a task and upon completion we reported our results to the group and whether we felt that our mission was accomplished. I searched for the most expensive items in the marketplace and searched for ‘diamonds’. I ended up finding a domain name that was listed for $21 million dollars. I felt that my mini-mission was accomplished. I made a mental note to give the weekend testing session another try in the near future as I found this activity mind-stimulating and fun.

Another highlight of the conference was the Testing Competition. I was particularly excited about it as I wanted to see what it feels like to work on a team of testers, especially my ‘comrades’ from the Miagi-Do school whom I met for the first time at this conference. For the last 3-4 years I have worked as a lone tester on projects and even though I am an introvert a part of me craved teaming up with others for the sake of sharing ideas and exchanging feedback – something that I have not been able to do in recent years. This was a perfect opportunity to do that. When I first asked Matt Heusser whether Miagi-Do would participate he was not sure whether he should be on the team. Matt was one of the organizers of the conference and because of that it would disqualify us from winning a cash prize in case of victory. None of us seemed to care about this minor detail as we were thrilled to work with each other for the first time.

We hit a couple of snags in the beginning of the competition. The conference center’s network connection was not good and we were unable to download the program for testing. Ajay and I happened to have USB drives so we copied a program from folks who managed to download the app. Another challenge was the lack of Windows computers so we decided to pair test and share available machines. The competition was scheduled from 6-10pm. In that time the teams were expected to review available documentation, install and familiarize ourselves with the application, find and log bugs, and produce a final test summary report.

We set up testing sessions timed at 20 minutes each and threw ourselves into bug hunting. Ajay and I hit it off right away and got so absorbed in bug hunting and troubleshooting our findings that at times we would forget to tune into team discussions upon completion of the timed sessions. It was crazy intense and draining. The application was buggy. I was actually surprised that it was picked for this contest. 

In the beginning of the contest we looked up Elizabeth Hendrickson’s cheat sheet but it turned out that we did not need to bother with such depth. It took little effort to cause access violation errors but it was a challenge to log bugs and upload the logs and snapshots due to network latency issues. Those were the craziest 4 hours of speed testing in my professional experience. I felt the adrenalin rush throughout the whole competition and was disappointed that I had to leave earlier than the rest of the team to join my family at the hotel.

We learned the next day that the team earned high scores except for one item. We bluntly noted in our test summary report that the application program was not ready for testing and supported our statement with valid arguments and sentiments. This so-called ‘Black Flag’ gesture offended the developer and ultimately resulted in a major loss of points.

Having tested over 120 web and mobile apps through uTest in the last 8 months I felt a little bemused by the developer’s reaction. I have no regrets that we threw the Black Flag. I expected us to be challenged by the complexity of the application and to spend time on figuring out paths to make it fail. However, the fails consistently happened with nearly every little poke and prod. I enjoyed working with my Miagi-Do teammates and wished  for the clock to stop during competition so I could soak in the experience for a little while longer. But time has no pause button and all the good things seemed to end much faster than the not so good ones. I think I was still dreaming about the competition in my sleep…

The next day Matt told Ajay, Michael and me that based on our performance at the CAST 2011 Tester Competition all three of us were promoted to Black Belt in the Miagi-Do School of Software Testing. That was pretty cool even though I value the experience of bonding and camaraderie much more than ranks and awards. Just knowing that I got a chance to work with the best of the best in our field is inspiring and rewarding.

I am sad that CAST 2011 is over. It was unbelievably enjoyable and deeply inspiring even though I was not able to attend all the workshops, happy hours and dinners; or meet and talk to ALL of the members of this amazing community. I feel humbled by this experience and look forward to attending CAST 2012 in Silicon Valley. Anyone who considers himself an avid and passionate tester and is serious about our craft should attend this conference at least once.  I have no doubt in my mind that your perception of testing will change forever.

3 comments:

  1. The developer and I both felt that the "black flag" argument was insulting and groundless. If you had supported it with data and good warrants, as the winning team did, you would have won the competition yourself. Instead, it seemed you were saying that your personal dislike of the product justified not bothering to test it.

    ReplyDelete
  2. We tested the app extensively and thoroughly. This was not an emotional or preferential decision.
    Yes, we may have been harsh but we felt that honesty trumps massaging the feelings of the developer. =) Based on consistent failures this app was not ready for this level of testing. Let alone ready for the market. The concept and idea of the product was very good but we concluded that it was unreliable.

    ReplyDelete
  3. Great summary Lena.

    I will be defending my (team supported) decision to hypothetically black flag this project in a blog post once Matt Heusser returns from vacation and I have access to the full test report as submitted.

    I'd like to be explicitly state that we were never in the running for the prize and wish to take nothing away from the winning team. Even though we have no access to their report I don't doubt they performed well as the caliber of testers at CAST was top notch.

    This "Black Flag" is clearly hypothetical since our team did test this product and spend a great deal of effort on collaborating with the developer, and filing ~45 bugs. Very few of them complex in nature of requiring extensive application of testing skills to uncover. I have many more points to cover apart from just a sensitive developer (and possibly contest organizer ;-) ) But I can't discuss a case that's currently in litigation. ;-)

    I look forward to this continued discussion, but given I must still catch up with work I won't be ready to engage on my blog for at least another week.

    Again, my sincere congratulations to the winning team. We enjoyed participating in this challenge and I look forward to making my (our) case soon.

    Elena doesn't have access to the specific wording of the Black Flag so I thought I'd paste it here since there is discussion.



    This product should be "Black Flagged" due to the following level and severity of bugs. Like a race car leaking oil and parts on the track that pose a danger to ther drivers, this project should be sent to the pits.

    Intermittent but common memory access violation crashes
    Usability issues such as creating a new file opens a "Save-As" dialog.
    The edit menu has 0 options.
    The File menu has empty spaces in it that lead nowhere.

    As professional testing craftsmen this level of testing is not worthy of our effort. Any user would immediately find these bugs. Spending time filing comprehensive bug reports for such fundamental and severe defects is a waste of time and effort.

    I would suggest these bullets be submitted to the developer so that he can take some time to fix them and bring the product to a level of customer value that warrants applying the skills of our team.



    All the best!
    Adam Yuret

    ReplyDelete