Thursday, March 26, 2015

The Ins and Outs of Entrance and Exit Criteria

Hi All,

Here are the slides from today's webinar - "The Ins and Outs of Entrance and Exit Criteria".

You can find the webinar recording at: http://www.astqb.org/certified-tester-resources/istqb-certified-tester-resource-area/

Also, I am working to develop this topic into a micro-learning format for a 3-hour e-learning course. I may also conduct a live virtual session or two. If you are interested, just contact through my website at http://www.riceconsulting.com.

Thanks!


Friday, March 20, 2015

Live Virtual Training in User Acceptance Testing, Testing Mobile Devices, and ISTQB Foundation Level Agile Extension Certification


I am offering three courses in live virtual format. I will be the instructor on each of these. You can ask me questions along the way in each course and we will have some interactive exercises. Here is the line-up:

April 1 - 2 (Wed - Thursday) - User Acceptance Testing 1:00 p.m. - 4:00 p.m. EDT.  This is a great course to learn a structured way to approach user acceptance testing that validates systems from a real-world perspective. Register at https://www.mysoftwaretesting.com/Structured_User_Acceptance_Testing_Live_Virtual_p/uat-lv.htm

April 7 - 8 (Tuesday - Wednesday) - Testing Mobile Devices 1:00 p.m. - 4:00 p.m. EDT. Learn how to define a strategy and approach for mobile testing that fits the unique needs of your customers and technology. We will cover a wide variety of tests you can perform and you are welcome to try some of the tests on your own mobile devices during the class. Register at https://www.mysoftwaretesting.com/Testing_Mobile_Applications_Live_Virtual_Course_p/mobilelv.htm

April 14 - 16 (Tuesday - Thursday) - ISTQB Foundation Level Agile Extension Course 1:00 p.m. - 4:30 p.m. EDT. This is a follow-on to the ISTQB Foundation Level Certification. The focus is on understanding agile development and testing principles, as well as agile testing methods and tools. The course follows the ISTQB Foundation Level Extension Agile Tester Syllabus 2014. Register at https://www.mysoftwaretesting.com/ISTQB_Agile_Extension_Course_Public_Course_p/agilelv.htm

I really hope you can join me in one of these courses. If you want to have your team trained, just let me know and I'll create a custom quote for you.

Thursday, March 05, 2015

ISTQB Advanced Test Manager e-Learning Course Now Available

I am excited to announce the availability of the ISTQB Advanced Test Manager certification course in e-Learning format.

This course contains the equivalent of 4.5 days of live training. However, with e-learning, you can take the training in small pieces.

You get:
  • Narrated slide shows in mv4 format
  • Course notes in a PDF viewer
  • Exercises to help you apply the knowledge learned
  • Practice exams
  • Lifetime access
  • Access to me, Randy Rice, to ask any questions.
  • One electronic exam voucher 
Pre-requisites:
You must hold the CTFL certification and have at least 3 years of verifiable full-time experience in software or systems testing, development, quality assurance, engineering or a related field.

Click here for more details about the ISTQB Advanced Test Manager certification course.
Click here for a course demo.
Click here to register for the course. 

Feel free to contact me with any questions!

Thursday, February 26, 2015

Big Sale on e-Learning Courses in Software Testing!

For a limited time, I am offering discounted pricing on ALL my e-learning courses, including ISTQB Foundation Level and Advanced Level certification courses!

Just go to http://www.mysoftwaretesting.com to see the savings on each course.

Remember, as always, you get:
  • LIFETIME access to the training
  • In-depth course notes
  • Exams included on the certification courses
  • Direct access to me, Randy Rice, to ask any questions during the course
Hurry! This sale ends at Midnight CST on Friday, March 20th, 2015.
Questions? Contact me by e-mail or call 405-691-8075.

Friday, February 06, 2015

How Important is Credibility to Software Testers?

I have incorporated the importance of credibility in all my software testing training courses. Why? Because if people don't trust you as a person, they won't trust the information you provide. As testers, the end result of everything we do is information. That information must be timely, correct, and objective.

When teaching the importance of credibility, I always say "Just look at the current events to see how people have lost or gained credibility. You will find examples everywhere."

The issue of credibility came to mind in recent days as I have been watching the response unfold to the new revelation that Brian Williams admitted to incorrect reporting of an incident that occurred while embedded with troops in Iraq. Some might say Williams' account was mere embellishment of the event. Others, however, take a much stronger view and call it an outright lie. Williams has acknowledged the inaccuracy and issued an apology, which is always a good start to rebuild credibility.

In the case of Williams, the stakes are high since he is a well-respected news presenter on a major television network. This is also a good example of how credibility can rub-off on the entire team. Williams' story many well cause many to stop watching the entire network for news reporting. People may ask, "How many other reporters embellish or hide the truth?"

Now other reports from Williams, such as those from  Hurricane Katrina, have been called into question with reliable information from those who were there. Williams reported seeing a dead body floating face-down from his hotel in the French Quarter. The New Orleans Advocate states, "But the French Quarter, the original high ground of New Orleans, was not impacted by the floodwaters that overwhelmed the vast majority of the city."

I have a feeling this story will take on a life of its own. Now, there are accounts from the pilot of the helicopter saying they did take on fire, just not RPG fire. That may help clarify things, but it adds confusion to the issue which doesn't help rebuild credibility.

As a software tester, you may be tempted to overstate how many problems you have seen in a system or application, just to get people to pay attention to the fact there are problems. Don't succumb to that temptation. Instead, just show a really good example of a serious problem. If there are other similar issues, say so. As a tester, you can only report what you have seen. You can't predict what other problems may exist, even though you might be right.

In telling your testing story, the accurate, complete truth is all you need.

If you would like to read my three-part article series on How to Build, Destroy and Rebuild credibility, here are the links:

Building Your Credibility as a Tester

How to Destroy Your Credibility as a Tester

How to Rebuild Your Credibility as a Tester

I hope all this gives us all something to remember as a we report what we see in testing!

I would love to hear your comments, questions and experiences.

And...I would like to wish my son, Ryan Rice, a happy 34th birthday today!

Have a great weekend,

Randy

Friday, January 02, 2015

What Mythbusters Can Teach Software Testers

mythbusters
I hope you have all had a great Christmas and New Year. Normally at this time of year I write a posting about goal setting, reflecting on the past year for lessons learned or something similar, which are still important things to do.

I thought for this year, let's go for "Now for something completely different."

Between all the bowl games and Christmas movies on TV, the Science Channel has been showing a Mythbusters marathon for last couple of weeks. That, my friends, is a lot of episodes - more than I could ever take in. But I have enjoyed watching a few of the episodes.

I often refer to the Mythbusters show when explaining session-based testing in some of my software testing classes and workshops. Mythbusters is a perfect example of how to set-up a test, perform it, then evaluate the results.

However, in watching several episodes back-to-back, some clear trends emerged that I have not previously related to what we do as software testers.

Planning

You don't see the written plans, but believe me, they are there. Of course, you have the episode script, but there is much more going on behind the scenes. The crew has to get all the resources in place for the show. Some of the resources get expensive. In one episode I counted eight cars destroyed. Then you have explosives, the permissions to use the explosives, chemicals, the permission to use dangerous chemicals, special cameras and other devices. The list could go on, but the point is that those items, just like the items you need for testing, do not just magically appear. It takes planning.

Teamwork

It is impressive how each of the Mythbusters collaborate and get along. They may disagree on some things, but each one is committed to letting the test tell the story. They ask for input and feedback and it's refreshing to not see the drama that plays out in some of the "reality" shows.

Expert Consultation

When it comes to things beyond their scope of knowledge, the Mythbusters bring in a subject matter expert to evaluate the test results.

Measurement

The Mythbusters are fanatics which it comes to measuring what they do. Not only do they measure things, but they are careful to take good baseline and control measurements BEFORE they perform the test. If the Mythbusters can measure how much pressure it takes to get candy away from a baby, we can measure what we do in testing.

Estimation

The Mythbusters allocate certain amounts of time to their tests. That allows them to estimate the time needed for the entire test. The time for each of the tasks varies due to the complexity of the test. Setting the scope of the test is very important. The test is focused on ONE thing, not several. That allows them to stay right on point and on time.

It appeared to me the total test time broke into:
  • Planning - They discuss the urban legend to be busted or confirmed and how they plan to conduct the tests.
  • Set-up - This often takes up the largest percentage of time, depending on the specific test. In some cases, they go to very elaborate means to create the closest conditions possible to the purported occurrence of the original (supposedly) event. In one show, the crew built a jail cell out of cinder blocks, poured concrete in the inner space and reinforced it with rebar. That alone would have taken multiple days.
  • Performance - The tests themselves can go quickly, but the trick is that many of the tests are repeated either to confirm the findings or to get a large enough sample size. In fact, one impressive observation is that they will try a test several different ways. They might start by using just a model, then by using the real thing, such as car, motorcycle, etc.
  • Evaluation - This often goes quickly because the test has been so clearly defined, measured and documented.
  • Tear-down - This is the part we don't see, but in the case of the mock jail cell, I could see it taking a while.
I think a starting point for testers is to measure as a percentage of effort your testing tasks require, based on varying levels of test complexity.

Expected Results

The Mythbusters define what they expect to see either confirmed or busted. In fact, they know what constitutes "confirmed" or "busted" and what the measures must be to support the findings. It's at this point the viewer gets the clear impression that these guys practice science. Personally, I also practice testing as a science, not a guessing game or a diversion. However, the popularity of the show is that they have made science (and testing urban legends) fun and entertaining. I wonder if we could do that in testing?

No Boredom

This is the challenge for testers. We get to blow stuff up also, but not in a physical sense. Let's face it, a system crash is not as exciting as a car crash (Unless the system you are testing operates a car, but we won't go there...). So, how do we make software testing exciting and perhaps even entertaining?

By the way, if you haven't seen my video of testing a mobile phone radar app during a tornado, that was exciting! But, like the Mythbusters say "Don't try this at home."

The best answer I can give for this is to start thinking about testing software like the Mythbusters think about their tests. Create a challenge and build some excitement about confirming or busting ways that the software should work. You may not get to see a literal explosion, but you may make some people's heads almost explode in frustration when they see that really bad defect (I mean showstopper, dead-in-the-water, kind of defect) the day before the release is supposed to go out.

Some companies hold "bug bashes," but in my experience, most of the problems reported are either superficial or not actually a problem. The issues are often misunderstand of how the application should work. In one case, we held a bug bash at a company with customer service reps performing the tests. Over 200 issues were logged over a 4-hour period. 3 of the issues were actually confirmed defects!

Contests are fun, but be careful. Don't make the contest about who finds the most bugs. You will get a lot of bug reports and some of the better testing may be the tests that find few, but important, defects.

Perhaps a contest for the "Bug of the Week" or "Most Humorous Bug" (Hint: error messages are great places to look for humor) might be interesting. I would love to hear your ideas on this. I would also love to hear any things you or your company does to make testing exciting.

How's that for your New Year's challenge?

Now, go bust some of your own software myths!

Friday, November 14, 2014

Thoughts on the Consulting Profession

Sometimes I come across something that makes me realize I am the "anti" version of what I am seeing or hearing.

Recently, I saw a Facebook ad for a person's consulting course that promised high income quickly with no effort on the part of the "consultant" to actually do the work. "Everything is outsourced," he goes on to say. In his videos he shows all of his expensive collections, which include both a Ferrari and a Porsche. I'm thinking "Really?"

I'm not faulting his success or his income, but I do have a problem with the promotion of the concept that one can truly call themselves a consultant or an expert in something without actually doing the work involved. His high income is based on the markup of other people's subcontracting rates because they are the ones with the actual talent. Apparently, they just don't think they are worth what they are being billed for in the marketplace.

It does sound enticing and all, but I have learned over the years that my clients want to work with me, not someone I just contract with. I would like to have the "Four Hour Workweek", but that's just not the world I live in.

Nothing wrong with subcontracting, either. I sometimes team with other highly qualified and experienced consultants to help me on engagements where the scope is large. But I'm still heavily involved on the project.

I think of people like Gerry Weinberg or Alan Weiss who are master consultants and get their hands dirty in helping solve their client's problems. I mentioned in our webinar yesterday that I was fortunate to have read Weinberg's "Secrets of Consulting" way back in 1990 when I was first starting out on my own in software testing consulting. That book is rich in practical wisdom, as are Weiss' books. (Weiss also promotes the high income potential of consulting, but it is based on the value he personally brings to his clients.)

Without tooting my own horn too loudly, I just want to state for the record that I am a software quality and testing practitioner in my consulting and training practice. That establishes credibility with my clients and students. I do not get consulting work, only to then farm it out to sub-contractors. I don't consider that as true consulting.

True consulting is strategic and high-value. My goal is to do the work, then equip my clients to carry on - not to be around forever, as is the practice of some consulting firms. However, I'm always available to support my clients personally when they need ongoing help.

Yes, I still write test plans, work with test tools, lead teams and other detailed work so I can stay sharp technically. However, that is only one dimension of the consulting game - being able to consult and advise others because you have done it before yourself (and it wasn't all done 20 years ago).

Scott Adams, the creator of the Dilbert comic strip had a heyday with poking fun at consultants. His humor had a lot of truth in it, as did the movie "Office Space."

My point?

When choosing a consultant, look for 1) experience and knowledge in your specific area of problems (or opportunities), 2) the work ethic to actually spend time on your specific concerns, and 3) integrity and trust. All three need to be in place or you will be under-served.

Rant over and thanks for reading! I would love to hear your comments.

Randy


ISTQB Advanced Technical Test Analyst e-Learning Course





I am excited to announce the availability of my newest e-Learning course - The ISTQB Advanced Technical Test Analyst certification course.
 Register Now

To take a free demo, just click on the demo button.






This e-Learning course follows on from the ISTQB Foundation Level Course and leads to the ISTQB Advanced Technical Test Analyst Certification. The course focuses specifically on technical test analyst issues such as producing test documentation in relation to technical testing, choosing and applying appropriate specification-based, structure-based, defect-based and experienced-based test design techniques, and specifying test cases to evaluate software characteristics. Candidates will be given exercises, practice exams and learning aids for the ISTQB Advanced Technical Test Analyst qualification.

Webinar Recording and Slides from Estabilishing the Software Quality Organization

Hi all,

Thanks to everyone who attended yesterday's webinar on "Establishing the Software Quality Organization" with Tom Staab and me.

Here is the recording of the session: http://youtu.be/pczgcHGvV5Q

Here are the slides in PDF format: http://www.softwaretestingtrainingonline.com/cc/public_pdf/Establishing%20The%20SQO%20webinar%2011132014-1.pdf

I hope you can attend some of our future sessions!

Thanks,

Randy