24 September, 2012

Interview with Santhosh Tuppad on Usability


 Do you like great interiors and ambience at hotels or restaurants you visit? One could correlate food to functional and interiors and ambience adding up to usability which adds value to the dining experience.

Here is an interview with Santhosh Tuppad who emphasizes on usability being crucial testing for any software that is designed and developed for people around the world or people of specific location or any living being or could be anything. Usability is a deeper study and testers who say, “Oh, usability is to see if application is user friendly”, they are mostly faking it. The hard truth on the contrary is, it's not easy and has never been easy.  I got Santhosh to answer a few questions. Take a look.


Why do you call usability as a crucial ingredient for software?


It is because, you want to;
a.      Win more customers
b.      Give the best to your customers
c.       Win over your competitors
d.      Entice your competitors customers as your customers
e.      Give better user experience for your customers
f.        Want to make more business and increase your revenues
g.      Make people spread a word about your software
h.      Ultimately, help do GREAT BUSINESS!



My management is not serious about it or developers just reject the usability bugs I report?


This is a classical problem that most of the testers find. Here are some tips that might make them serious about it. Even if you could change ones thinking, then you have made a better impact).

  • Build good rapport with your manager and make him / her understand about usability and its importance.
  • Conduct a meeting with your team members and speak to them about usability and how to report them (Bug Advocacy).
  • Even if developers reject your bugs, there should be a genuine reason and not just like that. Keep reporting even if they reject. All the testers in your team should report huge number of usability bugs and then I bet they will not reject them or higher management looks into them because so many bugs are being rejected. At least, you will not be pointed when customer reports the same usability bug which you wanted to report but were biased that the developer may reject it and hence did not report it.


What books do you read / how do you practice usability?


  • Psychology reference book
  • http://useit.com/ and http://boxesandarrows.com/ are good ones to refer (Be careful, you need not agree with everything these people are telling. There need to be your own thinking, that’s when you could make things better rather than following someone else’s ideas blindly. Here, you will see how I opposed the idea of Jakob Nielsen – http://useit.com/ however; I refer to it for some cool study).
  • I design websites (Well, buggy ones)
  • I interact with UI / UX designers
  • I attend conferences on UI / UX
  • I keep myself updated with new technologies which could add value to software when implemented or when enhanced
  • I co-related usability with the things in day to day life. I do exercises with the things around me about usability.


What are the important skills you think you posess for getting better at Usability testing


Analyzing “What could be good” for end-users. Most of the times, I have succeeded when I analysed things over time. However, there is bad side to it when you just start getting biased by your thoughts of usability rather than seeing it from customers’ point of view.  I have seen testers saying, “You got to put yourself in customers shoes” and then test for usability. Well, its easy to say; but too challenging to do it. Your customers might range from -- millions, billions, trillions of end-users with different brains, different thinking abilities and different ways of using software. Now, you see, “Wow, this is really challenging”. If you still do not feel it, then either you do not want to accept it or you do not understand it. Continue to do your functional testing. Most of the organizations are happy with it.

I have cultivated a practice of arguing with my own thoughts about usability. I have seen at times, when my colleagues just agreed whatever I said but, I go back and think on those lines. Finally, I get better approach than what I really said. So, that’s the thinking power. Do not stop just because people stopped arguing with you. You argue with your own thoughts or you say to yourself, there is something better I can do rather than sticking to my old ideas.



Do you conduct workshops or talks or would be interested in guiding aspiring testers?


I am not ready to conduct workshops on usability yet. In future, I will be.

With respect to talks, I am interested to explore opportunities in formats like half-day seminars or 2 hour talks on usability.

With respect to mentoring or guiding someone, I feel I am running out of my bandwidth as there are many people who are approaching me for security testing. However, I could support you people over e-mails but, do not expect quick replies. I would at least take 1 week however; you might receive responses at the earliest in few hours or minutes as well.


Related articles on Usability




About Santhosh Tuppad

Santhosh Tuppad is the Co-founder & Senior Tester of Moolya Software Testing Private Limited (www.moolya.com). He also recently won the uTest Top Tester of the Year 2010 apart from winning several testing competitions from uTest and Zappers. Santhosh specializes in the exploratory testing approach and his core interests are security, usability and accessibility amidst other quality criteria. Santhosh loves writing and he has a blog at: http://tuppad.com/blog/. He has also authored several articles and crash courses. He attends conferences and confers with testers he meets. Santhosh is known for testing skills and if you are passionate in testing, feel free to contact him at: Twitter: santhoshst | Skype: santhosh.s.tuppad | Santhosh.Tuppad@gmail.com

20 September, 2012

Why testing isn't good? - Inspection Vs. Prevention

Once upon a time in Testerland, there lived an enthusiastic tester called Ether (Enthusiastic Tester). Ether was a well known tester at the time. He had great rapport with Techland, Devland, Businessland, Salesland to name a few. His knowledge and talent was the talk of the land for every release that went into trouble just before the D-day. His reputation was such that a new phase called 'Ether' was introduced for every release. This meant Ether had to bless every feature before his organization Org released it to the market.

Ether was happy with the way his life was going. He had been getting a pay hike every six months and a promotion every two years. Not to mention the heap of awards showered on him on a quarterly basis.

And then came the storm
Suddenly, Org wanted several "Ethers" in Testerland. They wanted to create more Ethers, train them and do what it takes to replicate many such Ethers. After all, one Ether could not succeed in blessing all releases 24/7. He was human and he needed to avoid burning out. Organization thought the same. Thankfully.

Org lined up all its concerns in testing with Test-eagle, the highest authority in Testing. Test-eagle had set up the first team in Org which had now grown to about 10,000 testers in Testerland alone. He was a key guy responsible for all the innovative stuff that had happened in Testerland so far. It was only obvious he had to get involved in this mission.

Test-eagle met up with top executives in Testersland and heard them out. He decided that a few decisions had to be made. His questions were pretty simple:
  1. Why are more testers not transforming like Ethers?
  2. What is stopping Org from building more Ethers?
He thought for a week's time and presented a report to his leadership team. His proposal was okayed in a day's time. No questions asked. After all, Test-eagle knew exactly what he was doing.

Test-eagle lined up all the managers in Testerland, Devland, Techland and others and passed a mandate on the following:
  1. A new role 'Lead tester' was created who report to Quackle, who headed QCOE - Quality Center of Excellence 
  2. For every two testers, there will be a lead tester
  3. Lead tester will supervise tester's work on a day to day basis
  4. Lead tester will come up with metrics based on tester's work
  5. Lead tester will present the findings to the leadership once every Quarter
Lead testers were the new Quality Inspectors
Test-eagle was to review the results after 1 month. Interestingly, Test-eagle didn't have an inkling of what was happening with testers in Testerland although he passed the "Inspection Bill" and expected it to rock as always.

One month later...........
Test-eagle was welcomed with a pile of results and findings from lead testers who were proud of their findings. Some key observations were listed as below:
  1. Ether had left Org for greener pastures
  2. "Ethers in the making" had left Org for greener pastures
  3. Testers in close proximity to Ether in talent and knowledge also left ...... for greener pastures of course!
  4. Testers who were doing well on an average stopped doing so. Motivation had dropped
  5. Testers who were doing what they were told to do no longer did it; instead stopped doing anything at all. Productivity was as low as nothing
  6. Testers who didn't do anything meaningful continued to do the same
  7. Customer started complaining that even basic requirements were not functioning properly
Quality lay breathless at Org's Doorstep

What happened?
  1. Lead testers brutally pin pointed how testers can be better in what they do.
  2. Lead testers wanted the testers to follow 50% scripted approach and 50% exploratory approach in addition to 20% of time/effort contribution to Automation
  3. Lead testers wanted to review every bug before it was reported in the bug tracking system
  4. Lead testers questioned the severity of bugs
  5. Lead testers rejected several valid bugs because they didn't think that those were bugs
  6. Lead testers questioned testers if they missed any bugs
  7. Lead testers questioned testers if they thought testers tested very little on a particular day
Lead testers forced the testers to TRY HARDER and DO BETTER which testers were already doing. Something more had to be done. No one knew what that "something more" was.

What went wrong?
Test-eagle ignored the following which was a sub-set of problems that existed:
  1. Test environments were never stable. Release management team hardly took onus
  2. Test environments if available needed lot of troubleshooting from testers before it worked
  3. Checking in code changes into environments took days at a stretch
  4. Code drops to testing teams never happened on time. There was always a 2 week delay 
  5. If code drops happened on time, then a very small code mass came for testing. For e.g, if there were say 10 features, only 1 feature would have been available in testing
  6. Code drops coming towards release deadlines contained massive number of features
  7. Availability of data for testing was always a problem. Data team always forced testers to compromise testing with workarounds. Data team provided invalid or nonsensical data at times. A laptop for example would cost only 2 Baht
  8. Infrastructure in testing environments were never provided on time. If the production environment needed 3 load balancers, testers would be advised to test without the load balancer and certify that the code works fine and doesn't break anything new. Whether the code worked with 3 load balancers in production was a question not to be asked
  9. And a lot more..........

What's wrong with the inspection approach? 
Inspection chokes! How will you feel if your mom or dad just kept staring at you without saying a thing to you while you do your household chores. Same thing had  happened to Testerland testers.


There was a need for a CHANGE IN THE SYSTEM in totality. What Test-eagle did was to transform just the "Testing System" converting people in other systems to Supervisors who oversaw testing work and mocked at it. Test-eagle put complete responsibility of quality with testing team and let others go scot free.
Test-eagle failed to understand that Quality is everyone's job, but management's responsibility. 
Test-eagle was highly de-motivated. What on earth failed his plan on which he had invested his blood and sweat on. He had to calm down. He had to come down to tester's level. He had to prevent some or all of the tester's problems listed above to be able to improve the quality of testers. He had to fix a few problems that were built into the system even before blaming the 'Testing system'. And he did. As a first step, he figured he didn't need Lead testers. Not that he fired them. He still kept them, but after stripping off their Inspector roles. He made them part of the system.

Inspection Vs. Prevention mind-set
Test-eagle moved away from an Inspection mind-set to a Prevention mind-set. He thought that if he fixed a lot more problems in many other non-testing teams, testing teams could do better. Testers could do better. Ethers could come back to Org without a second thought. He looked at some of the key problems that Org was facing for a long time. He identified top five problems to begin with. He summoned the respective stream leaders and discussed problems in detail. A brainstorming session followed. Few action items were noted. And the group was off to execute.

Inspection vs. Prevention
Courtesy: www.docstoc.com
The first improvement was on the Test Environment challenge. The release manager was answerable for every minute of test environment downtime. He took the onus of fixing the problems right away, of course after being reprimanded by Test-eagle (Sometimes, position power is the only power that works.) After a while, testers were happy. They no longer suffered long hours of environment downtime. They could test a lot more in the same time that was scheduled for testing. They didn't have to waste time on mindless things.

Test-eagle took each stream to task and fixed the problems one after the other. Slowly, testing team was empowered. Empowered to TEST BETTER!

*This post is inspired by John Guaspari's book 'I Know It When I See It'.

Regards,
Parimala Hariprasad