Why would you move out of your job? Why would anyone move out of his/her current job? I bet, its not just MONEY all the time. Each one of us wants to be rich, but that does not mean that we would be spending 8-10 hours/day at work just for money without enjoying what we do, without growing in the area of our liking and expertise, without learning new stuff for self improvement etc. All said and done, we still look out for our dream job each time and after we do land up in one such dream job, we finally end up thinking this is not THE JOB FOR ME.
Coming to the point. Pradeep Soundararajan is writing a book on ‘Interviews and Jobs'. Would you be interested? Why would you not be interested? If you still plan to attend interviews in each of your dream companies and figure out what works in each company and how to get a job there, then nobody is gonna stop you. But do you have the time that it demands? If you don’t, then how about this? Interviews and Jobs. On the website, you will get to see and read cool stuff that might land you in your dream job or atleast give a big nudge towards finding one. Check out the website and provide your feedback. Do read the Teaser of the book. If you want to know more, Check HERE.
It’s Better to Try and Fail Than to Have Never Tried At All.
Happy Reading,
Parimala Shankaraiah
http://curioustester.blogspot.com
29 September, 2009
27 September, 2009
Bangalore Weekend Testing 9 (BWT – 9)
Date and Time
26th September 2009, 3.00 PM
Mission
To find functional bugs in the application
Product Overview
Splashup formerly Fauxto, is a powerful editing tool and photo manager. It's easy to use, works in real-time and allows you to edit many images at once. Splashup runs in all browsers, integrates seamlessly with top photo-sharing sites, and even has its own file format so you can save your work in progress.
Testers
Ajay Balamurugadas, Parimala Shankaraiah, Gunjan Sethi, Dhanasekar Subramaniam, Poulami Ghosh, Suja C S, Karan Indra, Rajesh Iyer, Amit Kulkarni.
Reports
My Report
Ajay’s Report
Approach
Testing multimedia and imaging applications has been a jittery-ride for me for a long time simply because I have not tested them or even brainstormed test ideas for such applications. The product for BWT-9 was one such application – Splashup. First look at the application and I was like ‘Oops! I am stuck’. I had no clue about what the product is, how I am going to understand how it works, how to figure out and what to figure out? Any online help – just nothing. I am surprised how most applications are released to the market without the basic help information if not contextual help! For splashup, there was a brief overview about the product and that’s it. I opened the application and observed the tools pane. Wow – now this looks similar to MS Paint. I am not sure yet if mapping one application to another does any good/bad in terms of modeling the application under test(AUT). At least in this case, I was able to move forward.
Dear Facilitator
I was the facilitator for this session. Facilitating is a different ballgame altogether. Inviting people to google chat, reconnecting disconnected people, providing information for latecomers(if any), clarifications, questions, doubts, understanding misinterpretations and interpreting correctly etc. Wow. It was an awesome experience. I always think of me as a very poor Management Material. I have always dreamt of being a tester in an Individual Contributor(IC) role and be able to work like that forever in an Ideal World. Sadly, this is not an ideal world. This pushes me to pursue a lot more skills to be learnt and practiced. Someday, these skills are going to help me be a better tester in some way. Respect every experience in your life. You never know when you would require those skills.
To learn from whatever happens, no matter how horrible that experience maybe, is a kind of revenge on bad fortune that is always available to us – James Bach
Happy Learning,
Parimala Shankaraiah
http://curioustester.blogspot.com
26th September 2009, 3.00 PM
Mission
To find functional bugs in the application
Product Overview
Splashup formerly Fauxto, is a powerful editing tool and photo manager. It's easy to use, works in real-time and allows you to edit many images at once. Splashup runs in all browsers, integrates seamlessly with top photo-sharing sites, and even has its own file format so you can save your work in progress.
Testers
Ajay Balamurugadas, Parimala Shankaraiah, Gunjan Sethi, Dhanasekar Subramaniam, Poulami Ghosh, Suja C S, Karan Indra, Rajesh Iyer, Amit Kulkarni.
Reports
My Report
Ajay’s Report
Approach
Testing multimedia and imaging applications has been a jittery-ride for me for a long time simply because I have not tested them or even brainstormed test ideas for such applications. The product for BWT-9 was one such application – Splashup. First look at the application and I was like ‘Oops! I am stuck’. I had no clue about what the product is, how I am going to understand how it works, how to figure out and what to figure out? Any online help – just nothing. I am surprised how most applications are released to the market without the basic help information if not contextual help! For splashup, there was a brief overview about the product and that’s it. I opened the application and observed the tools pane. Wow – now this looks similar to MS Paint. I am not sure yet if mapping one application to another does any good/bad in terms of modeling the application under test(AUT). At least in this case, I was able to move forward.
Dear Facilitator
I was the facilitator for this session. Facilitating is a different ballgame altogether. Inviting people to google chat, reconnecting disconnected people, providing information for latecomers(if any), clarifications, questions, doubts, understanding misinterpretations and interpreting correctly etc. Wow. It was an awesome experience. I always think of me as a very poor Management Material. I have always dreamt of being a tester in an Individual Contributor(IC) role and be able to work like that forever in an Ideal World. Sadly, this is not an ideal world. This pushes me to pursue a lot more skills to be learnt and practiced. Someday, these skills are going to help me be a better tester in some way. Respect every experience in your life. You never know when you would require those skills.
To learn from whatever happens, no matter how horrible that experience maybe, is a kind of revenge on bad fortune that is always available to us – James Bach
Happy Learning,
Parimala Shankaraiah
http://curioustester.blogspot.com
26 September, 2009
Bangalore Weekend Testing 8 (BWT – 8)
Date and Time
20th September 2009, 3.00 PM, BWT testers meet up on Google Chat.
Mission
Choose one of the quality criteria out of the following –
Installability / Usability / Performance / Reliability / Compatibility /Testability.
Product Overview
Areca-Backup is a file backup software that supports incremental, image and delta backup on local drives or FTP servers.
Testers
Ajay Balamurugadas, Vasupratha, Bhargavi, Sudhakar, Gunjan, Parimala Shankaraiah.
Reports
My Report
Ajay's Report
Approach
The beauty of BWT 8 lay in the fact that there was a flexibility to choose the mission by selecting any one of the Quality Criteria listed: Installability / Usability / Performance / Reliability / Compatibility / Testability. Previously, I was fascinated by the results of testing Nokia 1650 Insert Options by breaking down the feature sets and choosing one small testing chunk to test. I wanted to learn to better my previous testing performance. I chose Installability as the quality criteria - something that I had not tested earlier in session based format. I set out to test and loved the entire experience.
One thing I noticed in this session was that a few testers chose the much known, much easier quality criteria. In short, they chose to be in their comfort zones. Please note that if any product is given to test with a vague mission like find bugs in the product, we as humans have a normal tendency to choose areas that we are comfortable with. For eg. I love functional and usability testing and I would just jump at any opportunity to do just this . Over a period of time, I realized that other areas like performance, claims, reliability, compatibility etc are important as well based on what the stakeholder demands. Hence, as a tester, it is good to have basic skills in different test techniques if not great skills. This in turn can be improved over a period of time by practicing more and more.
The best way to learn is to push yourself out of your comfort zone. Test what you are not sure about. Test what you do not know about. Test what you have not heard about. Ask questions. Find answers. Question your answers. And ask more questions. The final answer could be well within your reach. You be your first critic.
Advantages of testing smaller chunks of the product
1. Brainstorming one feature/task results in an umpteen number of test ideas
2. Improved Focus on a single feature/task
3. Adequate testing can be done in a 90 – 120 minutes session
4. Finding lot more valuable and hidden bugs (bugs hungry? – Naa!)
5. Highly buggy features can be retested in future sessions possibly in short session say 60 minutes
If you have tried testing by breaking testing tasks into smaller chunks, feel free to add to the above list.
Get Uncomfortable Today,
Parimala Shankaraiah
http://curioustester.blogspot.com
20th September 2009, 3.00 PM, BWT testers meet up on Google Chat.
Mission
Choose one of the quality criteria out of the following –
Installability / Usability / Performance / Reliability / Compatibility /Testability.
Product Overview
Areca-Backup is a file backup software that supports incremental, image and delta backup on local drives or FTP servers.
Testers
Ajay Balamurugadas, Vasupratha, Bhargavi, Sudhakar, Gunjan, Parimala Shankaraiah.
Reports
My Report
Ajay's Report
Approach
The beauty of BWT 8 lay in the fact that there was a flexibility to choose the mission by selecting any one of the Quality Criteria listed: Installability / Usability / Performance / Reliability / Compatibility / Testability. Previously, I was fascinated by the results of testing Nokia 1650 Insert Options by breaking down the feature sets and choosing one small testing chunk to test. I wanted to learn to better my previous testing performance. I chose Installability as the quality criteria - something that I had not tested earlier in session based format. I set out to test and loved the entire experience.
One thing I noticed in this session was that a few testers chose the much known, much easier quality criteria. In short, they chose to be in their comfort zones. Please note that if any product is given to test with a vague mission like find bugs in the product, we as humans have a normal tendency to choose areas that we are comfortable with. For eg. I love functional and usability testing and I would just jump at any opportunity to do just this . Over a period of time, I realized that other areas like performance, claims, reliability, compatibility etc are important as well based on what the stakeholder demands. Hence, as a tester, it is good to have basic skills in different test techniques if not great skills. This in turn can be improved over a period of time by practicing more and more.
The best way to learn is to push yourself out of your comfort zone. Test what you are not sure about. Test what you do not know about. Test what you have not heard about. Ask questions. Find answers. Question your answers. And ask more questions. The final answer could be well within your reach. You be your first critic.
Advantages of testing smaller chunks of the product
1. Brainstorming one feature/task results in an umpteen number of test ideas
2. Improved Focus on a single feature/task
3. Adequate testing can be done in a 90 – 120 minutes session
4. Finding lot more valuable and hidden bugs (bugs hungry? – Naa!)
5. Highly buggy features can be retested in future sessions possibly in short session say 60 minutes
If you have tried testing by breaking testing tasks into smaller chunks, feel free to add to the above list.
Get Uncomfortable Today,
Parimala Shankaraiah
http://curioustester.blogspot.com
25 September, 2009
Which School do you belong to?
No. I am not talking about the elementary school that you studied at. I am talking about the School of Software Testing you belong to? I am inspired to write this after James mentioned in one of his recent posts that he was not sure if I think of myself as context-driven. I do think of myself as Context-Driven and I am proud to be a part of the Context Driven School of Testing.
The Seven Basic Principles of the Context-Driven School as listed on the Context Driven Testing website:
1. The value of any practice depends on its context.
2. There are good practices in context, but there are no best practices.
3. People, working together, are the most important part of any project's context.
4. Projects unfold over time in ways that are often not predictable.
5. The product is a solution. If the problem isn't solved, the product doesn't work.
6. Good software testing is a challenging intellectual process.
7. Only through judgment and skill, exercised cooperatively throughout the entire project, are we able to do the right things at the right times to effectively test our products.
Do you see the Freedom and Responsibility that is available in this school of testing? Is this what you have been yearning for? Figure it out yourself. For more information on Context Driven Testing, please visit HERE. A MUST READ for all the context driven testers.
Happy Reading,
Parimala Shankaraiah
http://curioustester.blogspot.com
The Seven Basic Principles of the Context-Driven School as listed on the Context Driven Testing website:
1. The value of any practice depends on its context.
2. There are good practices in context, but there are no best practices.
3. People, working together, are the most important part of any project's context.
4. Projects unfold over time in ways that are often not predictable.
5. The product is a solution. If the problem isn't solved, the product doesn't work.
6. Good software testing is a challenging intellectual process.
7. Only through judgment and skill, exercised cooperatively throughout the entire project, are we able to do the right things at the right times to effectively test our products.
Do you see the Freedom and Responsibility that is available in this school of testing? Is this what you have been yearning for? Figure it out yourself. For more information on Context Driven Testing, please visit HERE. A MUST READ for all the context driven testers.
Happy Reading,
Parimala Shankaraiah
http://curioustester.blogspot.com
21 September, 2009
Pair Testing - II
The last time I took up Pair Testing was while testing Vischeck with Ajay Balamurugadas. We had teamed up on Gmail chat, tested the product in a given timeframe and shared our learnings. If you visit the comments of this post, you will realize how I goofed up my learning by completely misunderstanding the product that I tested (I intentionally have these comments in place to remind me not to repeat this mistake).
I got another opportunity to learn with a tester friend Bhargavi M who teamed up with me for Pair testing. The best part was that we got together at one place instead of Gmail chat. We mutually agreed to test ‘Play’ option in Windows Media Player. While Bhargavi(in this case, the tester) took over the keyboard to test Windows Media Player > Play option, I(in this case, the reviewer) chose to analyst and review the tests in addition to brainstorming additional ideas which Bhargavi might miss(potential misses and not intentional misses). At the end of the session, we had tested all the ideas that both of us had come up with. Please find the report HERE.
Learnings from Pair Testing
1. It is a good learning to let the tester take complete control to test while the reviewer observes/takes down notes/reviews the testing process. This is a very good way for the reviewer to learn how different the thought process of the tester is. This also points to areas where the reviewer needs to improve to test better.
2. It is a good learning for the tester to learn and understand which areas/test ideas were missed, but caught by the reviewer during the note taking process. It helps the tester to observe and learn where he/she failed to identify the test ideas that reviewer captured.
3. It is a challenge to think in diverse ways and execute all the test ideas all alone. Pair Testing is advantageous in that it is two pairs of eyes which is testing the application instead of just 1 pair of eyes AT THE SAME TIME. This not only helps in finding issues, but also enhances the investigative nature of the tester and the reviewer as they would keep on building on the test ideas one by one. If the tester missed something, there is a high probability that the reviewer would have caught this and vice versa.
4. Pair testing gives more insight into the tester's scenario testing process. Each tester’s thinking and understanding about scenario testing is different and hence executed differently.
5. Pair testing yields good benefits when the tester and the reviewer’s skills complement each other. If both end up being similar personalities with same set of skills, knowledge, intuition etc, they might end up missing critical bugs which would have been caught as a result of their diverse nature.
6. Pair testing is different from Pairwise testing.
NOTE: I have called one person as the tester and the other as the reviewer because the reviewer's job is mainly to note down observations and add new test ideas. However, the reviewer can contribute to the testing at any time.
Thanks to Bhargavi M for the wonderful Pair Testing Experience.
Addendum on 6th Oct 2009
A good document on Exploratory Testing in Pairs.
Happy Pairing,
Parimala Shankaraiah
http://curioustester.blogspot.com
I got another opportunity to learn with a tester friend Bhargavi M who teamed up with me for Pair testing. The best part was that we got together at one place instead of Gmail chat. We mutually agreed to test ‘Play’ option in Windows Media Player. While Bhargavi(in this case, the tester) took over the keyboard to test Windows Media Player > Play option, I(in this case, the reviewer) chose to analyst and review the tests in addition to brainstorming additional ideas which Bhargavi might miss(potential misses and not intentional misses). At the end of the session, we had tested all the ideas that both of us had come up with. Please find the report HERE.
Learnings from Pair Testing
1. It is a good learning to let the tester take complete control to test while the reviewer observes/takes down notes/reviews the testing process. This is a very good way for the reviewer to learn how different the thought process of the tester is. This also points to areas where the reviewer needs to improve to test better.
2. It is a good learning for the tester to learn and understand which areas/test ideas were missed, but caught by the reviewer during the note taking process. It helps the tester to observe and learn where he/she failed to identify the test ideas that reviewer captured.
3. It is a challenge to think in diverse ways and execute all the test ideas all alone. Pair Testing is advantageous in that it is two pairs of eyes which is testing the application instead of just 1 pair of eyes AT THE SAME TIME. This not only helps in finding issues, but also enhances the investigative nature of the tester and the reviewer as they would keep on building on the test ideas one by one. If the tester missed something, there is a high probability that the reviewer would have caught this and vice versa.
4. Pair testing gives more insight into the tester's scenario testing process. Each tester’s thinking and understanding about scenario testing is different and hence executed differently.
5. Pair testing yields good benefits when the tester and the reviewer’s skills complement each other. If both end up being similar personalities with same set of skills, knowledge, intuition etc, they might end up missing critical bugs which would have been caught as a result of their diverse nature.
6. Pair testing is different from Pairwise testing.
NOTE: I have called one person as the tester and the other as the reviewer because the reviewer's job is mainly to note down observations and add new test ideas. However, the reviewer can contribute to the testing at any time.
Thanks to Bhargavi M for the wonderful Pair Testing Experience.
Addendum on 6th Oct 2009
A good document on Exploratory Testing in Pairs.
Happy Pairing,
Parimala Shankaraiah
http://curioustester.blogspot.com
17 September, 2009
Indian Testing Renaissance
I wanted to share some good news with you – my blog readers (both silent readers and commenters) who are taking time to read my blog and even go a step ahead by commenting about my work.
Firstly, James Bach talked about me and my work in one of his recent posts' titled New Voice: Parimala Shankaraiah. My blog now sits on James Bach’s blogroll. I am thrilled! Appreciation from James is very special because I have admired James Bach , Cem Kaner, Michael Bolton , Pradeep Soundararajan and many more great testers as my Role Models. With this appreciation comes an added responsibility on my shoulders to do more for the betterment of the tester in me and for the testing community as well. Thank You James!
Secondly, I won a Honorable Mention prize in the recently announced uTest Q3 2009 Bug Battle results.
These recognitions are special to me for 2 reasons:
1. Appreciation at the global level(not just for me, but for Indian Testers in general) – a few more people might get to know not just about Parimala Shankaraiah and her work, but Indian testers in general and the challenges they are facing. I am a little thirsty for more people to know us right now because I want more and more people to check out and critique our work in addition to all the self criticism that we do for our own good. I have seen this helping a lot. Just recently, It took me about 2 hours to reply to one of my blog reader's comments. In those 2 hours, I learned so many new things and got so many questions. It was a great lesson indeed.
2. The email sent by the uTest team sharing the results of Q3 2009 bug battle read ‘Congratulations! You are a winner in our Honorable Mention category for the latest Bug Battle! Although this was not a planned prize category, we want to recognize the quality of your bug reports and feedback'. Wow! I am not sure if they sent the same email to everyone who got a similar prize! No offense Meant!
My heart bleeds to see and hear bad things about Indian testers(including me) who are loathed for doing shabby testing, for blindly following test scripts for the fear of getting fired, not innovating on how better testing can be and many more. I do understand and respect the downside to bringing any changes in Indian Testing Organizations. Right now, while I am typing this, If I have to talk to my management about Exploratory testing and ask them to let go of those dirty and obsolete test scripts, I know how strongly I am going to be opposed and I know how difficult it is going to be doing this by myself all alone. One way to overcome this is to build more and more context driven testing groups in whatever small way each of us can and spreading the network (Never mind if it is slow and quiet as long as there is perserverance and no giving up!). This network will one day prove how better testing can get. I get goosebumps while I visualize that day. Dear Testers, Spread the word in your own small way!
Happy Networking,
Parimala Shankaraiah,
http://curioustester.blogspot.com
Firstly, James Bach talked about me and my work in one of his recent posts' titled New Voice: Parimala Shankaraiah. My blog now sits on James Bach’s blogroll. I am thrilled! Appreciation from James is very special because I have admired James Bach , Cem Kaner, Michael Bolton , Pradeep Soundararajan and many more great testers as my Role Models. With this appreciation comes an added responsibility on my shoulders to do more for the betterment of the tester in me and for the testing community as well. Thank You James!
Secondly, I won a Honorable Mention prize in the recently announced uTest Q3 2009 Bug Battle results.
These recognitions are special to me for 2 reasons:
1. Appreciation at the global level(not just for me, but for Indian Testers in general) – a few more people might get to know not just about Parimala Shankaraiah and her work, but Indian testers in general and the challenges they are facing. I am a little thirsty for more people to know us right now because I want more and more people to check out and critique our work in addition to all the self criticism that we do for our own good. I have seen this helping a lot. Just recently, It took me about 2 hours to reply to one of my blog reader's comments. In those 2 hours, I learned so many new things and got so many questions. It was a great lesson indeed.
2. The email sent by the uTest team sharing the results of Q3 2009 bug battle read ‘Congratulations! You are a winner in our Honorable Mention category for the latest Bug Battle! Although this was not a planned prize category, we want to recognize the quality of your bug reports and feedback'. Wow! I am not sure if they sent the same email to everyone who got a similar prize! No offense Meant!
My heart bleeds to see and hear bad things about Indian testers(including me) who are loathed for doing shabby testing, for blindly following test scripts for the fear of getting fired, not innovating on how better testing can be and many more. I do understand and respect the downside to bringing any changes in Indian Testing Organizations. Right now, while I am typing this, If I have to talk to my management about Exploratory testing and ask them to let go of those dirty and obsolete test scripts, I know how strongly I am going to be opposed and I know how difficult it is going to be doing this by myself all alone. One way to overcome this is to build more and more context driven testing groups in whatever small way each of us can and spreading the network (Never mind if it is slow and quiet as long as there is perserverance and no giving up!). This network will one day prove how better testing can get. I get goosebumps while I visualize that day. Dear Testers, Spread the word in your own small way!
Happy Networking,
Parimala Shankaraiah,
http://curioustester.blogspot.com
15 September, 2009
Session Based Testing – Nokia 1650
I have not explicitly tested mobile devices though I have found issues on different types of cell phones that I have used in the past. This gave me an idea to test Nokia 1650 model cell phone that I own currently. I chose to test the 'Insert Options' feature in Create Message functionality. I quickly glanced through the Insert Options feature at a high level, broke it down into different types of Insert options available (smileys, words, numbers, symbols and templates), how different these options are when the dictionary is on and the language chosen is English or Hindi, how different is the behavior if the dictionary is turned off, what are the effects of changing the text format to different sentence cases like uppercase, lowercase and title case. The testing checklist was ready. Please find it HERE.
As the session unfolded, I tested dictionary on (English), dictionary on(Hindi) and dictionary off sub-features. I referred to the checklist from time to time to ensure that I was focusing on the charter and not deviating from it at any point. For eg. When I was testing the Insert Options with dictionary On(Hindi), I found that the Create Message > Options feature has 4 additional options like Text: A B C, Text: a b c, Numbers: 1 2 3 and Insert Symbol. I was surprised because these were already available in the Insert Options. Redundant feature? May be. This was a guaranteed deviation from my current charter and hence I noted it down as an opportunity and continued testing. Please find the Session Based Test Report is HERE.
My Learnings:
1. It is a good idea to break the charter of the session to the smallest possible task(chunk) so as to focus and test it adequately(Note that I am not using the word completely).
2. I can get rid of a separate testing checklist document by adding the checklist items in the Task Breakdown section of the Session Based Test report.
3. Any activity related to this testing session should be done within the duration that was originally planned.
4. Opportunities play a spoilsport by deviating the tester from the charter unless the tester gives in to it resulting in boondoggle.
5. Opportunities in current session (outside of the current charter) can become the charters for future sessions.
6. Test Notes section should have information about observations and inferences made while exploring, understanding and testing the product. Any bug should get to the Bugs sections and queries if any should get to the Issues sections.
7. Add Risk field to talk about the risk associated with each bug(the risk of not getting the bug fixed) and the Customer Impact field to talk about any possible impact to the customer. In general, It would be nice to add Risk, Customer Impact and Oracle/Heuristic fields to indicate why and how the bug was found (OK, I am tweaking the original template suggested by Jonathan Bach).
8. Issues in the Issues section should be followed up without fail with developers and product management teams for further clarity and follow up testing in a separate session.
Ever since I read Jonathan Bach’s article on Session Based Test Management, I have been thinking how it can be adopted in any company which takes the conventional route to testing. Can you dare to tell your manager that you will no longer write test cases? Can you tell your manager that he/she cannot generate any reports automatically anymore? Can you tell them that you test some areas here and there(explore), but nothing in particular? Can you just convince your manager about your work just by showing the bugs you have filed? What if you did not find any bugs? According to the conventional testers, the main problems with Exploratory testing is that it is not measurable, auditable and manageable. If you read Jonathan Bach’s article above, you will come to know that these ‘so called’ problems are not problems at all. I will work on answering the above questions by writing a follow up post to this in the near future. In the meantime, if you want to answer them, please feel free to add them in the comments section.
Happy Testing,
Parimala Shankaraiah,
http://curioustester.blogspot.com
As the session unfolded, I tested dictionary on (English), dictionary on(Hindi) and dictionary off sub-features. I referred to the checklist from time to time to ensure that I was focusing on the charter and not deviating from it at any point. For eg. When I was testing the Insert Options with dictionary On(Hindi), I found that the Create Message > Options feature has 4 additional options like Text: A B C, Text: a b c, Numbers: 1 2 3 and Insert Symbol. I was surprised because these were already available in the Insert Options. Redundant feature? May be. This was a guaranteed deviation from my current charter and hence I noted it down as an opportunity and continued testing. Please find the Session Based Test Report is HERE.
My Learnings:
1. It is a good idea to break the charter of the session to the smallest possible task(chunk) so as to focus and test it adequately(Note that I am not using the word completely).
2. I can get rid of a separate testing checklist document by adding the checklist items in the Task Breakdown section of the Session Based Test report.
3. Any activity related to this testing session should be done within the duration that was originally planned.
4. Opportunities play a spoilsport by deviating the tester from the charter unless the tester gives in to it resulting in boondoggle.
5. Opportunities in current session (outside of the current charter) can become the charters for future sessions.
6. Test Notes section should have information about observations and inferences made while exploring, understanding and testing the product. Any bug should get to the Bugs sections and queries if any should get to the Issues sections.
7. Add Risk field to talk about the risk associated with each bug(the risk of not getting the bug fixed) and the Customer Impact field to talk about any possible impact to the customer. In general, It would be nice to add Risk, Customer Impact and Oracle/Heuristic fields to indicate why and how the bug was found (OK, I am tweaking the original template suggested by Jonathan Bach).
8. Issues in the Issues section should be followed up without fail with developers and product management teams for further clarity and follow up testing in a separate session.
Ever since I read Jonathan Bach’s article on Session Based Test Management, I have been thinking how it can be adopted in any company which takes the conventional route to testing. Can you dare to tell your manager that you will no longer write test cases? Can you tell your manager that he/she cannot generate any reports automatically anymore? Can you tell them that you test some areas here and there(explore), but nothing in particular? Can you just convince your manager about your work just by showing the bugs you have filed? What if you did not find any bugs? According to the conventional testers, the main problems with Exploratory testing is that it is not measurable, auditable and manageable. If you read Jonathan Bach’s article above, you will come to know that these ‘so called’ problems are not problems at all. I will work on answering the above questions by writing a follow up post to this in the near future. In the meantime, if you want to answer them, please feel free to add them in the comments section.
Happy Testing,
Parimala Shankaraiah,
http://curioustester.blogspot.com
14 September, 2009
Book Review: The Secret
Off late, I have been busy reading a book and a few testing related articles (I am amazed by the amount of information that comes absolutely FREE on the internet for people to read, learn, think and practice!). Reading is something I love to do if I get some free time. This is also one of my ways of relaxing. I recently read a book ‘The Secret’ by Rhonda Byrne. You must have heard of the 90/10 principle: “10% of life is made up of what happens to you. 90% of life is decided by how you react”. ‘The Secret’ tells you that you decide what happens to you 100%! My favourite phases from the book and the corresponding testing analogies are listed below:
1. Thoughts are magnetic and thoughts have a frequency. As you think thoughts, they are sent out into the universe, and they magnificently attract all like things that are on the same frequency. Everything sent out returns to the source
Testing Analogy: Bugs are magnetic and bugs have a frequency. Every bug returns to the source which is the product and You who is the tester of the product.
2. The law of attraction says like attracts like, so when you think a thought, you are also attracting like thoughts to you
Testing Analogy: Good testers attract good bugs. When you think more bugs, you attract more bugs. Note that a tester does not improve the quality of the product. A tester provides vital information about the product based on which quality decisions can be made.
3. Nothing can come into your experience unless you summon it through persistent thoughts.
Testing Analogy: Summon every bug in the product through persistence!
4. You can start with nothing, and out of nothing and out of no way, a way will be made.
Testing Analogy: You can start with testing a product, and out of nothing(buggy product) and out of no way(coverage), bugs will be made.
5. Expectation is a powerful attractive force. Expect the things you want, and don’t expect the things you don’t want.
Testing Analogy: Expect to understand and learn the product, expect to understand the users, expect the product to be of good quality before it is shipped. Do not expect that you cannot find any more bugs in the product ever. Never say never.
6. Treat yourself with love and respect, and you will attract people who show you love and respect.
Testing Analogy: Do you love the tester in you? If you do, then others will love and respect the tester in you too.
7. Do not listen to society’s messages about diseases and aging. Negative messages do not serve you.
Testing Analogy: Do not listen to scripts, do not listen to the traditional education, testing methodologies and training systems, do not give in to aging of the time and the mind. Just listen to the Voice within You – that you can better yourself with every testing effort that you take up.
8. What you resist persists
Testing Analogy: If you resist bad quality, it persists. If you expect the product to be of good quality, it persists. What do you want to persist?
9. Let go of difficulties from your past, cultural codes, and social beliefs. You are the only one who can create the life you deserve.
Testing Analogy: Let go of conventional testing. Break the chains, take the Exploratory Path!
Happy Reading,
Parimala Shankaraiah
http://curioustester.blogspot.com
1. Thoughts are magnetic and thoughts have a frequency. As you think thoughts, they are sent out into the universe, and they magnificently attract all like things that are on the same frequency. Everything sent out returns to the source
Testing Analogy: Bugs are magnetic and bugs have a frequency. Every bug returns to the source which is the product and You who is the tester of the product.
2. The law of attraction says like attracts like, so when you think a thought, you are also attracting like thoughts to you
Testing Analogy: Good testers attract good bugs. When you think more bugs, you attract more bugs. Note that a tester does not improve the quality of the product. A tester provides vital information about the product based on which quality decisions can be made.
3. Nothing can come into your experience unless you summon it through persistent thoughts.
Testing Analogy: Summon every bug in the product through persistence!
4. You can start with nothing, and out of nothing and out of no way, a way will be made.
Testing Analogy: You can start with testing a product, and out of nothing(buggy product) and out of no way(coverage), bugs will be made.
5. Expectation is a powerful attractive force. Expect the things you want, and don’t expect the things you don’t want.
Testing Analogy: Expect to understand and learn the product, expect to understand the users, expect the product to be of good quality before it is shipped. Do not expect that you cannot find any more bugs in the product ever. Never say never.
6. Treat yourself with love and respect, and you will attract people who show you love and respect.
Testing Analogy: Do you love the tester in you? If you do, then others will love and respect the tester in you too.
7. Do not listen to society’s messages about diseases and aging. Negative messages do not serve you.
Testing Analogy: Do not listen to scripts, do not listen to the traditional education, testing methodologies and training systems, do not give in to aging of the time and the mind. Just listen to the Voice within You – that you can better yourself with every testing effort that you take up.
8. What you resist persists
Testing Analogy: If you resist bad quality, it persists. If you expect the product to be of good quality, it persists. What do you want to persist?
9. Let go of difficulties from your past, cultural codes, and social beliefs. You are the only one who can create the life you deserve.
Testing Analogy: Let go of conventional testing. Break the chains, take the Exploratory Path!
Happy Reading,
Parimala Shankaraiah
http://curioustester.blogspot.com
10 September, 2009
Dead Links Test using Xenu’s Link Sleuth
Dead Link
A dead link also called a broken link or dangling link is a link on the World Wide Web that points to a web page or a server that is permanently unavailable. Though nobody explicitly complained about any dead links on my blog, I did not want to assume that all is well. I went ahead and performed a deadlink test out of curiousity :-)
Dead Link Test
Dead Link Tests are very important to identify the dead or broken links on the World Wide Web. This is one of the important tests while testing websites. It is very tedious to go through each and every link to check whether it is alive or dead. Several tools are available at no charge or at a cost to help testers with the dead link tests.
Dead links can be internal or external links. Internal links are the hyperlinks that refer to elements present within the website or domain of the internet. External links are the hyperllinks that refer to elements present outside of the website or domain of the internet. For eg, if you consider any blog, all the hyperlinks that reference to anything on the same blog are considered as Internal links. All the hyperlinks that reference to anything on other blogs other than this blog are considered as External links.
Xenu's Link Sleuth
Xenu's Link Sleuth checks Web sites for broken links. Link verification is done on "normal" links, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts and java applets. It displays a continously updated list of URLs which you can sort by different criteria. A report can be produced at any time. This software is a freeware that can be downloaded from HERE.
Dead Links Statistics on Curious Tester
1. Internal dead links: None
2. External dead links: 4
PS: Token of Thanks to Alan Jorgensen for introducing me to this tool.
Addendum on 11th Sep 09:
When it comes to tracking test coverage in product testing, Testing Checklists(either in MS Word or Excel) come very handy. One really good feature that I observed in the reports generated by Link Sleuth is the data presented in 'Site Map of valid HTML pages with a title' section. This can act as a good base for creation of Testing Checklists while testing the website as a whole. This is very useful for testers who start off with a high level checklist before starting with product testing.
Happy Exploring,
Parimala Shankaraiah
http://curioustester.blogspot.com
A dead link also called a broken link or dangling link is a link on the World Wide Web that points to a web page or a server that is permanently unavailable. Though nobody explicitly complained about any dead links on my blog, I did not want to assume that all is well. I went ahead and performed a deadlink test out of curiousity :-)
Dead Link Test
Dead Link Tests are very important to identify the dead or broken links on the World Wide Web. This is one of the important tests while testing websites. It is very tedious to go through each and every link to check whether it is alive or dead. Several tools are available at no charge or at a cost to help testers with the dead link tests.
Dead links can be internal or external links. Internal links are the hyperlinks that refer to elements present within the website or domain of the internet. External links are the hyperllinks that refer to elements present outside of the website or domain of the internet. For eg, if you consider any blog, all the hyperlinks that reference to anything on the same blog are considered as Internal links. All the hyperlinks that reference to anything on other blogs other than this blog are considered as External links.
Xenu's Link Sleuth
Xenu's Link Sleuth checks Web sites for broken links. Link verification is done on "normal" links, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts and java applets. It displays a continously updated list of URLs which you can sort by different criteria. A report can be produced at any time. This software is a freeware that can be downloaded from HERE.
Dead Links Statistics on Curious Tester
1. Internal dead links: None
2. External dead links: 4
PS: Token of Thanks to Alan Jorgensen for introducing me to this tool.
Addendum on 11th Sep 09:
When it comes to tracking test coverage in product testing, Testing Checklists(either in MS Word or Excel) come very handy. One really good feature that I observed in the reports generated by Link Sleuth is the data presented in 'Site Map of valid HTML pages with a title' section. This can act as a good base for creation of Testing Checklists while testing the website as a whole. This is very useful for testers who start off with a high level checklist before starting with product testing.
Happy Exploring,
Parimala Shankaraiah
http://curioustester.blogspot.com
Subscribe to:
Posts (Atom)