Demonstrations, Experiments, and Software Testing

Listen on the go!

Did you know there’s a difference between an experiment and a demonstration? Also, did you ever realize why this difference is critical to the team that includes coders, testers, and managers.

In the software development life cycle, both the experiments and demonstrations are referred to by the same name known as “tests”.

While demonstrations are considered to tell us something that we knew before, experiments are envisioned to help us study things we need or want to know.

The difference is critical as the need of testing must not be simply to show that the product can work.

We test to study about the product so that we can comprehend it well, and address glitches before it’s too late.

According to Michael Bolton, Lead Consultant, DevelopSenseThe more similar a test is to a previous instance of it, the less likely it is to find a bug. That’s why it’s essential to include plenty of variation in your testing.

In one of his articles titled Alternatives to Manual Testing, Michael Bolton explains how experiential and exploratory testing are not the same.

“Of course, there’s overlap between those two kinds of encounters. A key difference is that the tester, upon encountering a problem, will investigate and report it. A user is much less likely to do so. (Notice this phenomenon, while trying to enter a link from LinkedIn’s Articles editor; the “apply” button isn’t visible and hides off the right-hand side of the popup. I found this while interacting with Linked experientially. I’d like to hope that I would have find that problem when testing intentionally, in an exploratory way, too.)”

While the gamut of testing is huge, a context-driven approach to automation in testing certainly brings in more value.

A context-driven approach to Automation in Testing

Test Automation can certainly do much more than simply feigning a user who is pressing buttons.

Context-driven testers pick their testing purposes, practices, and deliverables by looking first at the facts of the explicit situation, including the needs of the investors who commissioned the testing.

In a paper authored by James Bach, creator of Rapid Software Testing methodology, and Michael Bolton, “There are many wonderful ways tools can be used to help software testing. Yet, all across industry, tools are poorly applied, which adds terrible waste, confusion, and pain to what is already a hard problem. Why is this so? What can be done? We think the basic problem is a shallow, narrow, and ritualistic approach to tool use. This is encouraged by the pandemic, rarely examined, and absolutely false belief that testing is a mechanical, repetitive process. Good testing, like programming, is instead a challenging intellectual process. Tool use in testing must therefore be mediated by people who understand the complexities of tools and of tests. This is as true for testing as for development, or indeed as it is for any skilled occupation from carpentry to medicine.”

The spirit of context-driven testing is a project-appropriate request for dexterity and finding. The context-driven testing places this approach to testing within a humanistic societal and principled framework.

Eventually, context-driven testing is about doing the best we can with what we get. Rather than trying to apply “best practices,” we accept that very diverse practices will work best under different situations.

Seven Basic Principles of Context-Driven Testing

As laid down by Cem Karner, J.D., Ph.D, Michael Bolton, and James Bach, the seven basic principles of context-driven testing include –

  1. The best software testing is a challenging intelligent process.
  2. Individuals, working together, are the most significant part of any project’s context.
  3. Projects reveal over time in ways that are often not expectable.
  4. The worth of any practice depends on the situation.
  5. The product is a solution. If the issue isn’t resolved, the product will not work.
  6. There is nothing like best practices but only good practices in context.
  7. Only through judgment and talent, exercised supportively during the complete project, are we able to do the correct things at the right times to efficiently test our products.

The illustrations of the principles in action include –

  • Testing is carried on behalf of participants in the service of debugging, developing, qualifying, examining, or selling a product. In an entirety, different testing approaches could be suitable for these diverse objectives.
  • Testing groups exist to provide testing-related services. They do not run the development project, rather serve the project.
  • Metrics that are not valid are dangerous.
  • The critical value of any test case lies in its capability to deliver info (i.e. to reduce ambiguity).
  • All oracles are imperfect. It might well have failed it in ways that you (or the automated test program) were not monitoring despite of the product seeming to pass your test,
  • In essence, automated testing is not automatic manual testing. It’s illogical to speak about automated tests as if they were automated manual testing.
  • It is completely appropriate for diverse test groups to have different undertakings. A main exercise in the service of one mission might be extraneous or counter-productive in the service of another.
  • Various types of issues will be exposed by different types of tests. They should become more challenging or must emphasize various defects as the program becomes steadier.
  • Test relics are valuable to the degree that they mollify their participants’ pertinent requirements.

From Michael Bolton’s point of view, Testing must be a social (and socially challenging), cognitive, risk-focused, critical (in several senses), analytical, investigative, skilled, technical, exploratory, experiential, experimental, scientific, revelatory, honorable craft. Not “manual” or “automated”. He urges that misleading distinction to take a long vacation on a deserted island.

Cigniti invites you to join an interesting webinar where Michael Bolton, Lead Consultant, DevelopSense, will be joined by Kalyan Rao Konda, President, Cigniti, to discuss the difference between demonstrations and experiments in software testing.

Michael Bolton is a consulting software tester and testing teacher who helps people to unravel testing problems that they didn’t realize they could solve. He is the co-author (with James Bach) of Rapid Software Testing (RST), a strategy and mindset for testing software expertly and credibly in uncertain conditions and under extreme time pressure. He has taught RST to testers in 35 countries. Michael has been testing, developing, managing, and writing about software since 1988.

Kalyan is a strong proponent of IP led testing services and features a pending patent within the area of intelligent test scenario generation to accelerate software test life cycles. He is also a sought-after consultant and a recognized speaker in industry events and conferences of international repute. Known for his expertise in setting up large-scale global delivery teams, delivering testing advisory services and amazing people leadership skills, he has built leading global delivery teams & operations, achieving target gross margins, managing the investments for the service delivery organization.

In this presentation, Michael Bolton explains the difference, and the way scientists (and, yes, philosophers of science) came to differentiate between demonstration and experiment.

Kalyan will share insights on how Cigniti is assisting leading enterprises globally including the Fortune 500 in their Software Testing & Quality Engineering initiatives and accelerating their digital transformation.

Register for the webinar and save your spot to listen to some interesting insights on June 4th, 2021.

Being a worldwide leader in independent quality engineering services, Cigniti is a strong advocate of Quality Assurance and its implementation right from the initial stages of the software lifecycle. We encourage customer feedback and believe in including such feedback in our broader quality assurance approach. We take great measures to make sure that we are fully equipped with state-of-the-art services and have partnered with other experts that specialize in providing testing services. Talk to us.

Author

  • Cigniti is a Global Leader in Independent Quality Engineering & Software Testing Services with offices in US, UK, India, Australia, and Canada.

Leave a Reply

Your email address will not be published. Required fields are marked *