Articles

Loading

Static analysis and ROI

Bug/Defect
Author
vivekjog
Date
2014-02-22 02:54
Views
14156612

I regularly communicate with potential users who are worried about
errors in C++ programs. Their worry is expressed in the following way:
they try the PVS-Studio tool and start to write that it finds too few
errors during tests. And although we feel that they find the tool
interesting, still they their reaction is quite skeptical.

Then a discussion starts where I try to convince them that
PVS-Studio is a very good and useful product and that it could be very
profitable to their company. In response they criticize my explanations
and make caustic remarks on the analyzer's work and false alarms it
produces. Usual marketing work it is.

While communicating with one of these users I wrote a detailed
answer and my opponent suggested that I arrange it as an article. And
this is what I am doing. We are waiting for your comments on the
estimate of a profit static code analysis tools may bring. Although I
wrote the article keeping PVS-Studio in mind, the calculations given in
it seem to be interesting regardless of what static analysis tool is
under discussion.

The text cited below is an answer to the following fragment of a letter:

...

About 40 (forty) more real defects have been found - in most cases, this is a bad copy-paste.

The question is: what will we get from
integration of an expensive program into the process of developing a
software product in code of which the program detects so few defects?
Yes, I understand that we will find a fresh error quicker, but there
are not so many fresh errors.

...

So, let's have a look at static analysis tools from the viewpoint of ROI.

Let's take an average programmer who spends most of his working time
on developing C++ software. It is easy for me to imagine such a person
since I myself have been programming a lot for a long time. Suppose
that he runs a static code analyzer at night. Also suppose that the
analyzer, being used in this working mode and at a medium programming
rate, can find two defects in code made by the programmer in a week.

This is not abstract reasoning - I tell this relying on my own
experience. I am handling the code only with half of usual effort now
but almost every week I notice a mistake in my own code thanks to night
analysis. Usually it is some trifle that would reveal itself when
writing a test or running regression tests but sometimes I find really
serious things. Here is a sample of a defect PVS-Studio has found in my
own code quite recently:

bool staticSpecification = IsStaticSpecification(sspec); 
bool virtualSpecification = IsVirtualSpecification(sspec);
bool externSpecification = IsVirtualSpecification(sspec);

The fact that I write articles about the harm Copy-Paste does in no way prevents myself from making such mistakes. I am human too and
I copy code fragments and make mistakes too. It is very difficult to
catch an error like the one shown above. In practice it would cause the
analyzer to generate a false alarm on some code constructs in certain
cases. I would hardly manage to create a manual test for such a rare
situation (to be exact, I did fail to create this test since I had put
this code into SVN). What is insidious about this error is that if some
user complained about it, I would have to search for it at least for a
couple of hours and also ask the user to send me the *.i file. But
let's not get distracted.

If the programmer writes code more regularly than me, 2 real
warnings generated by the analyzer during a week is a natural quantity.
Altogether the analyzer can produce 2*4*11 = 88 actual warnings during
a year. We could neglect the time needed to fix such defects and
suppress false alarms. But still let's take it into account.

Suppose the programmer spends 20 minutes in a week to fix 2 real
errors and 2 false alarms. Altogether he will spend 20*4*11 = 880
minutes in a year on handling the analyzer. In other words, it is 15
hours. Does it seem a large waste of time? It is very little in
comparison to what we will calculate further.

Now let's consider the price of eliminating the same defects in case the analyzer does not detect them during night tests.

The programmer will find 60% of these errors himself a bit later
while writing unit-tests, during individual preliminary testing or the
process of debugging other code fragments. Let's say that the search of
an error itself and fixing it will take about 15 minutes in this case
since the person is handling a recently written code and knows it well.
For example, the programmer might find a text in some dialogue that
should not be there and find out that yesterday he used x.empty()
instead of x.clear() by accidence:

url_.empty();
if (status_text)
url_ = status_text;

And do not tell me that fixing such errors takes only 1-2 minutes. A
correction itself takes several seconds at all. But you have to find
the necessary fragment, compile the fixed code, check if your
correction is right and probably introduce corrections into SVN. So
let's say it's 15 minutes.

I would like to note right away that errors of this kind are usually
fixed by programmers mechanically and are not considered errors usually
because they are not recorded anywhere.

35% of errors will be found at the testing stage. These errors have
a longer life cycle. In the beginning, a tester locates and recalls an
issue. Then he makes a description of the error and places it into the
bug-tracker. The programmer finds and fixes the error and asks the
tester to check this fragment once again and close the error. The total
time spent by the tester and programmer together is about 2 hours. Here
you are an example of such an error: incorrect handling of
OPENFILENAME. The programmer might be lucky and he will not see the
rubbish in the dialogue while the tester will, yet not every time (Heisenbug):

OPENFILENAME info = {0};
...
info.lpstrFilter = L"*.txt";

We have 5% of errors left unnoticed. That is, programmers and QA-engineers cannot find them but a static code analyzer can.

If you take your current project and check it with PVS-Studio or
some other static analyzer, you will see that very unnoticed 5% of
errors. This 5% is those very 40 errors the potential user has
mentioned while trying PVS-Studio.

The rest 95% of errors were fixed by yourself earlier while writing
tests, using unit-testing, manual testing and other methods.

So, we have 5% of errors we cannot find and they are hidden in the
product we are releasing. 4% of them might never occur at all and we
may ignore them. The remaining 1% of errors might reveal itself
unexpectedly by the user's side and cause him a lot of troubles. For
instance, a client wants to write a plugin to your system and the
program crashes because of this code:

bool ChromeFrameNPAPI::Invoke(...)
{
ChromeFrameNPAPI* plugin_instance =
ChromeFrameInstanceFromNPObject(header);
if (!plugin_instance && (plugin_instance->automation_client_.get()))
return false;

You never do that and always check external interfaces? Good guys. But Google Chromium failed here. So never make such promises.

If you value your client, you will have to spend many hours on
finding the defect and corresponding with the client. After that you
will have to additionally make a fix for him or release the next
version ahead of time. You might easily spend 40 hours of working time
of various people (not to speak of their nerves) on such errors.

What? Who said it's not true? You have never wasted a whole week on
one insidious bug? Then you have never been a true programmer. 🙂

Let's calculate how much time we could save during a year:

88 * 0.60 * 15 = 792 minutes

88 * 0.35 * 120 = 3696 minutes

88 * 0.01 * 40 * 60 = 2212 minutes

Altogether the programmer spends (792 + 3696 + 2212) / 60 = 112 hours during one year to fix some subset of his own errors.

A team of 5 persons will spend about 560 hours or 70 days during a
year on their own mistakes. Taking into account paid weekends,
vacations and sick-leaves we can say it is about 4 months of work for
some abstract person.

If it is profitable to use a static analyzer or not depends upon the salary of your employees.

Since we speak about some abstract person (not only programmers are
participating), let's take a salary of $6000 per month. Taking into
account salary taxes, rent, computer purchase and depreciation,
bonuses, Internet, juice, etc., we can easily increase this number
twice at least.

So we get that the price of fixing errors (not all errors, but most
of them) without using static analysis is $ 12 000 * 4 = $ 48 000.

If we find the same errors quickly using static code analysis, the
price of fixing them will be 5 * (15 / 8) * $ 12 000 / 20 = $ 5 600.

Let's add the price of purchasing the PVS-Studio license for a team of 5 persons to this figure.

The final price of fixing errors using a static analyzer will be (3 500 EUR) * 1.4 + 5600 = $ 10 500.

Altogether the pure annual PROFIT from using the PVS-Studio static analyzer in a team of 5 programmers is:

$ 48 000 - $ 10 500 = $ 37 500.

The price of fixing some errors has decreased more than three times.
It's up to you to think and decide if you should have this or not...

Yes, I also would like to note that I proceeded from rather
conservative figures in my estimates. Actually I think that investments
will be repaid much better. I just wanted to show you that you will get
profit even at the most conservative estimates. And please do not try
to reproach me for any figures saying that they are false. The article
shows an approach to a quality profit estimate, not a quantitative one.

Total 0

Total 20,613
Number Title Author Date Votes Views
20613
SileniusStor is specialized in reselling Cheap Aquafadas software online.
Softwareces | 2022.09.15 | Votes 0 | Views 12836889
Softwareces 2022.09.15 0 12836889
20612
CSStore is specialized in reselling Cheap Steinberg software online.
Softwareces | 2022.09.04 | Votes 0 | Views 12654183
Softwareces 2022.09.04 0 12654183
20611
Software Tester Community Website
ItSeTsQtBer | 2021.04.11 | Votes 0 | Views 13631476
ItSeTsQtBer 2021.04.11 0 13631476
20610
SoapUI Certification Course Content
Steveskok | 2021.02.08 | Votes 0 | Views 13558233
Steveskok 2021.02.08 0 13558233
20609
JavaScript Course Content
Steveskok | 2021.02.08 | Votes 0 | Views 13411763
Steveskok 2021.02.08 0 13411763
20608
Why most mobile testing is not continuous?
(TestExpert) | 2021.02.04 | Votes 0 | Views 13889177
(TestExpert) 2021.02.04 0 13889177
20607
8 Common Mistakes When Planning and Documenting Your Tests
(TestExpert) | 2021.02.04 | Votes 0 | Views 13771780
(TestExpert) 2021.02.04 0 13771780
20606
Types of Performance Testing
Jamessmith | 2021.01.17 | Votes 0 | Views 13572551
Jamessmith 2021.01.17 0 13572551
20605
How to Build E2E Test Cases (1)
tanthanh | 2020.05.28 | Votes 0 | Views 13811192
tanthanh 2020.05.28 0 13811192
20604
[White Paper] Delivering better software using Test Automation
tanthanh | 2020.05.28 | Votes 0 | Views 12877080
tanthanh 2020.05.28 0 12877080
20603
[Whitepaper] How to choose the right API Testing Solution
tanthanh | 2020.05.28 | Votes 0 | Views 13219870
tanthanh 2020.05.28 0 13219870
20602
[Whitepaper] How to choose the right API Testing Solution
tanthanh | 2020.05.28 | Votes 0 | Views 13342568
tanthanh 2020.05.28 0 13342568
20601
TestOps Introduction
VTB | 2020.03.23 | Votes 0 | Views 13061767
VTB 2020.03.23 0 13061767
20600
TestOps Implementation Case Study
VTB | 2020.03.23 | Votes 0 | Views 14000768
VTB 2020.03.23 0 14000768
20599
Selenium Automation Tester Certification Sample Exam Set 3
(TestExpert) | 2020.02.03 | Votes 0 | Views 13322810
(TestExpert) 2020.02.03 0 13322810
20598
Selenium Automation Tester Certification Sample Exam Set 2
(TestExpert) | 2020.02.03 | Votes 0 | Views 13869705
(TestExpert) 2020.02.03 0 13869705
20597
Selenium Automation Tester Certification : Sample Exam Set 1
(TestExpert) | 2020.02.03 | Votes 0 | Views 13512379
(TestExpert) 2020.02.03 0 13512379
20596
What is agile testing? why is agile testing? and what is the benefits? (20)
oishichip | 2019.12.26 | Votes 0 | Views 13905012
oishichip 2019.12.26 0 13905012
20595
Crowd Testing — Vantagens para testadores, plataformas e clientes [pt-br]
soikmd2 | 2019.12.14 | Votes 0 | Views 13499672
soikmd2 2019.12.14 0 13499672
20594
Software Testing Industry Report (Turkey) 2018-2019 (2)
ItSeTsQtB | 2019.08.08 | Votes 0 | Views 13673973
ItSeTsQtB 2019.08.08 0 13673973
20593
How to bypass security in integration tests in ASP.Net Core
ItSeTsQtB | 2019.08.08 | Votes 0 | Views 14202683
ItSeTsQtB 2019.08.08 0 14202683
20592
For agile testing, fail fast with test impact analysis
ItSeTsQtB | 2019.08.08 | Votes 0 | Views 13923398
ItSeTsQtB 2019.08.08 0 13923398
20591
Career Path in Software Testing
^Software^ | 2019.07.28 | Votes 0 | Views 13872862
^Software^ 2019.07.28 0 13872862
20590
Challenges in Big Data Testing
^Software^ | 2019.07.22 | Votes 0 | Views 13535631
^Software^ 2019.07.22 0 13535631
20589
Essential Necessities In Big Data Testing
^Software^ | 2019.07.22 | Votes 0 | Views 13374774
^Software^ 2019.07.22 0 13374774
20588
5 Test Data Generation Techniques
^Software^ | 2019.07.22 | Votes 0 | Views 13761975
^Software^ 2019.07.22 0 13761975
20587
CI and CD for ETL (Extract-Transform-Load) testing.
^Software^ | 2019.07.22 | Votes 0 | Views 13425663
^Software^ 2019.07.22 0 13425663
20586
Quest for Quality Conference : 5-6 Nov 2019 at Dublin, Ireland
VTB | 2019.07.08 | Votes 0 | Views 13265135
VTB 2019.07.08 0 13265135
20585
TestBash Essentials Conference 2019
VTB | 2019.07.08 | Votes 0 | Views 13568837
VTB 2019.07.08 0 13568837
20584
UKSTAR Conference : 11-03-2019
VTB | 2019.07.08 | Votes 0 | Views 13371920
VTB 2019.07.08 0 13371920
20583
Testing in Context Conference Australia (TiCCA) 2019
VTB | 2019.07.08 | Votes 0 | Views 13561379
VTB 2019.07.08 0 13561379
20582
European Testing Conference 2019
VTB | 2019.07.08 | Votes 0 | Views 13358903
VTB 2019.07.08 0 13358903
20581
Automation Guild Online Conference 2019
VTB | 2019.07.08 | Votes 0 | Views 13720380
VTB 2019.07.08 0 13720380
20580
Software Quality Days 2019 : 15 Jan 2019 (1)
VTB | 2019.07.08 | Votes 0 | Views 13480791
VTB 2019.07.08 0 13480791
20579
Skills required to become a Software Tester (1)
IT-Tester | 2019.07.08 | Votes 0 | Views 13165393
IT-Tester 2019.07.08 0 13165393
20578
Automated Unit Testing with Randoop, JWalk and µJava versus Manual JUnit Testing
IT-Tester | 2019.07.08 | Votes 0 | Views 13134171
IT-Tester 2019.07.08 0 13134171
20577
Scrum Testing Guide Book (2)
IT-Tester | 2019.06.26 | Votes 0 | Views 13162624
IT-Tester 2019.06.26 0 13162624
20576
Acceptance Testing Definition in Testing vs Scrum
IT-Tester | 2019.06.26 | Votes 0 | Views 13390982
IT-Tester 2019.06.26 0 13390982
20575
User Acceptance Testing Checklist
VTB | 2019.06.20 | Votes 0 | Views 13356270
VTB 2019.06.20 0 13356270
20574
Firewalls and Types (1)
VTB | 2018.11.14 | Votes 0 | Views 13180225
VTB 2018.11.14 0 13180225
20573
Senior Test Engineer (1)
VTB | 2018.11.05 | Votes 0 | Views 13627841
VTB 2018.11.05 0 13627841
20572
Junior Test Analyst
kornadian2 | 2018.11.04 | Votes 0 | Views 13327237
kornadian2 2018.11.04 0 13327237
20571
Senior Test Engineer
kornadian2 | 2018.11.04 | Votes 0 | Views 13682274
kornadian2 2018.11.04 0 13682274
20570
Programme Test Manager, 12m, Immed Start, $NEG
kornadian2 | 2018.11.04 | Votes 0 | Views 13489927
kornadian2 2018.11.04 0 13489927
20569
Junior Test Analyst
kornadian2 | 2018.11.04 | Votes 0 | Views 13401571
kornadian2 2018.11.04 0 13401571
20568
Dev & Test Manager
kornadian2 | 2018.11.04 | Votes 0 | Views 13363245
kornadian2 2018.11.04 0 13363245
20567
Senior Test Analyst (Performance Tester)
kornadian2 | 2018.11.04 | Votes 0 | Views 13296567
kornadian2 2018.11.04 0 13296567
20566
QA Automation Test Analyst
kornadian2 | 2018.11.04 | Votes 0 | Views 13466829
kornadian2 2018.11.04 0 13466829
20565
Software/Field Testing Engineer
kornadian2 | 2018.11.04 | Votes 0 | Views 13068559
kornadian2 2018.11.04 0 13068559
20564
Compliance Technician (Tester)
kornadian2 | 2018.11.04 | Votes 0 | Views 13394575
kornadian2 2018.11.04 0 13394575