While a number of software defects are the result of simple human errors that escape either scrutiny or test cases, the vast majority of defects found in ACIS come out of geometric cases that were not - and likely could not, be practically anticipated.
In one of my former lives as QA manager, one of the problems that continually grated on my nerves was the mysterious nature of our nightly regression test failures. Our test suite was incredibly fragile, such that in analyzing failures during our convergence period to determine whether they were truly indicative of a potential customer problem or just whiny tests, I was constantly faced with making decisions based on vague error information emitted from black box tests of unknown origin. A lot of manual analysis was required for every release.
"Everything in moderation"… I’ve always had trouble with this saying. Because let’s face it, engineers are notoriously bad at moderation. We like to do what no one believed possible, to defy conventional thinking, to create a never-before-seen engineering masterpiece. The Hoover Dam, putting a man on the Moon… these kinds of things are what get engineers’ all worked up!
For my first blog post, I’ve decided to cheat. About a year ago, I wrote an article, ACIS as an Ecosystem, for our company newsletter. In this article, I presented the idea that “the functions within a large commercial software package form an ecosystem, in the technical sense of a collection of evolving actors which interact among themselves." At the time, I wanted to go into more detail about the evolution theory behind this statement (because it’s both rigorous and really cool), but was limited by space.
Though I cannot remember the exact source, I have heard it enough times that I am sure it’s not a figment of my imagination, the assertion that software productivity has not increased significantly in the past two decades. That the number of lines of tested and debugged code written per day per developer is roughly the same as two decades ago.