Ideas may fall short if they are fallacious and software engineering is full of ideas. Let’s break some of the fallacies down to make ourselves better engineers.

Brain

A ghost in the flesh

A human brain is a complex machine that evolved over millennia. It works in the most peculiar ways possible and allows us to excel both at perception, dexterity and mind work. Some of its functions are a bit hacky. There are a lot of stereotypes and shortcuts that our mind takes to be more efficient and take less energy to fulfill the task. It helps most of the time although, being overwhelmingly erroneous at times, so that it leads you to the wrong decision, building an incorrect map of reality. The lenses of your perception may be flawed, the mechanism that grinds up the information you collect may be malfunctioning, your mapping can be highly incorrect.

Warts of cellular computation

Such errors have a name. This name is ‘fallacy’. A fallacy is reasoning that is evaluated as logically incorrect and that undermines the logical validity of the argument and permits its recognition as unsound.

Like other people of mind work, software engineers require a lot of thinking, analysis, and mapping of reality to fulfill their job. While doing these processes, our mind sometimes takes shorter routes to reach the destination, leading to wrong decisions or poor planning. To avoid that it is better to know your flaws.

How we fail

Mind processes are complex and multi-contextual, so are the fallacies one may fall for on the way of thinking. Although, there are some that I consider interesting for a software engineer. Here they are:

  • Nirvana fallacy (perfect-solution fallacy) – solutions to problems are rejected because they are not perfect. This is the common problem experienced developers have. They know how things should be, and what would be the best option for the current issue, even though it may be time-consuming and harder to grasp from an outsider’s perspective. Holistic solutions are beautiful and elegant, but real-world deadlines and realities dictate otherwise. It is better to steer perfectionist tendencies to the project architectural uniformity, startup time, environment, log output, and other ongoing metrics than attention to one particular piece of code. Even though there are people building cathedrals, most of us work on the Ford assembly line, where proper processes, isomorphy, and clean workplace win the day.

  • Appeal to authority (argument from authority, argumentum ad verecundiam) – an assertion is deemed true because of the position of authority of the person asserting it. When explaining practices or opinions on some subjects of software development, project management, operations, e.t.c, people tend to use somebody else’s saying, blog post, conference talk or another claim as a foundation for the justification of their own decision. Even though it might not always be fallacious, mostly it is better to add more contextual arguments that apply to a specific solution or project rather than appealing to authority.

  • Historian’s fallacy – the assumption that decision-makers of the past viewed events from the same perspective and had the same information as those subsequently analyzing the decision. This fallacy frequently arises when a programmer is working with legacy code. It is important to remember that the decisions of the past were not done by people who wanted to do wrong. Most likely, their poor coding standards, entangled code structure or confusing library choice had some reasonable background, related either to hardly structured development process, everchanging requirements or inadequate deadlines. However, there are some really bad engineers out there also.

  • Misleading vividness – involves describing an occurrence in vivid detail, even if it is an exceptional occurrence, to convince someone that it is a problem; this also relies on the appeal to emotion fallacy. Some engineers may use such fallacy, exposing all the technicalities of their job to cover incompetence and justify failure in rolling out the feature in time. It is better to judge the actual context and possible outcome, not solely on the explanation of a fellow engineer.

  • Survivorship bias – a small number of successes of a given process are actively promoted while completely ignoring a large number of failures. This one is especially important for people with a business mindset that want to open a startup or enterprise. They tend to learn the success stories of prominent companies, which may be an exception, rather than a rule. Thousands of startups fail each year and only a few survive to succeed.

Conclusion

We, engineers, tend to recheck and test artifacts of our work. Our systems have to work properly and bugs have to be eliminated at sight for us to be proud of our work. However, there is more to it. Our systems have defects and so do we. We need to seek for bugs in our minds, not only in our systems. If the system that creates systems is not working properly, no wonder its product will be broken too. Remember: errors are not a sin, it is sin to ignore them.