On Better Hiring (or Do Coding Interviews Work? Part 2)

Whiteboarding

Last week I wrote about coding interviews and questioned whether they are the best method to predict future job success. There were strong opinions on both sides of the argument. Someone expressed the opinion that I was taking a quote from Laszlow Bock, senior vice president of people operations at Google, out of context, so I would like to begin by giving him the first word on what Google has discovered about interviews. Interestingly, he refers to cognitive biases, specifically confirmation bias (as I did in my post) as a reason traditional interviews are not good indicators of job performance.

In April 2015 he wrote an excellent article for Wired Magazine and I encourage everyone to take a look so that they can form their own conclusions. I will quote at length in order to make sure that Mr. Bock’s ideas are not taken out of context. In the article he stated:

In 1998, Frank Schmidt and John Hunter published a meta-analysis of 85 years of research on how well assessments predict performance. They looked at 19 different assessment techniques and found that typical, unstructured job interviews were pretty bad at predicting how someone would perform once hired.

Unstructured interviews have an r2 of 0.14, meaning that they can explain only 14 percent of an employee’s performance. This is somewhat ahead of reference checks (explaining 7 percent of performance), ahead of the number of years of work experience (3 percent).

The best predictor of how someone will perform in a job is a work sample test (29 percent). This entails giving candidates a sample piece of work, similar to that which they would do in the job, and assessing their performance at it. Even this can’t predict performance perfectly, since actual performance also depends on other skills, such as how well you collaborate with others, adapt to uncertainty, and learn.

I believe that the work sample test he refers to is analogous to the coding interview. If not, I will error on the side of caution since equating the two would mean that coding interviews are the most effective. Even so, the very best interview technique is only twice as effective as an unstructured interview in predicting job performance. I hope some statisticians can weigh in on the topic, but an r2 value of 0.29 does not seem to be very indicative of future job success. I vaguely remember from college that in physical sciences we look for much higher r2 values before we accept a hypothesis as proven.

Bock goes on to claim that using a combination of interview techniques in a “structured interview” is even more predictive (though he does not present any r2 values for the combination of techniques). And, believe it or not, I agree with him. He has presented some real scientific information that a structured interview process is better than an unstructured one, but I still wonder if we have committed a type 3 error (solving the wrong problem precisely).

Bock and I both agree that “the goal of our interview process is to predict how candidates will perform once they join the team,” but he also admits that doing things as he recommends is “hard to develop” and “a lot of work.” Even Bock states:

Full disclosure: I’m the Senior Vice President of People Operations at Google, and some of these (unstructured) interview questions have been and I’m sure continue to be used at the company. Sorry about that.

And this is where I would begin to question whether this process is one that everyone should follow. Of course, for highly desirable and highly capitalized companies like Google, Facebook, etc. the benefits likely outweigh the costs, but for the majority of companies that interview software developers, this may be a luxury they cannot afford. This overall hiring style is time consuming. There is not doubt that developers are willing to jump through hoops to work at Google or Facebook, but such is not the case for most companies where time is not a friend. If a majority of companies use this process they may find a much higher cost in potential candidates lost than the marginal effect of hiring better. This is the reality that many managers and companies face.

InterviewThis begs the question of what should those who are not Google or Facebook, those without unlimited resources and legions of potential candidates, do to improve their hiring of software development professionals? The first thing is to be aware of the true costs and benefits associated with pursuing one path over another. Eschew convention and ask the hard questions like, “Given limited time and financial resources, is this the best use of either (or both)?” It appears that Bock agrees there is a high cost to more effective interviews, but for most companies is the higher cost justified? Is there a better way to allocate time and money? I believe there are things that should be considered and evaluated.

One thing I have suggested in the past is the concept of apprenticeship and creating a pipeline of talent from within a company. This is an essential aspect of good management and would reduce the need for interviewing new candidates. In my experience is that in software development in particular we often look for perfect fits and do not do a good job of creating a pipeline of candidates. This puts us “behind the eight ball” and increases the risk of hiring poorly, providing high pressure to make the right hire.

If we are truly looking for the best predictor of how well someone will preform on the job, then we could certainly achieve a much higher r2 value by actually putting someone on the job for a limited period (or at least give them an experience that would be as much like the job as possible). In response to my original blog, Duncan Campbell had perhaps the most insightful comment when he wrote:

The best way to find out if someone is good for a job is for them to do the job… which is why our “coding interview” is a real problem done in the candidate’s own time on their own PC followed by a code review.

I agree with Duncan, but it may even be cost effective to go further. Instead of spending many hours via interview, make the quick decision to bring the candidate into the company on a trial basis, perhaps two weeks at pay. We all seem to have projects that are less critical that would be a good proving ground for potential long-term employees. If the candidate doesn’t work out after two weeks, then part company. It may prove more cost effective for managers to actually do the work of managing existing people than spending a huge chunk of time in a laborious interview process.

thumbs upAnother possibility is the concept of contract to hire or using staffing agencies to do the heavy lifting. Full disclosure: I work for a company that has staffing as one of our offerings. Nevertheless, having another company take the time to screen candidates and having the employment-related risk owned by a third party does have its place in this discussion.

My original article, titled “Do Coding Interviews Work?”, was purposely open-ended. I have tried my best to present information to help people answer that particular question. The title was not “are there better ways to interview software developers?” for a reason. I am truly questioning any interviewing technique because of the high cost and the low correlation – even when interviews are conducted perfectly (or as least as perfectly as science has instructed us). I do not think that we will get rid of interviewing altogether, but it is important to know that for a great majority of companies there may be alternatives that, given all the costs and benefits, may be more effective ways of answering the larger question of how best to hire. I gave a few suggestions. I look forward to others.

Brainstorming – Effective Technique or Sacred Cow?

sacred cow

cognitive biasI have spent a great deal of time studying and reading about human cognitive biases and their effect on business, especially the business of software development. This past weekend I finished the groundbreaking book by Stuart Sutherland, appropriately title “Irrationality: The Enemy Within”.

Since I have made quite a bit of study on the topic previously, some of the material was either referenced by other materials or has lost its shock value since I have become thoroughly convinced of humankind’s built in propensity not only for irrational behavior, but their inability to recognize that these biases are a problem. In fact, my experience is that a large segment of our population is not only ignorant of biases but seems to revel in a willful ignorance of scientific evidence. Certainly there appears to be a great deal of cognitive bias (mostly the confirmation bias) in the debate on climate change.

My previous understanding of human cognitive bias withstanding, while the book was published in 1992, the information is still relevant, interesting and cogent. I would suppose that there are a number of things that are worthy of note, but since there is such a wealth of information in the book, I decided to choose a single instance to write about here and encourage those interested in more examples to actually get a copy of the original material.

brain stormingThe one thing that caught my attention and has stuck in my mind is the example of using a technique called “brainstorming” to improve creativity and productivity. For those who have lived on another planet, brainstorming is the process of getting as many ideas out as possible without judging or filtering of the ideas. It has been used for decades since its introduction by Alex Olsen in the book Applied Imagination. Olsen claimed that in his experience using brainstorming in advertising agencies resulted in 44% more worthwhile ideas than individuals thinking up ideas without the benefit of group discussion.

Ever since that time, brainstorming has been widely used to improve creativity and productivity of groups. However, here’s the kicker, since as long ago as 1958, Osborn’s claims has been subject to numerous studies which almost universally cast doubt upon the effectiveness of brainstorming.  Keith Sawyer, a psychologist at Washington University in St. Louis, states: “Decades of research have consistently shown that brainstorming groups think of far fewer ideas than the same number of people who work alone and later pool their ideas.” In other words, brainstorming doesn’t work quite as well as we think it does (or should).

With scientific evidence questioning the effectiveness of brainstorming vast, the real question is why does the use of brainstorming persist? The question is at the heart of much of my agile practice in that the prime issue is not whether one is merely effective, but that one is optimal. It is obvious to me that several cognitive biases are in play in keeping brainstorming around.

herd behaviorThere is something of the availability cascade to brainstorming “which is a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or ‘repeat something long enough and it will become true’)” (Wikipedia). Furthermore, a whole host of cognitive biases around groupthink, herd behavior and the bandwagon effect certainly have their influence on the popularity of brainstorming. Since brainstorming “seems” to make sense it is also subject to the belief bias, which is seen when the believability of the conclusion leads us to misunderstand the true effectiveness of the process. Frankly, I would suppose that I could find literally dozens of cognitive biases, which allow brainstorming to proliferate as the “go to” technique for group creativity and productivity.

Given that brainstorming may very well not be optimal, what are the alternatives that have actually been scientifically proven to be more effective? In a 2012 article for Psychology Today, Ray Williams proposes a few modifications to the brainstorming approach:

  • Have groups collaborate frequently by having them in close physical proximity to each other;
  • Pay attention to creating physical spaces that enable good collaboration, which facilitates people frequently “running into each other” while at work;
  • Revise the “no criticism” script of brainstorming to encourage debate about ideas;
  • Use appreciative inquiry techniques, where group participants build on ideas suggested by each individual in the group.

ideasMost interesting to me about these suggestions is how closely they align to the things that Agile (and I) speak to, namely close attention to co-location of people within an Agile team to increase good collaboration, allowing an environment where there is embracing of feedback as opposed to “failure” and using iterative feedback to improve ideas (and software) incrementally.

There are a great number of cognitive biases inherent in human beings. The first step is to be aware that these irrationalities exist. We must also acknowledge that we, as individuals, are subject to these irrationalities. Furthermore, we need to create an environment of safety that gives us the freedom and encouragement to continually explore and seek the underlying scientific truths, the “why” of what we do – the freedom to gore the sacred cows.

Agile – It’s All About Making Better Decisions

cognitive bias

I’ve been spending a lot of time recently doing research, reading and presenting on human cognitive biases. To the initiated, cognitive biases are defined as

“…a systematic pattern of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. Individuals create their own ‘subjective social reality’ from their perception of the input.” (Wikipedia Definition)

In other words, cognitive biases exist when there is a gap between our perception of reality and objective reality. For example, there is the “confirmation bias” which is our human tendency to seek out or interpret information that confirms one’s existing opinions.

everestWhile the term “cognitive bias” is relatively new (it was coined in 1972 by Amos Tversky and Daniel Kahneman), researchers have already uncovered literally over a hundred cognitive biases, some which are relatively tame like the “google effect” (or digital amnesia), where there is a tendency to forget information that can be easily researched, to ones that can lead to more disastrous consequences like the Sunk Cost Fallacy where people justify increased investment in a decision based on prior investment instead of looking only at future efficacy. The Sunk Cost Effect, along with the Overconfidence Effect and Receny Effect, played a role in the May 1996 mountain climbing tragedy, made famous in the movie Everest, that resulted in the death of five experienced climbers.

A great number of cognitive biases have been found through the work of behavioral economics researchers like Dan Ariely who wrote the wonderful books Predictably Irrational and The Upside of Irrationality. Underlying all of classic economics is the concept of homo economicus, or economic man who behaves in rational ways to maximize individual returns and acts in his own self-interest.   Unfortunately, this is not the case and humans often act irrationally (and predictably so) because of their inherent cognitive biases. Humans all have biases for loss aversion and would choose to avoid loss over a larger corresponding potential gain and thus act as “homo irrationalis” as discovered by behavioral economics instead of “homo economicus” as predicted by classic economics.

It is our cognitive biases that cause us to make irrational decisions. Since behavioral economists found many of these cognitive biases, it was not a great leap to see how cognitive biases would be a paramount concern for the economics of software development. In my coaching practice, a great deal of my time and effort is used in helping organizations make better decisions about software development. Many times the optimal decisions are counter intuitive to people’s inherent biases so my job (and my passion) is helping companies see the world of software development differently so that, when it comes down to making a decision, they have all the knowledge necessary to make the optimal economic decision.

smokestackOne of the most prevalent biases in software development is to see the world in a mechanistic / Tayloristic manner. Taylor’s viewpoint was fine for the old world of physical work, but does not hold up in the complex knowledge work being done by software development professionals today. Unfortunately, most of the people making software development decisions are predominantly influenced by this old, less optimal way of viewing the world, and, as a result, make sub-optimal decisions. For example, in the mechanistic worldview, adding more people to an effort results in a corresponding increase in output. If there is an existing team of seven people and we add seven more then we would (if we hold this mechanistic bias) expect the work to be approximately twice as fast. However, like the behavioral economists that found the real world to be counter intuitive to homo economicus, actual studies have found that the need for increased communication of knowledge work nearly outpaces any incremental increase in individual productivity (see Top Performing Projects Use Small Teams). I have always said that if you want to double productivity of a fourteen person team all that is necessary is to create two teams of seven.

The mechanistic bias can also be seen in many of the ways that the Agile philosophy is implemented. For example, the scrum framework is often trained as a series of ceremonies and actions with little or no understanding of the reason such mechanistic actions are successful. “Scrum Masters” are “certified” with only two days of training and a simple test. The training deals with ideal situations, but when the scrum master actually has to implement scrum, he or she is woefully unprepared. In the real world compromises and decisions must be made. Without understanding the underlying “why” of agile and the basic nature of software development, the decisions and compromises that are made are not optimal. In my experience, this is why project managers are tougher to train than people with no project management experience. When faced with ambiguous information and the need to make optimal decisions, project managers tend to fall back on existing mechanistic knowledge and the decisions made range from mildly irritating to completely disastrous. As I have often pointed out, to say that one was successful with waterfall reeks of confirmation bias because it begs the question of whether or not one would have been more successful using another methodology or framework like Lean or Scrum.

rental carIn addition to the mechanistic bias, software development suffers from another bias, the project-centric bias, which is the tendency to see all work done in terms of projects. Unfortunately, the project-centric bias is so ingrained in companies that there needs to be some radical changes to the way we view software development across all areas, including accounting. Viewing work as a project when we are actually working on software products results in a whole raft of poor software economic decisions like concentrating on features more than quality and security. Remember that no one washes a rental car.

As I think back on my coaching work in agile, the blogs I have written, the many discussions I have had and the presentations I have made, I think that all of these boil down into one very simple thing – my work is all about helping people understand the true nature of the software development business process and, thereby helping them to make better decisions. Understanding our cognitive biases, therefore, is extremely important for my clients and myself because, in the end, Agile is all about making better decisions.

On Death and Dying and Agile Transformation

death and dying

I was recently involved with a large scale Agile transformation and noticed what I thought was an interesting correlation, jotted down a note to blog about it and then promptly did nothing for a very long time. Usually these blinding flashes of light quickly lose their luster and find themselves relegated to the bottom of the blog backlog, never seeing the light of day, but this particular one reignited my attention as I sat down to write my newest blog.

ideaMy earth-shattering insight was that any organizational transformation, which obviously includes an Agile transformation, involves the very same stages that were first identified by Swiss psychiatrist Elisabeth Kübler-Ross in her 1969 book, On Death and Dying. For those who were sleeping through Psych 101, Kübler-Ross proposed that there are a series of stages that are experienced by survivors when faced with the death of a close friend or relative. These stages could be experienced linearly but also in no particular order, but that everyone would go through the five stages she recognized through her work with terminally ill patients.

The five stages are: Denial, Anger, Bargaining, Depression, and Acceptance. Though the model was originally created to explain the stages of grief following the death of a loved one, it was later expanded to encompass the grief stages associated with any major loss like the loss of a job or income or divorce / end of a relationship. It is my opinion that these stages can also be applied to the loss of a treasured idea. In fact, I think these stages are better explained by the death (loss) of a cherished idea since love, attachment, etc. are all associated with mental constructs (ideas). Our world is merely the sum of our mental perception so the loss of a loved one, loss of a job, or loss of a relationship are nothing more than the loss of an idea.

Once we understand that the grief stages are in response to the loss of an idea, it is not a great leap to apply this to any company transformation. It is well known that there are some who will readily embrace change, but there are a great number that see any change as a threat. What is the nature of the threat? I think that either consciously, or more often subconsciously, the threat is to an idea that one has grown to “love” and that there is a very real fear that if this idea were replaced that its death would cause grief. In my experience with a great number of agile transformations I do tend to see the five stages that Kübler-Ross outlined.

sadnessThere is certainly a large share of denial when I have tried to help companies become more agile. There is never a shortage of people who will defend the status quo and insist that the current way of creating software (nearly always waterfall) is already successful and that there is no need to bring agile. The minds of the people in denial are closed to any external threat to their enshrined beliefs.

I also see a great deal of anger during transformations. People have loved their ideas for so long that they are like a member of the family. How dare you agile folks try to kill off my favorite processes? I will do everything in my power to try to stop you, railing at the purveyors of such dangerous ideas.

I see my share of bargaining too. If we cannot outright defeat the new ways, we can at least try to keep as many of the old ways intact. Maybe we don’t have to kill off all the waterfall phases. Maybe we can keep the phases, but just do them in shorter time frames. Maybe we can just do this “agile” thing for development and leave the rest of the sacred cows not slaughtered. I don’t have to give up my old way of thinking or deal with the death of my ideas, is there not room for both?

As new ideas begin to take hold, I have also seen my share of depression. People have viewed the world in one way for so long that once their ideas are shown to be outdated or not optimal, they begin to look forlorn and some even begin to despair. With waterfall gone, how am I to complete a software development project?

handshakeAnd finally, if the company has the intestinal fortitude to stick it out through the first four stages, you will finally get to acceptance. No with your new idea having taken hold, it probably won’t be long before you will have to go through it all again with another new idea. I think the more that we realize that changes our ideas result in the death of old ideas and that the death of old ideas will result in some recognizable stages the more we will be able to quickly move through those stages and adopt new ideas more readily.

Postscript: Interestingly enough, when I had just finished the above blog, I did some research to see if I had written on this subject before because it had seemed eerily familiar. I could find no blog that I had published on the subject, but I may have written about it before and not published. What my research did find was that I am not alone in my link between the Kübler-Ross model and Agile. I have found other references to this on Mindstorm, and Agile Helpline. While I do not recall reading these blogs prior to my current blog and it is possible that we have all come to the same insight independently, I reference these here just in case I did read them at some point and perhaps forgot. Regardless, the fact there are others who have written about the very same topic leads me to believe in the concept’s applicability.

Post Postscript: This blog represents my 100th blog post under the agile-doctor.com website. While this is mostly symbolic and my 100th blog post will not guarantee me syndication (like a sitcom), it is still a moment to celebrate. Many thanks to everyone who has given support over the years! Stay agile my friends!

 

The VW Scandal as a Cautionary Tale – Cultivating Fear Always Ends Badly

culture of fear

For those who might have been taking a long vacation from reality, Volkswagen recently became embroiled in a scandal regarding some of its cars with diesel engines. It seems that when these cars where tested for emissions, a “defeat device” (a software program), would detect that the cars were being tested and change the performance accordingly. This led to claims that their diesel cars were better for the environment than their competitors. In all there appears that over 500,000 of these “clean diesel” cars are currently on the road, mostly in the United States.

vw emissions testShortly after this scandal broke, the finger pointing began. Under pressure from a United States House of Representatives Oversight and Investigations panel, Michael Horn, Volkswagen’s United States Head, stated, “This was a couple of software engineers who put this in for whatever reason.” While I find it very disingenuous and slimy to through your software developers “under the bus”, albeit one with flowers and the smell of Patchouli oil, the question that remains, assuming Horn was not involved, is why would these “rogue” developers decide to create something like a “defeat device” in the first place?

When I first heard of the “rogue” developer explanation, I realized there are two possible reasons – the developers do not care about their work or the developers acted out of fear for their jobs (or both). In either case the root cause is that there is a culture that discourages employee disengagement and fear. This means that while Mr. Horn might be accurate that it was “rogue” developers who are responsible for the act of creating the malicious software, the responsibility for corporate culture rests squarely with leadership. Since Horn was the top leader then he bears the brunt of the responsibility for the culture that would allow/coerce developers to make such a disastrous decision, one that could cost VW billions.

vw diesel engineThat is not to say that the developers who made the changes should be held blameless, but it is not unusual for corporate culture to treat developers more like minions than the professionals that they are, shielding them from making decisions we would expect respected professionals to make. Fear for their very jobs, while puzzling in an environment where good developers can pick and choose, was most likely the final calculation that allowed the developers to write this malicious code. And not only developers, but where was the Quality Assurance during this process? Again, it is poor culture that allows such misdeeds to flourish.

My educated guess received some support in a recent column in Road and Track by Bob Lutz, a former Genera Motors executive. Lutz placed the blame for the recent scandal directly on the shoulders of ex-Chairman Ferdinand Piech. Lutz stated that Piech’s tenure was distinguished by a style of leadership which was “a reign of terror and a culture where performance was driven by fear and intimidation.” Lutz described one exchange with Piech regarding the body fits of the new VW Golf where Piech bragged about his “recipe” for better body fits.

“I called all the body engineers, stamping people, manufacturing, and executives into my conference room. And I said, ‘I am tired of all these lousy body fits. You have six weeks to achieve world-class body fits. I have all your names. If we do not have good body fits in six weeks, I will replace all of you. Thank you for your time today.”

While this leadership style might work from time to time, it is motivational junk food. It creates a toxic culture where the long-term effects will one day negatively manifest themselves, in this case, with “defeat devices.”

Studies have shown that only about 30% of US workers are engaged in their work. That fact, along with dozens of years of experience, have led me to believe a huge percentage of companies use fear as their primary means of motivation. My advice to leaders is to pay attention to the VW scandal and heed its warning. You will most certainly reap what you sow, karma will catch up with you and cultivating fear will always end badly.

You Say You Want to Change the World?

Eye on World

Do you really want to sell sugar water, or do you want to come with me and change the world?

—Steve Jobs, recruiting John Sculley to become Apple CEO, 1983

We’re here to put a dent in the universe. Otherwise why else even be here?

—Steve Jobs

Changing the world. That’s some pretty heady stuff. I’d like to think we all want to change the world, to put our own dent in the universe. I know that when I talk about my work with SIS and 10XP Solutions I frequently mention that my goal is to change the world of staffing and change the world of software development. Both of these are laudable goals, but this merely begs the question, how does one go about changing the world?

The answer is deceptively simple, but, like so many things in life, it is hiding in plain sight. The problem is that most of us fail to approach the problem correctly. We believe that if we are to change the world that we have to perform some action on the world itself – we need to create something that didn’t exist, we need to convince someone of something new – but this is the wrong place to begin. If we wish to change the world, we must first change the way we view the world.

World SculptureThe world only exists as we perceive it. This perception exists regardless of whether there is (or isn’t) an objective reality. Of course, we should try to passionately and unwaveringly pursue objective reality, but we should always acknowledge that our view of reality might not be correct and the way we behave, as a result of our perception, might not be optimal. Therefore, when contemplating the qualities that will allow us to be the leaders that change the world, we must never forget humility or curiosity. Our world will never change until we allow the freedom for our perception of the world to change.

When we look at those things that we say have changed the world, what, in fact, has changed? Is the world really radically different or is it our perception that has changed? I think if we give it any thought at all we would easily conclude that the world really doesn’t change much, but when new things, whether products or ideas, come our way and these allow us (or force us) to see the world differently, then the world itself has changed.

World in my handThe first step then when we want to change the world is to first change our own world by changing the way we view the world. Once we have changed the world in that manner the next step is to figure out how to lead and convince others that our worldview is the correct one. This is where the advice from my last blog, “You Can Be Right or You Can Be Successful” comes into play. We need to help people see the new world, not because we are right, but because we have made the fundamental mental shift ourselves and find that the new worldview is more successful.

Steve Jobs made monumental changes to the world; he certainly made his dent in the universe. It is my opinion the reason he was successful was because before he changed the world, he changed himself and his view of the world. If we wish to follow in his footsteps and the many others who have changed the world, we need to have the curiosity to keep seeking, the humility to acknowledge that we don’t have all the answers and, once we have seen the world change by the new views we hold, the patience and compassion to lead others to see the world as we now see it.