A Large Collection of Miniskills

A Large Collection of Miniskills

I  really like the way that Daniel Kahneman describes expertise in his book Thinking Fast and Slow. His description is incredibly meaningful today, in a world where so many of us work in offices and perform knowledge world. Expertise is important, but it is a bit nebulous when you think about knowledge work expertise compared to craftsmanship expertise. Nevertheless, a good concept of what expertise is can be helpful when thinking about personal growth and success.

 

Kahneman writes, “The acquisition of expertise in complex tasks such as high-level chess, professional basketball, or firefighting is intricate and slow because expertise in a domain is not a single skill but rather a large collection of miniskills.” By thinking about expertise as a large collection of miniskills it becomes more understandable and meaningful, even in the context of knowledge work. For sports, many crafts, and even physical labor, expertise as a collection of miniskills is so obvious it is almost invisible. But for knowledge work, expertise as a collection of miniskills is invisible because it is not obvious or ubiquitous.

 

The image coming to mind for me when I think of expertise as a series of miniskills is iron forging or glasswork. It is clear that one must have a lot of different skills ranging from skills related to noticing subtle changes in materials as heat is applied to physical skills involved in shaping the material once it is at a certain temperature. One also has to have imaginative skills in order to see the shape and design that one wants, and be able to connect the right twists, bends, and physical manipulations to the object to match the mental image. Forging a knife or making a glass marble requires a lot of skills in related but different spheres in order to make one final product. It is obvious that one needs a lot of miniskills to be successful, but unless we enroll in a beginners class, we don’t necessarily think about all the miniskills that go into the craftsmanship.

 

In the knowledge work economy, our final work products are also an accumulation of miniskills, even though it feels as though we just produce one thing or do one thing with no real “skill” involved. However, our work requires communication skills, writing skills (a particular variation of communication skills), scheduling and coordinating skills, and oftentimes skills that require us to be able to create visually stimulating and engaging materials. Whether it is creating a slide show, coordinating an important meeting, or drafting standard operating procedures, we are not simply doing one thing, but are engaging an entire set of miniskills. True expertise in knowledge work is still derived from a set of miniskills, but the skills themselves don’t seem like real skills, and are easily ignored or overlooked. Focusing on the miniskills needed for knowledge work expertise can help us understand where we can improve, what our image of success really entails, and how to approach important projects. It is the mastery and connection of various miniskills that enables us to be experts in what we do, even in our ubiquitous office environments.
Overconfidence

Overconfidence

How much should you trust your intuitions? The answer to the question depends on your level of expertise with the area in which you have intuitions. If you cook with a certain pan on a stove every day, then you are probably pretty good with trusting your intuition for where the temperature should be set, how long the thing you are cooking will need, and where the hottest spots on the pan will be. If you are generally unfamiliar with cars, then you probably shouldn’t trust your intuition about whether or not a certain used car is the right car to purchase. In other words, you should trust your instincts in things you are deeply familiar with and in areas where you are an expert. In areas where you are not an expert and where you only have a handful of experiences, you should consider yourself to be overconfident if you think you have strong intuitions about the situation.

 

Daniel Kahneman demonstrates this with an example of a math problem in his book Thinking Fast and  Slow. Most of us don’t solve a lot of written math problems in our head on a daily basis. As a result, we shouldn’t trust the first intuitive answer that comes to mind when we see one. This is the case with the problem that Kahneman uses in his book. It is deliberately designed to have an intuitive easy answer that is incorrect. It helps us see how our overconfidence can feel justified, but still lead us astray.

 

Kahneman writes, “an observation that will be a recurrent theme of this book: many people are overconfident, prone to place too much faith in their intuitions. They apparently find cognitive effort at least mildly unpleasant and avoid it as much as possible.” Intuitions are easy. They come to mind quickly, and following them doesn’t take much conscious effort or thought. The problem, however, is that our intuitions can be wildly wrong. Sometimes they may help us reach an answer quickly, and if we are an expert they can even be life saving, but in many cases our intuitions can be problematic.  If we don’t ever think through our intuitions, we won’t actually realize how often we act on them, and how our overconfidence can lead to poor outcomes.

 

This doesn’t mean that we have to pull out a note pad and calculator every time we make a decision. Instead, it means we should pause momentarily to ask ourselves if our immediate intuition is justified. If we are driving down a freeway that we take every day, and our intuition says change lanes, we can pause for a beat and consider that we drive this way every day, and know that one lane or the other generally slows down a lot and that we will be better off in a different lane. If we have an intuition instead about a complex public policy, we can take a minute to consider whether we truly know anything about the public policy area, and whether we should be more critical of our intuitions. Jumping to conclusions in public policy based solely on intuition can be dangerous. It doesn’t take too much effort or time to think about whether our intuition can be trusted or whether we are overconfident, but it can have a big impact for how we relate to the world and whether we trust the voice in our own head, or the voice of experts.

Training Our Instincts

In his book Becoming Who We Need To Be, author Colin Wright explains how training in certain areas changes us. “Training our instincts is like feeding our subconscious. It grants us more informed, helpful knee-jerk reactions, rather than blind and potentially damaging impulses.” For examples, Wright writes about the ways that experienced auto mechanics are diagnose vehicle problems in one area of an engine based on a signal in a different area of the engine and he writes about learning to cook in six months and having a new understanding and appreciation for raw ingredients that can be cooked together to make a meal. In isolated cases, things we don’t know about and don’t understand at all can become things that give us clues and slight insights based on our experience and knowledge.

 

Recently, Tyler Cowen interviewed Ezekiel Emanuel for his podcast, Conversations with Tyler, and I was struck by Emanuel’s efforts to learn and engage with something new each year. He has recently learned how to make his own jam and chocolate and in the interview talked about the insights and unexpected things that he has gained by trying something completely new. He doesn’t always stick with everything he learns and tries, but by applying himself in a lot of different areas, he picks up new perspectives, meets new people, and gains a new appreciation for something that was foreign to him in the past.

 

The lessons from Wright and Emanuel are things we should keep in mind and try to build into our own lives. When we only have a vague understanding or idea of how the world works, we are going to move through it making assumptions that are not warranted. We will act in ways that seem intuitively obvious for us, but our way of moving through the world may be as foolish as asking the French why they haven’t had an air tanker drop water on Notre-Dame. Ignorance can be quite costly in our own lives and in the negative externalities that we push onto the rest of the world, and as we become more responsible with relationships, families, and businesses that count on us, ignorance can be quite costly for the rest of society. Becoming aware of areas where we have no expertise and no training is important so that we can identify where we might have these knee-jerk reactions that won’t help anyone. Awareness of our ignorance can help us choose what we want to focus on, what we want to learn about, and what would help us become a better person for our society.

 

On the opposite side of the coin, as we become more expert in a given area, we will be able to better sense what is happening around us and make choices and decisions that we can’t explain but that work properly. It is something we should strive toward, but all the while we should recognize where our expertise falls short and how bad assumptions could harm us and others.

Crisis

In his book, A Hole at the Bottom of the Sea, author Joel Achenbach explores the 2010 disaster of the BP oil spill in the Gulf of Mexico.  He examines the decisions that were made leading up to the night the well broke open, and how a solution to the worst oil spill in history was reached.  What he discovers in writing the book is importance of keeping our world’s experts engaged and connected when moments of disaster or crisis emerge. In regards to crisis management he wrote, “A good rule in a crisis is , at the point of attack, keep the professionals in charge. This is the battle cry of competency. Don’t let a crisis put you off your game. Don’t rush, don’t panic, don’t deviate from best practices.”

 

In the section I pulled this quote from, Achenbach is explaining the way that engineers with BP and scientists brought in by our government approached the broken well.  The experts for deepwater drilling continued their work in a practical and pragmatic way, with extra brain power and assistance piling on to try to find, research, and understand novel solutions to the problem.  From the outside, the world was going mad in a desperate frenzy to see the well shut off, but for those who must think about, manage, and design solutions for the problem, the top people were kept in charge and provided the resources necessary for a solution.

 

This approach to a crisis reminds me of stoic philosophy which calls for tranquility and clear thought in times that are challenging.  Reactionary behavior and frenzied emotions will pull us in many directions and encourage us to make hasty decisions based on half formed thoughts.  During a time of crisis and during our most challenging moments a clear and consistent thought process may seem maddeningly slow and tedious, but it will serve us better in the long run as we keep ourselves from making rash decisions with unknown consequences.

Saving the Country

In Joel Achenbach’s book, A Hole at the Bottom of the Sea: The race to Kill the BP Oil Gusher, we are presented with a reality that is very concerning about the designed, engineered, and increasingly complex world that we live in. Our systems today are so well connected and include so many different moving parts that it can be nearly impossible for any single individual to fully understand how everything functions together.  When one, or multiple, parts of a system fail it can have catastrophic and unpredictable results that challenge even those who built the system. Achenbach however, does not look at our world with fear because it is not just our systems that are increasingly interconnected, but also our smart people. Toward the beginning of his book he writes, “You never know when someone’s fantastically esoteric expertise may be called upon to help save the country.”

 

As our problems have become more complex we have developed higher education and research opportunities for individuals  to specialize in increasingly narrow fields. A common refrain heard on college campuses is that as one advances through multiple degrees they know more and more about less and less. Their focus shifts from a broad knowledge base to an increasingly narrow, specific, and complete understanding of a single subject. What this means is that we have many experts in single areas who understand the problems and science related to their field in truly profound ways.

 

When disasters arise and systems fail, which Achenbach believes may happen with increasing frequency in the future, we don’t simply need to rely on the on the ground and local experts, designers, and engineers who built the system that is failing.  Those who may be able to help save our system could be spread across the world and their fields may seem to distinct and far apart to be useful, but Achenbach believes that everyone can combine their individual expertise in novel ways to solve the most complex problems that arise.  As our research grows so do our social networks and our opportunities to combine research in new ways. We may not think that any single piece of research is too critical for our planet, but each scientific view that can be combined increases our perspective of a problem and increase the creativity which can be brought toward our solution.  In his book Achenbach shows the way that scientists from different fields were able to pool their knowledge and perspective to find a solution to a problem that threatened the entire Gulf of Mexico.