According to Aubrey Daniels, common sense is “the unreflective opinions of ordinary people. Unreflective opinions are based on unanalyzed experience.” He goes on to opine that “contrary to popular belief there isn’t too little common sense in business, there’s too much.” I believe that the meaning of common sense has changed from the original dictionary definition. Now common sense means having sound judgment in practical matters.
In my mind, sound judgment means gathering the facts, analyzing the situation, and deciding based on data—not intuition. So, I do agree with Daniels on the rest of his hypothesis—which is that businesses need to be driven by the scientific method and must insist on data, not “common sense.” As leaders, we should not be flying by the seat of our pants; instead, we should insist on decisions being made based on analytics.
In addition to operational decisions being data-driven, organization leaders should insist on science-based education to be leaders or managers. In my experience, there is little science-based training on how to be a manager or a leader. Instead, we often promote those from within the organization who have mastered their job. Or, perhaps we hire people from outside the organization who have “industry experience.” It seems evident that leadership and management must be taught and not left up to common sense. At least not to the old definition of common sense.
There may be some leadership attributes that facilitate a person becoming an excellent leader. However, many science-based studies about human nature can and should inform leaders how to inspire, engage, and develop human employees. I’m not sure why we assume that technical competencies require training while management skills do not. That appears to be how we approach most management development in the larger organizations I am familiar with.
As I mentioned above, I believe sound judgment requires gathering and analyzing as much data as possible. Properly studying data often means employing experts who understand statistics, probability, and cause and effects. For example, I frequently get into conversations about advertising effectiveness, especially when it comes to retail products during the holidays. Marketing executives point to the data and say, “See, we advertised extensively in the months leading up to the holidays, and sales are up.” I have seldom heard an executive say, “See, we stopped advertising this year, and sure enough, retail sales were down year-over-year.” The studies that retailers have done show that their products’ sales are unchanged regardless of ad dollars. People will purchase gifts for friends and family based on branding, not advertising. Many companies could put those ad dollars right in the profit column instead of the expense column!
Clearly, there is no stomach to experiment and uncover the cause and effect for long-term benefit. Instead, fear of short term loss keeps leaders from gathering and carefully analyzing the data. There is, finally, more research and analysis on the return-on-investment on advertising dollars. Our whole economy would benefit if we used ad dollars for research, development, and customer service. Perhaps this is a worthwhile resolution for the new year—complete, by the end of the year, scientifically based research into advertising effectiveness, to be done by disinterested third-parties, not the Chief Marketing Officer.