My wife does a fair amount of baking. I noticed that when she is baking a cake or bread, she will sometimes stick a sharp knife (like a paring knife) or a toothpick into the cake when the timer indicates that the cake is supposed to be “done.” Sometimes, after doing this, she takes the cake out to cool. Sometimes, she puts the cake back in the oven, sets the timer for a few more minutes, and goes about other chores. When the timer goes off a second time, the procedure is repeated.
One day, curiosity got the best of me and I asked what she was doing. What was she looking for? Her response was that she was checking to see if the cake was really done. She was
looking to see if there was too much moisture or uncooked batter on the toothpick. If there was, it wasn’t fully baked and so back in the oven with the cake. If the toothpick was clean, then it was fully baked and things were fine. The proof is in the tasting, and so far, she’s right every time.
I wish there were a toothpick test for data. We’re swimming in the stuff. I get the distinct feeling, looking at business results, that many leaders are going off with half baked ideas because they are overwhelmed by too much data.
According to IFL Science (I can’t tell you what that stands for, because this is a family blog), 90% of all data in the world today was created in the last two years. DOMO has an info graphic that claims we are generating about 2.5 quintillion (2.5 x 10^18) bytes of data every day. On average, the US alone spits out 2,657,700 gigabytes of Internet data every minute. What that means is (generally) we cannot possibly hope to analyze all the available data before we make a decision. We will always be dealing with half baked strategies when it comes to data!
AI to the Rescue?
That is, of course, until we get help from massive computational machines and Artificial Intelligence (AI). Theoretically at least, we can design algorithms to scan through all that data and give us a distilled set of options from which to choose. Or, better yet, just tells us what the right decision or course of action is.
And perhaps that will be true for mundane things like introduction of a new product or service. Yet we will be faced with the same kind of overpowering amounts of data around human interactions. For example, we are capable of putting together a complex picture of a suspected criminal’s activity based on digital footprints. Add to that DNA data, digital images from cameras of all types, etc. and we will easily be swamped. But who or what decides guilt or innocence? Will a jury of my peers have to include an AI agent as well?
I’m not sure what’s worse — making a decision based on a lack of data (as we have been doing) or having too much data and not even processing some of it. Either way, it seems to me, I will have to be satisfied with half baked ideas. That will be true, at least with respect to my fully understanding all the details in the data.
I think we can gain an advantage in our organizations when we develop an algorithm for a viable stopping point. More than ever we can be trapped by analysis paralysis. When do we reach enough analysis? How do we know when the data cake is cooked, but ensure that the data cake is still plenty moist, and out on time to feed the guests? I need a data cake toothpick.