If you get off the highway and take an alternate route when traffic slows to one lane, you are making a prediction. Likewise, if you decide to invite someone to dinner, that too is a prediction. The scientific method? Predictive in nature. Every time you make a decision, you are making a prediction of an outcome, and choosing one over another based on this prediction.
Prediction skills become second nature because of this daily application. These predictions may not be based on data or evidence, but involve some subjective guess about a preferred outcome. In the case of choosing a traffic route or a dinner date, it’s clear that not much data is involved. The decision involves subjective interpretations, intuitive hunches, and guesses about potential outcomes.
Will data analysis really enhance prediction accuracy? There are no guarantees, without adding a certain amount of understanding of data, of variation, and of process performance.
The perennial answer is, of course, “It depends.”
For decades of users, Shewhart control charts have provided information about process stability. Like all Shewhart charts, data is plotted over time and in order from oldest to most current. The traditional control chart, an old standby, is not the only possibility when it comes to garnering information from process data. In monitoring processes with small drifts or changes, for example, the exponentially weighted moving average (EWMA) chart may offer an improvement over traditional Shewhart control charts.
But again, that depends. Certain processes—for example, in the chemical industry—benefit from understanding small shifts or drifts in a process. For other industries, Shewhart control charts do the job quite effectively.
How the cataclysmic shift in technology size and speed affects our work practices: Spreadsheets on a smart phone and other considerations.
Meeting compliance requirements and standards is sufficient to meet objectives of injury and accident prevention, and assure the health and safety of all employees—right?
Among the “Ten top business trends that will drive success in 2016,” reported in an end-of-2015 Forbes article by author and consultant Ian Altman, was the point that “Top performing companies will focus on connecting customers.”
Citing examples that include Uber, Airbnb, Kickstarter, and others, Altman notes that these companies may own no real estate and have no funds to invest, and yet they are among highly successful firms in 2016. He attributes their success, in part, to the fact that in the case of Uber, for example, “they excel at connecting riders with drivers.” Baker predicts that “The most valuable companies will connect buyer to seller, or consumer to content.”
Does this signal a return to customer service as a priority?
When Frederick Winslow Taylor advanced the principles of “scientific management” in 1909, he was hailed as a master of efficient production. In the context of the new century’s focus on science, his principles were met with approval of manufacturers, who saw opportunities to improve productivity and enhance profitability.
The principles that Taylor advanced were based on the beliefs that there is one “right” way to do each job, that workers are motivated by money, and that close monitoring of processes assures that the most efficient methods could be applied. In one of his experiments, he studied the precise movements that were involved in bricklaying, timing each movement and outlining in specific step-by-step moves the most efficient way to lay bricks.
Taylor’s approach may indeed have improved productivity and streamlined processes in manufacturing. What it gained in efficiency, however, it lost in terms of pride of workmanship, individual responsibility, and the motive for innovation. Unfortunately, its effects linger even a century later in the attitudes of managers toward their workers and workers’ perceptions of their jobs. Its effects may even be seen in education, where rigorous testing assesses only master of highly specific content, with little emphasis on individual motivation or creativity.
Bluffing in poker, if used wisely, can increase your chances of winning. Bluff too much though, and your opponents are on to you. Bluffing quality isn’t so different. Whimsical changes decreed from the top in the form of new quality teams, control charts displayed on large screens, and new buzzwords for everyone to learn might present a confident quality front to customers, employees, and auditors at first. These tactics might even, in some cases, result in improvements and cost savings.
At some point, though, your customers, employees, and other stakeholders will catch on. Unless significant changes in the way the organization operates are made, the gains will eventually slow to a crawl or stall completely. Why? Because the decrees at best temporarily change behavior, but the organization that created the quality issues remains fundamentally the same. When the “heat and light” that was used to alter the behavior is removed, the organization reverts to its former behaviors.
As World Quality Month celebrations are replaced with attention to holiday celebrations and November’s focus fades into the distant past, facing a new year in the darkness of December may represent an opportunity to pay attention to issues related to developing and managing technology and contemplating the future of a company or organization.
Last month’s issue of Harvard Business Review, with a cover story related to “What really keeps CEOs awake at night,” addressed the timing of innovative technologies in an article authored by Ron Adner and Rahul Kapoor (https://hbr.org/2016/11/right-tech-wrong-time). We all know of technological innovations that have been released too late and missed the revolution (the article cites Blockbuster’s failure to address the shift from rentals to streaming, for example), as well as those that have been ready too soon, falling into a market that does not perceive their value.
To avoid the “right tech, wrong time” scenario, Adner and Kapoor suggest looking more closely at the ecosystems that support technologies. Understanding the competition between the new and the old ecosystems can help to assure more accurate predictions about the timing of transitions, and to render decisions about allocating resources more effective.
Setting aside time to celebrate quality offers an opportunity not only to reflect on our own quality improvement efforts, but also to recall other years and other celebrations, and to consider the history of the designation as well as of our own quality improvement efforts.
National Quality Month (October) started in 1988 in the U.S. and Canada, while Japan has been celebrating Quality Month (November) since 1960. World Quality Month was instituted in 2010, acknowledging the global impact that quality improvement has had on organizations, and recognizing that quality in products and services is important for organizations throughout the world.
The role of W. Edwards Deming and others is not to be forgotten as we reflect on the meaning of this month and recall its history.
Statistics has gotten a bad rap. People love to quote Mark Twain (“There are lies, damn lies, and statistics,” alternatively attributed to Benjamin Disraeli), Vin Scully (“Statistics are used much like a drunk uses a lamppost: for support, not illumination”), or Stephen Leacock (“In ancient times they had no statistics so they had to fall back on lies”).
For statisticians, these jokes have become quite tedious. Avoiding small talk at cocktail parties where quips are likely to come up or lying about one’s profession (“I’m a kind of mathematician” sometimes works) are not really satisfying alternatives to the lines that people have saved to shower on the innocent professional. What’s a statistician to do?