Information is Power, But Only if it is Used.
A Newsletter from Custom Decision Support Inc. & Lieb Associates Vol. 7, Summer 2002
Obtaining and using information in an uncertain world are the themes of this monthís newsletter. Management always wants to know what is the "optimum price" and how will we get the proper representation of the data. The assumptions for identifying optimum results from survey data and the use of weights with survey data are covered.
"Obtaining A Prevalent Marketplace Awareness"
In our last Information Edge Newsletter (Vol. 6), we discussed the prospect of the "death spiral," where a lack of an adequate business growth model drives a firm into poorer and poorer economic conditions. While poor long-term management is a key cause of this situation, the lack of in-depth market-place awareness is an equally culpable source. The military has long been aware of the power of information. The U.S. military has establish the principle of a "Dominant Battlefield Awareness," whereby through control of knowledge (both by gaining of information and by denying it to the enemy) victory is assured or at least obtainable. A similar principle should hold in the commercial world. Through knowledge, competitive advantage can be sustained. Because we can usually not prevent our existing and potential competitors from knowing the market, it becomes even more important that we are not blind-sided.
However, while the concept is easy to state and may be self evident, it is hard to implement. It is far too easy to assume that firms "understand the marketplace." The sales force is always expected to "know" their customers. Management reads the trade journals religiously and clearly understands all the trends influencing the industry. Businesses always have excellent relations with the distributors who surely inform us of all competitive actions. And to assure our understanding, we augment our marketplace knowledge with routine customer satisfaction and market tracking studies.
Unfortunately, this sounds better than it is. Sales forces often hear what they want hear. Trade journals publish what they are told. Relationships with customers and distributors are collaborative but also competitive, they often tell you what is in their interest. And finally, most routine customer satisfaction and tracking studies return only that information that is being sought. Most of the important information is not in plain view. It needs to be search out. We need to find out previously undiscovered opportunities and threats. Competitive advantages come from doing things new ways and exploiting the new opportunities.
Marketing research studies and competitive intelligence programs need to be planned and executed in a methodical integrated approach. However, this costs money. Over the past two decades, we have seen in the United States an obsession with operational cost reduction. This has resulted in many cases with a drastic, if not draconian, reduction of commercial information gathering and analysis. We believe that continuing this "penny-wise and pound-foolish" approach results in a fixation with the present, an inability for businesses to grow and prevents any non-trivial awareness of the future of the marketplace.
The Strategic Edge
Optimum Answers in an Uncertain World
What do we mean by an "optimum" price. Marketing and survey research provides modeling tools to explore market behavior. From these models, we can obtain "optimum" answers if we can define what we wish to optimize. This is not simple in our highly uncertain world.
If competitive conditions are known and stable, we can determine the prices and conditions that should maximize earnings. For a single product, this earnings maximizing optimum price can be obtained simply by plotting earnings over a price range. The procedures and interpretation of this type of optimization is discussed in the Fall 2000 Information Edge Newsletter (Vol. 5, No. 2).
Similarly, optimum solutions can also be obtained for several products. This approach provides estimates of acceptable prices and conditions for the product line which would maximize total earnings. But here again we assume that that we know the competitive conditions. However, the world is seldom that simple.
Competitive conditions are rarely known for certain. The issue is, how to include the uncertainty in the process of estimating the "optimum" choice. The trick is to identify new types of goals that captures uncertainty. We have found two effective optimization goals: (1) maximizing the likelihood of exceeding targeted earnings, and (2) identifying a range of satisfactory results that would take place irrespective of a range of competitive conditions.
The first approach is based on imposing distributions of possible conditions and computing the results. The likelihood of exceeding a target earnings is computed based on a number of simulations using the distributions. This is referred to as a Monte Carlo analysis and uses the same competitive model that we had used for the simple deterministic approach. This is fairly straight forward using Microsoft Excel. Probabilistic business analysis of this type using Excel is discussed in Fall 1999 Information Edge Newsletter (Vol. 4, No. 2).
The major problem with using this method is that we need to assign probabilities to events that we normally have little insight into. Usually, the best we can do is provide a range of competitive values. An alternative approach is to estimate the impact of two extremely different scenarios and capture the solutions that would be satisfactory to both. This is essentially a "minimum regret" solution since we would have a satisfactory solution for most conditions between these extremes. The solution exists in the region overlapping the two conditions. These are shown on the following chart.
The point where these curves cross represents the highest common percent of maximum returns. In this case, it is 96% of both maxima. This is a "minimum regret" solution and is by its nature sub-optimum. While this point represents a "optimum minimum regret" solution, we typically seek a range of acceptable values. This is shown as band of values set at some sub-optimum below the optimum point. In this case, it is shown at an 85% range. This allows acceptable return as well as the needed flexibility required to handle other conditions. It is not unusual, that the expected optimum solution is usually near the "minimum regret" solution and therefore, provides some assurance that the expected optimum value will not produce terrible "unforeseen" results.
Weighting Survey Results
The Techno-Tip column consists of suggestions and comments for data analysis. It is intended to help analysts and managers directly involved in the analysis of business data.
"The managers want to see the data weighed to represent their markets!" It is often desirable to present survey results in terms of market importance rather than in terms of the averages by respondents. For example, it is useful to weight market data by customersí purchases, or by the population groups in the market.
Typically, the respondents are selected in order to measure the characteristics of groups of people. It is, therefore, necessary in many cases to select samples that represent groups rather than the population as a whole. This is referred to as obtaining a "stratified" sample. It is designed to capture results for each segment with sufficient precision for decision making. Unfortunately, we also need to look at the results as a whole or in terms of other potential groups. In order to get a representative view of the data, we often resort to weighting the data.
At first glance, this would appear to be a straightforward exercise. The data doesnít change; it is just weighed in terms of its importance. However, that weighting exercise influences the effective size of the database and the precision of the average results. For example, consider a sample of 1000 respondents, with only one individual being important, all others have weights of zero. In this extreme case, the actual sample size, for statistical purposes, is only one not 1000. On the other hand, if we equally weighed the data, the effective number would once again be 1000. The effective sample size can be computed as:
where h is the effective sample size, N is the actual sample size, Wi is the weight for respondent i. Another way of looking at this is the greater the variation in weights the lower the effective sample size.
Below are, the results for the effective sample size for a number of commonly used market importance ratios. An 80/20 rule, for example, means that 80% of the sales comes from 20% of the customers. This is a fairly common rule for general businesses. However, industrial suppliers often see 90/10 and even 95/5 rules where as much as 95% of the earnings comes from only 5% of the customers.
For an 80/20, the effective sample size is slightly over 60% of the original sample. For a 90/10 rule, it is less than 40%. Because of this effect, we have tended to resist using weights particularly when multivariate analysis will be used.
The Next Killer Applications
Once again, it is time to consider a new computer. This happens to me every three to four years. This corresponds to new machines with four to five times the speed and memory capacity, as well as a new family of gadgets and stuff. This new generation has brought us burnable DVDís, network connections, flat screen monitors and a new generation of operating systems. And finally, the price canít be beat. They sell for half the cost of the original IBM PC. The only problem is that the old system works perfectly well for most of our previous tasks. It still does word processing like a pro. The Excel spreadsheet works just fine on it. Why then invest in the new technology? Only if it offers new capabilities of great value that the older system does not.
But what are these new "Killer Applications?" While I tend to be cynical about new offerings and the hype associated with them, there are some new applications that may change the way we do things. In particular, voice recognition (dictation) and video editing were not feasible without the additional capabilities. As I had mentioned in a previous newsletter, voice recognition on older systems, with speeds less than 600 Megahertz, did not function effectively. Furthermore, the packages for dictation were difficult to work with and were prone to unacceptable error. The new generation software is far better and appears to work satisfactory with machines operating over 1 Gigahertz. Acceptability of error is, of course, a personal call; but I have found "Via Voice" by IBM presently a satisfactory tool.
While voice recognition may be considered only a convenience for those of us without major disabilities, Video, on the other hand, may be a revolution on its way. Modern video equipment and editing software are affordable and can produce fairly high quality, if not professional, looking results. This is a relatively new phenomenon. With the advent of high capacity storage (writeable CDís and more recently DVDís) distributing video is likewise affordable and the media players are ubiquitous. The only issue is the distribution of skills for their generation and the creativity of identifying cost saving and competitive advantaging gaining uses for this new capability.
Just as I recovered from a badly broken leg, I have now broken my foot. However, this one is minor compared to the last break and I am clearly on the mend. As many of the readers of this newsletter will appreciate, the recession has reduced the billable work for this consultant. That is not to say that the work has declined, only the cash flow. New prospective projects keep on coming in but approvals appear to take forever.
The recession has also liberated time to focus on some long term projects: (1) expand the Business Research Methods and a set of Information Systems notes, (2) edit and publish a set of analyses based on the PIMS s database, (3) revise the web site, and (4) explore the use of video for training in business research. We have expanded the Research Methods notes with the publication of a chapter on business modeling. This is the eighth chapter of the notes and is available on the web site. This chapter includes a tutorial on using Microsoft Excel for developing business decision support systems. The Management Information Systems notes had been published as part of our teaching activities at Drexel and Villanova Universities. These notes are now available on the web site. Three new chapters are being prepared on: (1) Knowledge Management, (2) Business Success Factors and Functional Imperatives, and (3) Selecting Project Development Methodologies.
We are editing a set of extensive notes and analyses by Jack Frey. Jack had prepared over 135 articles examining the nature of business performance using the PIMS database. The PIMS database has been compiled over the last three decades and consists of detailed characteristics of over 3000 businesses. These articles provide statistical views of the drivers of business performance. After editing, we expect to publish them on our web site.
We have started to revise the web site for easier use and to provide more business information. This is a continuous process. And finally, we are exploring the use of video to help educate business teams and managers on the use of analytical tools. This is going to be a long, involved process. What I know about video production is very little and what I donít know is enormous. Iíll keep you all posted on my progress.
Gene Lieb(Editor and President)
E-Mail at firstname.lastname@example.org.