## Statistical Management

Question 1

Quantitative analysis is a tool that is used to understand, explain and predict behavior of different phenomena using measurement, research, statistical and mathematical modeling. Quantitative analysis provides a scientific approach to managerial decision making because it involves manipulating and processing raw data into meaningful information using a structured procedure. Quantitative analysis involves seven steps; problem definition, developing the model, acquisition of input data, developing a solution, testing the developed solution, analyzing results from the solution and lastly implementing the results.

Problem definition is the most difficult yet most important step in quantitative analysis. The problem needs to be stated in a clearly and in a concise manner so that it gives direction and meaning. The problem statement should identify the real causes, not just mere symptoms. The objectives need to be specific and capable of measurement. The second step is the development of a model. Models contain variables and parameters. Variables can be controllable or uncontrollable. Controllable variables are the decision variables which can be influenced. The contrary is true for uncontrollable variables. Parameters are the quantities, which are known, that are part of the problem. The model developed needs to be solvable, realistic and must use mathematical representations that are understandable. Examples of models include scale models and schematic models.

The third step is the acquisition of input data. Data could be obtained from various sources. For example company documents, company reports, on-site measurements, statistical sampling and interviews. It is important that the data must be accurate and relevant to the problem being studied. Otherwise, an irrelevant solution will be obtained. This is emphasized by the Garbage in- Garbage out (GIGO) rule. The next step is developing a solution. This involves manipulation of the variables of the developed model until a solution that is practical is obtained. There several techniques for finding a solution. The common techniques include solving equations, simulation, trial and error, complete enumeration and use of algorithms. After a solution has been found, the solution is tested to evaluate its accuracy before it is analyzed and implemented. When testing the solution, data that is newly collected is used. The solution is then analyzed to determine its implication. Implementation of a new solution will inevitably lead to changes in the organization. The impact of these changes is what is analyzed at this stage. The final step is implementation of the solution. Implementation is difficult since people will always resist change. During implementation appropriate modification can be made.

In conclusion, Quantitative analysis provides a scientific approach to managerial decision making because it involves manipulating and processing raw data into meaningful information using a structured procedure. The procedure is the same as the procedure used in any scientific inquiry.

## Question 2

Quantitative analysis involves a structured procedure for developing solutions to managerial problems. Quantitative analysis involves seven steps; problem definition, developing the model, acquisition of input data, developing a solution, testing the developed solution, analyzing results from the solution and lastly implementing the results. Each of the seven stages has potential pitfall that are likely to derail the process if they are not addressed. This section discusses those problems.

There are two major potential pitfalls when defining the problem. The first one is conflicting viewpoints. Different people with different personalities and professional backgrounds are involved in defining the problem. The problem that is being considered also cut across various departments with different objectives. For example, the main objective of the sales department is to maximize sales. On the other hand, the production department’s main objective is to minimize costs. Therefore, it is only natural that they have conflicting viewpoints. Another pitfall is unrealistic assumptions. In defining the problem, some assumptions regarding the problem are formed since not all variables can be defined. If the assumptions made are unrealistic then the developed solution will be impractical. The main pitfalls in developing the model is fitting the model and understanding the model. The model developed must be comprehensive enough to address the problem. Otherwise, the problem will not be solved. The model must also be simple enough to be understood by all the stakeholders who will implement it. Otherwise, it will not be implemented properly. Therefore, there is a need to create a balance between developing a fitting model and simplifying it for understandability purposes. In the acquisition of input data, the major pitfalls are; availability of data and validity of data. In addressing the problem, data must be availed in order to appreciate and analyze the given problem. The data must be relevant and accurate if a solution is to be found. Otherwise, an irrelevant solution will be obtained. This is emphasized by the Garbage in- Garbage out (GIGO) rule. The major pitfall in the development of a solution is selecting the best technique. The solution obtained depends on the technique used. Different technique could develop different solutions for the same problem. Therefore, it is important to select a technique best befits the situation. The major pitfall in testing the solution is simulating the developed solution in the real world. Normally, when developing the solution several assumptions are made. Sometimes some of these assumptions are ideal and do not hold in the real world. This makes it difficult to test the solution. Lastly, Implementation of the results is the most difficult part. People tend to resist any kind of change. This is because it requires learning new things and in extreme cases some employees may be laid off.

In conclusion, Quantitative analysis involves seven steps; problem definition, developing the model, acquisition of input data, developing a solution, testing the developed solution, analyzing results from the solution and lastly implementing the results. Each of the seven stages has potential pitfall that are likely to derail the process if they are not addressed.

## Question 3

Independent and dependent events in statistics are studied in probability theory. The study of probability is concerned with the combination of two or more events. The relationship between these events determines the rules and methods that will be used in determining the probability of those events. In probability, an event is either independent or dependent.

In probability theory, events are independent if the occasion of one event does not affect the probabilities of subsequent events. An example of independent events is flipping a coin and rolling a die. In this case, the probability of obtaining any number on the face of the die is not influenced by the probability of getting a head or a tail. On the other hand, two events are dependent if the occasion of one event influences the occurrence of subsequent events. An example would be the probability of drawing two aces from a pack of cards. Normally a complete pack of cards has 52 cards with four aces. If the first card that is drawn is an ace and is put aside, then the probability of drawing an ace in the second round will change because the pack of cards will now have only 51 cards and not 52 cards.

It is important to differentiate independent and dependent events in probability theory because the rules for studying probability differ between the two groups. The two common rules applied in probability are the multiplicative rule and additive rule. For illustration sake, we will use two random events denoted by A and B.

The addition rule is used in determining the probability that either of the two events will occur. That is either event A or event B will occur. In probability theory, it will be denoted as A union B. It is written mathematically as AUB. For independent events, the probability that either of the events occur is computed by adding the probability of the two events minus the probability that the two events will occur jointly. This is because the two events can overlap. However, for dependent events the probability that either of the events occur is computed by adding the probability of the two events only because the two events cannot overlap. Using the two events A and B, it can be illustrated mathematically as follows:

## If the two, events A and B, are independent then A union B will be;

P (AUB) = P (A) + P (B) – P (AnB)

If the two, events A and B, are dependent then A intersection B will be;

P (AUB) = P (A) + P (B)

The multiplication rule is used in determining the probability that two events will occur together. That is both event A and event B will occur. In probability theory, it will be denoted as A intersection B. It is written mathematically as AnB. For independent events, the probability that both events occur is computed by multiplying the probability of the two events. However, for dependent events, the probability that both events occur is computed by the union of the two events less the sum of the individual probabilities of the events. Using the two events A and B, it can be illustrated mathematically as follows:

## If the two, events A and B, are independent then A intersection B will be;

P (AnB) = P (A). P (B)

If the two, events A and B, are dependent then A intersection B will be;

P (AnB) = P (AUB) – {P (A only) + P (B only)}

Question 4

Program Evaluation and Review Technique (PERT) and Critical Path Method (CPM) are commonly used tools in project scheduling. PERT and CPM were developed in the 1950s by two independent groups. However, the two techniques have been combined to develop the PERT/CPM method that is used in project scheduling. The two techniques were an improvement of the Gantt chart. PERT and CPM helps project managers to identify the sequentially time-critical activities which must be monitored closely if the project is to be completed in time. Both PERT and CPM are founded on the network diagram. A network diagram refers to the system of interrelationships between tasks and jobs for purposes of planning and controlling projects. A Network diagram highlights the interdependency of the different activities that are required to complete a project and enhances communication.

PERT was designed by the U. S Navy, Allen, Booz and Hamilton consulting firm and the Lockheed Corporation during the Polaris missile project. Time was a critical factor. Therefore, the technique utilized statistical tools to determine the probability of completing the project within a given duration. On the other hand, CPM was developed by Sperry-Rand Corporation and DU Pont Company. CPM was created for industrial projects. Therefore, the critical factor was cost minimization. In addition to identifying the critical path activities, the designers of CPM also developed a mechanism for analyzing the time-cost trade off what they referred to as crashing.

Both the two tools use network diagram. The network diagrams in both PERT and CPM use arrows to indicate direction and rectangles or circles referred to as nodes to indicate an activity or a task. Letters are normally used to represent an activity. In both cases, the flow of logic in the network diagram is from the left to the right. However, the network diagrams differ. In the network diagram, PERT uses the activity-on-arrow (AOA). On the other hand, CPM uses activity-on –node (AON) convention. Although the two conventions can be used interchangeably, some application software used in project management requires the logic of a particular convention.

When using PERT to control and schedule a project, we start by listing all activities in the plan. The following details need to be listed for each activity; the earliest start time (EST), the amount of time it will take and if the task is parallel or sequential. For a sequential task, the stage it depends on must also be indicated. The activities are then plotted on a network diagram using arrows and circles. An arrow that is between two event circles shows the activity that is needed in order to complete the task. When drawing the network, it should have only one start point and finish point. Each activity must have at least one succeeding event and one preceding event. However, activities cannot share a succeeding event or preceding event. The completed critical path diagram shows all the project’s activities. The estimated time of an activity is computed by the summation of the shortest time, 4 times the most likely time and the longest time which is then divided by six.

The procedure for drawing the network diagram in CPM is the same. However, In addition to identifying the critical path, CPM involves cost scheduling. This involves calculating the cost of different project durations with the aim of identifying the one with the lowest associated costs by increasing equipment, labor hours and other resources. Cost scheduling involves comparing the normal cost and crash cost. Normal cost is the cost incurred when the normal time required for each activity is used. Normal time is set at the point at which resources are used in the most efficient method. Crash cost on the other hand is the cost incurred when the crash time for each activity is used. Crash time is the minimum possible time that an activity can take. Crash costs are normally higher because it involves paying overtime premium, extra wages and extra facility costs.

## References

Anderson, D. R., Sweeney, D. J., & Arthur, T. (2008). Statistics for Business and Economics (10, illustrated ed.). New York: Cengage Learning.

Hansen, D. R., & Mowen, M. M. (2006). Managerial Accounting (8, illustrated ed.). London: Cengage Learning.

Krishnaswamy, K. N., Sivakumar, A. I., & Mathir, M. (2009). Management Research Methodology: Integration of Methods and Techniques. New Delhi: Pearson Education India.