QUAL20016 Quality and Statistical Control - 1155_57775 Case Study 6.1 First Name: Allen Last Name: Barnett Student ID: 991 329 867 Similar Calculations Omitted for clarity. Introduction Quality… Case Study 6.1 The subject of my case study is on Hop Scotch Drive In. A restaurant that offers its customers a choice of dinning inside or an in car service. To serve their patrons in their car the company employs car hops wait on them and to serve food when it is available. The company had previously set out standards of wait times for in car service of 2 minutes, plus 0, minus 1 minutes to be seen by their carhop. Concerned that they are not meeting these standards, Hop Scotch records the time it currently takes for its drive-in customers …show more content…
Also since cpk is less then 0, the process is centered outside of the specification limits and since C_P≠C_pk it can be stated that this process is not centered inside the specifications With this new information, it has come to the companies attention that they are not meeting their promise of service within 2 minutes so a new company directive is formed, workers are to try harder to get to the cars faster. Again, data is collected of the time it takes to reach each car. Calculations part 3 Again the Xbar, Sigma, Cp, Z(LSL) and Z(USL) are calculated and are as follows. Since Z(LSL) is still lower, it again will be used to calculate Cpk. Looking at the results of these new numbers there is one noticeable change. Cp is now greater then 1, meaning that this process operates within a range similar to the one set out by the restaurant. Although the new cpk is still negative, the process is centered outside of the specifications. It is much closer but still much too far from being a capable process. It can also be noticed that because of this improvement the average time for a patron to be seen has improved by 0.279 minutes Looking at the new process data, it decides that it needs to do something much more drastic if it wishes to meet its promise, so each carhop is given roller blades to further improve wait times for …show more content…
Again the Xbar, Sigma, Cp, Z(LSL) and Z(USL) are calculated and are as follows. Now that the value for Z(LSL) is the lower value, it will be used to calculate Cpk. Examining the new results, Cp is still greater then 1, meaning that this process operates within a range similar to the one set out by the restaurant. Although the new cpk is still negative, the process is centered outside of the specifications. It is much closer but still much too far from being a capable process. It can also be noticed that a further improvement of 0.971 minutes has been made to the time it takes to serve a patron from the previous improvement and a total improvement of 1.589 minutes overall. Since cp is greater then 2.5, this process is capable of operating within a range two and a half times as small as the one specified, however since the cpk value is still less then 1 this process cannot meet the process set out by the
The final table calculates the upper and lower control limits for the entire sample. Since I wake up every day at 6:30am and must leave the house by 7:40am, I have 70 minutes total to complete all of my tasks in the morning. According to my calculations on the final table, I need to leave the house anywhere between 7:34am and 7:39am. Doing so means that I have made my bus and my process has been a success for the day.
If the process variable are exceeds a predetermined value, what you can do to make it normal?
• Provide at least two examples or problem situations in which statistics was used or could be used.
Further, machines continued to break down because of non-functioning motors and were exposing workers to considerable safety concerns. Therefore, fewer machines were being used and in turn increased demand, which in turn further increased delays of meeting customer demand for the products.
14) Refer to the table. What is the average time a customer spends waiting in line and being served?
46th & Hiawatha project is out for re-bid on 12/22/16. Based on the newly issued preliminary drawings we will need a revised taper quote. Note, the attached drawings are preliminary drawings so if you have any questions with regard to how the system should be laid out please contact Troy or
After running a process flow [see Exhibit 2], it becomes apparent that a main bottleneck exists at the
The neighborhood that I have picked for the fieldwork project is Uptown. It is north of Chicago; 6 miles away from the Loop, according to Encyclopedia of Chicago. Based on the map, Uptown’s boundaries are Foster Avenue (north), Lake Michigan (east), Montrose and Irving Park (south), and Ravenswood and Clark (west). The cross streets for the south side of Uptown are from Ravenswood to Clark, then Clark St. to Lake Michigan; from the west side, Foster to Montrose, then Montrose to Irving Park. North of Uptown is Edgewater, to the west of it is Lincoln Square, then to the south is Lake View (City of Chicago). As of 2010, Uptown’s total population is 56,362 (2010 United States Census).
From the calculation (See Appendix I), we get the 3-sigma control limits for the process, i.e. UCL=0.091, LCL=0.014. These control limits indicate that if the error proportion is within the range of [0.014, 0.091], the process is under control; if not, the process is out of control.
We observe that there is high standard deviation for most of the steps (especially Underwriting Step) and is comparable to mean value itself. Ex: For RERUNs Underwriting step mean is 18.7min while the standard deviation is 19.8min. Hence we are using 95% SCT for determining bottleneck step. And thus we consider Underwriting Stage as the bottleneck for the whole system.
To receive the best possible outcome in this observation study, I decided to examine and write things down on days that I was off. I went in for about an hour and observed how the employees interacted with guests and each other. I frequented the restaurant on busy days as well as slow ones to convey if the pace of the restaurant also affected the employees. Instead
The data that I have analysed is the number of employees within the back office that have left since the 1st January 20011 and the reasons why they have left. During the last 3 years the back office have seen a number of changes and the below data shows the reasons why these changes have been made. I have also looked at how long each employee has been with the organisation.
See Excel Model 1.1 In March 2003 PC = $800, QC = 138 hours In March 2003, management feels: $800 PC ↑ by $200 → 30% ↓ QC PC ↓ by $200 → 30% ↑ QC
Evaluating the first data set from the table, for n = 20, A2 = 0.18. Using the hint provided, the estimated standard deviation is 0.234. The process capability as obtained is 1.03. This is below 1.33, which means that the process is not capable.
The p-value shows that there is no significant difference between the service times offered by the servers.