Lab A - Lab Manual Chapter_ 24W-LIFESCI-23L-LEC-1 Introduction to Laboratory and Scientific Methodol

.pdf

School

University of California, Los Angeles *

*We aren’t endorsed by this school

Course

LS7B

Subject

Statistics

Date

Feb 20, 2024

Type

pdf

Pages

24

Uploaded by CoachNeutronKouprey66

Report
Table of Contents 1. Objectives 2. Introduction 3. Emergence of Scientific Methodology 4. Statistics 5. Descriptive Statistics 6. Inferential Statistics 7. The Memory Interference Test (MIT) 8. Appendix: An Example of Comparing Two Groups Using Resampling 1. Objectives To introduce and use the scientific method. To introduce and practice using simple statistics. To learn how to write scientific reports. 2. Introduction Science is a practice of gaining knowledge of nature. In order to do so, a series of methods are designed to gather, analyze, and interpret the information about nature. These methods have not always been the same through time. Even in modern days, different practices are found in different disciplines by different scientists. Although it may be difficult to have all of those who practice science to agree on one single method based on which scientific knowledge is obtained, there are still a few common characteristics in their methods that are generally agreed on by those who are in the practice. In this lab, you are going to learn a few techniques used by many scientists who follow them to learn about nature. Lab Manual - Lab A - Scientific Method and the Memory Interference Test (MIT) Return to top Return to top
3. Emergence of Scientific Methodology Modern methodology to pursue science was established in the seventeenth century in Western Europe. About four hundred years ago a new experimental method of investigation into the natural world emerged. The major players in this revolutionary change in thinking and practice included Francis Bacon (1561- 1626) and Rene Descartes (1596-1650). Since then much of the scientific methodology has been modified. Today there are two important emphases in practicing science: (1) the hypothetico-deductive approach and (2) the falsificationist procedure. The hypothetico-deductive approach ( Figures A.1 & A.2 ) : The hypothetico-deductive approach is a series of steps that, as long as none of the steps is flawed, leads to a robust conclusion about a particular problem. It begins with observations of events or patterns, followed by suggestions for the general causes and nature of the observed events and patterns. However, without further testing of the model, inaccuracies would render the suggestions unreliable. Consequently, after the initial observations of and reasoning about the general nature of observed phenomena, a scientific method demands that a hypothetico-deductive approach be employed. The hypothetico-deductive approach , proposed by Karl Popper (1902-1994), an influential science philosopher, requires a specific hypothesis (H1), i.e., a prediction of an effect or a difference, to be constructed to explain a particular aspect of the observed phenomenon. Furthermore, this hypothesis must be tested, either by carrying out appropriate experiments or making specific observations. Only after the results of these experiments have been measured and tested statistically can we determine whether the hypothesis (prediction) is or is not supported by the data and, therefore, deduce something about the phenomenon. Return to top
Figure A.1. A scientific method that incorporates the hypothetico-deductive approach and falsificationist procedure. If the hypothesis was supported, something positive is now known about that phenomenon and other aspects can be examined by constructing and testing other hypotheses. If the hypothesis was rejected, something else is known about that phenomenon, albeit something negative. At the same time other hypotheses should also be constructed and tested. As you can see, via the hypothetico- deductive approach, it is possible to go on learning about things forever. Consequently, there is always the possibility that a new hypothesis and test will show a previous piece of "knowledge" to be false. This self-correcting mechanism is an important aspect of the scientific method. The falsificationist procedure : The falsificationist procedure is a simple way of increasing the power of conclusions deduced using the hypothetico-deductive approach. It merely involves taking the prediction (hypothesis) of an effect (H1 above) and creating a null hypothesis . For the purpose of this course, we will state that a null hypothesis (H0) predicts no effect or no difference between two or more tested samples. The reason for doing this is that hypotheses can be disproved much more easily than they can be proved. When we are formulating statistically testable hypotheses, they need to meet certain criteria. A good hypothesis is one that is both specific and testable. Specific:
What groups are being compared? What measure is being used to compare them? Testable: Will you be able to reject/retain your null hypothesis after conducting the experiment? In the lab section this week, you will participate in an activity where you look at a series of statistical hypotheses and evaluate them. Figure A.2. Why is scientific writing so critical? 4. Statistics As stated previously, it is almost never feasible to make all of the possible measurements that might prove a hypothesis. In addition, in natural populations, there often is considerable variation (consider the human species). Instead we take measurements from some individuals in a population (a Return to top
sample) and use those measurements to draw conclusions about the larger population using statistical methods. Statistics are often divided into two types: descriptive and inferential statistics. Descriptive statistics (e.g., mean and median) describe the pattern (i.e., distribution ) in observed groups of measurements (i.e., samples ). Inferential statistics, in contrast, can be used to draw conclusions about the whole population(s) based on the smaller sample datasets, including testing hypotheses. For example, in this lab, we’ll be comparing two groups and testing the null hypothesis that there is no difference between them. Brief descriptions are provided below to help you to understand these statistics. However, for LS23L, you are not required to memorize the formulas. Definitions Several definitions will help you to understand how statistics are calculated, how they relate to your measurements, and what they really mean. Population: the entire collection of measurements on which the researcher intends to draw conclusions (e.g., adult weight of human population in South America, or height of eucalyptus trees in Los Angeles County). Sample: the set of measurements (X , X , X , … X ) actually made (e.g., sampling daily dietary calories of one thousand individuals from each capital of a South American country; or sampling height of fifty eucalyptus trees in each LA neighborhood). 5. Descriptive Statistics There are a few terms in statistics commonly used to describe the set of measurements in order to show their characteristics. These terms, called parameters, can show the central tendency or can be described as a measure of variability. However, due to the fact that it is impossible to obtain all the measurements of one particular variable, the true parameter is usually not available. As a result, an estimate of a parameter is produced to serve as a description of these measurements. An estimate of a parameter is called a statistic. The following explains three statistics that measure the central tendency and two statistics that describe the level of variability of a set of measurements. We are going to incorporate these statistics into the lab report. Mean 1 2 3 i Return to top
One of the statistics that measures the central tendency of a variable is the mean. The mean is more commonly known as the “arithmetic average.” The mean of a sample (X ̄ ) is calculated as the sum of all measurements in the sample divided by the sample size ( n ). However, the mean is only a good estimate of the central tendency of a set of data if the data’s distribution is bell-shaped (symmetric single-humped with thin tails). Mean = X ̄ = (X +X +X +...+X ) / n = ∑X / n When is it OK to use the mean? Rule of Thumb When is it OK to use the mean to describe a data set? We can set out a rule of thumb. Let’s say that a distribution is “bell-shaped” if it is: symmetric single humped thin tailed Then our rule of thumb is that when a distribution is bell-shaped, the use of the mean value to describe the data is OK. Median The median is a measure of central tendency that works even if the data doesn’t fit these requirements, so it is often a better measure than the mean. The median is the measurement located at the middle of the ordered set of data. In other words, there are just as many observations larger than the median as there are smaller. If the sample size is odd, the median is the middle measurement of the ordered series. If the sample size is even, the median is the average between the two middle measurements. For example, Series A: 1.5, 3.7, 3.9, 4.5, 6.3, 7.1, 8.0, 8.8, 9.4 Series B: 1.5, 3.7, 3.9, 4.5, 6.3, 7.1, 8.0, 8.8, 9.4, 10.5 The median for Series A is 6.3 and the median for Series B is (6.3 + 7.1) ÷ 2 = 6.7 1 2 3 i i
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help