To determine the average height of trees in a national forest, 10 trees are randomly selected. Their heights in feet are 85.4, 87.6, 87.1, 85.6, 85.9, 85.3, 84.9, 83.7, 84.0, and 85.2. To ensure the accuracy of the data, another set of 10 trees is randomly selected. Their heights are 87.1, 88.0, 84.2, 85.3, 86.5, 84.2, 83.2, 84.1, 85.2, and 87.3. What is the difference between the mean heights of the trees of these two random samples?

1 answer

To determine the mean height of the first set of trees:
(85.4 + 87.6 + 87.1 + 85.6 + 85.9 + 85.3 + 84.9 + 83.7 + 84.0 + 85.2) / 10 = 85.7 feet

To determine the mean height of the second set of trees:
(87.1 + 88 + 84.2 + 85.3 + 86.5 + 84.2 + 83.2 + 84.1 + 85.2 + 87.3) / 10 = 85.8 feet

The difference between the mean heights of the two sets of trees is:
85.8 - 85.7 = 0.1 feet

So, the difference between the mean heights of the trees in the two random samples is 0.1 feet.