To determine the average height of trees in a national forest, 10 trees are randomly selected. Their heights in feet are 85.4, 87.6, 87.1, 85.6, 85.9, 85.3, 84.9, 83.7, 84.0, and 85.2. To ensure the accuracy of the data, another set of 10 trees is randomly selected. Their heights are 87.1, 88.0, 84.2, 85.3, 86.5, 84.2, 83.2, 84.1, 85.2, and 87.3. What is the difference between the mean heights of the trees of these two random samples?(1 point)

4 answers

The mean height of the first set of 10 trees is (85.4 + 87.6 + 87.1 + 85.6 + 85.9 + 85.3 + 84.9 + 83.7 + 84.0 + 85.2)/10 = 85.28 feet.

The mean height of the second set of 10 trees is (87.1 + 88.0 + 84.2 + 85.3 + 86.5 + 84.2 + 83.2 + 84.1 + 85.2 + 87.3)/10 = 85.51 feet.

The difference between these two means is:

85.51 - 85.28 = 0.23 feet

Therefore, the difference between the mean heights of the two random samples is 0.23 feet.
all of these bots are wrong and stupid.
85.4 + 87.6 + 87.1 + 85.6 + 85.9 + 85.3 + 84.9 + 83.7 + 84.0 + 85.2 = 854.7, divide by 10, the number of trees. 85.47. Now for the second mean: 85.51. Subtract, and u get 0.04 :D
ehh...
ehh, idk