To find the distance between two points using the Pythagorean Theorem, you can use the formula:
\[ d = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} \]
where \( (x_1, y_1) \) and \( (x_2, y_2) \) are the coordinates of the two points.
Let’s analyze the given options:
-
First Response: \( 52 + 72 = \sqrt{136} \)
- Looks incorrect. This uses incorrect addition instead of calculating squares.
-
Second Response: \( \sqrt{(6^2 + 10^2)} = \sqrt{136} \)
- This implies points at (0,0) and (6,10). This is valid since \( 6^2 + 10^2 = 36 + 100 = 136 \) and thus, \( \sqrt{136} \) is correct.
-
Third Response: \( \sqrt{3^2} - 5 = \sqrt{20} \)
- Does not follow the Pythagorean theorem correctly.
-
Fourth Response: \( \sqrt{6^2} + 3 = \sqrt{25} \)
- Incorrect application of the theorem.
The correct choice is the second response:
\[ \sqrt{(6^2 + 10^2)} = \sqrt{136} \]