Ok, hard to visualize the diagram, but I think I got it. (I have no idea why they would have the corner of the flag touch the ground)
My diagram is as follows
Flagpole #1 , top A bottom B , AB = 4
Flagpole #2, top C, bottom D , CD = 3
Have BD the distance on the ground between the bottoms of the poles.
pick any visually possible point P on BD, labelling
AP = AC = PC = x , so the side of the flag is x
let angle BPA = Ø, then angle =180-60-Ø = 120-Ø
in triangle ABP, sinØ = 4/x **
in triangle CPD, sin(120-Ø) = 3/x
sin120cosØ - cos120sinØ = 3/x
(√3/2)cosØ - (-1/2)sinØ = 3/x ***
divide *** by **
[(√3/2)cosØ + (1/2)sinØ]/sinØ = (3/x) / (4/x) = 3/4
2√3cosØ + 2sinØ = 3sinØ
2√3cosØ = sinØ
sinØ/cosØ = 2√3
tanØ = 2√3
Ø = appr 73.89788.. (I stored it in my calculator for accuracy)
then sinØ = 4/x
x = 4/sinØ = appr 4.16 units
check my arithmetic.
A flag in the form of an equilateral triangle is connected to the tops of 2 vertical poles. One of the pole has a length of 4 and the other pole has a length of 3. You also know that the third vertex touches the ground perfectly. Calculate the length of a side
1 answer