Question
The stellar magnitude scale compares the brightness of stars using the equation m_2 - m_1 = log(b_1/b_2). where m_2 and m_1 are the apparent magnitude of the two stars being compared (how bright they appear in the sky) and B_2 and B_1 are their brightness (how much light they actually emit). This relationship does not factor in how far from Earth the stars are.
a) The sun appears about 1.3 x 10^10 times as brightly in our sky as does Sirius. What is the apparent magnitude of the sun?
a) The sun appears about 1.3 x 10^10 times as brightly in our sky as does Sirius. What is the apparent magnitude of the sun?
Answers
drwls
The stellar magnitude scale has not been defined correctly in this question. Actually, it is
m2 - m1 = 2.5 log(10)(b1/b2)
so that for each decrease in magnitude by 5, the brightness increases by 100.
I hesitate to answer your question because a wrong formula is being used. The actual magnitude of the sun is about -26.7. Your question is also incomplete because you would need to know the magnitude of Sirius, which is about -1.5 as I recall
m2 - m1 = 2.5 log(10)(b1/b2)
so that for each decrease in magnitude by 5, the brightness increases by 100.
I hesitate to answer your question because a wrong formula is being used. The actual magnitude of the sun is about -26.7. Your question is also incomplete because you would need to know the magnitude of Sirius, which is about -1.5 as I recall