Asked by Christine
How bright a star appears can depend on how much light the star actually emits and how far away it is. The stellar magnitude scale can be adjusted to account for distance as follows:
m_2 - m_1 = log (b_1/b_2)
Here, M refers to a star's absolute magnitude, that is, how brightly it appears from a standard distance of 10 parsecs (or 32.6 light-years). The absolute brightness of Sirius is 1.4 and the absolute brightness of Betelgeuse is -8.1.
a) Which of these two stars is brighter, in absolute terms, and by how much?
m_2 - m_1 = log (b_1/b_2)
Here, M refers to a star's absolute magnitude, that is, how brightly it appears from a standard distance of 10 parsecs (or 32.6 light-years). The absolute brightness of Sirius is 1.4 and the absolute brightness of Betelgeuse is -8.1.
a) Which of these two stars is brighter, in absolute terms, and by how much?
Answers
Answered by
Reiny
a bit outside my expertise, but this page seems to fit in quite nicely with your problem.
http://csep10.phys.utk.edu/astr162/lect/stars/magnitudes.html
http://csep10.phys.utk.edu/astr162/lect/stars/magnitudes.html
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.