Asked by Sam
Evaluate the integral: 16csc(x) dx from pi/2 to pi (and determine if it is convergent or divergent).
I know how to find the indefinite integral of csc(x) dx, but I do not know how to evaluate the improper integral, at the following particular step.
I know I need to take the limit on the upper bound (e.g. A) as it approaches pi from the left side.
After doing so, I get ln|csc(x)+cot(x)|.
Approaching pi from the left side, I get ln|inf-inf|, so I need to do L'Hopital's Rule.
I common denominator, and get ln|(1+cos(A))/(sin(A))|.
As A approaches pi, I get ln(0), which is still not defined.
Does this mean that the integral diverges?
I know how to find the indefinite integral of csc(x) dx, but I do not know how to evaluate the improper integral, at the following particular step.
I know I need to take the limit on the upper bound (e.g. A) as it approaches pi from the left side.
After doing so, I get ln|csc(x)+cot(x)|.
Approaching pi from the left side, I get ln|inf-inf|, so I need to do L'Hopital's Rule.
I common denominator, and get ln|(1+cos(A))/(sin(A))|.
As A approaches pi, I get ln(0), which is still not defined.
Does this mean that the integral diverges?
Answers
Answered by
Count Iblis
Yes, it is divergent. However, using the indefinite integral to establish this is not recommended. This is because in most cases, the indefinite integral cannot be expressed in terms of elementary functions.
You need to consider first if the integral might be divergent at some of the limits, and then you prove whether it actally is divergent or convergent.
In this case, you can see that near the upper limit of pi, the integrand
1/sin(x) behaves as 1/(pi-x). More precisely, for any interval containg pi, there exists a constant A such that
|1/sin(x) - 1/(pi-x)| < A|pi-x|
This you can obtain from the Taylor expansion of sin(x) around x = pi with the error term. Then this implies that is you subtract
1/(pi-x) from 1/sin(x), the integral will be convergent. You can then easily prove that divergence of the integral of 1/(pi-x) then implies that the integral of 1/sin(x) must also be divergent.
You need to consider first if the integral might be divergent at some of the limits, and then you prove whether it actally is divergent or convergent.
In this case, you can see that near the upper limit of pi, the integrand
1/sin(x) behaves as 1/(pi-x). More precisely, for any interval containg pi, there exists a constant A such that
|1/sin(x) - 1/(pi-x)| < A|pi-x|
This you can obtain from the Taylor expansion of sin(x) around x = pi with the error term. Then this implies that is you subtract
1/(pi-x) from 1/sin(x), the integral will be convergent. You can then easily prove that divergence of the integral of 1/(pi-x) then implies that the integral of 1/sin(x) must also be divergent.
Answered by
Sam
Thanks a lot for your help, Count Iblis! :)
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.