Asked by A.
Expand f(x)=ln (1+x/1-x) in a Taylor Series about x=0. You must express your answer using summation notation.
---This is what I tried to do---
So I was thinking of taking the derivative of ln(1+x/1-x) and get 2/x^2-1
And then use the identity 1/1-x= sum x^k
So then it will be Sum (-2^k) (x^2k)
---This is what I tried to do---
So I was thinking of taking the derivative of ln(1+x/1-x) and get 2/x^2-1
And then use the identity 1/1-x= sum x^k
So then it will be Sum (-2^k) (x^2k)
Answers
Answered by
Steve
you have the right idea.
First, df/dx = -2/(1-x^2), but I'd do it like this:
f(x) = ln(1+x) - ln(1-x)
ln(1+x) = 0+x-x^2/2+x^3/3-x^4/4+x^5/5+...
ln(1-x) = 0-x-x^2/2-x^3/3-x^4/4-x^5/5-...
subtract to get
0 + 2x + 2x^3/3 + 2x^5/5 + ...
First, df/dx = -2/(1-x^2), but I'd do it like this:
f(x) = ln(1+x) - ln(1-x)
ln(1+x) = 0+x-x^2/2+x^3/3-x^4/4+x^5/5+...
ln(1-x) = 0-x-x^2/2-x^3/3-x^4/4-x^5/5-...
subtract to get
0 + 2x + 2x^3/3 + 2x^5/5 + ...
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.