Asked by Matt
Prove that a number 10^(3n+1) , where n is a positive integer, cannot be represented as the sum of two cubes of positive integers.
thanx
thanx
Answers
Answered by
MathMate
We will examine the sum of cubes of two numbers, A and B. Without losing generality, we will further assume that
A=2<sup>n</sup>X and
B=2<sup>n+k</sup>Y
where
X is not divisible by 2
n is a positive integer and
k is a non-negative integer.
A<sup>3</sup>+B<sup>3</sup>
=(A+B)(A<sup>2</sup>-AB+B<sup>2</sup>)
=2<sup>n</sup>(X + 2<sup>k</sup>Y) 2<sup>2n</sup>(X<sup>2</sup> - 2<sup>k</sup>XY + 2<sup>2k</sup>Y²)
=2<sup>3n</sup>(X + 2<sup>k</sup>Y) (X² - 2<sup>k</sup>XY + 2<sup>2k</sup>Y²)
Thus A<sup>3</sup>+B<sup>3</sup> has a factor 2<sup>3n</sup>, but not 2<sup>3n+1</sup> since X is not divisible by 2.
Since 10<sup>3n+1</sup> requires a factor of 2<sup>3n+1</sup>, we conclude that it is not possible that
10<sup>3n+1</sup>=A<sup>3</sup>+B<sup>3</sup>
A=2<sup>n</sup>X and
B=2<sup>n+k</sup>Y
where
X is not divisible by 2
n is a positive integer and
k is a non-negative integer.
A<sup>3</sup>+B<sup>3</sup>
=(A+B)(A<sup>2</sup>-AB+B<sup>2</sup>)
=2<sup>n</sup>(X + 2<sup>k</sup>Y) 2<sup>2n</sup>(X<sup>2</sup> - 2<sup>k</sup>XY + 2<sup>2k</sup>Y²)
=2<sup>3n</sup>(X + 2<sup>k</sup>Y) (X² - 2<sup>k</sup>XY + 2<sup>2k</sup>Y²)
Thus A<sup>3</sup>+B<sup>3</sup> has a factor 2<sup>3n</sup>, but not 2<sup>3n+1</sup> since X is not divisible by 2.
Since 10<sup>3n+1</sup> requires a factor of 2<sup>3n+1</sup>, we conclude that it is not possible that
10<sup>3n+1</sup>=A<sup>3</sup>+B<sup>3</sup>
Answered by
Jon Zhan
Nice Answer, But Please Try To Use (Mod)
That Way Is Easier
That Way Is Easier
Answered by
Sean
Hey, Your ANSWER is corrupt, cause it doesnt really explain anything! Try to make it more clear.
SY
SY
There are no AI answers yet. The ability to request AI answers is coming soon!