Question

How do you find the optimal condition given u (x1,x2)= sqrt(x1) + sqrt(x2)

Income is 12, p1 is 2, and p2 is 1

Answers

GPT 3.5
To find the optimal condition given the utility function u(x1, x2) = sqrt(x1) + sqrt(x2), we need to maximize this utility function subject to certain constraints. In this case, the constraints are income (I), the price of good 1 (p1), and the price of good 2 (p2).

Given that the income (I) is 12 and the prices are p1 = 2 and p2 = 1, the budget constraint can be written as:

p1 * x1 + p2 * x2 = I

Substituting the given values, we get:

2x1 + x2 = 12

To maximize the utility function, we can use the method of Lagrange multipliers. Define the Lagrangian function (L):

L(x1, x2, λ) = sqrt(x1) + sqrt(x2) - λ(2x1 + x2 - 12)

To find the optimal condition, we need to solve the following equations simultaneously:

∂L/∂x1 = 0
∂L/∂x2 = 0
∂L/∂λ = 0

Differentiating L with respect to x1, x2, and λ, we get:

∂L/∂x1 = (1/2) * (1/sqrt(x1)) - 2λ
∂L/∂x2 = (1/2) * (1/sqrt(x2)) - λ
∂L/∂λ = -2x1 - x2 + 12

Setting the derivatives equal to zero and solving the resulting equations, we can find the values of x1, x2, and λ that satisfy the conditions for maximum utility.

After finding the values of x1 and x2, you can plug them back into the budget constraint (2x1 + x2 = 12) to ensure that the optimal condition also fulfills the constraint.

Related Questions