Briefly proof that:

𝑃𝑟 𝑌𝑖 = 1⃓𝑋𝑖 = 𝑃𝑖 = 1/1+𝑒−𝑍𝑖 =e-Zi/1+eZi

1 answer

To prove the given expression for \( P(Y_i = 1 | X_i) \), we start by recognizing that this expression is based on a logistic regression model.

The probability that \( Y_i = 1 \) given \( X_i \) is represented as:

\[ P(Y_i = 1 | X_i) = P_i \]

In logistic regression, we model \( P_i \) using the logistic function:

\[ P_i = \frac{1}{1 + e^{-Z_i}} \]

where \( Z_i \) is a linear combination of the predictors (or features) associated with \( X_i \):

\[ Z_i = \beta_0 + \beta_1 X_{i1} + \beta_2 X_{i2} + \ldots + \beta_k X_{ik} \]

Now, we can manipulate the equation further to express it in its alternative form:

Starting from the logistic function:

\[ P_i = \frac{1}{1 + e^{-Z_i}} \]

We can also express this as:

\[ P_i = \frac{e^{Z_i}}{e^{Z_i} + 1} \]

To see this, multiply the numerator and the denominator of \( P_i \) by \( e^{Z_i} \):

\[ P_i = \frac{1 \cdot e^{Z_i}}{(1 + e^{-Z_i}) \cdot e^{Z_i}} = \frac{e^{Z_i}}{e^{Z_i} + 1} \]

Thus, we have shown:

\[ P(Y_i = 1 | X_i) = P_i = \frac{1}{1 + e^{-Z_i}} = \frac{e^{Z_i}}{1 + e^{Z_i}} \]

This completes the proof. The expressions confirm the relationship between the logistic function and the probability \( P(Y_i = 1 | X_i) \).