To prove the equation \( P(Y_i = 0 | X_i) = 1 - P_i = \frac{1}{1 + e^{Z_i}} \), we begin with the logistic function, which is often used in binary classification models.
-
Logistic Function Definition: The probability of the response variable \( Y \) being 1 given \( X \) is typically modeled using the logistic function:
\[ P(Y_i = 1 | X_i) = P_i = \frac{e^{Z_i}}{1 + e^{Z_i}} \]
where \( Z_i \) is a linear combination of the predictors \( X_i \) (e.g., \( Z_i = \beta_0 + \beta_1 X_{i1} + \beta_2 X_{i2} + ... + \beta_k X_{ik} \)).
-
Complement of Probability: The event \( Y_i = 0 \) is the complement of the event \( Y_i = 1 \). Hence,
\[ P(Y_i = 0 | X_i) = 1 - P(Y_i = 1 | X_i) \]
-
Substituting the Logistic Function: \[ P(Y_i = 0 | X_i) = 1 - P_i \]
Substituting the expression for \( P_i \):
\[ P(Y_i = 0 | X_i) = 1 - \frac{e^{Z_i}}{1 + e^{Z_i}} \]
-
Finding a Common Denominator: To simplify, we express \( 1 \) as \( \frac{1 + e^{Z_i}}{1 + e^{Z_i}} \):
\[ P(Y_i = 0 | X_i) = \frac{1 + e^{Z_i} - e^{Z_i}}{1 + e^{Z_i}} = \frac{1}{1 + e^{Z_i}} \]
This final result gives us:
\[ P(Y_i = 0 | X_i) = \frac{1}{1 + e^{Z_i}} \]
Thus, we have proved that:
\[ P(Y_i = 0 | X_i) = 1 - P_i = \frac{1}{1 + e^{Z_i}} \]