First shift everything by minus a, so that the plane goes through the origin. The plane then goes through the points:
b'= b - a
and
c'= c - a
and through the origin.
Then, we define an orhonormal basis for the linear space spanned by the plane. We take the first basis vector to be in the direction of b'. So we normalize b1 to obtain:
e1 = b'/|b'|
From c' we subtract the component in the direction of e1:
f2 = c' - (c' dot e1) e1
Then f2 is orthogonal to e1:
f2 dot e1 = c'dot e1 -
(c' dot e1) (e1 dot e1.
Now, e1 dot e1 = 1, because e1 is normalized. So we see that
f2 dot e1 =0.
Then we have to normalize f2 to get the second basis vector:
e2 = f2/|f2|
You can now simply project the point x'= x-a onto the plane to obtain p':
p' = (x' dot e1) e1 + (x' dot e2) e2
Then shift back by a to obtain p:
p = p'+ a
In 3-space, a plane abc is spanned by three points a, b, c. The point x does not lie on this plane.
I'm trying to find the formulas used to find the point p on abc with minimal distance to x.
I know the point p must be perpendicular to the point p and the plane.
So, I began with taking a vector from the plane to the point, w:
x = (x0,x1,x2)
p = (p0,p1,p2) <-- unknown
w = [p0-x0, p1-x1, p2-x2]
v = [a, b, c] = [a0 b0, c0]
[a1, b1, c1]
[a2, b2, c2]
then I project w onto v
D = |(v dot w)|/|v|
but my problem arises here because I do not know the distance and I do not know the coordinates of p. Am I supposed to be using the gradient formula instead?
Any help is greatly appreciated! Thank you
1 answer