f(x,y) = 3x^2 + 3yx^2 + y^3 -15y.
Step 1. Take the gradient:
Grad f = (6 x + 6 x y) i + (3 x^2 + 3 y^2 - 15) j
This is always defined, so the question is whether the gradient is 0:
In[1]:= Solve[{6 x + 6 x y == 0, 3 x^2 + 3 y^2 - 15 == 0}, {x,y}] Out[1]= {{x -> -2, y -> -1}, {x -> 2, y -> -1}, {y -> -Sqrt[5], x -> 0}, > {y -> Sqrt[5], x -> 0}} To classify them, we evaluate the discriminant: f_xx f_yy - (f_xy)^2 == (6 + 6 y)(6 y) - (6 x) (6 x) For the first critical point, this is (-6)(-6) - (-12) (-12) < 0, so the first critical point is a saddle. For the second critical point, the discriminant is the same, so it is also a saddle. For the third critical point, the discriminant is 5 - 0 > 0, so it is an extremum. Since f_xx < 0, this is a local maximum. It cannot be a global maximum, because for very large y (positive or negative), the most important term in f is y^3, and it could be either positive or negative. For the final critical point, the discriminant is likewise positive, but f_xx > 0, so it is a (local) minimum.
The objective function, is the cost:
f(R,H) == 2 * pi * R^2 * .01 + (2 * pi * R) H * .005
The constraint is the volume:
g(R,H) = pi * R^2 * H == 20
To find the best dimensions, we need Lagrange's formula: Grad f == lambda Grad g, i.e., R component: .04 * pi * R + .01 * pi * H == lambda * 2 * pi * R * H H component: .01 * pi * R == lambda * pi * R^2 To solve, let's multiply these equations by 100/pi, to get:In[2]:= Solve[{4 R + H == 200 lambda R H, R == 100 lambda R^2, \ Pi R^2 H == 20}, {R,H, lambda}] Out[2] = 1/3 (25 Pi) 5 1/3 5 1/3 > {lambda -> ----------, H -> 4 (--) , R -> (--) }} 500 Pi Pi The cost of the can is: In[3]:= .02 * Pi * R^2 + .01 * Pi *R * H /. % 1/3 Out[3]= 0.06 (25 Pi)3. (5 points - Dang if this doesn't look a lot like homework problem 21!) Find the minimum value of x^3 + y^3 + z^3 for (x,y,z) on the intersection of the planes x + y + z = 2 and x - y + z = 3.
The place(s) where the minimum occurs is (are) where LAgrange's condition is satisfied.
With two constraints, we have: Grad(x^3 + y^3 + z^3) = lambda Grad(x + y + z) + mu Grad(x - y + z) The three components of this read: 3 x^2 == lambda + mu 3 y^2 == lambda - mu 3 z^2 == lambda + mu There are 5 unknowns, so we need two more equations, which are the constraint equations. Comparing the equations given above, we can see that x^2 == z^2, so z = x or -x. Suppose first that z == x. From the constraint equations, we then learn that y == 2 - 2 x and -y == 3 - 2 x. Adding these together gives 0 == 5 - 4 x, so one constrained critical point is (x,y,z) == (5/4, -1/2, 5/4) The other possibility is that z == -x. The difficulty with this possibility is that it converts the constraint equations into y == 2 and -y == 3, which is impossible. Thus there is only one critical point. A good student will wonder here if the critical point is actually a minimum. It is, as can be seen by substituting the constraint equations into the objective, which becomes a quadratic representing a parabola going to +infinity. (We would probably give almost total credit without a carefgul proof of this point.) The value of the minimum is 121/32.4. (5 points) NOTE: The point of this problem is to see if you understand Newton's method. The exact solution of the system is (x,y) = (3,2) or (2,3). Now that we all know the exact answer, you will get 0 points for writing it down!
Suppose
f(x,y) := x + y - 5 = 0 and
g(x,y) := (x - y)^2 - 1 = 0.
Take as your initial guess (x_0,y_0) = (1,-1) and use Newton's method to produce a better guess (x_1,y_1)
The explicit* formula for the improved guess (x_{n+1},y_{n+1}) given (x_n,y_n) is:
x_{n+1} = _________________________________________________
y_{n+1} = _________________________________________________
With (x_0,y_0) = (1,1),
x_1 = _________________________________________________
y_1 = _________________________________________________
This problem is a "plug and chug," and the answer can be looked up in the textbook.