🔬 Tutorial problems theta \(\theta\)#
Note
This problems are designed to help you practice the concepts covered in the lectures. Not all problems may be covered in the tutorial, those left out are for additional practice on your own.
\(\theta\).1#
Consider the maximization problem
subject to \(c_1 \geq 0\), \(c_2 \geq 0\) and \(p_1 c_1 + p_2 c_2 \leq m\). Here \(p_1, p_2\) and \(m\) are nonnegative constants, and \(\beta \in (0, 1)\).
Show that this problem has a solution if and only if \(p_1\) and \(p_2\) are both strictly positive.
Solve the problem by substitution and using the tangency (relative slope) condition. Discuss, which solution approach is easier.
To answer the first part of the question, review facts of existence of optima.
First, note that \(U(c_1, c_2) = \sqrt c_1 + \beta \sqrt{c_2}\) is continuous as a composition of continuous functions. Then, observe that the admissible set
is closed, all inequalities are weak.
Hence, by the Weierstrass extreme value theorem, a maximizer will exist whenever \(B\) is bounded.
If \(p_1\) and \(p_2\) are strictly positive then \(B\) is bounded. This is intuitive but we can also show it formally by observing that \((c_1, c_2) \in B\) implies \(c_i \leq m / p_i\) for \(i =1,2\). Hence
We also need to show that if one price is zero then no maximizer exists. Suppose to the contrary that \(p_1 = 0\). Intuitively, no maximizer exists because we can always consumer more of good one, thereby increasing our utility.
To formalize this we can suppose that a maximizer exists and derive a contradiction. To this end, suppose that \(c^* = (c_1^*, c_2^*)\) is a maximizer of \(U\) over \(B\). Since \(p_1 = 0\), the fact that \((c_1^*, c_2^*) \in B\) implies \(c^{**} = (c_1^* + 1, c_2^*) \in B\). Since \(U\) is strictly increasing in its first argument, we also have \(U(c^{**}) > U(c^*)\). This contradicts the statement that \(c^*\) is a maximizer of \(U\) over \(B\).
Now, to solve the problem explicitly, review the argument that the inequality \(p_1 c_1 + p_2 c_2 \leq m\) can be replaced by an equality with no effect on the set of maximizers. Indeed, if \(p_1 c_1 + p_2 c_2 < m\) then we can increase \(c_1\) or \(c_2\) until the equality is satisfied, and because the criterion function is strictly increasing in each argument, the interior point is not a maximizer.
Inverting \(p_1 c_1 + p_2 c_2 = m\) with respect to \(c_2\) gives \(c_2 = (m - p_1 c_1) / p_2\).
First, solve by substitution. The first order condition is
It is convenient to check the second order conditions right away for the one dimensional problem. Simple derivation (left for exercise) yields that the second derivative is negative independent of \(c_1\), and so the function \(\sqrt{c_1} + \beta \sqrt{\tfrac{m}{p_2}-\tfrac{p_1}{p_2}c_1}\) is strictly concave. Therefore, any point that satisfies the first order condition is a unique maximizer.
It is only left to find maximizing \(c_2\) from the constraint.
To solve the problem using the tangency condition, recall that the maximizer is characterized by
where \(f(x_1,x_2)\) is the criterion and \(g(x_1,x_2)=0\) is the constraint, and the equality condition itself.
In our case \(x_1=c_1\), \(x_2=c_2\), and we have
Hence, the optimum is the solution of
Clearly, this system leads to the same equation as before after the substitution \(c_2 = (m - p_1 c_1) / p_2\) is made.
\(\theta\).2#
Solve the following constrained maximization problem using the Lagrange method, including the second order conditions.
Follow standard algorithm of Lagrange method.
Let \(f(x,y)= x^3/3 - 3y^2 + 2x\) and \(g(x,y)=4x-y^3\) for all \(x,y \in \mathbb{R}\). The constraint set is \(\{(x,y): g(x,y)=0\}\). The Lagrangian function is
The first order partial derivatives are
The bordered Hessian matrix is
Observe that \(\mathrm{rank}(Dg)) = \mathrm{rank}((-4, 3 y^2)) = 1\) for all \(x,y \in \mathbb{R}\). That is, the constrained qualification holds for all points on the constraint.
The first order conditions are
Observe that if \(\lambda=0\), then the FOCs imply \(x^2+2-4\lambda = x^2+2 = 0\), which is a contradiction since \(x^2+2 \ge 2\). It must be that \(\lambda \ne 0\). Next, from the equation \(-6y+3\lambda y^2 = 3y(\lambda y -2)=0\), it is either \(y=0\) or \(y = \lambda/2\).
Case 1: Suppose that \(y=0\). Then, from the constraint \(4x=y^3\) we get \(x=0\). Also, the FOC yields \(x^2+2 - 4\lambda = 0+2-4\lambda=0\) so that \(\lambda=1/2\). Hence, the optimizer is \((x^*, y^*, \lambda^*)=(0,0,1/2)\). The corresponding bordered Hessian matrix is
Since there are two variables \(N=2\) and one constraint \(K=1\), it suffices to check the last (\(N-K=1\)) leading principal minor, i.e., the determinant of the full bordered Hessian matrix. The determinant follows
It has the same sign as \((-1)^N=(-1)^2\). Therefore, the bordered Hessian matrix is negative definite and \((x^*, y^*, \lambda^*)=(0,0,1/2)\) is a local maximizer on the constraint set.
Case 2: Suppose that \(y=2/\lambda\). Then, by \(\lambda = 2/y\) and \(x = y^3/4\), the FOCs yield
Let \(h(y)=y^7 + 32 y - 128\). Since \(h'(y)=7 y^6+32 >0\), function \(h\) is strictly increasing. Note that the solution for \(h(y)=0\) is strictly positive since \(y^7\) and \(y\) have the same sign. We can verify that \(y^* \approx 1.8325\) solves \(h(y)=0\). The optimizer is \((x^*, y^*,\lambda^*)=(\bar{y}^{3}/4, \bar{y}, 2/\bar{y})\) where \(\bar{y}\) satisfies \(h(\bar{y})=0\). The corresponding bordered Hessian is
The determinant is
where the last inequality holds since \(\bar{y}\) is positive. Therefore, \(H\mathcal{L}\) has the same sign as \((-1)^K= -1\) so that it is positive definite. Hence, \((x^*, y^*,\lambda^*)=(\bar{y}^{3}/4, \bar{y}, 2/\bar{y})\) is a local minimizer on the constraint set.
Finally, since \(f(x,y) = x^3/3-3y^2+2x = x^3-3(4x)^{1/3}+2x \rightarrow \infty\) as \(x \rightarrow \infty\), there is no global maximizer. The local maximizer on the constraint is \((x,y)=(0,0)\).
\(\theta\).3#
Find the maxima and minima of the function
Check both first and second order conditions.
Follow standard algorithm of Lagrange method.
We first wite down the Lagrangian:
We have:
F.O.C.s are \(\nabla \mathcal{L}(\lambda, x, y) = 0\).
If \(\lambda = 0\), then \(x=y=0\), which contradicts with \(x^2 + y^2 - 2a^2 = 0\) because \(a>0\). Thus, \(\lambda \neq 0\).
If \(x=0\), then \(\lambda = y = 0\), which contradicts with \(\lambda \neq 0\). Thus, \(x \neq 0\).
Similarly, we can prove \(y \neq 0\).
Thus, we have \(y = 2 \lambda x\), \(x = 2 \lambda y\) and \(x,y,\lambda \ne 0\). Combining the two equation we have \(y = 4 \lambda^2 y\) and \(\lambda^2 = 1/4\). Consider the two cases one by one: for \(\lambda = 1/2\) we have \(x=y\), and for \(\lambda = -1/2\) we have \(x=-y\). Taking into account the constraint, each of the two cases leads to two each, namely \(x=\pm a\) and \(y= \pm a\).
Altogether we get four critical points:
\(x=a, y=a, \lambda = \frac{1}{2}\),
\(x=a, y=-a, \lambda = -\frac{1}{2}\),
\(x=-a, y=a, \lambda = -\frac{1}{2}\),
\(x=-a, y=-a, \lambda = \frac{1}{2}\).
The bordered Hessian:
Since there are two variables, one constraint (\(N=2\), \(K=1\), \(N-K=1\)), we only need to check the sign of the determinant of the bordered Hessian (the last leading principal minor):
Notice that:
Since \((-1)^K=-1\), we conclude that the critical points \((a, a)\) and \((-a, -a)\) satisfy the pattern for negative definiteness, and the critical points \((-a, a)\) and \((a, -a)\) satisfy the pattern for positive definiteness. Thus, the critical points \((a, a)\) and \((-a, -a)\) are local maximizers, and the critical points \((a, -a)\) and \((-a, a)\) are local minimizers.
In this question, the gloabl maximizer/minimizer exists by Weierstrass theorem and must be some local maximizer/minimizer. From \(f(a, a) = f(-a, -a) = a^2, f(-a, a) = f(a, -a) = -a^2\), we know the maxima is \(a^2\) and the minima is \(-a^2\).
\(\theta\).4#
Find the maxima and minima of the function
subject to
where \(a>0\).
Follow standard algorithm of Lagrange method.
We first wite down the Lagrangian:
The F.O.C.s are:
Solving F.O.C.s, we get two solutions:
\(x=y=\sqrt{2}a, \lambda = \frac{\sqrt{2}}{2}a.\)
\(x=y=-\sqrt{2}a, \lambda = -\frac{\sqrt{2}}{2}a.\)
The bordered Hessian:
Since there are two variables, one constraint (\(N=2\), \(K=1\), \(N-K=1\)), we only need to check the sign of the determinant of the bordered Hessian (the last leading principal minor):
Since for both critical points, we have \(x=y=2\lambda\), we only need to consider \(\det(H\mathcal{L}(x/2, x, x)) = 8/x^9\).
Notice that:
Thus, we know that the critical point \((\sqrt{2}a, \sqrt{2}a)\) is local maximizer, and the critical points \((-\sqrt{2}a, -\sqrt{2}a)\) are local minimizers.
In this question, the gloabl maximizer/minimizer exists by Weierstrass theorem and must be some local maximizer/minimizer. We know the maxima is \(f(\sqrt{2}a, \sqrt{2}a) = \frac{\sqrt{2}}{a}\) and the minima is \(f(-\sqrt{2}a, -\sqrt{2}a) = -\frac{\sqrt{2}}{a}\).
\(\theta\).5#
Solve the following maximization problem
Follow standard algorithm of Lagrange method realizing that the inequality constraints have to be satisfied with equality.
First, notice that if \(x=0\), \(f(0, y,z) = 0 < f(1, 1, 1)=1\), thus, \(x>0\). Similarly, we can prove \(y > 0\) and \(z > 0\).
Second, notice that if \(x + 2y + z < 10\), we can set \(x^{\prime} = 10 - 2y - z\), which implies \(x^{\prime} > x\) and \(f(x^{\prime}, y, z) > f(x, y, z)\).
Thus, the only binding constraint is \(x + 2y + z = 10\).
The Lagrangian is:
The gradient of the Lagrangian w.r.t. \((\lambda, x, y, z)\) is:
Solve \(\nabla \mathcal{L}(\lambda, x, y, z) = 0\), we get \(x = z = \frac{20}{11}\), \(y = \frac{15}{11}\).
The bordered Hessian:
Since there are three variables, one constraint (\(N=3\), \(K=1\), \(N-K=2\)), we need to check two last leading principal minors of \(H\mathcal{L}\), that is the determinant of the bordered Hessian and the determinant of the \(H\mathcal{L}\) where last column and last row are removed. Denote the matrix composed of the intersection of first three rows and first three columns of \(H\mathcal{L}\) by \(M(\lambda, x, y, z)\).
We can simplify \(H\mathcal{L}\) by using \(x = z = \frac{4}{3}y\) at the critical point:
The determinants are:
and
since \(z = 20/11\) at the critical point.
Thus, the sign of the last leading principal minor is negative which the same as \((-1)^N\), and the sign of the determinant of the last but one leading principal minor is positive (recall there are three variables, and we need to confirm \(H_{(x, y, z)}\mathcal{L}(\lambda, x, y, z)\) is negative definite on the constraint set). Thus, Hessian is negative definite on the linear constraint, and therefore the critical point is a local maximizer by the sufficient second order condition.
By Weierstrass theorem, we know the unique local maximizer must be the global maximizer. The maxima is \(f(20/11, 15/11, 20/11) = (\frac{20}{11})^{4/3} (\frac{15}{11})^{1/2}\).