Homework Assignment 1#
Deadline: March 2, 2025. Your answer to the assignment must be written by hand and uploaded as a pdf to “Assignments” on Learn. Read the rules here: Homework Assignments.
Remember, all answers must be justified and well-reasoned.
Exercise 1: Tangent Vector to a Level Set is Orthogonal to the Gradient#
Consider a differentiable function
and let
be a differentiable vector function, whose image space is a subset of the level set:
for a constant \(c\). In other words, we have:
We consider \(\pmb{r}\) as a parametrization of a curve.
Question a#
Explain why
Question b#
Show using the chain rule that
Question c#
Explain how the result from the previous question implies that the gradient \(\nabla g\) is perpendicular (orthogonal) to the level surface
at the point \(\pmb{r}(t)\) for every \(t \in ]a,b[\).
Exercise 2: The Chain Rule on a Function with no a Functional Expression#
We consider two differentiable vector functions:
We are informed that \(\pmb{f}\) has the functional expression:
Furthermore, we are informed that the Jacobian matrix \(\pmb{J}_{\pmb{g}}(\pmb{y}) \in \mathbb{R}^{2 \times 3}\) of \(\pmb{g}\) is given by:
We wish to determine the Jacobian matrix of the composite function \(\pmb{g}\circ \pmb{f}\) at the point
Question a#
Determine \(\pmb{f}(1,-1,\tfrac{\pi}{2},0)\).
Question b#
Determine te Jacobian matrix \(\pmb{J}_{\pmb{f}}(1,-1,\tfrac{\pi}{2},0)\) of the function \(\pmb{f}\) at the point \((1,-1,\tfrac{\pi}{2},0)\).
Question c#
Argue that the composite function \(\pmb{g} \circ \pmb{f}\) is differentiable on \(\mathbb{R}^4\). Use the chain rule to determine the Jacobian matrix of the composite function \(\pmb{g} \circ \pmb{f}\) at the point \((1,-1,\tfrac{\pi}{2},0)\).
Exercise 3: Differentiability of a Piecewise Polynomial and of Softplus#
Consider the functions \(f:\mathbb{R} \to \mathbb{R}\) and \(g:\mathbb{R} \to \mathbb{R}\) given by the functional expressions:
and
The function \(g\) is the so-called Softplus
function, which is an approximation of ReLU
. It can, for example, be used as an activation function in neural networks to achieve differentiability, thereby facilitating the optimization process.
Question a#
Explain why \(f\) is differentiable on \(\mathbb{R} \setminus \{0\}\). Provide a functional expression for the differential quotient \(f'\).
Question b#
Explain why \(f\) is continuous but not differentiable at the point \(0\).
Question c#
Argue that \(g\) is differentiable on the entire real line and find a functional expression for \(g'\).
Question d#
In machine learning, ReLU and Softplus are typically applied to vectors. These vector functions are defined by applying the given scalar functions coordinate-wise. For example, the function \(\pmb{g}: \mathbb{R}^n \to \mathbb{R}^n\) is given by \(\pmb{g}(\pmb{x}) = [g(x_1), g(x_2), \dots, g(x_n)]\). Compute the Jacobian matrix \(\pmb{J}_{\pmb{g}}(\pmb{x})\) of the Softplus vector function.
Exercise 4: A new Inner Product#
Consider \(\mathbb{R}^n\) with the usual inner product \(\langle \cdot, \cdot \rangle\). Let
\(U \in \mathbb{C}^{n \times n}\) be a unitary matrix (meaning \(U^*U=I\)), and let
\(\Lambda \in \mathbb{R}^{n \times n}\) be a diagonal matrix with strictly positive diagonal element, meaning \(\lambda_i>0\) for \(i=1,\ldots,n\).
Define
Furthermore define
where \(D\) is a diagonal matrix with the elements
Question a#
Show that
Question b#
Determine the inverse matrix \(A^{-1}\).
Question c#
Define for the column vectors \(\pmb{x}, \pmb{y} \in \mathbb{R}^n\) the inner product
Show that
Question d#
Show that \(\langle \pmb{x}, \pmb{y} \rangle_B\) actually is an inner product on \(\mathbb{R}^n\). You may use, without proof, that the usual inner product \(\langle \pmb{x}, \pmb{y} \rangle\) is an inner product on \(\mathbb{R}^n\).
Question e#
Let \(B = \operatorname{diag}(2,4)\). Provide two vectors \(\pmb{x}, \pmb{y} \in \mathbb{R}^2\) that are orthogonal to each other with respect to \(\langle \cdot, \cdot \rangle_B\). None of the vectors may be the zero vector in \(\mathbb{R}^2\).