Common Time

\(4/4\) time-signature, also denoted \(\mathcal{C}\).

Cost Function
Loss Function

A mapping between an outcome and a real number that signifies the loss or cost of that outcome. The outcome variable may be an event like dropping hot cup of tea, in which case the cost is some numerical value representing how terrible that is. More commonly the outcome variable is a vector representing, for example, the probability mass a function (like a neural network) assigned to an input.

Enharmonic Equivalence

Two notes, intervals, scales or chords are enharmonic equivalents if they have different names but contain the exact same notes. Imagine a building. The ceiling of the first floor and the floor of the second floor are the same thing, but the naming is different depending on your point of view.

Hessian Matrix

A square matrix of the second-order partial derivatives of a function. On the diagional are the partial derivatives in a single direction, and the other spots are taken up by all the mixed-partial derivatives. Example in 2D:

\[\mathbf{H}f(x,y) = \begin{bmatrix} \frac{\partial^2 f}{\partial x^2} & \frac{\partial^2 f}{\partial xy}\\ \frac{\partial^2 f}{\partial yx} & \frac{\partial^2 f}{\partial y^2} \end{bmatrix}\]
Iverson Bracket

Notation that converts logical propositions inside the brackets to a \(1\) if true and \(0\) if false. One application is to mathematically include or exclude elements of vectors or sets in a summation or product:

\[\v x = \begin{bmatrix}1&3&7&9\end{bmatrix}\\ \sum_{i=1}^n \left[x_i > 5 \right] = 2\]
Jacobian Matrix

Matrix of first-order partial derivatives of a vector-valued function. If \(f\) is a function that maps some vector \(\mathbf{x}\) in \(\mathbb{R}^n\) to \(\mathbf{f(x)}\) in \(\mathbb{R}^m\), its Jacobian is:

\[\mathbf{J} f = \begin{bmatrix} \frac{\partial \mathbf{f}}{\partial x_1} & \dots & \frac{\partial \mathbf{f}}{\partial x_n} \end{bmatrix} = \begin{bmatrix} \frac{\partial f_1}{\partial x_1} & \dots & \frac{\partial f_1}{\partial x_n}\\ \dots & \ddots & \vdots\\ \frac{\partial f_m}{\partial x_1} & \dots & \frac{\partial f_m}{\partial x_n}\\ \end{bmatrix}\]
Observed Variable
Unobserved Variable

A factor that is a part of a statistical relationship like a correlation or causation, and is (not) recorded in the data at hand.

Operator Overloading

Having an operator do different things depending on the type of the arguments. For example, we are familiar with + adding numbers, but the operator is often extended to support adding images, dates and other datatypes.


Random Sample Consensus. Iterative method of fitting a model.

  1. Draw \(s\) samples from the data.

  2. Fit the model to these samples.

  3. Check how many points from the full dataset fall within an acceptable range \(d\) around the model - these are inliers.

  4. Do this for \(N\) iterations.

  5. Choose the model with the most inliers and refit it to all inliers.

Signed Distance

The distance of a point to some surface, with the sign signifying on which side of the surface the point is located.


Tom’s Obvious, Minimal Language. A readable configuration file format. Example:

title = "TOML Example"

name = "Stefan Wijnja"
website = ""

server = ""
ports = [ 8001, 8001, 8002 ]

The sum of the components on the main diagonal of a square matrix. The trace has the property that for three matrices \(A, B, C\): \(\mathrm{Tr}(ABC) = \mathrm{Tr}(BCA)\)

Unit Vector

A vector with norm of \(1\), i.e.: \(\sqrt{x \cdot x} = 1\).