Week 2A: Powers of the adjacency matrix

A social media website (like Linked-in) may suggest new friends/connections to connect with. It is possible to use matrix algebra in order to analyze the “proximity” of people in a complex network of connections. Consider an example network below of six people, lines (vertices) are drawn between persons (nodes) to indicate the existing connections.

An example network between six entities

In a first step, we can characterize this network by its adjacency matrix AA, a 6×66\times 6 matrix where each element Ai,jA_{i,j} is either 1, if there is a connection between person ii and jj or 0, if there is not. As such, the adjacency matrix is a symmetric matrix. For the given graph above, the adjacency matrix is,

A=(010010101010010100001011110100000100).A = \begin{pmatrix} 0 & 1 & 0 & 0 & 1 & 0\\ 1 & 0 & 1 & 0 & 1 & 0\\ 0 & 1 & 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 0 & 1 & 1\\ 1 & 1 & 0 & 1 & 0 & 0\\ 0 & 0 & 0 & 1 & 0 & 0\end{pmatrix}.

Put differently, $A1A^1 gives the number of paths of length 1 vertex between two persons. Lets say we broaden our scope and look at the number of paths between two persons that have a length of two vertices. This would mean counting the number of persons with which each of them has a connection of length 1. Say we want find out how many paths there are of length 2 between node 2 and 4 we can inspect the second and fourth row of the matrix,

(010010101_01_0010100001_01_1110100000100).\begin{pmatrix} &0 & 1 & 0 & 0 & 1 & 0\\ \rightarrow &1 & 0 & \underline{1} & 0 & \underline{1} & 0\\ &0 & 1 & 0 & 1 & 0 & 0\\ \rightarrow &0 & 0 & \underline{1} & 0 & \underline{1} & 1\\ &1 & 1 & 0 & 1 & 0 & 0\\ &0 & 0 & 0 & 1 & 0 & 0\end{pmatrix}.

The 3rd and 5th column have 1s on both rows, indicating a shared one-vertex connection, and thus there are two a possible paths from node 2 to 4 over two vertices (via node 3 or via node 5). Given the symmetry of the matrix we could have also compared the 2nd row to the 4th column instead of the 4th row.

(010010101_01_00101_000010111101_00000100).\begin{pmatrix} & & & & \downarrow & &\\ &0 & 1 & 0 & 0 & 1 & 0\\ \rightarrow &1 & 0 & \underline{1} & 0 & \underline{1} & 0\\ &0 & 1 & 0 & \underline{1} & 0 & 0\\ &0 & 0 & 1 & 0 & 1 & 1\\ &1 & 1 & 0 & \underline{1} & 0 & 0\\ &0 & 0 & 0 & 1 & 0 & 0\end{pmatrix}.

You can now see that finding the number of 2-vertex paths is nothing but a row-column multiplication! As such, we can multiply AA with itself to get A2A^2, whose entries list the number of two-vertex connections between two nodes.

We can repeat the same argument for the 3-vertex connections encoded in A3A^3. For the given graph it is,

A3=(242241425162250510215063461620120300).A^3 = \begin{pmatrix} 2&4& 2& 2& 4& 1\\ 4& 2& 5& 1& 6& 2\\ 2& 5& 0& 5& 1& 0\\ 2& 1& 5& 0& 6& 3\\ 4& 6& 1& 6& 2& 0\\ 1& 2& 0& 3& 0& 0\end{pmatrix}.

We see that this matrix is has non-zero diagonal entries. This corresponds to paths from a node to itself via three vertices. Such triangular paths always come in multiples of 6: For each node in a triangle there is a left and a right oriented path traversal. As such, we can sum the number of entries on the diagonal and divide by six to get the number of triangles in the netowrk. Using the trace (TR(...)\text{TR}(...)) we can write for the number of triangles in a network (NN),

N=TR(A3)6.N = \frac{\text{TR}(A^3)}{6}.

Thank you matrix algebra! for this neat fact.

Week 2A: Polynomial-line fitting revisited

Consider three data points {x,y}\{x,y\} with specific xx coordinates but with variable yy values given by {1,y1},{0,y0}\{-1, y_{-1}\}, \{0, y_0\} and {1,y1}\{1, y_1\}. We can again try to fit a parabolic function,

y(x)=ax2+bx+c.y(x) = ax^2 + bx + c.

The matrix equation is,

A(abc)=(111001111)(abc)=(y1y0y1).A\begin{pmatrix}a\\b\\c\end{pmatrix} = \begin{pmatrix}1&-1&1\\ 0 & 0 &1\\ 1&1&1\end{pmatrix} \begin{pmatrix}a\\b\\c\end{pmatrix} = \begin{pmatrix}y_{-1}\\y_0\\y_1\end{pmatrix}.

or in short-hand notation,

A𝐚=𝐲A\mathbf{a} = \mathbf{y}

We can rewrite the equation by left-multiplication of the the matrix inverse,

𝐚=A1𝐲\mathbf{a} = A^{-1}\mathbf{y}

Substituting the coefficients gives,

(abc)=(1211212012010)(y1y0y1).\begin{pmatrix}a\\b\\c\end{pmatrix} = \begin{pmatrix} \frac{1}{2}&-1&\frac{1}{2}\\ -\frac{1}{2}&0&\frac{1}{2}\\ 0&1&0 \end{pmatrix}\begin{pmatrix}y_{-1}\\y_0\\y_1\end{pmatrix}.

Instead of doing the Gaussian elimination procedure for every column vector 𝐲\mathbf{y}, we plot find the polynomial coefficients (𝐚\mathbf{a}) using a simple matrix multiplication. We can apply this in Octave to generate the frames of the following movie.

A = [[1,-1,1];[0,0,1];[1,1,1]];
Ai = inv(A);

for i = 1:200
  t = 2.*pi*i/200
  y = [sin(t - 1); cos(t + 1); sin(2*t)];
  a = Ai*y;
  clf
  fplot (@(x) polyval(a, x), [-1.1, 1.1], 'linewidth', 2, 'red');
  hold on
  scatter ([-1, 0, 1], y, 'filled')
end

Finite difference

If we (can) assume that the data is part of some continuous signal (f(x)), it turns out to be quite OK to guess that locally this function can be approximated with a polynomial. Say we wish to find the second derivative at x=0x = 0 we can differentiate our polynomial,

d2fdx2|x=0d2ydx2|x=0=2a.\frac{\mathrm{d}^2f}{\mathrm{d}x^2}\bigg\rvert_{x=0}\approx\frac{\mathrm{d}^2y}{\mathrm{d}x^2}\bigg\rvert_{x=0} = 2a.

By reading the matrix equation we can thus conclude that:

d2fdx2|x=0y12y0+y1\frac{\mathrm{d}^2f}{\mathrm{d}x^2}\bigg\rvert_{x=0}\approx y_{-1}-2y_0+y_1

The process of using point-location values and estimating derivatives is part of the finite difference method, which is used for all kinds of numerical models in science and engineering.

Week 2B: The vector product (cross product)

Consider two vectors expressed with a Cartesian basis, a=axex+ayey+azez\vec{a} = a_x\vec{e}_x + a_y\vec{e}_y + a_z\vec{e}_z and b=bxex+byey+bzez\vec{b} = b_x\vec{e}_x + b_y\vec{e}_y + b_z\vec{e}_z. We can write their vector product c=a×b\vec{c} = \vec{a} \times \vec{b} as the determinant of a carefully constructed matrix,

c=a×b=exeyezaxayazbxbybz=(aybzazby)ey+(azbxaxbz)ey+(axbyaybx)ez.\vec{c} = \vec{a} \times \vec{b} = \begin{vmatrix} \vec{e_x}&\vec{e_y}&\vec{e_z}\\ a_x&a_y&a_z\\ b_x&b_y&b_z\end{vmatrix} = (a_yb_z - a_zb_y)\vec{e}_y + (a_zb_x - a_xb_z)\vec{e}_y + (a_xb_y - a_yb_x)\vec{e}_z.

Thank you determinant for the convenient mnemonic!

Continue to the examples for the third week…

The marvelous design of this website is taken from Suckless.org