In this essay we will talk about graphs and a way to represent them via matrices, we will then study some elementary properties of these matrices. We start our discussion off with some fundamental definitions for graph theory.
Graph
A graph is an ordered pair of sets where
We call the set of edges and the vertices, also note that edges don't have an order to them, as we know that , sometimes we want them to have a direction, and in that case we have this definition:
Directed Graph
A directed graph is an ordered pair of sets where
Throughout this entire discussion we will assume our graphs to be finite, unless otherwise stated.
When working graphs it's useful to talk about the way vertices are related to eachother, the first one we'll look at is when two vertices have an edge between them:
Neighboring Vertices
Given a graph and then we say that are neighbors if
Set of Neighbors of a Vertex
Given a graph and then we denote the set of neighbors of as
Degree of a Vertex
Given a graph and then we denote degree of as
With these defintions in place we can define matrices that encode a graph entirely:
Adjacency Matrix
Given a graph then we denote the adjacency matrix of the graph by where
Laplacian Matrix
Given a graph then we denote the Laplacian matrix of the graph by where
Note that both the adjacency matrix and laplacian iterate through all possible pairs of vertices, that is to say that the shape of both these matrices is
Note that if we focus on a row we are looking at elements of the form where is fixed, and therefore for the diagonal element of that row we store the degree of and for each other element in that row we cout each neighbor with a and therefore the sum of each row is zero.
A Vertex Incident to an Edge
Given a graph and some then we say it is incident to an edge if
Walk
A walk on a graph is an alternating series of vertices and edges beginning and ending with a vertex such that each edge is incident with the vertex directly before it and and directly after it.
Note that when we have a walk between two vertices we call it it a walk, moreover a walk from is different . Now we can connect a few of the above ideas together
The Entries of the Power of an Adjacency Matrix Represent Walks
Given a graph where then given an entry of the matrix equals the number of walks of length
We go by induction, so for then the -th power is just the regular adjacency matrix, additionally each represents if there is an edge between the two vertices, which is the same as the number of walks from to of length .
Now suppose that the statement holds true for any then let's prove that it holds true for . Since it holds true on then we know that has the desired property, but note that .
If we denote the elements of by , those of by and those of by then by the definition of matrix multiplication
by our induction hypothesis we know that are the number of walks of length and by the regular definition of we know that represents whether or not there is a edge connecting , that is, the value is either or .
If then we can take each possible path of length from and then extend the path to via the edge , note that all path from are of this form for some therfore represents the total number of walks of length as needed. Therefore the statement holds true by induction.
Laplacian of an Edge
Suppose and then we define the Laplacian for the edge as as
The above matrix is mostly zeros, but along a diagonal element of the matrix if is part of the edge, then it is a and there will be exactly two cases where based on the definition.
The Laplacian Is the Sum of Edge Laplacians
TODO: Add the proof here.
Quadratic Form of the Edge Laplacian Is a Squared Difference
For any we have
TODO: Add the proof here.
Positive Semi-definite
Given an matrix then it is positive-semidefinite if for any we have
The Laplacian Is Positive-semidefinite
As per title.
TODO: Add the proof here.
Every Eigenvalue of the Laplacian Is Non-negative
As per title.
Suppose that is an eigen value, and an eigenvector for (non-zero). We already know that , but also note
Since then we have , therefore we must have that , as needed.
By the real spectral theorem we deduce now that has an orthonormal basis consisting of eigen vectors of , in that case denote them as
Path
A path is a non-empty graph of the form and where the vertices are distinct.
In other-words a path is just a walk such that you don't go over the same vertex.
Connected
Given a non-empty graph we say that it is connected if given any two vertices in there is a path between them.
We'll now start our exploration of eigenvalues of the laplacian and what they tell us about the original graph.
0 Is an Eigenvalue for the Laplacian
Let be a graph then is an eigenvalue of
Consider then note that the column matrix has equivalently this is the sum of a row of the laplacian and we've already mentioned that this value is 0, showing that and therefore is an eigenvalue of
Here we have the first connection with the connectivity of a graph, note that this is quite an interesting property to have, as we can compute eigenvalues quickly and computers operate quickly on matrices, this can provide a speedup
There Is Only One Zero Eigen Value for the Laplacian of a Connected Graph
Let be a connected graph then is an eigenvalue and all other eigenvalues of are positive
Let be a non-zero eigenvector, then note that and also that , in otherwords for each we have .
Now since is connected, we know that given any there is a path connecting to this means that for each edge in this path the two incident vertices are equal by what was just discussed, and therefore by transitivity of equality we see that , therefore we conclude that , therefore the eigen space associated with the eigenvalue of 0 is .
From there we deduce that the multiplicity of the eigenvalue 0 is equal to thus we must have that for if it was equal to zero, then since as before the were the eigenvalues for an orthonormal basis then if we would get that the multiplicity would be , a contradiction, therefore , that is as needed.
Subgraph
Suppose is a graph, then a subgraph of is a graph such that and
In other words a subgraph is a graph that's already a part of the original graph.
Connected Component
A connected component of a graph is a subgraph such that are connected, but also that for any we have that are not connected.
A connected component is a component such that it itself is connected, but for any vertex outside of it, there is no way to get there, so a graph with 3 connected components would look like a graph with three parts to it, but no way to get from one part to another.
The Geometric Multiplicity of the Eigenvalue Zero of the Laplacian Is the Number of Connected Components
Let be a graph, then the geometric multiplicity of for is equal to the number of connected components in .
Suppose that for for some are the connected components of a graph then if we construct vectors of the form holding information about which vertices from the original graph exist in the component, that is:
Now given an non-zero eigenvector of then we know that whenever are in the same connected component from the last proposition, therefore we deduce that the eigenspace for is , by construction, the collection are linearly indpendent so the dimension of the eigenspace is , that is to say that the geometric multiplicity of the eigenspace of for is the number of connected components.
Now we explore the eigenvalues and eigenvectors of some known graphs.
Complete
We say that a graph is complete whenever and . In that case we refer to as
The Eigenvalues of the Laplacian of a Connected Graph
The laplcian of has as an eigenvalue with geometric multiplicity and as an eigenvalue with geometric multiplicity
We already know that since there is one connected component, then we should have that the geometric multiplicity of is .
Now since each vertex is connected to other vertices, then we know that the laplacian of is such that
therefore is a matrix of all -1's which has rank one and is not invertible, thus is an eigenvalue of the laplacian of , finally by the rank nullity theorem we conclude that so then the eigenvalue has multiplicity
Cycle Graph
The cycle graph on vertices is denoted by is a graph where and
Laplacian of the Cycle Graph
TODO: Add the proof here.
An Eigenvector of the Laplacian of the Cycle Graph Yields More Eigenvectors by Its Cyclic Permutations
Suppose that is an eigenvector of , then for any which is a cyclic permutation of , then it is also an eigenvector of
If then we have a series of equations, let us label these equations:
(Equation 1): also and
(Equation ):
For each we have (Equation ):
Now suppose that is a cyclic permutation of , ie, there is a bijection such that .
We want to prove that is an eigen vector for , thus we must verify that , but that generates another equations similar to those above, but note that given the -th equation generated by this situation of the form
is equivalent to this equation
which holds true due to our original equations from , thus since all equations hold true then is an eigenvector.
Laplacian of the Cycle Graph Has Trigonometric Eigenvalues
The Laplacian of has eigenvalues given by for and associated eigenvectors of the form:
Suppose that is an eigenvector of and an eigenvector for that eigenvalue,then recall from the previous proof we have:
(Equation 1): also and
(Equation ):
For each we have (Equation ):
We know that if is a cyclic permutation, then is also an eigenvector for . This means that are all eigenvectors of . However, the maximum dimension of any eigenspace is because we already knew that 0 was an eigenvalue of multiplicity 1. So there must be some such that:
If happened to be one, then we would be saying that for some , meaning that we are able to multiply all the numbers in this vector by a constant, and it will "loop" them around, so we will assume that for some constant , and see if we get any progress. That is if we plug in a potential solution of this form into the equations we have:
If we add the third equation to the first we obtain that so that therefore which would imply that for each (where the imaginary number ), we see that is a solution to the above equations, in other words for each with (as assumed earlier) will produce a valid eigen vector.
For each of the eigen vectors, we can explicitly determine their eigenvalue, based on (Equation 1) we note that their associated eigenvalues are given by , we note that these values are real, but may consist of complex coefficients. Writing our vector using Euler's formula it is of the form:
Since and both and are real, this means that both the real part and the imaginary part of are invariant under because consists of real entries. Thus we can conclude that and are both eigenvectors for . Then, for ,
So the eigenvalues are with .
The list above is exhaustive since we have formed independent eigenvectors correspondingly to these eigenvalues:
When is odd, we have one eigenvector for , and two eigenvectors and for all .
When is even, we have one eigenvector for and two eigenvectors and for all and one eigenvector for ( ).
Essay 2
Continuing on with what I learned in my first essay I decided to look at these eigenvalues in more detail, making a detour on the proof of Courant-Fischer which helps us obtain some bounds on these eigenvalues. Knowing this is important because the smallest non-zero eigenvalue of the Laplacian of a graph tells us how connected a graph is. We start with some preliminaries:
Rayleigh Quotient
Given a vector and a compatible matrix, then we define the Rayleigh quotient as
The Rayleigh Quotient of an Eigenvector Is Its Eigenvalue
As per title.
if , then
Quadradic Form as a Summation of Eigenvalues
Let be a symmetric matrix with eigenvalues and a corresponding orthonormal basis of eigenvectors . Let be a vector whose expansion in the eigenbasis is
Then,
Note the following algebra, where and are temporary vector valued variables to help ease you into the next line:
But we know that
therefore
The Courant-Fischer Theorem tells us that the vectors that maximize the Rayleigh quotient are exactly the eigenvectors of the largest eigenvalue of the given matrix.
Courant Fischer
Let be a symmetric matrix with eigenvalues . Then,
where the maximization and minimization are over subspaces and of .
Let be an orthonormal set of eigenvectors of corresponding to . We will only prove that
Only because the other is symetrically similar.
In order to prove that is achievable. Let be the span of . We can expand every as
Now since for we have that
And thus a lower bound on the minimum value, that is:
To show that this is in fact the maximum, we will prove that for all subspaces of dimension ,
Let be the span of . As has dimension , every of dimension has an intersection with of dimension at least 1 because , therefore:
Now recall in may be expressed as
and so for in since where , we get:
which shows that
so we conclude :
For our purposes we will use a more speicific version of the theorem:
Courant Fischer Specific
Let be an symmetric matrix and let . Let be the eigenvalues of and be the corresponding eigenvectors. Then,
TODO: Add the proof here.
With this new version of the theorem we can get a lower bound on the largest eigenvalue of the Laplacian of a graph. For the next theorem we will denote the -th eigen value of of the laplacian of a graph by , note that in the context of having eigenvalues since they were ordered in an non-decreasing way then is the largest eigenvalue.
The Largest Eigenvalue of the Laplacian of a Graph Is at Least as Large as the Degree of Any Vertex
Let be a graph with and . If has degree , then
By Courant-Fischer Theorem,
By the combination of this corollary and this proposition we have
Note that because for each edge such that is incident to that edge, ie then we know that and thus , on the other hand the number of edges such that is incident to is simply the degree of thus said equation holds true.
Now using the information gathered in the last paragraph recall the standard basis: and consider for some , we have:
So .
Now we focus on the second smallest eigenvalue, whose importance we learned of in the first essay.
The Second Smallest Eigenvalue of the Laplacian of the Path Graph on N Vertices Is in the Big O of One Over N Squared
Let be the path graph, then
.
We start by considering the vector such that , the reason why is so that it becomes perpendicular with , the vector of all ones:
As seen earlier on spans the eigenspace for the eigenvalue . Then by Courant-Fischer we get an upperbound:
We have also shown that:
By the nature of we have that
So therefore:
The denominator , so we can conclude that
This concludes my exploration of spectral graphs for the moment.