452
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Squared distance matrices of trees with matrix weights

&
Pages 168-176 | Received 23 Jun 2023, Accepted 08 Jul 2023, Published online: 24 Jul 2023

Abstract

Let T be a tree on n vertices whose edge weights are positive definite matrices of order s. The squared distance matrix of T, denoted by Δ, is the ns × ns block matrix with Δij=d(i,j)2, where d(i, j) is the sum of the weights of the edges in the unique (i, j)-path. In this article, we obtain a formula for the determinant of Δ and find Δ1 under some conditions.

AMS CLASSIFICATION:

1 Introduction

Let T be a tree with vertex set V(T)={1,,n} and edge set E(T)={e1,,en1}. If two vertices i and j are adjacent, we write ij. Let us assign an orientation to each edge of T. The usual distance between the vertices i,jV(T), denoted by du(i,j), is the length of the shortest path between them in T. Two edges ei=(p,q) and ej=(r,s) of T are similarly oriented if du(p,r)=du(q,s) and is denoted by eiej, otherwise they are oppositely oriented and is denoted by eiej. The edge orientation matrix H=(hij) of T is the (n1)×(n1) matrix whose rows and columns are indexed by the edges of T and the entries are defined [Citation8] as hij={1if eiej,ij;1if eiej,ij;1if i=j.

The incidence matrix Q of T is the n×n1 matrix with its rows indexed by V(T) and the columns indexed by E(T). The entry corresponding to the row i and column ej of Q is 1 if ej originates at i,–1 if ej terminates at i, and zero if ej and i are not incident. We assume that the same orientation is used while defining the edge orientation matrix H and the incidence matrix Q.

The distance matrix of T, denoted by D(T), is the n×n matrix whose rows and columns are indexed by the vertices of T and the entries are defined as follows: D(T)=(dij), where dij=du(i,j). In [Citation8], the authors introduced the notion of squared distance matrix Δ, which is defined to be the Hadamard product D°D, that is, the (i, j)-th element of Δ is dij2. For the unweighted tree T, the determinant of Δ is obtained in [Citation8], while the inverse and the inertia of Δ are considered in [Citation7]. In [Citation5], the author considered an extension of these results to a weighted tree whose each edge is assigned a positive scalar weight and found the determinant and inverse of Δ. Recently, in [Citation9], the authors determined the inertia and energy of the squared distance matrix of a complete multipartite graph. Also, they characterized the graphs among all complete t-partite graphs on n vertices for which the spectral radius of the squared distance matrix and the squared distance energy are maximum and minimum, respectively.

In this article, we consider a weighted tree T on n vertices with each edge weight is a positive definite matrix of order s. For i,jV(T), the distance d(i, j) between i and j is the sum of the weight matrices in the unique (i, j)-path of T. Thus, the distance matrix D=(dij) of T is the block matrix of order ns × ns with its (i, j)-th block dij=d(i,j) if ij, and is the s × s zero matrix if i = j. The squared distance matrix Δ of T is the ns × ns block matrix with its (i, j)-th block is equal to d(i,j)2 if ij, and is the s × s zero matrix if i = j. The Laplacian matrix L=(lij) of T is the ns × ns block matrix defined as follows: For i,jV(T),ij, the (i, j)-th block lij=(W(i,j))1 if i ~ j, where W(i, j) is the matrix weight of the edge joining the vertices i and j, and the zero matrix otherwise. For iV(T), the (i, i)-th block of L is ji(W(i,j))1.

In the context of classical distance, the matrix weights have been studied in [Citation2] and [Citation3]. The Laplacian matrix with matrix weights have been studied in [Citation2, Citation10] and [Citation11]. The Resistance distance matrix and the Product distance matrix with matrix weights have been considered in [Citation1], and [Citation6], respectively. In this article, we consider the squared distance matrix Δ of a tree T with matrix weights and find the formulae for the determinant and inverse of Δ, which generalizes the results of [Citation5, Citation7, Citation8].

This article is organized as follows. In Section 2, we define needed notations and state some preliminary results, which will be used in the subsequent sections. In Section 3, we find some relations of incidence matrix, Laplacian matrix, and distance matrix with squared distance matrix. In Sections 4 and 5, we obtain the formula for the determinant and inverse of Δ, respectively.

2 Notations and preliminary results

In this section, we define some useful notations and state some known results which will be needed to prove our main results.

The n×1 column vector with all ones and the identity matrix of order n are denoted by 1n and In, respectively. Let J denote the matrix of appropriate size with all entries equal to 1. The transpose of a matrix A is denoted by A. Let A be an n × n matrix partitioned as A=[A11A12A21A22], where A11 and A22 are square matrices. If A11 is nonsingular, then the Schur complement of A11 in A is defined as A22A21A111A12. The following is the well known Schur complement formula: detA=(detA11)det(A22A21A111A12). The Kronecker product of two matrices A=(aij)m×n and B=(bij)p×q, denoted by AB, is defined to be the mp × nq block matrix [aijB]. It is known that for the matrices A, B, C and D, (AB)(CD)=ACBD, whenever the products AC and BD are defined. Also (AB)1=A1B1, if A and B are nonsingular. Moreover, if A and B are n × n and p × p matrices, respectively, then det(AB)=(detA)p(detB)n. For more details about the Kronecker product, we refer to [Citation12].

Let H be the edge-orientation matrix, and Q be the incidence matrix of the underlying unweighted tree with an orientation assigned to each edge. The edge-orientation matrix of a weighted tree whose edge weights are positive definite matrices of order s is defined by replacing 1 and–1 by Is and Is, respectively. The incidence matrix of a weighted tree is defined in a similar way. That is, for the matrix weighted tree T, the edge-orientation matrix and the incidence matrix are defined as (HIs) and (QIs), respectively. For a block matrix A, let Aij denote the (i, j)-th block of A.

Now we introduce some more notations. Let T be a tree with vertex set V(T)={1,,n} and edge set E(T)={e1,,en1}. Let Wi be the edge weight matrix associated with each edge ei of T, i=1,2,,n. Let δi be the degree of the vertex i and set τi=2δi for i=1,2,,n. Let τ be the n×1 matrix with components τ1,,τn and τ˜ be the diagonal matrix with diagonal entries τ1,τ2,,τn. Let δî be the matrix weighted degree of i, which is defined as δî=j:jiW(i,j),  i=1,,n.

Let δ̂ be the ns × s block matrix with the components δ1̂,,δn̂. Let F be a diagonal matrix with diagonal entries W1,W2,,Wn1. It can be verified that L=(QIs)F1 (QIs).

A tree T is said to be directed tree, if the edges of the tree T are directed. If the tree T has no vertex of degree 2, then τ̂ denote the diagonal matrix with diagonal elements 1/τ1,1/τ2,,1/τn. In the following theorem, we state a basic result about the edge-orientation matrix H of an unweighted tree T, which is a combination of Theorem 9 of [Citation8] and Theorem 11 of [Citation7].

Theorem 2.1.

[Citation7, Citation8] Let T be a directed tree on n vertices and let H and Q be the edge-orientation matrix and incidence matrix of T, respectively. Then detH=2n2i=1nτi. Furthermore, if T has no vertex of degree 2, then H is nonsingular and H1=12Qτ̂Q.

Next, we state a known result related to the distance matrix of a tree with matrix weights.

Theorem 2.2.

[2, Theorem 3.4] Let T be a tree on n vertices whose each edge is assigned a positive definite matrix of order s. Let L and D be the Laplacian matrix and distance matrix of T, respectively. If D is invertible, then the following assertions hold:

  1. LD=τ1nIs2InIs.

  2. DL=1nτIs2InIs.

3 Properties of the squared distance matrices of trees

In this section, we find the relation of the squared distance matrix with other matrices, such as distance matrix, Laplacian matrix, incidence matrix, etc. We will use these results to obtain the formulae for determinants and inverses of the squared distance matrices of directed trees.

Lemma 3.1.

Let T be a tree with vertex set {1,2,,n} and each edge of T is assigned a positive definite matrix weight of order s. Let D and Δ be the distance matrix and the squared distance matrix of T, respectively. Then Δ(τIs)=Dδ̂.

Proof.

Let i{1,2,,n} be fixed. For ji, let p(j) be the predecessor of j on the (i, j)-path of the underlying tree. Let ej be the edge between the vertices p(j) and j. Let Wj be the weight of the edge ej and Xj=δĵWj. Therefore, 2j=1nd(i,j)2=j=1nd(i,j)2+ji(d(i,p(j))+Wj)2=j=1nd(i,j)2+jid(i,p(j))2+ 2jid(i,p(j))Wj+jiWj2.

Since the vertex j is the predecessor of δj1 vertices in the paths from i, we have jid(i,p(j))2=j=1n(δj1)d(i,j)2.

Thus, 2j=1nd(i,j)2=j=1nd(i,j)2+j=1n(δj1)d(i,j)2+ 2jid(i,p(j))Wj+jiWj2=j=1nδjd(i,j)2+2jid(i,p(j))Wj+jiWj2.

Therefore, the (i, j)-th element of Δ(τIs) is (Δ(τIs))ij=j=1n(2δj)d(i,j)2=2jid(i,p(j))Wj+jiWj2.

Now, let us compute the (i, j)-th element of Dδ̂. (Dδ̂)ij=j=1nd(i,j)δĵ=ji(d(i,p(j))+Wj)(Wj+Xj)=jid(i,p(j))Wj+jiWj2+ ji(d(i,p(j))+Wj)Xj.

Note that Xj is the sum of the weights of all edges incident to j, except ej. Hence, (d(i,p(j))+Wj)Xj=d(i,j)Xj=lj,lp(j)d(i,p(l))Wl.

Therefore, ji(d(i,p(j))+Wj)Xj=jilj,lp(j)d(i,p(l))Wl=jid(i,p(j))Wj.

Thus, (Dδ̂)ij=j=1nd(i,j)δĵ=2jid(i,p(j))Wj+jiWj2=(Δ(τIs))ij.

This completes the proof. □

Lemma 3.2.

Let T be a directed tree with vertex set {1,,n} and edge set {e1,,en1} with each edge ei is assigned a positive definite matrix weight Wi of order s for 1in1. Let H and Q be the edge orientation matrix and incidence matrix of T, respectively. If F is the diagonal matrix with diagonal entries W1,W2,,Wn1, then (QIs)Δ(QIs)=2F(HIs)F.

Proof.

For i,j{1,2,,n1}, let ei and ej be two edges of T such that ei is directed from p to q and ej is directed from r to s. Let Wi and Wj be the weights of the edges ei and ej, respectively.

Case 1:

Let the p-s path contain the vertices q and r. If d(q,r)=Y, then it is easy to check that ((QIs)Δ(QIs))ij={(Wi+Y)2+(Wj+Y)2(Wi+Wj+Y)2Y2,if eiej,(Wi+Y)2(Wj+Y)2+(Wi+Wj+Y)2+Y2,if eiej.={2WiWj,if eiej,2WiWj,if eiej.

Case 2:

Let the r-q path contain the vertices s and p. If d(s,p)=Y1, then ((QIs)Δ(QIs))ij={(Wi+Y1)2+(Wj+Y1)2(Wi+Wj+Y1)2Y12,if eiej,(Wi+Y1)2(Wj+Y1)2+(Wi+Wj+Y1)2+Y12,if eiej.={2WiWj,if eiej,2WiWj,if eiej.

Note that (F(HIs)F)ij={WiWjif eiej,WiWjif eiej.

Thus, ((QIs)Δ(QIs))ij=2(F(HIs)F)ij.

Lemma 3.3.

Let T be a tree with vertex set {1,,n} and edge set {e1,,en1} with each edge ei is assigned a positive definite matrix weight Wi of order s for 1in1. Let L, D and Δ be the Laplacian matrix, the distance matrix and the squared distance matrix of T, respectively. Then ΔL=2D(τ˜Is) 1nδ̂.

Proof.

Let i,jV(T) and the degree of the vertex j be t. Suppose j is adjacent to the vertices v1,v2,,vt, and let e1,e2,,et be the corresponding edges with edge weights W1,W2,,Wt, respectively.

Case 1.

For i = j, we have (ΔL)ii=s=1nd(i,s)2lsi=sid(i,s)2lsi=W12(W1)1++Wt2(Wt)1=(W1+W2++Wt)=δî=(2D(τ˜Is)1nδ̂)ii.

Case 2.

Let ij. Without loss of generality, assume that the (i, j)-path passes through the vertex v1 (it is possible that i = v1). If d(i,j)=Z, then d(i,v1)=ZW1, d(i,v2)=Z+W2,d(i,v3)=Z+W3,,d(i,vt)=Z+Wt. Therefore, (ΔL)ij=s=1nd(i,s)2lsj=sjd(i,s)2lsj+d(i,j)2ljj=d(i,v1)2(W1)1+d(i,v2)2(W2)1++ d(i,vt)2(Wt)1+d(i,j)2ljj=(ZW1)2(W1)1+(Z+W2)2(W2)1+ (Z+W3)2(W3)1++(Z+Wt)2(Wt)1+ Z2((W1)1+(W2)1++(Wt)1)=(W122ZW1)(W1)1+(W22+2ZW2)(W2)1+ (W32+2ZW3)(W3)1++(Wt2+2ZWt)(Wt)1=(W1+W2++Wt)+2Z2(t1)Z=2(2t)Z(W1+W2++Wt)=2τjZδĵ=(2D(τ˜Is)1nδ̂)ij.

This completes the proof. □

4 Determinant of the squared distance matrix

In this section, we obtain a formula for the determinant of the squared distance matrix of a tree with positive definite matrix weights. First, we consider trees with no vertex of degree 2.

Theorem 4.1.

Let T be a tree on n vertices, and let Wi be the weights of the edge ei, where Wi’s are positive definite matrices of order s, i=1,2,,n1. If T has no vertex of degree 2, then det(Δ)=(1)(n1)s2(2n5)si=1n(τi)si=1n1det(Wi2)det(i=1nδî2τi).

Proof.

Let us assign an orientation to each edge of T, and let H be the edge orientation matrix and Q be the incidence matrix of the underlying unweighted tree.

Let Δi denote the i-th column block of the block matrix Δ. Let ti be the n×1 column vector with 1 at the i-th position and 0 elsewhere, i=1,2,,n. Then (1) [QIst1Is]Δ[QIst1Is]=[(QIs)Δ(QIs)(QIs)Δ1Δ1(QIs)0].(1)

We know that Q is totally unimodular matrix and each submatrix of order n–1 is nonsingular [4, Lemmas 2.6 and 2.7]. Therefore, det[QIst1Is]=det([Qt1]Is)=±1. Now, by taking determinant of matrices in both sides of Equationequation (1), we have det(Δ)=det[(QIs)Δ(QIs)(QIs)Δ1Δ1(QIs)0].

Using Lemma 3.2, we have det(Δ)= det[2F(HIs)F(QIs)Δ1Δ1(QIs)0]. By Theorem 2.1, we have detH=2n2i=1nτi and hence det(HIs)=(detH)s=2(n2)si=1nτis. Thus, 2F(HIs)F is nonsingular, and by the Schur complement formula, we have det(Δ)=[2F(HIs)F(QIs)Δ1Δ1(QIs)0]=det(2F(HIs)F)det(Δ1(QIs)(2F(HIs)F)1(QIs)Δ1)=(1)(n1)s2(n2)si=1n1det(Wi2)det(HIs)det(Δ1(QIs)F1(HIs)1F1(QIs)Δ1).

Now, from Theorem 2.1, it follows that (HIs)1=H1Is=12Qτ̂QIs=12(Qτ̂QIs). Therefore, (2) det(Δ)=(1)(n1)s2(2n5)si=1n(τi)si=1n1det(Wi2)det(Δ1(QIs)F1(Qτ̂QIs)F1(QIs)Δ1).(2)

Now, by Lemmas 3.3 and 3.1, we have Δ1(QIs)F1(Qτ̂QIs)F1(QIs)Δ1=Δ1(QIs)F1(QIs)(τ̂Is)(QIs)F1(QIs)Δ1=(Δ1(QIs)F1(QIs))(τ̂Is)(Δ1(QIs)F1(QIs))=(Δ1L)(τ̂Is)(Δ1L)=i(2τid1iδî)21τi=i(4τi2d1i2+δî24τid1iδî)1τi=i4τid1i2+iδî2τii4d1iδî=iδî2τi.

Substituting the value of Δ1(QIs)F1(Qτ̂QIs)F1(QIs)Δ1 in (2), we get the required result. □

Next, let us illustrate the above theorem by an example.

Example 4.2.

Consider the tree T1 in , where the edge weights are W1=[1001],W2=[2001],W3=[1002].

Fig. 1 Tree T1 on 4 vertices.

Fig. 1 Tree T1 on 4 vertices.

Then, Δ=[0W12(W1+W2)2(W1+W3)2W120W22W32(W1+W2)2W220(W2+W3)2(W1+W3)2W32(W2+W3)20]=[0010904000010409100040100100010490400090040100094010900009040900] andi=14δî2τi=W12+W22+W32(W1+W2+W3)2=[100010].

One can verify that, det(Δ)=102400=(1)626i=14(τi)2i=13det(Wi2)det(i=14δî2τi).

Next, we obtain a formula for the determinant of the squared distance matrix of a tree T, which has exactly one vertex of degree 2.

Theorem 4.3.

Let T be a tree on n vertices with the edge set E(T)={e1,e2,,en1}. Let the positive definite matrices W1,W2,,Wn1 of order s be the weights of the edges e1,e2,,en1, respectively. Let v be the vertex of degree 2 and u and w be its neighbors in T. If ei=(u,v) and ej=(v,w), then det(Δ)=(1)(n1)s2(2n5)sdet(Wi+Wj)2k=1n1det(Wk2)kvτks.

Proof.

Let us assign an orientation to each edge of T. Without loss of generality, assume that, the edge ei is directed from u to v and the edge ej is directed from v to w.

Let Δi denote the i-th column block of the block matrix Δ. Let ti be the n×1 column vector with 1 at the i-th position and 0 elsewhere, i=1,2,,n. Therefore, by using Lemma 3.2, we have [QIstvIs]Δ[QIstvIs]=[(QIs)Δ(QIs)(QIs)ΔvΔv(QIs)0]=[2F(HIs)F)(QIs)ΔvΔv(QIs)0]

Pre-multiplying and post-multiplying the above equation by [F100Is], we get [F100Is][QIstvIs]Δ[QIstvIs][F100Is]=[2(HIs)F1(QIs)ΔvΔv(QIs)F10],

which implies that (det(F1))2det(Δ)=det[2(HIs)F1(QIs)ΔvΔv(QIs)F10].

Let H(j|j) denote the (n2)s×(n2)s submatrix obtained by deleting the all blocks in the j-th row and j-th column from HIs. Let G(j|j) denote the (n2)s×s submatrix obtained by deleting the j-the block in F1(QIs)Δv. Let Ri and Ci denote the i-th row and i-th column of the matrix [2(HIs)F1(QIs)ΔvΔv(QIs)F10], respectively. Note that the blocks in the i-th and j-th column of HIs are identical. Now, perform the operations RjRi and CjCi in [2(HIs)F1(QIs)ΔvΔv(QIs)F10], and then interchange Rj and Rn1, Cj and Cn1. Since Δv(QIs)F1)j(Δv(QIs)F1)i=WjWi, therefore det[2(HIs)F1(QIs)ΔvΔv(QIs)F10]=det[2H(j|j)0G(j|j)00WjWiG(j|j)WjWi0].

Since H(j|j) is the edge orientation matrix of the tree obtained by deleting the vertex v and replacing the edges ei and ej by a single edge directed from u to w in the tree, by Theorem 2.1, we have det(H(j|j)=2(n3)skvτks, which is nonzero. Therefore, by applying the Schur complement formula, we have   det[2H(j|j)0G(j|j)00WjWiG(j|j)WjWi0]=det(2H(j|j))det([0WjWiWjWi0]([000G(j|j)(2H(j|j))1G(j|j)])=(2)(n2)sdet(H(j|j))det[0WjWiWjWiG(j|j)(2H(j|j))1G(j|j)].

Again, by the proof of Theorem 4.1, we have G(j|j)(2H(j|j))1G(j|j)=14ivδî2τi.

Therefore, det[2H(j|j)0G(j|j)00WjWiG(j|j)WjWi0]=(2)(n2)sdet(H(j|j))det[0WjWiWjWi14ivδî2τi]=(2)(n2)sdet(H(j|j))det[0Wj+WiWj+Wi14ivδî2τi].

Since det(14ivδî2τi)0, by Schur complement formula, we have det[0Wj+WiWj+Wi14ivδî2τi]=det(14ivδî2τi)det[0(Wj+Wi)(14ivδî2τi)1(Wj+Wi)]=(1)sdet(14ivδî2τi)det(14ivδî2τi)1det(Wj+Wi)2=(1)sdet(Wi+Wj)2.

Thus, det(Δ)=(detF)2(1)s(2)(n2)s2(n3)skvτks det(Wi+Wj)2=(1)(n1)s2(2n5)sdet(Wi+Wj)2k=1n1det(Wk2)kvτks.

This completes the proof. □

Now, we illustrate the above theorem by the following example.

Example 4.4.

Consider the tree T2 in , where the edge weights are W1=[1001],W2=[2001],W3=[1002],W4=[2002].

Fig. 2 Tree T2 on 5 vertices.

Fig. 2 Tree T2 on 5 vertices.

Then, Δ =[0W12(W1+W2)2(W1+W2+W3)2(W1+W2+W4)2W120W22(W2+W3)2(W2+W4)2(W1+W2)2W220W32W42(W1+W2+W3)2(W2+W3)2W320(W3+W4)2(W1+W2+W4)2(W2+W4)2W42(W3+W4)20] =[0010901602500001040160161000409016001000109099040001040040100040416090100090016090400016250160409000016090401600].

One can verify that, det(Δ)=9437184=(1)8210det(W1+W2)2i=14det(Wi2)k2τks.

Corollary 4.5.

Let T be a tree on n vertices and each edge ei of T is assigned a positive definite matrix Wi order s, i=1,2,,n1. If T has at least two vertices of degree 2, then det(Δ)=0.

Proof.

Let T be a tree with at least two vertices v1 and v2 of degree 2. Therefore, τv1=2δv1=22=0 and τv2=2δv2=22=0. Let u and w be the neighbours of v1 and ei=(u,v1) and ej=(v1,w). Therefore, by Theorem 4.3, we have det(Δ)=(1)(n1)s2(2n5)sdet(Wi+Wj)2k=1n1det(Wk2)kv1τks.

Since τv2=0, therefore det(Δ)=0. This completes the proof. □

5 Inverse of the squared distance matrix

This section considers trees with no vertex of degree 2 and obtains an explicit formula for the inverse of its squared distance matrix. First, let us prove the following lemma which will be used to find Δ1.

Lemma 5.1.

Let T be a tree of order n with no vertex of degree 2 and each edge of T is assigned a positive definite matrix weight of order s. If β=δ̂(τ̂Is)δ̂ and η=2τIsL(τ̂Is)δ̂, then Δη=1nβ.

Proof.

By Lemma 3.3, we have ΔL=2D(τ˜Is)1nδ̂. Hence, ΔL(τ̂Is)δ̂=2Dδ̂(1nδ̂)(τ̂Is)δ̂=2Dδ̂1ni=1nδî2τi.

Since β=δ̂(τ̂Is)δ̂=i=1nδî2τi, therefore ΔL(τ̂Is)δ̂=2Dδ̂1nβ. By Lemma 3.1, we have Δ(τIs)=Dδ̂ and hence ΔL(τ̂Is)δ̂=2Δ(τIs)1nβ. This completes the proof. □

If the tree T has no vertex of degree 2 and det(β)0, then Δ is nonsingular, that is, Δ1 exists. In the next theorem, we determine the formula for Δ1.

Theorem 5.2.

Let T be a tree of order n with no vertex of degree 2 and each edge of T is assigned a positive definite matrix weight of order s. Let β=δ̂(τ̂Is)δ̂ and η=2τIsL(τ̂Is)δ̂. If det(β)0, then Δ1=14L(τ̂Is)L+14ηβ1η.

Proof.

Let X=14L(τ̂Is)L+14ηβ1η. Then, (3) ΔX=14ΔL(τ̂Is)L+14Δηβ1η.(3)

By Lemma 3.3, we have ΔL=2D(τ˜Is)1nδ̂. Therefore, ΔL(τ̂Is)L=2DL(1nδ̂)(τ̂Is)L.

By Theorem 2.2, DL=1nτIs2InIs and hence (4) ΔL(τ̂Is)L=2(1nτIs2InIs)(1nδ̂)(τ̂Is)L.(4)

By Lemma 5.1, we have Δη=1nβ=(1nIs)β. Therefore, from Equationequation (3) and Equation(4), we have ΔX=12(1nτIs2InIs)+ 14(1nδ̂)(τ̂Is)L+ 14(1nIs)η=121nτIs+InIs+14(1nδ̂)(τ̂Is)L+ 14(1nIs)(2τIsL(τ̂Is)δ̂)=121nτIs+InIs+14(1nδ̂)(τ̂Is)L+ 14(1nIs)(2τIsδ̂(τ̂Is)L)=InIs=Ins.

This completes the proof. □

Now, let us illustrate the above formula for Δ1 by an example.

Example 5.3.

Consider the tree T1 in , where the edge weights are W1=[1001],W2=[2001],W3=[1002].

Then, Δ=[0W12(W1+W2)2(W1+W3)2W120W22W32(W1+W2)2W220(W2+W3)2(W1+W3)2W32(W2+W3)20]=[0010904000010409100040100100010490400090040100094010900009040900],L=[W11W1100W11W11+W21+W31W21W310W21W2100W310W31]=[1010000001010000102.500.50100102.50100.5000.500.500000010100001000100000.50000.5],β=i=14δî2τi=W12+W22+W32(W1+W2+W3)2=[100010], andη=[301101030030110301].

Therefore, L(τ̂Is)L=[001.500.50100001.50100.51.5040101.5001.50401.5010.5010000.500101.50000.5101.500.500000.50100.500], andηβ1η=[0.903.300.300.9000.903.300.900.33.3012.101.103.3003.3012.103.301.10.301.100.100.3000.903.300.900.30.903.300.300.9000.301.100.300.1].

One can verify that, Δ1=14L(τ̂Is)L+14ηβ1η.

Acknowledgments

We are indebted to the referee for their valuable comments and suggestions.

References

  • Atik, F., Bapat, R. B., Kannan, M. R. (2019). Resistance matrices of graphs with matrix weights. Linear Algebra Appl. 571: 41–57.
  • Atik, F., Kannan, M. R., Bapat, R. B. (2021). On distance and Laplacian matrices of trees with matrix weights. Linear and Multilinear Algebra. 69(14): 2607–2619.
  • Bapat, R. B. (2006). Determinant of the distance matrix of a tree with matrix weights. Linear Algebra Appl. 416(1): 2–7.
  • Bapat, R. B. (2010). Graphs and Matrices, Vol. 27. London: Springer.
  • Bapat, R. B. (2019). Squared distance matrix of a weighted tree. Electron. J. Graph Theory Appl. (EJGTA). 7(2): 301–313.
  • Bapat, R. B., Sivasubramanian, S. (2015). Product distance matrix of a tree with matrix weights. Linear Algebra Appl. 468: 145–153.
  • Bapat, R. B., Sivasubramanian, S. (2016). Squared distance matrix of a tree: inverse and inertia. Linear Algebra Appl. 491: 328–342.
  • Bapat, R. B., Sivasubramanian, S. (2013). Product distance matrix of a graph and squared distance matrix of a tree. Appl. Anal. Discrete Math. 7(2): 285–301.
  • Das, J., Mohanty, S. (2020). On squared distance matrix of complete multipartite graphs. arXiv preprint arXiv:2012.04341.
  • Ganesh, S., Mohanty, S. (2022). Trees with matrix weights: Laplacian matrix and characteristic-like vertices. Linear Algebra Appl. 646: 195–237.
  • Hansen, J. (2021). Expansion in matrix-weighted graphs. Linear Algebra Appl. 630: 252–273.
  • Horn, R. A., Johnson, C. R. (1994). Topics in Matrix analysis. Cambridge: Cambridge University Press, Corrected reprint of the 1991 original.