Products of matrices are similar if one of them is invertible

inverse
similarity

Let \(A\) and \(B\) be two matrices such that \(B\) is invertible. Then \(AB\) and \(BA\) are similar.


The proof is just one line: \[ BA = B(AB)B^{-1} \] The result also holds if \(A\) is invertible instead of \(B\). The result naturally holds if both \(A\) and \(B\) are invertible.


Linear Transformations Perspective

We can view this from the perspective of linear transformations. Let \(A\) be the matrix representation of a linear transformation \(T\) from \(\mathbb{R}^{n} \rightarrow \mathbb{R}^{n}\) with the standard basis \(\beta\) for both domain and co-domain. We have: \[ A = [T]_{\beta}^{\beta} \] Let \(\beta_{1}\) represent the columns of \(B\) and let \(\beta_{2}\) represent the columns of \(B^{-1}\). Then, we have: \[ \begin{aligned} B &= [I]_{\beta_1}^{\beta}\\\\ &= [I]_{\beta}^{\beta_2} \end{aligned}\quad \quad \quad \quad \begin{aligned} B^{-1} &= [I]_{\beta_2}^{\beta}\\\\ &= [I]_{\beta}^{\beta_1} \end{aligned} \] What does \(AB\) mean? \[ AB = [T]_{\beta}^{\beta} [I]_{\beta_1}^{\beta} = [T]_{\beta_1}^{\beta} \] \(AB\) is the matrix representation of \(T\) with basis \(\beta_1\) for the domain and \(\beta\) for the co-domain, where \(\beta_1\) is made up of the columns of \(B\). Post-multiplying a matrix \(A\) with an invertible matrix just changes the basis of the domain for the corresponding linear transformation. It leaves the underlying transformation unchanged.

What does \(BA\) mean? \[ BA = [I]_{\beta}^{\beta_2} [T]_{\beta}^{\beta} = [T]_{\beta}^{\beta_2} \] \(BA\) is the matrix representation of \(T\) with basis \(\beta\) for the domain and \(\beta_2\) for the co-domain, where \(\beta_2\) is made up of the columns of \(B^{-1}\). Pre-multiplying a matrix \(A\) with an invertible matrix just changes the basis of the co-domain for the corresponding linear transformation. It leaves the underlying transformation unchanged.