#### Abstract

The endomorphism ring of the group of all sequences of integers, the Baer-Specker group, is isomorphic to the ring of row finite infinite matrices over the integers. The product bases of that group are represented by the multiplicative group of invertible elements in that matrix ring. All products in the Baer-Specker group are characterized, and a lemma of László Fuchs regarding such products is revisited.

#### 1. Introduction

Abelian group notation and terminology are standard [1, 2]. denotes the nonnegative integers and the positive integers. denotes the ring of all integers. Let , the group of all integer sequences, known as the Baer-Specker group [3, 4]. For purposes of matrix vector multiplication, an element of is viewed as a column vector, the th entry of which is denoted .

#### 2. Infinite Integral Matrices and Their Operations

To prove the results claimed, some properties of infinite integral matrices are needed. These properties are stated without proof, as they are known or easily proved.

The group of all infinite integral matrices is denoted by , with addition being defined in the usual way. Matrix rows and columns are indexed by . Column operations (multiplying a column by , interchanging two columns, adding an integral multiple of one column to another) are carried out in the usual way.

is said to be row finite if, for each for all but finitely many ; that is, if for each , there exists such that unless . A column finite matrix is defined analogously. The additive group of all row finite matrices in is denoted by . is said to be lower triangular if unless . The additive group of all lower triangular matrices in is denoted by . Obviously .

Recall that multiplication of two infinite matrices may not be well defined and, even when defined, may not be associative. For examples of pathologies of infinite matrices, see Section 5*.* However, for and , the product of and , denoted , is defined as

Because is row finite, the sum reduces to where . Under multiplication thus defined, and are rings with identity satisfying , the Kronecker delta.

is said to be invertible if there exists satisfying . It clear that such a is a unique two-sided inverse, so that is well defined [5, pages 21–25]. The set of invertible matrices in is denoted and forms a group under matrix multiplication.

Proposition 2.1. *If is obtained from through a column operation, then where is obtained by performing the same operation on . is both row and column finite and invertible.*

This corollary is immediate.

Corollary 2.2. *If is obtained from through a finite sequence of column operations, then where is obtained by performing the same sequence of operations on . is both row and column finite and invertible.*

Let be a subset of . If each expression , represents an element of , then the set of all such elements forms a subgroup of , denoted and called a product in [2, page 164]. Such a subset forms a product if and only if the matrix with columns is row finite. If the representations are unique, then the are independent and is said to be a basis of the product. One of the objectives of this paper is to characterize products and product bases in term of endomorphisms of .

For and , the matrix vector product of and , denoted simply , is defined as , with . As with matrix multiplication, because is row finite, the sum reduces to where . Each such naturally induces an endomorphism of , via matrix vector multiplication. Indeed, a product in , , is simply the image of such an endomorphism, where the matrix is constructed with columns .

All endomorphisms of are determined by their values on , the free subgroup of consisting of sequences which are after awhile [2, Lemma 94.1]. The standard group basis of is , and the also form the standard product basis of . In particular, all endomorphisms of are determined by their values on this standard basis. As a result, with each element in the endomorphism ring of , , is associated a unique matrix , given by . The immediate goal is to prove that each such is row finite, so that every endomorphism of is effectively multiplication by a row finite matrix.

#### 3. Endomorphisms and Product Bases

The main theorem follows.

Theorem 3.1. *Every endomorphism of is induced by the action of a row finite, infinite, integral matrix.*

*Proof. *Let , and let be the matrix associated with , , . Suppose that is not row finite; that is, suppose there are and distinct such that for all . Let denote the canonical projection of the th row of elements of onto . Then would be a homomorphism of into , which is nonzero at all , , an impossibility [4, Satz III]. Hence it is not possible that an infinite number of the are nonzero; that is, must be row finite.

To see that acts on to produce , it suffices to check agreement on the standard basis of . For , .

Corollary 3.2. *(i) The endomorphism ring of , , is isomorphic to the ring of row finite, infinite, integral matrices . (ii) The automorphism group of , , is isomorphic to the multiplicative group of invertible matrices in , .*

*Proof. * For , is a ring isomorphism with , which maps isomorphically onto . The row finiteness of the elements of ensures that all sums are finite and that matrix multiplication is associative.

Corollary 3.3. *There is a one-to-one correspondence between the matrices of and the product bases of .*

*Proof. *Let be a product basis for , and let be defined by for , when , . The uniqueness of expression of the elements of in terms of the product basis guarantees that is an automorphism. By Corollary 3.2(ii), corresponds to a unique element of . The converse is clear.

#### 4. Products in P

Topological techniques are helpful in studying the endomorphisms of . Background material may be found in [6–8] and the references cited therein. For , the distance between them, , is defined to be , where is the first place at which . Of course, . It is easy to check that is a metric on , which comports with the product topology on , when is discrete.

is a separable metric space having as a countable dense subset. Finally, is a complete metric space in which Cauchy sequences eventually become constant pointwise.

The important aspect of this topology is that all endomorphisms of are continuous.

Similarly, is a complete separable metric space with the distance between distinct matrices and defined as , where .

Theorem 4.1. *Every product in can be generated by a lower triangular matrix, the columns of which form a basis of the product.*

*Proof. *Let generate the product . The proof proceeds by transforming via column operations into a lower triangular matrix satisfying . Since is trivial, assume . Let , , where denotes transpose, and suppose for .

If row of consists entirely of 's, simply set , , and and proceed to the next step. Otherwise, perform column operations on the nonzero entries of row of to obtain a new matrix in which for all . This is tantamount to finding the gcd of the entries in row , columns of . Perform the same column operations on to produce , so that with an invertible row and column finite matrix. Let denote the inverse of . Since , . Set so that .

If for all , simply set , , and and proceed. Otherwise, perform column operations on the nonzero entries of row in columns of to obtain a new matrix in which for all . As before, this is tantamount to finding the gcd of the entries in row , columns of . Note that for all ; that is, column of is the same as column of .

Perform the same column operations on to obtain , so that with an invertible row and column finite matrix. Note that since no operation was performed on column of , none was performed on column of . Moreover, rows of and remained unchanged (because of entries), so that and the rest of the row and column entries of are . Thus the same is true of , the inverse of . Since , . Set so that . Because row of is , it follows that .

Suppose that after iterations, row finite matrices and invertible row and column finite matrices with inverses and have been obtained, such that(a) for all , ; that is, the matrices are becoming increasingly lower triangular;(b)column of is identical to column of for ; that is, after the th iteration, rows and columns do not change;(c);(d), ; that is, the 's and their inverses are becoming increasingly diagonal with 's down the diagonal, tending toward the identity matrix;(e), ; that is, is always in the image of ;(f), ; that is, after the th iteration, the entries of the 's do not change.

As before, if for all , simply set , , and and continue. Otherwise, perform column operations on the nonzero entries of the th row in columns of to obtain a new matrix in which for all . Because no operation is performed on columns through , , , ; that is, columns through of are the same as columns through of .

Perform the same column operations on to produce , so that with an invertible row and column finite matrix having inverse . Note that since no operation was performed on columns of , none was performed on columns of . Moreover, rows of were not changed (because of 's), and the same is true of and . (i) As a result, , . Since , , . (ii) Set so that . (i) and (ii) imply that , .

The Cauchy sequence converges to some because is a complete metric space. Since all endomorphisms of are continuous, the sequence converges to for each . Because of (b) and (f), the convergence is uniform. The Cauchy sequence of matrices thus obtained converges to a lower triangular matrix and so that From (b) it follows that and so .

It may, of course, happen that some of the diagonal terms of are so that the first nonzero entry in a column may be below the diagonal. Should that occur, the first nonzero entry in the next column will be in a lower row. If has only finitely many nonzero columns, the result is elementary. If all columns of are nonzero, it is clear that an infinite linear combination of its columns can be only if all coefficients are , so that the columns of form a product basis.

*Remark 4.2. *From the results in Section 3, a matrix can be classified as epic or monic if the endomorphism of which it induces is epic or monic, respectively. (1) If has a row , it cannot be epic because cannot be reached. If it has a column , it cannot be monic because is in the kernel. Here denotes the usual basis of . (2) Similarly, if a row (column) of is a multiple of another row (column), then cannot be epic (monic). (3) is invertible ( sided) over the rationals if and only if each diagonal entry is nonzero [5, pages 19-20 ]. (4) is integrally invertible if and only if each diagonal entry .

Lemma 4.3. * is epic if and only if each diagonal entry is .*

*Proof. * If each diagonal entry of is , then and so is epic. Conversely let be a nonzero element of . To solve , induct on . Let denote the finite matrix . If and , then is not solvable in general. If , then is uniquely determined. Suppose that the diagonals for , and suppose further that is uniquely solvable, , where denotes the inverse of . Express
so that
with some abuse of notation for the sake of simplicity. Now by the induction hypothesis, uniquely. Thus the only solution to must come from , which is impossible in general unless .

Invertible lower triangular infinite integral matrices may therefore be characterized as follows.

Theorem 4.4. *For the following are equivalent.*(1)* is epic.*(2)* for all .*(3)*.*

By now the following result is clear.

Corollary 4.5. * is monic if and only if for all .*

#### 5. Infinite Matrix Examples and Counterexamples

The next examples and counterexamples serve to justify some of the assumptions about matrix rings made in this paper. They illustrate some of the pitfalls and limitations of working with infinite matrices and further illustrate results from Section 4. They are not new but continue to be discussed in various contexts: [9, section 2]; [10, page 437]; [7, pages 189–191].

As before, the elements of are viewed as column vectors and = denotes the usual basis of ; is also the standard product basis of . The same symbol is used for the matrix with the as columns, that is, the identity matrix. The same liberty is taken with other bases and their matrices.

*Example 5.1. * also is a basis of , having matrix
Because the matrix is not row finite, cannot be a product basis of .

*Example 5.2. * also is a basis of , having matrix
Note that is not lower triangular.

*Example 5.3. *Let denote the column vector having all entries and let . Note that is not a basis of ; indeed, no element is even contained in . The accompanying lower triangular matrix is

The following relationships hold:
Observe that, although both and are row finite and even pure independent, is not monic ( is in the kernel), and is not epic ( is not attained). In particular, , but , so that multiplication is not associative. Despite the fact that , nevertheless cannot serve as a product basis for . The problem, of course, is that is not invertible in . Restriction to multiplicatively associative matrix rings helps to eliminate much of this infinite matrix pathology [5, pages 19–22].

In the proof of Theorem 4.1, although the columns of the resulting lower triangular matrix obviously are independent, nevertheless may be singular, as illustrated by above. Moreover, it may not be the case that the products in that proof will converge to a row finite matrix, as further illustrated by . Even though itself is not monic, the resulting lower triangular matrix , which results from column operations on , is monic.

#### 6. Fuchs' Lemma 95.1 Redux

Products in were introduced in [2] by beginning with a countable subset , with no condition other than that all sums of the form , are well defined. As mentioned, this means that the infinite matrix with columns must be row finite. Lemma 95.1 then was stated as follows in [2].

Let be a product in . There are elements and integers such that and where (•) if for all .

The goal was to establish a product analog of the well-known result for stacked bases of free groups of finite rank [1, Lemma 15.4]. However, a counterexample has been produced [11].

Perhaps another approach is to utilize the weaker definition of stacked bases found in [12], which does not require divisibility as in (•). From Theorem 4.1, there would be no loss of generality in assuming that the form a product basis and a lower triangular matrix.

Lemma 6.1. * Let be a product of infinite rank in , let , and let be the pure subgroup which the generate. Then there exist a product basis of and positive integers such that if and only if there exist stacked bases of and such that the matrix formed by the basis of is invertible and the matrix formed by the basis of satisfies .*

*Proof. *If is a product basis of , then the 's are pure independent. If are positive integers, then is a product in and if , then so that and are stacked bases for the respective subgroups of . The matrix formed by the 's is in by Corollary 3.3; the matrix formed by the certainly satisfies .

Conversely, let be a product in , let be the subgroup of which the 's generate, and let be the pure subgroup which they generate. Suppose there are stacked bases , of and of ; because is torsion, all 's must be positive. Suppose further that the matrix formed by the 's is invertible; then the 's are a product basis of and is a product in . Finally, if the matrix formed by the 's satisfies = , the proof is concluded.

*Remark 6.2. *(1) If is a product in and is a basis of the subgroup generated by the 's, the matrix formed by the 's need not be row finite. (2) As previously noted, the lower triangular matrix , obtained from the column operations in Theorem 4.1, may be singular. However, Nunke [8, page 199] has shown that every endomorphic image of is either free of finite rank or isomorphic to itself. In the latter case, there then exists monic satisfying = . Column operations will, of course, reveal matrix rank, finite or infinite.

All products in may therefore be characterized as follows.

Theorem 6.3. *Every product in is generated by a lower triangular, infinite, integral matrix, the columns of which form a product basis and the rank of which determines whether the product is free of finite rank or isomorphic to .*

*Generalizations*

The author's initial draft of this paper was done for higher dimensions, along the lines of [13]. At the referee's suggestion, matrix dimension has been restricted to , to improve readability.