The mathematization of cosmology from Kelvin to Einstein

Scott A. Walter Faculty of Science and Technology, University of Nantes
To appear in the Supplement to the Boletim da Sociedade Portuguesa de Matemática

At the turn of the 20th century, William Thomson (Lord Kelvin) [12], observed that by assimilating the stars in the sky to monatomic gas molecules, from kinetic gas theory one could calculate the dimensions of the universe, given stellar velocities in the vicinity of the Solar System and an estimate of average matter density. Thomson’s insight gave rise to “stargas” models of star clusters, and of the observable universe, that were pursued until the early 1920s by mathematical astronomers J. C. Kapteyn, Henri Poincaré, Arthur S. Eddington, Karl Schwarzschild, James Jeans, C. V. L. Charlier, and the theoretical physicist Albert Einstein.

Thomson’s suggestion arrived in a unique scientific context. From the 1860s to the turn of the twentieth century, position and apparent magnitude were cataloged for hundreds of thousands of stars. The directory of the Munich Observatory, Hugo von Seeliger sought to exploit this data by deriving a density law [11] for stars of a given magnitude in a solid angle. Concurrent progress in the theory of integral equations offered hope that Seeliger’s law would become more tractable, while further impetus to his statistical program was provided by Kapteyn’s announcement [5] at the Congress of Science and Arts during the World’s Fair in Saint Louis (1904) of his discovery of two star-streams. In Kapteyn’s audience was Henri Poincaré, who quickly seized on Thomson’s idea of using kinetic gas theory for modeling astronomical and cosmological phenomena.

Others soon followed, including A. S. Eddington and Karl Schwarzschild, who proposed dualist and unitary models, respectively, of the observed distribution of stellar velocities. Eddington [1] affirmed Kapteyn’s two-stream hypothesis on the basis of his analysis of the Groombridge stars, and claimed the streams were characterized by Maxwellian distributions with different constants. Shortly thereafter, Schwarzschild [10], on the basis of a different dataset, affirmed that there were not two star-streams but rather an ellipsoidal velocity distribution. The two models were judged at first to represent the data equally well, and further efforts were called for to determine which was best.

What Eddington and Schwarzschild provided in 1906–1907 were data models, and neither one of them investigated the underlying dynamics. Early on, Poincaré suggested that the Milky Way was undergoing a rotation [7]; he developed this bold conjecture in the course of his Sorbonne lectures of 1910–1911 [8], the publication of which constituted the first theoretical treatise on cosmology. Notably, Poincaré derived the virial for a gaseous mass with Newtonian attraction, and took up the mixing problem. Like Poincaré, James Jeans challenged belief in the stationary state of the universe, based on his calculation of the angle of deflection of colliding stars [4]. A stargas model of globular nebulæ was investigated by Einstein in 1921 using Poincaré’s virial, as a means of fixing the value of the cosmological constant he had introduced in 1917 to the field equations of general relativity [2], and to obtain thereby an estimate of the size of the universe [3].

Ultimately, stargas models of the universe did not stand up well against the data obtained from the powerful new telescopes at the Mount Wilson and Lowell observatories, as E. R. Paul pointed out [6]. Following the upheaval in knowledge of the structure of the Milky Way in the 1920s, stargas models continued to find employment in the sub-realms of galaxy and star clusters. The theorems and methods introduced in this context thus served as the foundation for research in stellar dynamics in later decades.