The oxygen isotope ratio 18 O/ 16 O (expressed as a δ 18 O VSMOW value) in marine sedimentary rocks has increased by ~8‰ from the early Paleozoic to modern times. Interpretation of this trend is hindered by ambiguities in the temperature of formation of the carbonate, the δ 18 O seawater , and the effects of postdepositional diagenesis. Carbonate clumped isotope measurements, a temperature proxy, offer constraints on this problem. This thermometer is thermodynamically controlled in cases where carbonate achieves an equilibrium internal distribution of isotopes and is independent of the δ 18 O of the water from which the carbonate grew; therefore, it has a relatively rigorous chemical–physics foundation and can be applied to settings where the δ 18 O of the water is not known. We apply this technique to an exceptionally well-preserved Ordovician carbonate record from the Baltic Basin and present a framework for interpreting clumped isotope results and for reconstructing past δ 18 O seawater . We find that the seawater in the Ordovician had lower δ 18 O seawater values than previously estimated, highlighting the need to reassess climate records based on oxygen-isotopes, particularly where interpretations are based on assumptions regarding either the δ 18 O seawater or the temperature of deposition or diagenesis. We argue that an increase in δ 18 O seawater contributed to the long-term rise in the δ 18 O of marine sedimentary rocks since the early Paleozoic. This rise might have been driven by a change in the proportion of high- versus low-temperature water–rock interaction in the earth’s hydrosphere as a whole.