Shannon information (SI) and the ideal-observer receiver operating characteristic (ROC) curve are two different methods for analyzing the performance of an imaging system for a binary classification task, such as the detection of a variable signal embedded within a random background. In this work we describe a new ROC curve, the Shannon information receiver operator curve (SIROC), that is derived from the SI expression for a binary classification task. We then show that the ideal-observer ROC curve and the SIROC have many properties in common, and are equivalent descriptions of the optimal performance of an observer on the task. This equivalence is described mathematically by an integral transform that maps the ideal-observer ROC curve onto the SIROC. This then leads to an integral transform relating the minimum probability of error, as a function of the odds against a signal, to the conditional entropy, as a function of the same variable. This last relation then gives us the complete mathematical equivalence between ideal-observer ROC analysis and SI analysis of the classification task for a given imaging system. We also find that there is a close relationship between the area under the ideal-observer ROC curve, which is often used as a figure of merit for imaging systems and the area under the SIROC. Finally, we show that the relationships between the two curves result in new inequalities relating SI to ROC quantities for the ideal observer.