We consider a class of Riemannian optimization problems where the objective is the sum of a smooth function and a nonsmooth function considered in the ambient space. This class of problems finds important applications in machine learning and statistics, such as sparse principal component analysis, sparse spectral clustering, and orthogonal dictionary learning. We propose a Riemannian alternating direction method of multipliers (ADMM) to solve this class of problems. Our algorithm adopts easily computable steps in each iteration. The iteration complexity of the proposed algorithm for obtaining an ϵ-stationary point is analyzed under mild assumptions. Existing ADMMs for solving nonconvex problems either do not allow a nonconvex constraint set or do not allow a nonsmooth objective function. Our algorithm is the first ADMM-type algorithm that minimizes a nonsmooth objective over manifold—a particular nonconvex set. Numerical experiments are conducted to demonstrate the advantage of the proposed method. Funding: The research of S. Ma was supported in part by the Office of Naval Research [Grant N00014-24-1-2705]; the National Science Foundation [Grants DMS-2243650, CCF-2308597, CCF-2311275, and ECCS-2326591]; the University of California, Davis Center for Data Science and Artificial Intelligence Research Innovative Data Science Seed Funding Program; and Rice University start-up fund.