For patients diagnosed with a chronic condition, appropriately timing treatment is critically important. Many conditions, such as age-related macular degeneration (AMD), have a maximum safe treatment interval (MSTI) in which treatment is required to prevent disease progression. We introduce a Markov Decision Process (MDP) with an ordinal action space as a way to quickly and safely identify the MSTI for an individual patient. The MDP's ordinal action space allows users to update the expected outcomes of multiple actions with only a single decision. We demonstrate that even with dependent actions the optimal decision policy can be calculated offline, and the expected outcomes of each action can be presented to clinicians as a menu of treatment options. We describe conditions under which the ordinal MDP is guaranteed to find the optimal treatment interval under uncertainty. We illustrate the model’s effectiveness through an application to AMD, and show that following the ordinal MDP can reduce patient exposure to symptoms by 38% and find the optimal interval 7% faster when compared to current clinical protocols.