Finding a parking space has not only become painful but also costs a lot in most of the metropolitan cities. With the increase in number of vehicles and limited resources such as manpower and space, the need for effective management of parking lots has increased. Improper management of parking lots can have negative consequences such as traffic congestion, wastage of time in search of parking spaces, air pollution and even loss of revenue for the parking lot managers. Dynamic pricing is a powerful tool to control the behavior of drivers by diverting them towards the unoccupied and cheaper parking lots. Though there are several existing dynamic pricing strategies, determining the right prices is quite challenging due to lack of knowledge of drivers’ behavior and several uncertainties like harsh weather and special days. In this paper Reinforcement Learning(RL) technique called Q-learning is used to calculate the dynamic prices for parking lots on hourly basis without the need of prior information about the system. Crucial factors like distance of the parking lots from the city centers, weather and holidays are considered in the proposed algorithm to achieve better accuracy. Price Elasticity of Demand (PED) is used in the proposed work to calculate the new state(occupancy) when an action(dynamic price charged by the parking lot owner) is taken place. Hourly prices are estimated using the proposed algorithm and simulation results show that the calculated prices can efficiently manage parking occupancy during peak and off peak hours. The simulation output also shows that the proposed algorithm can successfully increase the revenue of the parking lot owners.