In this work, we derive optimal transmission policies for an energy harvesting status update system. The system monitors a stochastic process which can be in one of two states of operation, a normal state or an alarm state. We capture the freshness of status updates for each state of the stochastic process by using a separate Age of Information (AoI) variable for each state and by extending the definition of AoI to account for the stochastic process state changes. We assume that the demand for status updates is higher when the stochastic process is in the alarm state and utilize a transition cost function that increases non-linearly with AoI when the stochastic process is in the alarm state and linearly otherwise. We formulate the problem at hand as a Markov Decision Process, evaluate numerically the derived policies and illustrate their effectiveness for reserving energy in anticipation of future alarm states.