The literature on tilting pad thrust bearings (TPTB) calls for flow reduction as an effective means to reduce drag power losses as well as oil pumping costs. However, the highest level of flow reduction a bearing can undergo while maintaining reliable operation is a key question that demands comprehensive analysis. This paper implements a model into an existing thermo-elasto-hydrodynamic (TEHD) computational analysis tool to deliver load performance predictions for TPTBs operating with reduced flow rates. For bearings supplied with either a reduced flow or an overflow conditions, a sound model for the flow and thermal energy mixing in a feed groove determines the temperature of the lubricant entering a thrust pad. Under a reduced flow condition, the analysis reduces the effective arc length of a wetted pad until matching the available flow. Predicted discharge flow temperature rise and pad subsurface temperature rise from the present model match measurements in the archival literature for an eight-pad bearing supplied with 150% to 25% of the nominal flow rate, i.e., the minimum flow that fully lubricates the bearing pads. A supply flow above nominal rate increases the bearing drag power because the lubricant enters a pad at a lower temperature, and yet has little effect on a thrust pad peak temperature rise or its minimum film thickness. A reduced flow below nominal produces areas lubricant starvation zones, and thus the minimum film thickness substantially decreases while the film and pad's surface temperature rapidly increase to produce significant thermal crowning of the pad surface. Compared to the bearing lubricated with a nominal rate, a starved flow bearing produces a larger axial stiffness and a lesser damping coefficient. A reduction in drag power with less lubricant supplied brings an immediate energy efficiency improvement to bearing operation. However, sustained long-term operation with overly warm pad temperatures could reduce the reliability of the mechanical element and its ultimate failure.