Abstract
Double-walled liners consisting of impingement and effusion cooling are commonly used to cool gas turbine combustor chambers. Engine ingestion of particulates such as dirt, sand, or ash leads to particulate deposition and blockage of cooling holes in these liners. Prior work on double-walled liners has shown that deposition leads to flow blockage; however, no experimental work has studied the reductions in cooling that result from the deposition. The goal of the present work was to quantify this reduction in cooling by evaluating the heat transfer coefficient on the cold side of an effusion plate with and without dirt buildup. Heat transfer coefficients were quantified over a range of Reynolds numbers, pressure ratios, plate-to-plate spacings, and dirt injection amounts. The injection of dirt onto the cold-side effusion plate surface resulted in flow blockages and cooling reductions as high as 55%, with these effects differing for pressure ratio and plate-to-plate spacing. Scan data of the dirty effusion plate were used to characterize the deposition thicknesses for the different parameters tested. Overall, the heat transfer when having deposition on the effusion plate affected the cooling much beyond the insulation effect of the dirt layer. These results suggest that the effects of particulate buildup on the cooling flow field are an important driver in double-wall combustor liners.