Abstract
The volumetric specific surface (S/V) of monoliths influences contaminant leachability. Small samples such as those used in laboratory-based leaching tests have higher S/V ratios than larger monoliths of controlled low-strength materials (CLSM) in utility and energy pipeline trenches. Quantitative relationships are herein derived for relating contaminant leaching rates from full-scale monolith-filled trenches in the field to laboratory leaching data. A field dimensional exposure modification factor, P, is derived with a magnitude range of 37.6 to 50.37 for trenches with length and cross-sectional area dimensional ranges of 10–15 m and 1–4 m, respectively. Then, P, which increases with CLSM sectional area, is applied to copper, arsenic, and selenium leaching data for CLSM comprising portland cement, aggregate, water, and fly ash, ranging in weight contents from 5 to 20%. The results of leaching with water and pH 5.5 leachant show that computed diffusion coefficients for the metals in the field, D ef values are higher than values obtained for small samples through leaching tests in the laboratory. Furthermore, D ef values are higher at low ash substitution levels than at higher levels. The highest D ef values, which are at 5% ash content, are 2.09 × 10−4 m2/s (deionized water), 7.29 × 10−11 m2/s (pH 5.5) and 2.09 × 10−10 m2/s (pH 5.5) for copper, arsenic, and selenium, respectively. Apparently, at higher ash contents, cementation effects decrease monolith porosity to produce lower values of diffusion coefficient. Computed estimates of cumulative leaching fractions at high saturation are low for the contaminants and are directly proportional to D ef values.
Get full access to this article
View all access options for this article.
