Predicting Wrist Osteoporosis from excised human finger bones using spatially offset Raman spectroscopy, A Cadaveric Study
Abstract
Osteoporosis and osteopenia remain vastly underdiagnosed. Current clinical screening relies almost exclusively on dual-energy X-ray absorptiometry (DXA), which measures bone mineral density (BMD) but fails to capture the compositional changes that lead to BMD loss. We investigated whether Spatially Offset Raman Spectroscopy (SORS) applied to excised finger bones can assess subsurface biochemical markers capable of diagnosing osteoporosis and osteopenia and predicting wrist DXA T-scores. Raman spectra were acquired ex vivo on the mid-shaft of the proximal phalanx of the second digit from 25 female cadavers spanning the three T-score categories (n=8 normal, n=6 osteopenic, and n=11 osteoporotic) at spatial offsets of 0, 3, and 6 mm from a laser irradiation spot. After normalizing spectra to the PO43- peak, group-averaged spectra of the three categories, measured at 3-mm offset, showed clear differences in the CO32-, Amide III, CH2, and Amide I bands. Quantitatively, four out of five mineral-to-matrix ratios differed significantly (p < 0.05) between normal and osteopenic bone, and between osteopenic and osteoporotic bone, and all five ratios showed significant differences between normal and osteoporotic bone. In contrast, the 0-mm offset suffered diminished contrast, and the 6-mm offset did not enhance discrimination between different groups, compared with the 3-mm offset. A leave-one-out, partial-least-squares regression model built from the 3-mm spectra predicted distal radius DXA T-score with a Pearson correlation of r = 0.85 and a root-mean-square error of cross-validation of 1 T-score units, correctly classifying 92% of specimens.