Öppna denna publikation i ny flik eller fönster >>2022 (Engelska)Ingår i: Journal of Biomedical Optics, ISSN 1083-3668, E-ISSN 1560-2281, Vol. 27, nr 3, artikel-id 036004Artikel i tidskrift (Refereegranskat) Published
Abstract [en]
Significance: Developing algorithms for estimating blood oxygenation from snapshot multispectral imaging (MSI) data is challenging due to the complexity of sensor characteristics and photon transport modeling in tissue. We circumvent this using a method where artificial neural networks (ANNs) are trained on in vivo MSI data with target values from a point-measuring reference method.
Aim: To develop and evaluate a methodology where a snapshot filter mosaic camera is utilized for imaging skin hemoglobin oxygen saturation (SO2), using ANNs.
Approach: MSI data were acquired during occlusion provocations. ANNs were trained to estimate SO2 with MSI data as input, targeting data from a validated probe-based reference system. Performance of ANNs with different properties and training data sets was compared.
Results: The method enables spatially resolved estimation of skin tissue SO2. Results are comparable to those acquired using a Monte-Carlo-based approach when relevant training data are used.
Conclusions: Training an ANN on in vivo MSI data covering a wide range of target values acquired during an occlusion protocol enable real-time estimation of SO2 maps. Data from the probe-based reference system can be used as target despite differences in sampling depth and measurement position.
Ort, förlag, år, upplaga, sidor
Bellingham, WA, United States: SPIE - International Society for Optical Engineering, 2022
Nyckelord
multispectral imaging, artificial neural networks, hemoglobin oxygen saturation, skin microcirculation, diffuse reflectance spectroscopy
Nationell ämneskategori
Medicinsk laboratorie- och mätteknik
Identifikatorer
urn:nbn:se:liu:diva-184440 (URN)10.1117/1.jbo.27.3.036004 (DOI)000776555200006 ()35340134 (PubMedID)2-s2.0-85127252219 (Scopus ID)
Anmärkning
Funding: This study was financially supported by VINNOVA Grants via the Swelife and MedTech4Health programs (Grant Nos.2016-02211, 2017-01435, and 2019-01522).
2022-04-202022-04-202022-05-11Bibliografiskt granskad