Skip to main content

Advertisement

Log in

Domain adaptation and self-supervised learning for surgical margin detection

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

One in five women who undergo breast conserving surgery will need a second revision surgery due to remaining tumor. The iKnife is a mass spectrometry modality that produces real-time margin information based on the metabolite signatures in surgical smoke. Using this modality and real-time tissue classification, surgeons could remove all cancerous tissue during the initial surgery, improving many facets of patient outcomes. An obstacle in developing a iKnife breast cancer recognition model is the destructive, time-consuming and sensitive nature of the data collection that limits the size of the datasets.

Methods

We address these challenges by first, building a self-supervised learning model from limited, weakly labeled data. By doing so, the model can learn to contextualize the general features of iKnife data from a more accessible cancer type. Second, the trained model can then be applied to a cancer classification task on breast data. This domain adaptation allows for the transfer of learnt weights from models of one tissue type to another.

Results

Our datasets contained 320 skin burns (129 tumor burns, 191 normal burns) from 51 patients and 144 breast tissue burns (41 tumor and 103 normal) from 11 patients. We investigate the effect of different hyper-parameters on the performance of the final classifier. The proposed two-step method performed statistically significantly better than a baseline model (p-value < 0.0001), by achieving an accuracy, sensitivity and specificity of 92%, 88% and 92%, respectively.

Conclusion

This is the first application of domain transfer for iKnife REIMS data. We showed that having a limited number of breast data samples for training a classifier can be compensated by self-supervised learning and domain adaption on a set of unlabeled skin data. We plan to confirm this performance by collecting new breast samples and extending it to incorporate other cancer tissues.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. World Cancer Research Fund. American Institute for Cancer Research, pages https://www.wcrf.org/dietandcancer/breast--cancer

  2. Moran MS, Schnitt SJ, Guiliano AE, Harris JR, Khan SA, Horton J, Klimberg S, Chavez-MacGregor M, Freedman G, Houssami N, Johnson PL, Morrow M (2014) Society of surgical oncology-american society for radiation oncology consensus guideline on margins for breast-conserving surgery with whole-breast irradiation in stages i and ii invasive breast cancer. Clin Oncol 10:1507–1515

    Google Scholar 

  3. Maloney BW, McClatchy DM, Pogue BW, Paulsen KD, Wells WA, Barth RJ (2018) Review of methods for intraoperative margin detection for breast conserving surgery. J. Biomed. Optics. 23:1

    Article  Google Scholar 

  4. Santilli A, Jamzad A, Janssen N, Kaufmann M, Connolly L, Vanderbeck K, Wang A, McKay D, Rudan J, Fichtinger G, Mousavi P (2020) Perioperative margin detection in bcc using a deep learning framework: a feasibility study. Int J CARS 15:887–96

    Article  Google Scholar 

  5. Phelps DL, Balog J, Gildea LF, Bodai Z, Savage A, El-Bahrawy MA, Speller A, Rosini F, Kudo H, Brown R, Takats Z, G-Maghami S (2018) The surgical intelligent knife distinguishes normal, borderline and malignant gynaecological tissues using rapid evaporative ionisation mass spectrometry. British J Cancer 118:1349–58

    Article  CAS  Google Scholar 

  6. Hanel L, Kwiatkowski M, Heikaus L, Schluter H (2019) Mass spectrometry-based intraoperative tumor diagnostics. Future Sci OA 5:FSO373

    Article  Google Scholar 

  7. Noroozi M, Favaro P (2016) Unsupervised learning of visual representations by solving jigsaw puzzles. Eur. Conf Comput Vis 9910:69–84

    Google Scholar 

  8. Doersch C, Gupta A, Efros AA (2015) Unsupervised visual representation learning by context prediction. In: IEEE International Conference on Computer Vision, pp 1422–1430

  9. Sermanet P, Lynch C, Chebotar Y, Hsu J, Jang E, Schaal S, Levine S (2017) Time-contrastive networks: Self-supervised learning from video. arXiv:1704.06888

  10. Chung JS, Zisserman A (2017) Lip reading in profile. In: Proceedings of the British Machine Vision Conference, pp 1–11

  11. Chen L, Bentley P, Mori K, Misawa K, Fujiwara M, Rueckert D (2019) Self-supervised learning for medical image analysis using image context restoration. Med Image Anal 58:101539

    Article  Google Scholar 

  12. Maaten LVD, Hinton G (2008) Visualizing data using t-sne. J Machine Learn Res 9:2579–2605

    Google Scholar 

  13. St-John ER, Al-Khudairi R, Ashrafian H, Athanasiou T, Takats Z, Hadjiminas DJ, Darzi A, Leff DR (2017) Diagnostic accuracy of intraoperative techniques for margin assessment in breast cancer surgery. Analytical Surg 265(2):300–310

    Article  Google Scholar 

  14. Santoro A, Drummond R, Silva I, Ferreira S, Juliano L, Vendramini P, da Costa Batista, Lemos M, Eberlin M, Andrade V (2020) In situ desi-msi lipidomic profiles of breast cancer molecular subtypes and precursor lesions. Cancer Res 80:1246–1257

    Article  CAS  Google Scholar 

  15. Selvaraju RR, Cogswell M, Das A, Vedantam R, Parikh D, Batra D (2019) Grad-cam: Visual explanations from deep networks via gradient-based localization. Int J Comput Vis 128:336–359. https://doi.org/10.1007/s11263-019-01228-7

    Article  Google Scholar 

Download references

Funding

We would like to thank the following sources of funding: Natural Sciences and Engineering Council of Canada (NSERC), the Canadian Institute for Health Research (CIHR), Southeastern Ontario Academic Medical Organization (SEAMO) Innovation Fund, Britton Smith Chair in Surgery to J. Rudan, and Canada Research Chair to G. Fichtinger.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alice M. L. Santilli.

Ethics declarations

Conflict of interest

The authors declare no conflicts of interest.

Ethical approval

This study was approved by the Queen’s University Health Sciences Research Ethics Board.

Informed consent

All patients that participated in the study gave informed verbal and written consent.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Santilli, A.M.L., Jamzad, A., Sedghi, A. et al. Domain adaptation and self-supervised learning for surgical margin detection. Int J CARS 16, 861–869 (2021). https://doi.org/10.1007/s11548-021-02381-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-021-02381-6

Keywords

Navigation