Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Biomedical Engineering

Date Submitted: Mar 27, 2024
Date Accepted: Sep 13, 2024

The final, peer-reviewed published version of this preprint can be found here:

Enhancing Ultrasound Image Quality Across Disease Domains: Application of Cycle-Consistent Generative Adversarial Network and Perceptual Loss

Athreya S, Radhachandran A, Ivezić V, Sant V, Arnold CW, Speier W

Enhancing Ultrasound Image Quality Across Disease Domains: Application of Cycle-Consistent Generative Adversarial Network and Perceptual Loss

JMIR Biomed Eng 2024;9:e58911

DOI: 10.2196/58911

PMID: 39689310

PMCID: 11688586

Enhancing Ultrasound Image Quality across Disease Domains: an Application of CycleGAN and Perceptual Loss

  • Shreeram Athreya; 
  • Ashwath Radhachandran; 
  • Vedrana Ivezić; 
  • Vivek Sant; 
  • Corey W. Arnold; 
  • William Speier

ABSTRACT

Background:

Several previous studies have explored ultrasound image enhancement using image processing approaches to bridge the gap between low-quality and high-quality ultrasound imaging equipment. Most of these studies work on datasets with registered input ultrasound image pairs. Further, they rely on organ-specific attributes for achieving comparable model performance.

Objective:

The objective of this work is to introduce an advanced framework designed to enhance ultrasound images, especially those captured by portable hand-held devices, which often produce lower quality images due to hardware constraints. Additionally, this framework is uniquely capable of effectively handling non-registered input ultrasound image pairs, addressing a common challenge in medical imaging.

Methods:

In this retrospective study, we utilized an enhanced generative adversarial network (CycleGAN) model for ultrasound image enhancement across five organ systems. Perceptual loss, derived from deep features of pretrained neural networks, is applied to ensure the human-perceptual quality of the enhanced images. These images are compared with paired images acquired from high resolution devices to demonstrate the model's ability to generate realistic high-quality images across organ systems.

Results:

Preliminary validation of the framework reveals promising performance metrics. The model generates images that result in a Structural Similarity Index (SSI) score of 0.722, Locally Normalized Cross-Correlation (LNCC) score of 0.902 and 28.802 for the Peak Signal-to-Noise Ratio (PSNR) metric.

Conclusions:

This work presents a significant advancement in medical imaging through the development of a CycleGAN model enhanced with Perceptual Loss (PL), effectively bridging the quality gap between ultrasound images from varied devices. By training on paired images, the model not only improves image quality but also ensures the preservation of vital anatomic structural content. This approach may improve equity in access to healthcare by enhancing portable device capabilities, although further validation and optimizations are necessary for broader clinical application.


 Citation

Please cite as:

Athreya S, Radhachandran A, Ivezić V, Sant V, Arnold CW, Speier W

Enhancing Ultrasound Image Quality Across Disease Domains: Application of Cycle-Consistent Generative Adversarial Network and Perceptual Loss

JMIR Biomed Eng 2024;9:e58911

DOI: 10.2196/58911

PMID: 39689310

PMCID: 11688586

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.