Share:
Share this content in WeChat
X
Technical Articles
Research on the method of brain magnetic resonance synthetic DWI generation based on the cycle generative adversarial network
XIA Liang  LIANG Zhipeng  ZHANG Jun 

Cite this article as: XIA L, LIANG Z P, ZHANG J. Research on the method of brain magnetic resonance synthetic DWI generation based on the cycle generative adversarial network[J]. Chin J Magn Reson Imaging, 2023, 14(7): 121-126. DOI:10.12015/issn.1674-8034.2023.07.021.


[Abstract] Objective Based on cycle generative adversarial network (CycleGAN), using unpaired patient head MR image data to achieve mutual conversion between water-suppressed T2WI images and diffusion weighted imaging (DWI) images, and to evaluate the quality of the generated synthetic DWI images.Materials and Methods Brain water-suppressed T2WI images and DWI images of 200 cases were collected. There were 100 cases in the training set and 100 cases in the test set, including 50 cases of acute cerebral infarction. CycleGAN model included two generators and two discriminators. Firstly, two generators were constructed based on convolutional neural networks (CNN). One generator converted water-suppressed T2WI images into synthetic-DWI images, and the other generator converted DWI images into synthetic-T2WI images. Then, two discriminators were constructed based on CNN, which were used to discriminate the real image and the generated synthetic image and update the parameters. The generator and discriminator work alternately to complete the training of CycleGAN model. The image quality of synthetic-DWI was evaluated by MAE, ME, PSNR, SSIM and subjective score. A total of 50 cases of acute cerebral infarction were divided into DWI images and sDWI images, and DICE coefficient was calculated.Results The MAE, ME, PSNR and SSIM values of the synthetic and true DWI images were 34.991±0.989, 15.982±0.978, 26.642±3.428 and 0.927±0.039, respectively. More than 80% of the synthetic DWI images had no or only slight image distortion or artifact. The DICE coefficients of true DWI and synthetic DWI images after infarction segmentation were 0.898±0.324 and 0.849±0.259, respectively.Conclusions The CycleGAN model and unpaired image data can generate high-quality synthetic DWI images and reduce the scanning time for patients who need rapid magnetic resonance imaging.
[Keywords] acute cerebral infarction;cerebral apoplexy;diffusion weighted imaging;cycle generative adversarial network;deep learning;magnetic mesonance imaging

XIA Liang   LIANG Zhipeng*   ZHANG Jun  

Department of Radiology, Sir Run Run Hospital Affiliated to Nanjing Medical University, Nanjing 211000, China

Corresponding author: Liang ZP, E-mail: 410312774@qq.com

Conflicts of interest   None.

Received  2022-10-12
Accepted  2023-06-25
DOI: 10.12015/issn.1674-8034.2023.07.021
Cite this article as: XIA L, LIANG Z P, ZHANG J. Research on the method of brain magnetic resonance synthetic DWI generation based on the cycle generative adversarial network[J]. Chin J Magn Reson Imaging, 2023, 14(7): 121-126. DOI:10.12015/issn.1674-8034.2023.07.021.

[1]
CHEVALLIER O, ZHOU N, HE J, et al. Removal of evidential motion-contaminated and poorly fitted image data improves IVIM diffusion MRI parameter scan-rescan reproducibility[J]. Acta Radiol, 2018, 59(10): 1157-1167. DOI: 10.1177/0284185118756949.
[2]
HU Y X, LEVINE E G, TIAN Q Y, et al. Motion-robust reconstruction of multishot diffusion-weighted images without phase estimation through locally low-rank regularization[J]. Magn Reson Med, 2019, 81(2): 1181-1190. DOI: 10.1002/mrm.27488.
[3]
LIU Q, ZHOU Z P. Principle and clinical application of high resolution magnetic resonance diffusion imaging with multiplexed sensitivity encoding[J]. Chin J Magn Reson Imaging, 2022, 13(1): 167-170. DOI: 10.12015/issn.1674-8034.2022.01.040.
[4]
KIM T H, BAEK M Y, PARK J E, et al. Comparison of DWI methods in the pediatric brain: propeller turbo spin-echo imaging versus readout-segmented echo-planar imaging versus single-shot echo-planar imaging[J]. AJR Am J Roentgenol, 2018, 210(6): 1352-1358. DOI: 10.2214/AJR.17.18796.
[5]
YOUNG I R, SZEVERENYI N M, DU J, et al. Pulse sequences as tissue property filters (TP-filters): a way of understanding the signal, contrast and weighting of magnetic resonance images[J]. Quant Imaging Med Surg, 2020, 10(5): 1080-1120. DOI: 10.21037/qims.2020.04.07.
[6]
HU H H, MCALLISTER A S, JIN N, et al. Comparison of 2D BLADE turbo gradient- and spin-echo and 2D spin-echo echo-planar diffusion-weighted brain MRI at 3 T: preliminary experience in children[J]. Acad Radiol, 2019, 26(12): 1597-1604. DOI: 10.1016/j.acra.2019.02.002.
[7]
ZHANG H X, LI H L, DILLMAN J R, et al. Multi-contrast MRI image synthesis using switchable cycle-consistent generative adversarial networks[J/OL]. Diagnostics (Basel), 2022, 12(4): 816 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/35453864/. DOI: 10.3390/diagnostics12040816.
[8]
DOU H, CHEN C, HU X Y, et al. Asymmetric CycleGAN for image-to-image translations with uneven complexities[J/OL]. Neurocomputing, 2020, 415: 114-122 [2022-10-11]. https://www.sciencedirect.com/science/article/abs/pii/S0925231220311528?via%3Dihub. DOI: 10.1016/j.neucom.2020.07.044.
[9]
HU L, ZHOU D W, ZHA Y F, et al. Synthesizing high-b-value diffusion-weighted imaging of the prostate using generative adversarial networks[J/OL]. Radiol Artif Intell, 2021, 3(5): e200237 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/34617025/. DOI: 10.1148/ryai.2021200237.
[10]
JI W, GUO J, LI Y. Multi-head mutual-attention CycleGAN for unpaired image-to-image translation[J]. IET Image Process, 2020, 14(11): 2395-2402. DOI: 10.1049/iet-ipr.2019.1153.
[11]
BABU K K, DUBEY S R. CDGAN: Cyclic Discriminative Generative Adversarial Networks for image-to-image transformation[J/OL]. J Vis Commun Image Represent, 2022, 82: 103382 [2022-10-11]. https://www.sciencedirect.com/science/article/abs/pii/S1047320321002522?via%3Dihub. DOI: 10.1016/j.jvcir.2021.103382.
[12]
SUN J Z, DU Y, LI C Y, et al. Pix2Pix generative adversarial network for low dose myocardial perfusion SPECT denoising[J]. Quant Imaging Med Surg, 2022, 12(7): 3539-3555. DOI: 10.21037/qims-21-1042.
[13]
ZHANG Z X, LIU Q J, WANG Y H. Road extraction by deep residual U-net[J]. IEEE Geosci Remote Sens Lett, 2018, 15(5): 749-753. DOI: 10.1109/LGRS.2018.2802944.
[14]
WEN L, LI X Y, GAO L. A transfer convolutional neural network for fault diagnosis based on ResNet-50[J]. Neural Comput & Applic, 2020, 32(10): 6111-6124. DOI: 10.1007/s00521-019-04097-w.
[15]
BABU K K, DUBEY S R. CSGAN: Cyclic-Synthesized Generative Adversarial Networks for image-to-image transformation[J/OL]. Expert Syst Appl, 2021, 169: 114431 [2022-10-11]. https://www.sciencedirect.com/science/article/abs/pii/S0957417420310940?via%3Dihub. DOI: 10.1016/j.eswa.2020.114431.
[16]
YANG H R, SUN J, CARASS A, et al. Unsupervised MR-to-CT synthesis using structure-constrained CycleGAN[J]. IEEE Trans Med Imaging, 2020, 39(12): 4249-4261. DOI: 10.1109/TMI.2020.3015379.
[17]
JIN C B, KIM H, LIU M J, et al. Deep CT to MR synthesis using paired and unpaired data[J/OL]. Sensors (Basel), 2019, 19(10): 2361 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/31121961/. DOI: 10.3390/s19102361.
[18]
WANG J T, YAN B, WU X H, et al. Development of an unsupervised cycle contrastive unpaired translation network for MRI-to-CT synthesis[J/OL]. J Appl Clin Med Phys, 2022, 23(11): e13775 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/36168935/. DOI: 10.1002/acm2.13775.
[19]
WANG J J, CHEN P, ZHENG N N, et al. Associations between MSE and SSIM as cost functions in linear decomposition with application to bit allocation for sparse coding[J/OL]. Neurocomputing, 2021, 422: 139-149 [2022-10-11]. https://www.sciencedirect.com/science/article/abs/pii/S0925231220315149?via%3Dihub. DOI: 10.1016/j.neucom.2020.10.018.
[20]
GOURDEAU D, DUCHESNE S, ARCHAMBAULT L. On the proper use of structural similarity for the robust evaluation of medical image synthesis models[J]. Med Phys, 2022, 49(4): 2462-2474. DOI: 10.1002/mp.15514.
[21]
CLÈRIGUES A, VALVERDE S, BERNAL J, et al. Acute and sub-acute stroke lesion segmentation from multimodal MRI[J]. Comput Methods Programs Biomed, 2020, 194: 105521 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/32434099/. DOI: 10.1016/j.cmpb.2020.105521.
[22]
GANESHKUMAR M, RAVI V, SOWMYA V, et al. Identification of intracranial haemorrhage (ICH) using ResNet with data augmentation using CycleGAN and ICH segmentation using SegAN[J].Multimed Tools Appl, 2022, 81(25): 36257-36273. DOI: 10.1007/s11042-021-11478-8.
[23]
MÖNCH S, SOLLMANN N, HOCK A, et al. Magnetic resonance imaging of the brain using compressed sensing-quality assessment in daily clinical routine[J]. Clin Neuroradiol, 2020, 30(2): 279-286. DOI: 10.1007/s00062-019-00789-x.
[24]
ZHU G M, TAN Y M, TAO J, et al. The value of artificial intelligence of coronary CTA based on deep learning in suspected coronary arteriosclerotic heart disease patients[J]. J Clin Radiol, 2021, 40(11): 2128-2133. DOI: 10.12117/jccmi.2020.04.006.
[25]
PENG Y L, CHEN S P, QIN A, et al. Magnetic resonance-based synthetic computed tomography images generated using generative adversarial networks for nasopharyngeal carcinoma radiotherapy treatment planning[J/OL]. Radiother Oncol, 2020, 150: 217-224 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/32622781/. DOI: 10.1016/j.radonc.2020.06.049.
[26]
SUN B, JIA S F, JIANG X L, et al. Double U-Net CycleGAN for 3D MR to CT image synthesis[J]. Int J CARS, 2023, 18(1): 149-156. DOI: 10.1007/s11548-022-02732-x.
[27]
ELAZAB A, WANG C M, GARDEZI S J S, et al. GP-GAN: brain tumor growth prediction using stacked 3D generative adversarial networks from longitudinal MR Images[J/OL]. Neural Netw, 2020, 132: 321-332 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/32977277/. DOI: 10.1016/j.neunet.2020.09.004.
[28]
ABU-SRHAN A, ALMALLAHI I, ABUSHARIAH M A M, et al. Paired-unpaired Unsupervised Attention Guided GAN with transfer learning for bidirectional brain MR-CT synthesis[J/OL]. Comput Biol Med, 2021, 136: 104763 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/34449305/. DOI: 10.1016/j.compbiomed.2021.104763.
[29]
DAR S U, YURT M, KARACAN L, et al. Image synthesis in multi-contrast MRI with conditional generative adversarial networks[J]. IEEE Trans Med Imaging, 2019, 38(10): 2375-2388. DOI: 10.1109/TMI.2019.2901750.
[30]
XIE H Q, LEI Y, WANG T H, et al. Synthesizing high-resolution magnetic resonance imaging using parallel cycle-consistent generative adversarial networks for fast magnetic resonance imaging[J]. Med Phys, 2022, 49(1): 357-369. DOI: 10.1002/mp.15380.
[31]
KLAGES P, BENSLIMANE I, RIYAHI S, et al. Patch-based generative adversarial neural network models for head and neck MR-only planning[J]. Med Phys, 2020, 47(2): 626-642. DOI: 10.1002/mp.13927.
[32]
LI Y F, LI W, XIONG J, et al. Comparison of supervised and unsupervised deep learning methods for medical image synthesis between computed tomography and magnetic resonance images[J/OL]. Biomed Res Int, 2020, 2020: 5193707 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/33204701/. DOI: 10.1155/2020/5193707.
[33]
YOSHIDA M, TERAMOTO A, KUDO K, et al. Automated extraction of cerebral infarction region in head MR image using pseudo cerebral infarction image by CycleGAN[J/OL]. Appl Sci, 2022, 12(1): 489 [2022-10-11]. https://www.mdpi.com/2076-3417/12/1/489. DOI: 10.3390/app12010489.
[34]
REZAEIJO S M, ENTEZARI ZARCH H, MOJTAHEDI H, et al. Feasibility study of synthetic DW-MR images with different b values compared with real DW-MR images: quantitative assessment of three models based-deep learning including CycleGAN, Pix2PiX, and DC2 Anet[J]. Appl Magn Reson, 2022, 53(10): 1407-1429. DOI: 10.1007/s00723-022-01482-y.
[35]
LI W, LI Y F, QIN W J, et al. Magnetic resonance image (MRI) synthesis from brain computed tomography (CT) images based on deep learning methods for magnetic resonance (MR)-guided radiotherapy[J]. Quant Imaging Med Surg, 2020, 10(6): 1223-1236. DOI: 10.21037/qims-19-885.
[36]
DAI X J, LEI Y, LIU Y Z, et al. Intensity non-uniformity correction in MR imaging using residual cycle generative adversarial network[J/OL]. Phys Med Biol, 2020, 65(21): 215025 [2022-10-11]. https://pubmed.ncbi.nlm.nih.gov/33245059/. DOI: 10.1088/1361-6560/abb31f.

PREV The evaluation and quantitative analysis of MR 3D-Vibe combined with T2 mapping imaging on triangular fibrocartilage complex injury of wrist joint
NEXT A case of low-grade malignant myofibroblastic sarcoma of chest wall
  



Tel & Fax: +8610-67113815    E-mail: editor@cjmri.cn