# Fixing Acceleration and Image Resolution Issues of Nuclear Magnetic Resonance

## Abstract

**:**

## 1. Introduction

## 2. MRI Sparse Sampling Schemes

**Figure 1.**Combining Compressed Sensing (CS) with exploiting the Hermitian symmetry property produces effective sampling pattern. On the right: the two k-space locations marked in yellow colour on the graph, mirror images across the origin of k-space, have identical amplitudes but opposite phases.

## 3. The Proposed Algorithm

- high-resolution estimate produced by → the submethod, see Figure 2.
- iterate until convergence $|{\mathsf{\Gamma}}_{L}^{n}-{\mathsf{\Gamma}}_{L}^{n,\mathrm{observed}}|<\epsilon $
- Estimate noise ${\theta}_{i}=\frac{\alpha +{N}_{q}-1}{\beta +{N}_{q}\overline{x}}$
- Calculated deformable image registration parameters and utilise them to align an image grid
- Estimate blur kernel ${B}_{x}=\underset{{B}_{x}}{argmin{\theta}_{0}}\parallel A{M}_{y}{B}_{x}-{\mathsf{\Gamma}}_{L}^{n}\parallel +\xi \parallel \nabla {B}_{x}\parallel $
- Enhance the high-resolution estimate ${\mathsf{\Gamma}}_{H}^{n}$
- Repeat steps a-d until convergence, see Figure 3

**Figure 4.**The Magnetic Resonance image registration. This procedure is applied to the entire set of the Low-Resolution images.

## 4. Results

**Figure 5.**The phantom images based on experiment results. See from the upper row: the unprocessed image corrupted by simulated shift and the reconstructed images with various sampling rates (varying from 25% to 40% and 60% of samples of the ground truth).

**Figure 6.**A brain imaging example. From left to right: A: Image reconstructed from partially sampled PROPELLER blade, B:Cartesian sampling grid without image registration applied (with no downsampling applied), C: B-spline Cubic interpolation, D: Non-Rigid Multi-Modal 3D Medical Image Registration Based on Foveated Modality Independent Neighbourhood Descriptor [45], E: Enhanced deep residual networks for single image super-resolution [14], F: Image super-resolution using very deep residual channel attention networks [16], G: Residual dense network for image super-resolution [15], H: super-resolution with proposed sampling scheme and motion compensation (the proposed algorithm). Compression ratio is 50%. Please see Table 1 for the PSNR values at other compression ratios.

**Table 1.**Stats of the Peak signal-to-noise ratio (PSNR) metrics for Figure 7. MAE abbreviation stands for Mean Average Error.

Reconstruction Method | N | M | MAE | SD | t | p |
---|---|---|---|---|---|---|

down-sampled without image registration | 100 | 21.16 | 20.04 | 0.01 | 1.094 | 0.276 |

down-sampled with image registration | 100 | 24.52 | 19.34 | 0.01 | −0.779 | 0.438 |

motion corrected regular sampling scheme (without subsampling applied) | 100 | 22.36 | 18.04 | 0.01 | 0.185 | 0.854 |

B-spline cubic interpolation /image registration applied/ | 100 | 23.31 | 18.01 | 0.01 | 0.184 | 0.274 |

Non-Rigid Multi-Modal 3D Medical Image Registration | ||||||

Based on Foveated Modality Independent Neighbourhood Descriptor | 100 | 26.33 | 17.22 | 0.01 | −0.321 | 0.432 |

Enhanced deep residual networks for single image super-resolution | 100 | 29.14 | 16.55 | 0.01 | −0.362 | 0,412 |

Image super-resolution using very deep residual channel attention networks | 100 | 28.75 | 15.51 | 0.01 | −0.416 | 0.437 |

Residual dense network for image SR | 100 | 29.88 | 14.63 | 0.01 | −0.541 | 0.554 |

the proposed method | 100 | 30.39 | 14.02 | 0.01 | −0.588 | 0.558 |

**Figure 7.**The brain imaging results. From left to right: A: Image reconstructed from partially sampled PROPELLER blade, B:Cartesian sampling grid without image registration applied (with no downsampling applied), C: B-spline Cubic interpolation, D: Non-Rigid Multi-Modal 3D Medical Image Registration Based on Foveated Modality Independent Neighbourhood Descriptor [45], E: Enhanced deep residual networks for single image super-resolution [14], F: Image super-resolution using very deep residual channel attention networks [16], G: Residual dense network for image super-resolution [15], H: super-resolution with proposed sampling scheme and motion compensation (the proposed algorithm). Compression ratio is 50%. Please see Table 2 for the PSNR values at other compression ratios.

**Table 2.**Stats of the Image Enhancement Metric (IEM) metrics for Figure 7.

Reconstruction Method | N | M | SD | t | p |
---|---|---|---|---|---|

down-sampled without image registration | 100 | 1.76 | 0.01 | 1.222 | 0.225 |

down-sampled with image registration | 100 | 1.99 | 0.01 | 0.505 | 0.615 |

motion corrected regular sampling scheme (without subsampling applied) | 100 | 2.67 | 0.01 | 1.848 | 0.068 |

B-spline cubic interpolation /image registration applied/ | 100 | 2.01 | 0.01 | 0.184 | 0.273 |

Non-Rigid Multi-Modal 3D Medical Image Registration | |||||

Based on Foveated Modality Independent Neighbourhood Descriptor | 100 | 2.33 | 0.01 | −0.320 | 0.436 |

Enhanced deep residual networks for single image super-resolution | 100 | 2.14 | 0.01 | −0.361 | 0.411 |

Image super-resolution using very deep residual channel attention networks | 100 | 2.45 | 0.01 | −0.411 | 0.431 |

Residual dense network for image SR | 100 | 2.88 | 0.01 | −0.543 | 0.552 |

the proposed method | 100 | 3.99 | 0.00 | −1.901 | 0.061 |

**Figure 8.**The abdominal image processing. From left to right: A: Image reconstructed from partially sampled PROPELLER blade, B:Cartesian sampling grid without image registration applied (with no downsampling applied), C: B-spline Cubic interpolation, D: Non-Rigid Multi-Modal 3D Medical Image Registration Based on Foveated Modality Independent Neighbourhood Descriptor [45], E: Enhanced deep residual networks for single image super-resolution [14], F: Image super-resolution using very deep residual channel attention networks [16], G: Residual dense network for image super-resolution [15], H: super-resolution with proposed sampling scheme and motion compensation (the proposed algorithm). Compression ratio is 50%. Please see Table 3 for the PSNR values at other compression ratios.

**Table 3.**Stats of the PSNR metrics for Figure 17. M denotes mean of observed PSNRs. MAE abbreviation stands for Mean Average Error.

Reconstruction Method | N | M | MAE | SD | t | p |
---|---|---|---|---|---|---|

Image reconstructed from partially sampled PROPELLER blade | 100 | 19.16 | 19.22 | 0.01 | 1.654 | 0.101 |

no motion corrected regular sampling scheme (with no downsampling applied) | 100 | 26.21 | 18.34 | 0.01 | 0.672 | 0.503 |

B-spline cubic interpolation /image registration applied/ | 100 | 23.31 | 17.89 | 0.01 | 0.183 | 0.273 |

Non-Rigid Multi-Modal 3D Medical Image Registration | ||||||

Based on Foveated Modality Independent Neighbourhood Descriptor | 100 | 26,22 | 17.02 | 0.01 | −0.354 | 0.431 |

Enhanced deep residual networks for single image super-resolution | 100 | 29.65 | 16.44 | 0.01 | −0.384 | 0,411 |

Image super-resolution using very deep residual channel attention networks | 100 | 28.66 | 16.01 | 0.01 | −0.466 | 0.436 |

Residual dense network for image SR | 100 | 29.00 | 15.55 | 0.01 | −0.565 | 0.554 |

the proposed method | 100 | 29.28 | 14.07 | 0.01 | −1.002 | 0.554 |

**Figure 9.**The Shepp-Logan phantom results comparison. From the left: the PROPELLER sampling reconstruction output (PSNR = 29.84 dB, IEM = 2.04), the proposed algorithm result with enhanced resolution (PSNR = 34.58 dB, IEM = 3.64). The lower row shows detailed images. Please see Table 4 for the PSNR values at other compression ratios.

**Table 4.**Stats of the PSNR metrics for Table 5 at different CS ratio. M denotes mean of observed PSNRs. MAE abbreviation stands for Mean Average Error.

CS Quality [%] | N | M | MAE | SD | t(99) | p |
---|---|---|---|---|---|---|

20 | 100 | 18.76 | 20.01 | 0.01 | 0.647 | 0.519 |

40 | 100 | 25.62 | 18.08 | 0.01 | 0.799 | 0.426 |

50 | 100 | 30.39 | 15.01 | 0.01 | 1.848 | 0.068 |

60 | 100 | 31.16 | 14.04 | 0.01 | 1.222 | 0.225 |

**Table 5.**Stats of the PSNR metrics for Figure 12. M denotes mean of observed PSNRs. MAE abbreviation stands for Mean Average Error.

Reconstruction Method | N | M | MAE | SD | t | p |
---|---|---|---|---|---|---|

Image reconstructed from partially sampled | ||||||

PROPELLER blade | 100 | 18.23 | 20.01 | 0.01 | −0.300 | 0.765 |

no motion corrected regular sampling scheme | ||||||

(with no downsampling applied) | 100 | 26.86 | 19.72 | 0.01 | −1.554 | 0.121 |

B-spline cubic interpolation /image registration applied/ | 100 | 23.30 | 18.91 | 0.01 | 0.181 | 0.275 |

Non-Rigid Multi-Modal 3D Medical Image Registration | ||||||

Based on Foveated Modality Independent Neighbourhood Descriptor | 100 | 26.31 | 1.23 | 0.01 | −0.323 | 0.437 |

Enhanced deep residual networks for single image super-resolution | 100 | 29.41 | 16.00 | 0.01 | −0.367 | 0.411 |

Image super-resolution using very deep residual channel | ||||||

attention networks | 100 | 28.12 | 15.61 | 0.01 | −0.412 | 0.432 |

Residual dense network for image SR | 100 | 29.93 | 15.01 | 0.01 | −0.542 | 0.558 |

the proposed method | 100 | 36.22 | 14.32 | 0.01 | 1.347 | 0.181 |

**Figure 10.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the PSNR scores for all the methods on each dataset from Figure 7.

**Figure 11.**The beeswarm* plots are one-dimensional scatter plots showing the “stripcharts” of the IEM scores for all the methods on each dataset from Figure 7. *The so-called bee swarm plot gives a better representation of the distribution of values, but it does not scale well to large numbers of observations.

**Table 6.**Stats of the IEM metrics for Figure 7 at different CS ratio.

CS Quality [%] | N | IEM | SD | t(99) | p |
---|---|---|---|---|---|

20 | 100 | 1.79 | 0.00 | −1.421 | 0.158 |

40 | 100 | 1.88 | 0.01 | 1.654 | 0.101 |

50 | 100 | 3.99 | 0.01 | −1.179 | 0.241 |

60 | 100 | 4.31 | 0.00 | 0.000 | 1.000 |

**Figure 14.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the PSNR scores for all the methods on each dataset from Figure 12.

**Figure 15.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the IEM scores for all the methods on each dataset from Figure 12.

**Figure 16.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the PSNR scores for all the methods on each dataset from Figure 12.

**Figure 17.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the IEM scores for all the methods on each dataset from Figure 12.

**Figure 18.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the PSNR scores for all the methods on each dataset from Figure 18.

**Figure 19.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the IEM scores for all the methods on each dataset from Figure 17.

**Figure 20.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the PSNR scores for all the methods on each dataset from Figure 17.

**Figure 21.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the IEM scores for all the methods on each dataset from Figure 17.

**Figure 22.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the PSNR scores for all the methods on each dataset from Figure 22.

**Figure 23.**The bee swarm plots are one-dimensional scatter plots showing the “stripcharts” of the IEM scores for all the methods on each dataset from Figure 22.

## 5. Discussion

Scanning Pattern | TR | TE | FOV | Voxel (mm) | Total Scan Duration (s) | p |
---|---|---|---|---|---|---|

PROPELLER | 1200 | 180 | 290 | 0.96/0.96/1.00 | 360 | 0.159 |

SENSE | 1200 | 180 | 290 | 0.96/0.96/1.00 | 353 | 0.226 |

GRAPPA | 1200 | 180 | 290 | 0.96/0.96/1.00 | 320 | 0.136 |

the proposed algorithm | 1200 | 180 | 290 | 0.96/0.96/1.00 | 112 | 0.103 |

**Table 8.**The performance of the proposed algorithm at different CS ratio. MAE abbreviation stands for Mean Average Error.

CS Ratio [%] | PSNR [dB] | MAE | IEM |
---|---|---|---|

20 | 18.76 | 19.82 | 1.79 |

40 | 25.62 | 18.01 | 1.88 |

50 | 30.39 | 15.66 | 3.99 |

60 | 31.16 | 14.20 | 4.31 |

**Table 9.**Stats of the IEM metrics for Figure 12. M denotes mean of observed IEMs

Reconstruction Method | N | M | SD | t | p |
---|---|---|---|---|---|

Image reconstructed from partially sampled | |||||

PROPELLER blade | 100 | 1.56 | 0.01 | 1.133 | 0.260 |

no motion corrected regular sampling scheme | |||||

(with no downsampling applied) | 100 | 3.12 | 0.01 | 0.294 | 0.070 |

B-spline cubic interpolation /image registration applied/ | 100 | 2.62 | 0.01 | 1.842 | 0.061 |

Non-Rigid Multi-Modal 3D Medical Image Registration | |||||

Based on Foveated Modality Independent Neighbourhood Descriptor | 100 | 2.11 | 0.01 | 0.183 | 0.275 |

Enhanced deep residual networks for single image super-resolution | 100 | 2.32 | 0.01 | −0.327 | 0.436 |

Image super-resolution using very deep residual channel | |||||

attention networks | 100 | 2.15 | 0.01 | −0.366 | 0.412 |

Residual dense network for image SR | 100 | 2.43 | 0.01 | −0.412 | 0.432 |

the proposed method | 100 | 2.82 | 0.01 | −0.51 | 0.551 |

Image reconstructed from partially sampled | |||||

PROPELLER blade | 100 | 3.89 | 0.01 | −0.371 | 0.202 |

**Table 10.**The performance of the proposed algorithm at different CS ratio for Figure 12. MAE abbreviation stands for Mean Average Error.

CS Ratio [%] | PSNR [dB] | MAE | IEM |
---|---|---|---|

20 | 18.44 | 20.01 | 1.75 |

40 | 28.42 | 18.03 | 1.92 |

50 | 36.22 | 16.05 | 3.89 |

60 | 38.11 | 14.55 | 4.31 |

**Table 11.**Stats of the PSNR metrics for Figure 12. M denotes mean of observed PSNRs.

CS Quality [%] | N | M | SD | t(99) | p |
---|---|---|---|---|---|

20 | 100 | 18.44 | 0.01 | 0.139 | 0.889 |

40 | 100 | 28.42 | 0.01 | 0.728 | 0.469 |

50 | 100 | 36.22 | 0.01 | 1.789 | 0.077 |

60 | 100 | 38.11 | 0.01 | 1.830 | 0.070 |

**Table 12.**Stats of the IEM metrics for Figure 12. M denotes mean of observed IEMs

CS Quality [%] | N | M | SD | t(99) | p |
---|---|---|---|---|---|

20 | 100 | 1.75 | 0.01 | 1.000 | 0.329 |

40 | 100 | 1.92 | 0.01 | −0.865 | 0.389 |

50 | 100 | 3.89 | 0.01 | −0.672 | 0.503 |

60 | 100 | 4.31 | 0.01 | 0.961 | 0.339 |

**Table 13.**Stats of the IEM metrics for Figure 17. M denotes mean of observed IEMs.

Reconstruction Method | N | M | SD | t | p |
---|---|---|---|---|---|

Image reconstructed from partially sampled | |||||

PROPELLER blade | 100 | 1.56 | 0.01 | 1.133 | 0.260 |

no motion corrected regular sampling scheme | |||||

(with no downsampling applied) | 100 | 3.12 | 0.01 | 0.294 | 0.770 |

B-spline cubic interpolation /image registration applied/ | 100 | 2.62 | 0.01 | 1.822 | 0.021 |

Non-Rigid Multi-Modal 3D Medical Image Registration | |||||

Based on Foveated Modality Independent Neighbourhood Descriptor | 100 | 2.10 | 0.01 | 0.123 | 0.225 |

Enhanced deep residual networks for single image super-resolution | 100 | 2.31 | 0.01 | −0.317 | 0.426 |

Image super-resolution using very deep residual channel | |||||

attention networks | 100 | 2.12 | 0.01 | −0.346 | 0.442 |

Residual dense network for image SR | 100 | 2.49 | 0.01 | −0.462 | 0.412 |

the proposed method | 100 | 2.80 | 0.01 | −0.511 | 0.521 |

Image reconstructed from partially sampled | |||||

PROPELLER blade | 100 | 3.89 | 0.01 | −0.371 | 0.202 |

**Table 14.**The performance of the proposed algorithm at different CS ratio. MAE abbreviation stands for Mean Average Error.

CS Ratio [%] | PSNR [dB] | MAE | IEM |
---|---|---|---|

20 | 19.31 | 20.01 | 1.67 |

40 | 24.14 | 18.23 | 1.91 |

50 | 29.28 | 16.02 | 3.68 |

60 | 31.08 | 15.65 | 4.19 |

**Table 15.**Stats of the PSNR metrics for Figure 17. M denotes mean of observed PSNRs.

CS Quality [%] | N | M | SD | t(99) | p |
---|---|---|---|---|---|

20 | 100 | 19.31 | 0.01 | 0.542 | 0.589 |

40 | 100 | 24.14 | 0.01 | −0.713 | 0.478 |

50 | 100 | 29.28 | 0.01 | −1.044 | 0.299 |

60 | 100 | 31.08 | 0.01 | −1.021 | 0.310 |

**Table 16.**Stats of the IEM metrics for Figure 17. M denotes mean of observed IEMs.

CS Quality [%] | N | M | SD | t(99) | p |
---|---|---|---|---|---|

20 | 100 | 1.67 | 0.01 | −0.575 | 0.566 |

40 | 100 | 1.91 | 0.01 | −0.134 | 0.894 |

50 | 100 | 3.68 | 0.01 | 0.542 | 0.589 |

60 | 100 | 4.19 | 0.01 | 0.588 | 0.558 |

**Table 17.**Stats of the PSNR metrics for Figure 22. M denotes mean of observed PSNRs. MAE abbreviation stands for Mean Average Error.

Reconstruction/Sampling Algorithm | N | M | MAE | SD | t(99) | p |
---|---|---|---|---|---|---|

the PROPELLER sampling reconstruction | 100 | 29.84 | 18.22 | 0.01 | −1.881 | 0.063 |

the proposed algorithm | 100 | 34.58 | 15..01 | 0.00 | 1.149 | 0.253 |

**Table 18.**Stats of the IEM metrics for Figure 22. M denotes mean of observed IEMs

Reconstruction/Sampling Algorithm | N | M | SD | t(99) | p |
---|---|---|---|---|---|

the PROPELLER sampling reconstruction | 100 | 2.04 | 0.01 | 0.139 | 0.889 |

the proposed algorithm | 100 | 3.64 | 0.00 | −1.228 | 0.222 |

**Table 19.**The performance of the proposed algorithm at different CS ratio. MAE abbreviation stands for Mean Average Error.

CS Ratio [%] | PSNR [dB] | MAE | IEM |
---|---|---|---|

20 | 21.04 | 20.22 | 1.88 |

40 | 27.45 | 18.23 | 1.92 |

50 | 34.58 | 15.45 | 3.64 |

60 | 36.01 | 15.01 | 4.17 |

## Funding

## Conflicts of Interest

## References

- Irani, M.; Peleg, S. Improving resolution by image registration. CVGIP Graph. Models Image Process.
**1991**, 53, 231–239. [Google Scholar] [CrossRef] - Malczewski, K.; Stasinski, R. High-resolution MRI image reconstruction from a PROPELLER data set of samples. Int. J. Funct. Inform. Pers. Med.
**2008**, 1, 311–320. [Google Scholar] [CrossRef] - Malczewski, K. Breaking The Resolution Limit In Medical Imaging Modalities. In Proceedings of the 2012 International Conference on Image Processing, Computer Vision, Worldcomp and Pattern Recognition (IPCV’12), Las Vegas, NV, USA, 16–19 July 2012. [Google Scholar]
- Malczewski, K.; Stasinsk, R. Super Resolution for Multimedia, Image, and Video Processing Applications. In Studies in Computational Intelligence; Springer: Berlin, Germany, 2009; Volume 231. [Google Scholar]
- Kennedy, J.A.; Israel, O.; Frenkel, A.; Bar-Shalom, R.; Azhari, H. Super-resolution in PET imaging. IEEE Trans. Med. Imaging
**2006**, 25, 137–147. [Google Scholar] [CrossRef] - Freeman, W.T.; Pasztor, E.C. Learning to Estimate Scenes from Images. In Advances in Neural Information Processing Systems; Kearns, M.S., Solla, S.A., Cohn, D.A., Eds.; MIT Press: Cambridge, MA, USA, 1999; Volume 11, pp. 775–781. [Google Scholar]
- Liu, C.; Sun, D. On Bayesian Adaptive Video Super Resolution. IEEE Trans. Pattern Anal. Mach. Intell.
**2013**, 36, 346–360. [Google Scholar] [CrossRef] [Green Version] - Kim, J.; Lee, J.K.; Lee, K.M. Accurate image super-resolution using very deep convolutional networks. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2016), Las Vegas, NV, USA, 27–30 June 2016; pp. 1646–1654. [Google Scholar]
- Kim, J.; Lee, J.K.; Lee, K.M. Deeply recursive convolutional network for image super-resolution. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 1637–1645. [Google Scholar]
- Tao, X.; Gao, H.; Liao, R.; Wang, J.; Jia, J. Detail-revealing deep video super-resolution. In Proceedings of the IEEE International Conference on Computer Vision (ICCV 2017), Venice, Italy, 22–29 October 2017; pp. 4482–4490. [Google Scholar]
- Sajjadi, M.S.M.; Vemulapalli, R.; Brown, M. Frame-recurrent video super-resolution. In Proceedings of the 2018 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2018), Salt Lake City, UT, USA, 18–22 June 2018; pp. 6626–6634. [Google Scholar]
- Younghyun, J.; Seoung, W.O.; Jaeyeon, K.; Seon, J.K. Deep video super-resolution network using dynamic upsampling filters without explicit motion compensation. In Proceedings of the 2018 IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018. [Google Scholar]
- Dong, C.; Loy, C.C.; He, K.; Tang, X. Learning a Deep Convolutional Network for Image Super-Resolution. In Proceedings of the European Conference on Computer Vision (ECCV), Zurich, Switzerland, 6–12 September 2014. [Google Scholar]
- Lim, B.; Son, S.; Kim, H.; Nah, S.; Lee, K.M. Enhanced deep residual networks for single image super-resolution. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops, Honolulu, HI, USA, 21–26 July 2017; pp. 1132–1140. [Google Scholar]
- Zhang, Y.; Tian, Y.; Kong, Y.; Fu, B.Y. Residual dense network for image super-resolution. In Proceedings of the 2018 IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 2472–2481. [Google Scholar]
- Zhang, Y.; Li, K.; Li, K.; Wang, L.; Zhong, B.; Fu, Y. Image super-resolution using very deep residual channel attention networks. In Proceedings of the Computer Vision—ECCV 2018—15th European Conference, Munich, Germany, 8–14 September 2018; Part VII. pp. 294–310. [Google Scholar]
- Haris, M.; Shakhnarovich, G.; Ukita, N. Deep back-projection networks for super-resolution. In Proceedings of the 2018 IEEE Conference on Computer Vision and PatternRecognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 1664–1673. [Google Scholar]
- Pruessmann, K.P.; Weiger, M.; Scheidegger, M.B.; Boesiger, P. SENSE: Sensitivity encoding for fast MRI. Magn. Reson. Med.
**1999**, 42, 952–962. [Google Scholar] [CrossRef] - Ding, Y.; Chung, Y.C.; Jekic, M.; Simonetti, O.P. A new approach to auto-calibrated dynamic parallel imaging based on the Karhunen-Loeve transform: KL-TSENSE and KL-TGRAPPA. Magn. Reson. Med.
**2011**, 65, 1786–1792. [Google Scholar] [CrossRef] [Green Version] - Küstner, T.; Würslin, C.; Gatidis, S.; Martirosian, P.; Nikolaou, K.; Schwenzer, N.F.; Schick, F.; Yang, B.; Schmidt, H. MR image reconstruction using a combination of Compressed Sensing and partial Fourier acquisition: ESPReSSo. IEEE Trans. Med. Imaging
**2016**, 35, 2447–2458. [Google Scholar] [CrossRef] - Ding, Y.; Chung, Y.C.; Jekic, M.; Simonetti, O.P. A method to assess spatially variant noise in dynamic MR image series. Magn. Reson. Med.
**2010**, 63, 782–789. [Google Scholar] [CrossRef] [PubMed] - Kim, D.; Dyvorne, H.A.; Otazo, R.; Feng, L.; Sodickson, D.K.; Lee, V.S. Accelerated phase-contrast cine MRI using k-t SPARSE-SENSE. Magn. Reson. Med.
**2012**, 67, 1054–1064. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Pedersen, H.; Kozerke, S.; Ringgaard, S.; Nehrke, K.; Kim, W. k-t PCA: Temporally constrained k-t BLAST reconstruction using principal component analysis. Magn. Reson. Imaging
**2009**, 63, 706–716. [Google Scholar] - Griswold, M.A.; Jakob, P.M.; Heidemann, R.M.; Nittka, M.; Jellus, V.; Wang, J.; Kiefer, B.; Haase, A. Generalized autocalibrating partially parallel acquisitions (GRAPPA). Magn. Reson. Med.
**2002**, 47, 1202–1210. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Noll, D.C.; Nishimura, D.G.; Macovski, A. Homodyne detection in magnetic resonance imaging. IEEE Trans. Med. Imaging
**1991**, 10, 154–163. [Google Scholar] [CrossRef] [PubMed] - McGibney, G.; Smith, M.R.; Nichols, S.T.; Crawley, A. Quantitative evaluation of several partial Fourier Reconstruction methods used in MRI. Magn. Reson. Med.
**1993**, 30, 51–59. [Google Scholar] [CrossRef] [PubMed] - Davenport, M. The Fundamentals of Compressive Sensing. IEEE Signal Processing Society Online Tutorial Library, 12 April 2013. [Google Scholar]
- Cook, R.L. Stochastic sampling in computer graphics. ACM Trans. Graph. (TOG)
**1986**, 5, 5172. [Google Scholar] [CrossRef] - Jung, H.; Ye, J.C.; Kim, E.Y. Improved k-t BLAST and k-t SENSE using FOCUSS. Phys. Med. Biol.
**2007**, 52, 3201–3226. [Google Scholar] [CrossRef] - He, Z.; Cichocki, A.; Zdunek, R.; Xie, S. Improved FOCUSS method with conjugate gradient iterations. IEEE Trans. Signal Process.
**2009**, 57, 399–404. [Google Scholar] - Deshmane, A.; Gulani, V.; Griswold, M.A.; Seiberlich, N. Parallel MR imaging. J. Magn. Reson. Imaging
**2012**, 36, 55–72. [Google Scholar] [CrossRef] [Green Version] - Feng, L.; Xu, J.; Kim, D.; Axel, L.; Sodickson, D.K.; Otazo, R. Combination of compressed sensing, parallel imaging and partial Fourier for highly-accelerated 3D first-pass cardiac perfusion MRI. In Proceedings of the 19th International Society for Magnetic Resonance in Medicine (ISMRM), Quebec, QC, Canada, 7–13 May 2011; p. 4368. [Google Scholar]
- Margosian, P.; Schmitt, F.; Purdy, D. Faster MR imaging: Imaging with half the data. Healthc. Instrum.
**1986**, 1, 195–197. [Google Scholar] - Cuppen, J.; van Est, A. Reducing MR imaging time by one-sided reconstruction. Magn. Reson. Med.
**1987**, 5, 526–527. [Google Scholar] [CrossRef] - Otazo, R.; Kim, D.; Axel, L.; Sodickson, D. Combination of compressed sensing and parallel imaging for highly accelerated first-pass cardiac perfusion MRI. Magn. Reson. Med.
**2010**, 64, 767–776. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Liang, D.; Liu, B.; Wang, J.; Ying, L. Accelerating SENSE using compressed sensing. Magn. Reson. Med.
**2009**, 62, 1574–1584. [Google Scholar] [CrossRef] [PubMed] - King, K.F.; Angelos, L. SENSE image quality improvement using matrix regularization. Int. Soc. Magn. Reson. Med.
**2001**, 9, 1771. [Google Scholar] - Thüring, T.; Eggers, H.; Doneva, M.; Kozerke, S. A fast Reconstruction method using Compressive Sensing and an additional phase constraint. In Proceedings of the 26th European Society for Magnetic Resonance in Medicine and Biology (ESMRMB), Antalya, Turkey, 1–3 October 2009; p. 31. [Google Scholar]
- Malczewski, K. Super-Resolution with compressively sensed MR/PET signals at its input. Inform. Med. Unlocked
**2020**, 18, 1–20. [Google Scholar] [CrossRef] - Malczewski, K. Rapid Diffusion Weighted Imaging with Enhanced Resolution. Appl. Magn. Reson.
**2020**, 51, 221–239. [Google Scholar] [CrossRef] - Tsai, C.M.; Nishimura, D.G. Reduced aliasing artefacts using variable density k-space sampling trajectories. Magn. Reson. Med.
**2000**, 43, 452–458. [Google Scholar] [CrossRef] - Lucia, M.; Granata, G.; Ichcha, M.; Mario, M.; Mario, R. GuarracinoGlioma Grade Classification via Omics Imaging. In Proceedings of the 7th International Conference on Bioimaging, Vienna, Austria, 24–26 February 2020. [Google Scholar]
- Heinrich, M.P.; Jenkinson, M.; Brady, M.; Schnabe, J.A. Globally optimal deformable registration on a minimum spanning tree using dense displacement sampling. In International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2012); Ayache, N., Delingette, H., Golland, P., Mori, K., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; Part III. [Google Scholar]
- Raj, A.; Singh, G.; Zabih, R.; Kressler, B.; Wang, Y.; Schuff, N.; Weiner, M. Bayesian parallel imaging with edge-preserving priors. Magn. Reson. Med.
**2007**, 57, 8–21. [Google Scholar] [CrossRef] [Green Version] - Yang, F.; Ding, M.; Zhang, X. Non-Rigid Multi-Modal 3D Medical Image Registration Based on Foveated Modality Independent Neighbourhood Descriptor. Sensors
**2019**, 19, 4675. [Google Scholar] [CrossRef] [Green Version] - Jaya, V.L.; Gopikakumari, R. IEM: A New Image Enhancement Metric for Contrast and Sharpness Measurements. Int. J. Comput. Appl.
**2013**, 79, 1–9. [Google Scholar]

© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Malczewski, K.
Fixing Acceleration and Image Resolution Issues of Nuclear Magnetic Resonance. *Symmetry* **2020**, *12*, 681.
https://doi.org/10.3390/sym12040681

**AMA Style**

Malczewski K.
Fixing Acceleration and Image Resolution Issues of Nuclear Magnetic Resonance. *Symmetry*. 2020; 12(4):681.
https://doi.org/10.3390/sym12040681

**Chicago/Turabian Style**

Malczewski, Krzysztof.
2020. "Fixing Acceleration and Image Resolution Issues of Nuclear Magnetic Resonance" *Symmetry* 12, no. 4: 681.
https://doi.org/10.3390/sym12040681