Table of Contents Author Guidelines Submit a Manuscript
Journal of Healthcare Engineering
Volume 2017, Article ID 2634389, 7 pages
Research Article

An Evaluation of the Benefits of Simultaneous Acquisition on PET/MR Coregistration in Head/Neck Imaging

IRCCS SDN, Naples, Italy

Correspondence should be addressed to Serena Monti; ti.ilopan-nds@itnoms

Received 24 February 2017; Revised 2 May 2017; Accepted 16 May 2017; Published 18 July 2017

Academic Editor: Pan Lin

Copyright © 2017 Serena Monti et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Coregistration of multimodal diagnostic images is crucial for qualitative and quantitative multiparametric analysis. While retrospective coregistration is computationally intense and could be inaccurate, hybrid PET/MR scanners allow acquiring implicitly coregistered images. Aim of this study is to assess the performance of state-of-the-art coregistration methods applied to PET and MR acquired as single modalities, comparing the results with the implicitly coregistration of a hybrid PET/MR, in complex anatomical regions such as head/neck (HN). A dataset consisting of PET/CT and PET/MR subsequently acquired in twenty-three patients was considered: performance of rigid (RR) and deformable (DR) registration obtained by a commercial software and an open-source registration package was evaluated. Registration accuracy was qualitatively assessed in terms of visual alignment of anatomical structures and qualitatively measured by the Dice scores computed on segmented tumors in PET and MRI. The resulting scores highlighted that hybrid PET/MR showed higher registration accuracy than retrospectively coregistered images, because of an overall misalignment after RR, unrealistic deformations and volume variations after DR. DR revealed superior performance compared to RR due to complex nonrigid movements of HN district. Moreover, simultaneous PET/MR offers unique datasets serving as ground truth for the improvement and validation of coregistration algorithms, if acquired with PET/CT.

1. Introduction

Integration of multimodal information carried out from different diagnostic imaging techniques is essential for a comprehensive characterization of the region under examination. Therefore, image coregistration has become crucial both for qualitative visual assessment [1] and for quantitative multiparametric analysis in research applications [2, 3] and clinical diagnosis, staging, and follow-up. Coregistration of complex data, such as diagnostic images, is typically computationally intense, and its result could also be inaccurate. This problem is intrinsically overcome by hybrid systems that allow acquiring simultaneously images that share the same coordinate system [4, 5].

In this field, the recently introduced integrated PET/MRI scanners represent the new frontier of molecular imaging. This new technology allows achieving in one-shot both functional information provided by positron emission tomography (PET) imaging and morpho-functional information with excellent soft tissue contrast provided by magnetic resonance imaging (MRI), increasing patient’s compliance. The advantages of such a technology go beyond the mere combination of functional and morphological imaging: considering the wide range of MRI sequences and PET radiotracers available [6], the functional information of both MRI and PET may complement each another; moreover, due to the high spatial and contrast resolution of MRI, PET/MR imaging is becoming a straightforward clinical indication for local staging in complex anatomical regions such as head/neck [7], where it can help in delineating the tumor extent and lymph node involvement from the surrounding tissue [811]. Furthermore, PET/MRI can be useful for radiation therapy and presurgical treatment planning in head and neck cancer patients [12, 13].

With respect to separate acquisition of PET and MR, hybrid systems can certainly overcome the computational problem of the PET and MR coregistration, carrying out at same time PET and MR images of the same anatomical district that are, therefore, ideally coregistered.

Despite the undeniable advantages of hybrid solutions, their cost effectiveness is still far to be proven and the coregistration of multimodal information is frequently retrospectively obtained via software, combining images from a PET scanner with preexisting CT and MR, thus reducing the cost for new technology purchasing while offering renewed opportunities to advance PET, especially in underserved areas or under increasing economic constraints [14]. The problem of multimodal coregistration via software is commonly approached by algorithms consisting of an affine or rigid transformation followed by a free form deformation and using mutual information [1517] as similarity measure. These algorithms are based, when available, on the coregistration of the anatomical information of CT component of PET/CT with MR, while the PET component can be transformed with the resulting deformation field, in order to guarantee more accurate coregistration of PET/MR data [13]. Retrospective coregistration via software has shown good performances also in the HN district [13, 18], but it is particularly challenging and technically demanding, mainly because of the varied patient positions used for the various scanners and the anatomic complexity of this region [10], which is subject to respiration, swallowing, and intrinsically nonrigid movements [19].

Looking at this scenario, our study is aimed to assess the performance of the state-of-the-art coregistration methods between PET and MR acquired as single modalities, comparing them with the intrinsic coregistration carried out by a hybrid PET/MR system, which is assumed to represent a ground truth for the assessment of retrospective coregistration. In particular, the performance of the state-of-the art rigid and deformable registration algorithms, implemented by a commercial software and an open-source registration package, was evaluated to appreciate their clinical suitability. Our work is based on a dataset of PET/MR and PET/CT of the HN district acquired during the same session, in order to exploit just a single administration of FDG-PET radiotracer.

2. Materials and Methods

The study was approved by the Institutional Review Board: 23 patients, with histologically confirmed HN malignancy (in early staging and in follow-up), were studied, after obtaining written informed consent. Table 1 shows clinical details for each patient.

Table 1: Patient’s cohort examined during this study. For each patient, in addition to personal details, the site of malignancy is specified.
2.1. Imaging Protocol

All subjects underwent a single-injection dual imaging protocol including PET/CT and subsequent PET/MR, so that no additional injection was required and any additional radiation exposure for the patients was avoided. The examination protocol consisted of the following steps: patients fasted for at least 6 h before scanning; just before the injection, the blood glucose level was measured in order to ensure a value below 150 mg/dL; and the patients were injected with about 400 MBq of [18F]-FDG, depending on their body weight. After an uptake period of 80 minutes, patients underwent PET/CT scanning and, soon after PET/CT, they underwent PET/MR examination.

2.1.1. PET/CT Acquisition

PET/CT acquisition was performed on a Gemini TF (Philips Medical Systems, Best, The Netherlands). PET data was acquired in sinogram mode for 15 minutes with a matrix size of 144 × 144. A 3-dimensional attenuation-weighted ordered-subsets expectation maximization iterative reconstruction algorithm (AW OSEM 3D) was applied with 3 iterations and 21 subsets, Gaussian smoothing of 4 mm in FWHM, and a zoom of 1. The CT consisted of a low-dose scan (120 kV, 80 mA). Patient position was supine with his arms resting at the side.

2.1.2. PET/MR Acquisition

PET/MR was performed on a Biograph mMR (Siemens Healthcare, Erlangen, Germany). Bed position was established in order to get a full coverage of the head/neck region. Also, these PET data were reconstructed with an AW OSEM 3D iterative reconstruction algorithm applied with 3 iterations and 21 subsets, Gaussian smoothing of 4 mm in full width at half maximum, and a zoom of 1. MR attenuation correction was performed via a segmentation approach based on 2-point Dixon MRI sequences. The MRI protocol was performed with dedicated head and neck coils. The MRI sequences taken into account for this study were T2-weighted short time inversion recovery (STIR) acquired in coronal direction (TR/TE/TI = 5000/84/220, one acquired signal, voxel size = 0.4 × 0.4 × 3.5 mm).

2.2. Data Processing

Image registration strategies were developed based on CT and MR data only, with the MR coronal STIR acquisition serving as fixed image and the CT as moving, whereas the PET component from PET/CT data was transformed only later by the resulting deformation field into the same coordinate system of PET/MR. The entire registration process was performed twice, using two different tools: the freely available, open-source registration package Elastix ( [20] and the tool for deformable image registration included in the commercial software XD3 (Mirada Medical Ltd., Oxford, United Kingdom) [21]. In the following, we will refer to the set composed by PET and MR acquired on the hybrid PET/MR scanner as PETMRo and to the set composed by PET from PET/CT retrospectively coregistered to MR from PET/MR as PETMRreg. Moreover, registration performed by means of Elastix or Mirada will be superscripted with ELX or MRD, respectively, while the suffixes RR and DR will indicate rigid and deformable registration, respectively.

2.2.1. Image Registration with Elastix

Elastix is a command line-driven program based on the Insight Toolkit (ITK) registration framework (open source: National Library of Medicine, The registration parameters were selected based on previous work on multimodal deformable image registration for integration of PET/MR into radiotherapy treatment planning for head and neck [13]. First, a rigid registration (RR) was performed and the resulting transform was used as a starting point for deformable registration (DR), by means of B-spline transform [15]. Both RR and DR were performed with a three-level multiresolution approach, using Gaussian smoothing (sigma = 8.0, 4.0, and 4.0 in x and y direction and sigma = 2.0, 1.0, and 0.5 in z direction to take into account voxel anisotropy) without downsampling. A localized version of mutual information (LMI) was considered as similarity measure (Mattes Mutual Information) [22], and a stochastic gradient descent optimizer [23] was chosen to minimize it. In detail, for RR, LMI metric was computed with 64 bins, 2000 samples, and a maximum of 500 iterations for each resolution. For DR, a bending energy penalty (BEP) term was calculated [15] to regularize the transformation; the metric (sum of LMI and BEP) was computed with 60 bins, 10,000 samples, and a maximum of 5000 iterations for each resolution. After the deformation field that maps the CT into the space coordinates of MR was computed, the PET from PET/CT was accordingly warped using Transformix, another command line-driven program based on ITK that applies a known transformation to an input image.

2.2.2. Image Registration with XD3

XD3 is a Mirada’s commercial platform that provides a full suite of practical applications for multimodal image viewing, including rigid and deformable registration. After a first step of RR between MR and CT, a DR was performed using Mirada’s multimodal deformable image registration algorithm that optimizes a proprietary form of a mutual information-based similarity function [15, 16, 24] over a radial basis function (RBF) transformation model. Default parameters for MR-CT unsupervised registration were used. Finally, the PET from PET/CT was automatically warped, once the transformation that registers CT to MR was computed.

2.3. Image Evaluation

Registration accuracy was qualitatively and qualitatively evaluated in five sets of images for each patient: PETMRo, PETMRregELXRR, PETMRregELXDR, PETMRregMRDRR, and PETMRregMRDDR. Qualitative evaluation was performed by two clinical reviewers: one is a nuclear medicine physician who is also licentiate in diagnostic radiology and the other is a radiologist who is also licentiate in nuclear medicine. Images were analyzed in the coronal plane on the freely available medical imaging platform medInria [25], which allows visualization and fusion of both NIfTI files and DICOM files. The observers reviewed the five image set for each patient evaluating the alignment of the major anatomical structures. Then they identified, in PET and MR images, the localization and the extent of the primary tumor and metastasis to regional lymph nodes, and, on these bases, they independently rated the registration quality of each tested method using the scoring system defined in Table 2. Neither reader was aware of the results of other imaging studies, histopathologic findings, or clinical data.

Table 2: Scoring system used to evaluate the registration quality of PET with MR images.

In order to obtain also a quantitative evaluation of the registration accuracy, the two clinicians were asked to segment the primary tumor of each patient using the freely available software ITK-SNAP ( [26]. The radiologist manually contoured the lesion in the T2-weighted coronal image. The nuclear medicine physician used the user-guided 3D active contour segmentation implemented in ITK-SNAP to semiautomatically segment the primary tumor, after having initialized the process with the placement of a spherical seed. For each patient, he repeated five times the operation (once for each PET of the five sets). The obtained segmentations were then used to compute the Dice score (Dice, 1945 number 39) between MR and PET from each set.

2.4. Statistical Analysis

A Friedman statistics with successive multicomparative analysis were used to evaluate the statistical differences in visual ratings between implicitly coregistered PET/MR and the results of coregistration software. Further comparisons between the single steps of the different coregistration software were evaluated by means of a Wilcoxon’s signed-rank statistic test, as done in [27]. Similarly, statistical differences in Dice scores were tested with ANOVA and paired Student’s t-test. Statistical analysis was performed using Matlab (MATLAB R2014b, Math-Works, Natick, MA). Differences at were considered to be statistically significant.

3. Results

Table 3 shows the mean qualitative scores over the two observers and the Dice score for each patient and for each set considered.

Table 3: Evaluation results: registration accuracy scores expressed as mean ± standard deviation.

Comparing PETMRo with registrations performed by Elastix and Mirada, their differences resulted to be statistically significant both at qualitative (Friedman test p values: PETMRo/PETMRregELXRR/PETMRregELXDR = 2.92 · 10−7, PETMRo/PETMRregMRDRR/PETMRregMRDDR = 4.56 · 10−7) and quantitative analysis (ANOVA p values: PETMRo/PETMRregELXRR/PETMRregELXDR = 4.83 · 10−11, PETMRo/PETMRregMRDRR/PETMRregMRDDR = 3.80 · 10−12). For each patient, the scores highlighted that PETMRo set showed a higher (at the most equal) registration accuracy than the other fused sets of images.

This superiority was statistically significant in all the comparison for qualitative (Wilcoxon test p values: PETMRo/PETMRregELXRR = 1.25 · 10−7, PETMRo/PETMRregELXDR = 0.02, PETMRo/PETMRregMRDRR = 2.46 · 10−6, PETMRo/PETMRregMRDDR = 3.50 · 10−5) and quantitative scores (t-test p values: PETMRo/PETMRregELXRR = 1.57 · 10−10, PETMRo/PETMRregELXDR = 2.22 · 10−8, PETMRo/PETMRregMRDRR = 1.45 · 10−10, PETMRo/PETMRregMRDDR = 1.20 · 10−9).

The registration results of PET with MR images after a RR step showed an overall misalignment due to different patient positioning, both for Elastix and Mirada results, with differences between the two methods that were not statistically significant both at the qualitative (Wilcoxon test p value PETMRregELXRR/PETMRregMRDRR = 0.73) and at the quantitative scores (t-test p value PETMRregELXRR/PETMRregMRDRR = 0.75).

If a DR step was performed, a significant improvement could be obtained, but also unrealistic deformations or moderate and smooth volume expansions and compressions could occur, leading to a good or sufficient alignment of major anatomical structures but local misregistration of tumors. However, looking at the scores of the single patients, after a DR step, the accuracy of registration tended to improve for both Elastix and Mirada results. This improvement was statistically significant for registration performed with Elastix (Wilcoxon test p value PETMRregELXRR/PETMRregELXDR = 0.01, t-test p value PETMRregELXRR/PETMRregELXDR = 1.4 · 10−3) and not statistically significant for Mirada (p value PETMRregMRDRR/PETMRregMRDDR = 0.85, p value PETMRregMRDRR/PETMRregMRDDR = 0.09)

Comparing the results obtained from the two registration tools, the scores of the DR output with Elastix were generally higher than those of the DR outputs, but the arisen differences between these two sets were statistically significant only at the qualitative assessment (Wilcoxon test p value PETMRregELXDR/PETMRregMRDDR = 0.02, t-test p value PETMRregELXDR/PETMRregMRDDR = 0.32.

In conclusion, while at the quantitative assessment, hybrid PET/MR definitely outperforms retrospective registration; at the qualitative score in the 25% of the cases, all retrospective coregistration methods showed results that were comparable with PETMRo in terms of alignment of major anatomical structures and tumors (Figure 1). In the remaining 75%, PETMRo exhibited an overall superiority. In detail, the 17% of these cases showed a slightly better performance of Elastix-based registration, in particular the DR step, in comparison with Mirada; one case showed a better performance of Mirada; in the remaining ones, problems of misalignment in RR steps and/or volume variations for DR steps were visible (Figure 2).

Figure 1: Example of qualitative well ranked coregistration results. From left to right: coronal MR image, fused PET/MR, and PET image from (a) PETMRo, (b) PETMRregELXRR, (c) PETMRregELXDR, (d) PETMRregMRDRR, and (e) PETMRregMRDDR. Both RR and DR with Elastix, (b) and (c), respectively, and Mirada, (d) and (e), respectively, show results comparable with intrinsic coregistration of simultaneous PET/MR (a). The Dice scores for this case are PETMRo = 0.95, PETMRregELXRR = 0.85, PETMRregELXDR = 0.86, PETMRregMRDRR = 0.89, and PETMRregMRDDR = 0.90.
Figure 2: Example of poorly ranked coregistration results. From left to right: coronal MR image, fused PET/MR, and PET image from (a) PETMRo, (b) PETMRregELXRR, (c) PETMRregELXDR, (d) PETMRregMRDRR, and (e) PETMRregMRDDR. RR with Elastix (b) and Mirada (d) shows an overall misalignment of the brain contour between PET and MR. This misalignment is only partially recovered by DR with Mirada (e) and better recovered by DR with Elastix (c). However, the lymph node tumor is completely absent in the PET component of PETMRregMRDRR and its localization is not perfectly corresponding in PETMRregELXDR (c) as in PETMRo (a). The Dice scores for this case are PETMRo = 0.79, PETMRregELXRR = 0.26, PETMRregELXDR = 0.52, PETMRregMRDRR = 0.19, and PETMRregMRDDR = 0.25.

4. Discussion

In this work, four different strategies for the coregistration of PET and MR in the HN region were qualitatively and quantitatively evaluated, with the purpose of comparing them with the intrinsic coregistration of simultaneous PET/MR, which is assumed to represent a ground truth for the assessment of retrospective coregistration.

To our knowledge, this is the first reported study to have investigated the validity of retrospectively coregistered PET/MR of HN district using images obtained from different modalities in terms of localization and extent of the primary tumor and metastasis to regional lymph nodes and to have compared the accuracy of anatomical structure alignment and tumor localization with the intrinsic coregistered simultaneous PET/MR.

Kanda et al. [9] assessed the clinical value of retrospective image coregistration of neck MRI and [18F]-FDG PET for loco-regional extension and nodal staging of neck cancer in 30 patients, comparing it with PET/CT fusion. Although they used manual registration, they hypothesized that simultaneous PET/MR technology would minimize the drawbacks of retrospective PET/MR coregistration strategy, such as local misregistration, generating better-quality fusion images, as can be confirmed by our study.

The same has been studied by Loeffelbein et al. [11] that compared their retrospective coregistration results obtained by a commercial software with side-by-side analysis of single modality PET and MRI in a group of thirty patients.

In neither of these two studies, the authors had data sets coming from simultaneous PET/MR available to use as gold standard for the evaluation of registration performances.

Leibfarth et al. [13] developed an accurate and robust registration strategy on a dataset of eight patients consisting of an FDG PET/CT and a subsequently acquired PET/MR of HN with the aim of integrating combined PET/MR data into RT treatment planning. We started from this work for the implementation of registration with Elastix, but we took advantages of a wider dataset and we also evaluate registration performed by a commercial software optimized for a clinical workflow.

Our results showed that comparison of rigid versus deformable registration revealed superior performance for deformable registration both for Elastix and Mirada. This is due to the complex movements of this region, which are intrinsically nonrigid and hence cannot be completely recovered by a rigid transformation with only six degrees of freedom. With regard to deformable registration, although Elastix showed better performance than Mirada at least in the qualitative evaluation, it was computationally more intense. The tested software used different similarity measures and different transform basis: although Mirada uses RBF transformation model, which is more accurate and faster than the B-Spline used in this work for Elastix registration, it is completely embedded. Consequently, internal registration parameters, such as deformation field smoothness, degrees of freedom, and similarity function sensitivity, are automatically tuned by the software on the basis of the considered modalities. On the other side, the registration scheme and parameters used for DR in Elastix are the result of a previous optimization study [13] and are designed for the specific application in MR-CT registration of HN district; in particular, B-spline parametrization in conjunction with BEP is chosen to favor a smooth and reasonable transform. LMI is advantageous in the case of spatial intensity distortions and for multimodal registration if one intensity class corresponds to a specific tissue type in one imaging modality and to different tissue types in the other imaging modality [13]. Moreover, as expected, coregistered PET/MR images from hybrid scanner carried out the best performances, since they are inherently free from the problems of misalignment, local misregistration, and unrealistic deformation field.

We believe that performances of software-based coregistration method in districts subject to nonrigid movements, such as HN, could be undoubtedly improved by means of support structures, as head masks, designed for immobilizing the patient during the acquisitions. In addition, registration algorithms could benefit from users’ supervision for preliminary manual step, in order to start from an optimal rigid alignment that could improve the performances of successive automatic deformable steps, making them more feasible in terms of computational time. Both these issues are out-of-the-scopes of this work that, although it is aimed to evaluate the clinical suitability of retrospective coregistration in comparison to intrinsic coregistration of simultaneous PET/MR, limits the investigation to a fully automated perspective.

In conclusion, our findings show that, regarding the complex case of PET/MR of HN district, there is a wide room for improvement of software-based coregistration algorithms, since, at present, they are definitely outperformed by the intrinsic coregistration of simultaneous PET/MR that overcomes the above-named problem of retrospective coregistration, as hypothesized in previous works [9, 10]. In this direction, simultaneous PET/MR imaging, which hence offers unique datasets when acquired together with PET/CT during the same session, could also serve as ground truth for the validation of improved coregistration algorithms.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this paper.


This work was partially supported by the RRC-2015-2360454 of the Italian Minister of Health and by the Italian project PON03PE_00128_1 “eHealthNet: Software ecosystem for Electronic Health”.


  1. M. Aiello, S. Monti, M. Inglese et al., “A multi-modal fusion scheme for the enhancement of PET/MR viewing,” EJNMMI Physics, vol. 2, 2015, Springer. View at Publisher · View at Google Scholar
  2. S. Monti, S. Cocozza, P. Borrelli et al., “MAVEN: an algorithm for multi-parametric automated segmentation of brain veins from gradient echo acquisitions,” IEEE Transactions on Medical Imaging, vol. 36, no. 5, pp. 1054–1065, 2017. View at Publisher · View at Google Scholar
  3. S. Monti, G. Palma, P. Borrelli et al., “A multiparametric and multiscale approach to automated segmentation of brain veins,” in 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 3041–3044, Milan, 2015. View at Publisher · View at Google Scholar · View at Scopus
  4. H. Zaidi and A. D. Guerra, “An outlook on future design of hybrid PET/MRI systems,” Medical Physics, vol. 38, no. 10, pp. 5667–5689, 2011. View at Publisher · View at Google Scholar · View at Scopus
  5. G. Delso, S. Fürst, B. Jakoby et al., “Performance measurements of the Siemens mMR integrated whole-body PET/MR scanner,” Journal of Nuclear Medicine, vol. 52, no. 12, pp. 1914–1922, 2011. View at Publisher · View at Google Scholar · View at Scopus
  6. G. Antoch and A. Bockisch, “Combined PET/MRI: a new dimension in whole-body oncology imaging?” European Journal of Nuclear Medicine and Molecular Imaging, vol. 36, Supplement 1, pp. S113–S120, 2009. View at Google Scholar
  7. P. Veit-Haibach, F. P. Kuhn, F. Wiesinger, G. Delso, and G. von Schulthess, “PET–MR imaging using a tri-modality PET/CT–MR system with a dedicated shuttle in clinical routine,” Magnetic Resonance Materials in Physics, Biology and Medicine, vol. 26, no. 1, pp. 25–35, 2013. View at Publisher · View at Google Scholar · View at Scopus
  8. C. Buchbender, T. A. Heusner, T. C. Lauenstein, A. Bockisch, and G. Antoch, “Oncologic PET/MRI, part 1: tumors of the brain, head and neck, chest, abdomen, and pelvis,” Journal of Nuclear Medicine, vol. 53, no. 6, pp. 928–938, 2012. View at Publisher · View at Google Scholar · View at Scopus
  9. T. Kanda, K. Kitajima, Y. Suenaga et al., “Value of retrospective image fusion of 18F-FDG PET and MRI for preoperative staging of head and neck cancer: comparison with PET/CT and contrast-enhanced neck MRI,” European Journal of Radiology, vol. 82, no. 11, pp. 2005–2010, 2013. View at Publisher · View at Google Scholar · View at Scopus
  10. D. J. Loeffelbein, M. Souvatzoglou, V. Wankerl et al., “PET-MRI fusion in head-and-neck oncology: current status and implications for hybrid PET/MRI,” Journal of Oral and Maxillofacial Surgery, vol. 70, no. 2, pp. 473–483, 2012. View at Publisher · View at Google Scholar · View at Scopus
  11. D. Loeffelbein, M. Souvatzoglou, V. Wankerl et al., “Diagnostic value of retrospective PET-MRI fusion in head-and-neck cancer,” BMC Cancer, vol. 14, no. 1, p. 846, 2014. View at Google Scholar
  12. S. Partovi, A. Kohan, C. Rubbert et al., “Clinical oncologic applications of PET/MRI: a new horizon,” American Journal of Nuclear Medicine and Molecular Imaging, vol. 4, no. 2, p. 202, 2014. View at Google Scholar
  13. S. Leibfarth, D. Mönnich, S. Welz et al., “A strategy for multimodal deformable image registration to integrate PET/MR into radiotherapy treatment planning,” Acta Oncologica, vol. 52, no. 7, pp. 1353–1359, 2013. View at Publisher · View at Google Scholar · View at Scopus
  14. R. L. Bridges, “Software fusion: an option never fully explored,” Journal of Nuclear Medicine, vol. 50, no. 5, pp. 834–836, 2009. View at Publisher · View at Google Scholar · View at Scopus
  15. D. Rueckert, L. I. Sonoda, C. Hayes, D. L. Hill, M. O. Leach, and D. J. Hawkes, “Nonrigid registration using free-form deformations: application to breast MR images,” IEEE Transactions on Medical Imaging, vol. 18, no. 8, pp. 712–721, 1999. View at Publisher · View at Google Scholar
  16. F. Maes, A. Collignon, D. Vandermeulen, G. Marchal, and P. Suetens, “Multimodality image registration by maximization of mutual information,” IEEE Transactions on Medical Imaging, vol. 16, no. 2, pp. 187–198, 1997. View at Publisher · View at Google Scholar
  17. D. Mattes, D. R. Haynor, H. Vesselle, T. K. Lewellen, and W. Eubank, “PET-CT image registration in the chest using free-form deformations,” IEEE Transactions on Medical Imaging, vol. 22, no. 1, pp. 120–128, 2003. View at Publisher · View at Google Scholar · View at Scopus
  18. V. Fortunati, R. F. Verhaart, F. Angeloni et al., “Feasibility of multimodal deformable registration for head and neck tumor treatment planning,” International Journal of Radiation Oncology, Biology, and Physics, vol. 90, no. 1, pp. 85–93, 2014. View at Publisher · View at Google Scholar · View at Scopus
  19. M. Becker and H. Zaidi, “Imaging in head and neck squamous cell carcinoma: the potential role of PET/MRI,” The British Journal of Radiology, vol. 87, no. 1036, 2014. View at Publisher · View at Google Scholar · View at Scopus
  20. S. Klein, M. Staring, K. Murphy, M. A. Viergever, and J. P. Pluim, “Elastix: a toolbox for intensity-based medical image registration,” IEEE Transactions on Medical Imaging, vol. 29, no. 1, pp. 196–205, 2010. View at Publisher · View at Google Scholar · View at Scopus
  21. M. J. Gooding, C. L. Eccles, M. Fuss et al., “Assessing the quality of deformable CT-MR registration for the purpose of multimodal radiotherapy contouring,” International Journal of Radiation Oncology, Biology, and Physics, vol. 81, no. 2, pp. S812–S813, 2011. View at Publisher · View at Google Scholar
  22. S. Klein, U. A. van der Heide, I. M. Lips, M. van Vulpen, M. Staring, and J. P. Pluim, “Automatic segmentation of the prostate in 3D MR images by atlas matching using localized mutual information,” Medical Physics, vol. 35, no. 4, pp. 1407–1417, 2008. View at Publisher · View at Google Scholar · View at Scopus
  23. S. Klein, J. P. Pluim, M. Staring, and M. A. Viergever, “Adaptive stochastic gradient descent optimisation for image registration,” International Journal of Computer Vision, vol. 81, no. 3, pp. 227–239, 2009. View at Publisher · View at Google Scholar · View at Scopus
  24. W. M. Wells III, P. Viola, H. Atsumi, S. Nakajima, and R. Kikinis, “Multi-modal volume registration by maximization of mutual information,” Medical Image Analysis, vol. 1, no. 1, pp. 35–51, 1996. View at Publisher · View at Google Scholar
  25. N. Toussaint, J.-C. Souplet, and P. Fillard, “MedINRIA: medical image navigation and research tool by INRIA,” in Proceedings of MICCAI Workshop on Interaction in Medical Image Analysis and Visualization, 2007. View at Publisher · View at Google Scholar
  26. P. A. Yushkevich, J. Piven, H. C. Hazlett et al., “User-guided 3D active contour segmentation of anatomical structures: significantly improved efficiency and reliability,” NeuroImage, vol. 31, no. 3, pp. 1116–1128, 2006. View at Publisher · View at Google Scholar · View at Scopus
  27. L. Pace, E. Nicolai, A. Luongo et al., “Comparison of whole-body PET/CT and PET/MRI in breast cancer patients: lesion detection and quantitation of 18F-deoxyglucose uptake in lesions and in normal organ tissues,” European Journal of Radiology, vol. 83, no. 2, pp. 289–296, 2014. View at Publisher · View at Google Scholar · View at Scopus