2,856
Views
235
CrossRef citations to date
0
Altmetric
Biomedical Paper

Real-time MRI-TRUS fusion for guidance of targeted prostate biopsies

, Ph.D., , , , , , & show all
Pages 255-264 | Received 04 Feb 2008, Accepted 03 Jun 2008, Published online: 06 Jan 2010

Abstract

Targeted prostate biopsy is challenging because no currently established imaging modality is both accurate for prostate cancer diagnosis and cost-effective for real-time procedure guidance. A system that fuses real-time transrectal ultrasound images with previously acquired endorectal coil MRI images for prostate biopsy guidance is presented here. The system uses electromagnetic tracking and intraoperative image registration to superimpose the MRI data on the ultrasound image. Prostate motion is tracked and compensated for without the need for fiducial markers. The accuracy of the system in phantom studies was shown to be 2.4 ± 1.2 mm. The fusion system has been used in more than 20 patients to guide biopsies with almost no modification of the conventional protocol. Retrospective clinical evaluation suggests that clinically acceptable spatial accuracy can be achieved.

Introduction

Prostate cancer is the most common non-skin cancer in American men and the second leading cause of cancer death among them Citation[1]. One in six men is affected by the disease during his lifetime. Transrectal ultrasound (TRUS)-guided needle biopsy is currently the standard technique for obtaining systematic histological specimens of the prostate due to its real-time nature, low cost, and simplicity Citation[2]. However, the use of TRUS to detect prostate cancer is limited by its relatively poor image quality and the low intrinsic contrast between tumor and non-tumor on ultrasound. In conventional gray-scale TRUS-guided biopsy procedures, the prostate is divided into six or more zones of equal volume, and a biopsy is obtained from each zone in a systematic but inherently undirected fashion, with reported false-negative rates of up to 30% Citation[3], Citation[4].

Prostate imaging limitations and potential alternatives to TRUS have been studied extensively and continue to be an active area of research. Magnetic resonance imaging (MRI) is among the most promising imaging modalities for visualizing the prostate anatomy and focal lesions that are suspicious for prostate cancer. However, MRI imaging is costly and typically not a real-time modality. Furthermore, the magnetic environment increases the complexity of interventional procedures, making the use of MRI for routine biopsy guidance problematic.

One solution to this problem is to fuse previously acquired MRI data to the real-time TRUS, thus exploiting the advantages of each modality Citation[5]. Using these complementary methods, the needle can be inserted into the suspicious regions identified on the MRI using ultrasound guidance. Previous reports have described systems for fusing pre-interventional images (e.g., CT or MRI) with real-time ultrasound images Citation[6], Citation[7]. In these systems, the ultrasound probe is tracked by an exterior localizer that assigns a global coordinate system to the ultrasound images. The registration between the MRI image and the localizer can be obtained using skin-mounted fiducial markers before the surgery. After both the MRI and ultrasound are registered to the localizer, the multi-planar reconstruction (MPR) of the pre-interventional image can be computed and overlaid on the 2D ultrasound image in real time. While these systems can work well on organs such as the liver or lungs, they cannot be applied directly to the prostate without the use of fiducial markers, which are difficult to justify for a biopsy procedure alone. In our earlier work Citation[8], gold seeds were implanted into the prostate for MRI/TRUS registration prior to radiation therapy. This approach was abandoned because very few seeds could be identified in both MRI and TRUS images. In addition, the prostate often moves during the ultrasound procedure, and passive fiducials cannot provide feedback to allow compensation for prostate motion. Since the prostate is a small organ, its motion can easily result in a loss of accuracy in the MRI/TRUS fusion display, leading to inaccurate needle insertion when using the fused display for image guidance. Therefore, new registration methods must be developed for image guidance in targeted prostate biopsies.

In this paper, an interventional navigation system is presented for targeted prostate biopsy Citation[9]. The system has been routinely used in patient studies under an IRB-approved research protocol. A hybrid registration approach is proposed for real-time MRI/TRUS image fusion, bringing the diagnostic information from the MRI to ultrasound procedures. The approach is based on both spatial tracking and intraoperative image registration, which allows compensation for prostate motion without the use of fiducial markers.

Materials and methods

An image registration and fusion system () was developed consisting of a custom visualization and registration workstation with two 3.7-GHz Dual Core Intel® Xeon® CPUs interfacing with the Aurora electromagnetic tracking system (Northern Digital, Inc., Waterloo, Ontario, Canada). Endorectal biopsy guides (CIVCO, Kalona, IA) were equipped with electromagnetic tracking sensors (Traxtal Inc., Toronto, Ontario, Canada) compatible with the Aurora system, and were attached to a front-fire endorectal ultrasound probe (Philips C9-5ec). Real-time transrectal ultrasound images (iU22, Philips Medical Systems, Andover, MA) were transferred to the workstation using video frame-grabbing, enabling spatial localization of each ultrasound frame with the tracking system.

Figure 1. Navigation system for targeted prostate biopsy. (a) System components: localizer (L), tracked ultrasound probe (US) and prostate phantom (P). (b) System setup in patient study. [Color version available online.]

Figure 1. Navigation system for targeted prostate biopsy. (a) System components: localizer (L), tracked ultrasound probe (US) and prostate phantom (P). (b) System setup in patient study. [Color version available online.]

Clinical workflow

The diagnostic images of the prostate are first acquired with a 3-Tesla MRI scanner (Achieva, Philips Medical Systems). The MRI images can be obtained days, weeks, or even months before the biopsy. An endorectal coil is used to improve the MRI image quality and to simulate the force of the ultrasound probe through the rectal wall, although the degree of deformation is not exact. Suspicious lesions are identified on the T2-weighted, proton spectroscopy, diffusion-weighted and/or dynamic contrast-enhanced images. The MRI images are transferred to the workstation. The patient is then positioned on an examination table and the 2D TRUS probe with tracking sensors is placed in the rectum. At the beginning of the TRUS procedure, the operator performs a 2D axial sweep from the prostate's base to its apex such that the series of 2D ultrasound images covers the entire volume of the prostate. The images and corresponding data from the tracking sensors are transferred to the workstation in real time. Based on these images and tracking data, a volumetric ultrasound image is immediately reconstructed on the workstation Citation[10]. The MRI images and the ultrasound volume are then spatially aligned with each other Citation[8]. During the needle insertion, the physician manually holds the 2D probe to scan the prostate. Spatial tracking of the ultrasound probe and registration of the MRI coordinate system with the tracking coordinate system enables real-time fusion of the live ultrasound image with the spatially corresponding multi-planar reconstruction (MPR) from the MRI scan. The suspicious lesions on the MRI are automatically color-coded in the real-time ultrasound display, even if the corresponding ultrasound texture is iso-echoic. If prostate motion results in misalignment between the TRUS and MRI images, image-based registration is carried out to recover the correct MRI/TRUS fusion image. The biopsy needle is deployed when the needle's guideline is aligned with the color-coded target on the ultrasound display.

MRI/TRUS fusion

MRI/TRUS fusion requires real-time image registration between the two datasets. This registration is straightforward if the target organ is a fixed rigid body:where TUSUSsensor is the transformation from the ultrasound image space to the local coordinate system of the tracking sensors attached to the probe, as determined by ultrasound calibration; TUSsensorlocalizer is the real-time tracking data of the ultrasound probe; and TlocalizerMRI is determined by the registration between the patient and the localizer.

Unfortunately, there is considerable prostate motion in the pelvic cavity for several reasons: first, the patient may move involuntarily due to pain or pressure related to the needle insertion; second, the transrectal ultrasound probe can itself move and distort the prostate; and finally, the patient's respiratory motion may shift the prostate Citation[11]. Therefore, to facilitate correct image fusion, real-time prostate motion information has to be obtained. While active skin fiducials are not very useful for tracking this motion, the only feedback available during the intervention is 2D ultrasound. Therefore, image-based registration must be used. Among the technical challenges of image-based registration in this context are the following:

  1. It is a cross-modality image registration;

  2. The TRUS image is 2D while the MRI image is 3D; and

  3. The registration needs to be conducted in real time during the intervention.

In our system, a three-stage process is used for image registration so that these problems can be confronted separately. The first stage is carried out immediately before the biopsy procedure, with a 3D ultrasound image being acquired and then manually registered to the 3D MRI image (T3DUSMRI in Equation 2). In the second stage, intermittent registrations are performed between some nearly real-time 2D ultrasound images and the 3D ultrasound image during the intervention (T2.5DUS→3DUS in Equation 2). Finally, the real-time 2D ultrasound image is registered to the 3D ultrasound image based on the result of the second step (TUS→2.5DUS in Equation 2). These three steps establish the transformation chain between the real-time ultrasound and the pre-interventional MRI, allowing the two images to be fused during the intervention.It seems that the localizer (or electromagnetic tracker) plays no role in Equation 1, suggesting that, in theory, the image fusion can be achieved without using spatial tracking. However, the ultrasound transducer is held manually in any arbitrary position and orientation, and image registration between the reference ultrasound volume and the real-time images can be extremely difficult if the spatial relationship between them is completely unknown. The tracking system can thus provide a good starting point for image registration. In addition, Equation 1 requires the image registration to be conducted in real time, which is very computationally expensive given current computer technology. Spatial tracking allows the registration to be carried out only when significant prostate motion occurs. Therefore, what the equation indicates is that the final accuracy of the image fusion is determined only by image-based registration. This leads to a significant feature of the system: the tracking error of the localizer (e.g., due to metallic distortion) and the calibration error of the probe can be automatically corrected.

Manual MRI/TRUS registration

A 3D ultrasound volume is obtained by sweeping the 2D TRUS probe across the prostate. Since the probe is spatially tracked, the reconstructed ultrasound volume is inherently registered to the tracking space. As an improvement over our earlier work Citation[8], the system automatically creates a rough alignment between the MRI and ultrasound datasets based on the orientation of the rectum and the segmentation of the prostate in the MRI image. This rough alignment is used as a starting point for manual refinement.

Currently, there is no fully automatic algorithm that is sufficiently robust for MRI/TRUS image registration of the prostate. Therefore, manual registration is used here as a practical solution, and is currently considered the most reliable method. In our protocol, the registration is conducted at the beginning of the ultrasound procedure, so the physician can use the time during the registration to examine the patient sonographically.

The manual registration sets up a baseline alignment for the MRI and the tracked ultrasound, allowing the initial position of the prostate to be located in the tracking space. However, the prostate may move after the manual registration. Image-based registrations are therefore needed to correct for motion during the intervention, where the ultrasound volume can be used as a reference for motion compensation.

Intermittent correction of rigid motion

Motion compensation is based on an image registration between the reference 3D ultrasound volume and the intraoperative 2D ultrasound images, which results in the prostate's motion relative to the reference ultrasound volume being determined. It is initially assumed that the prostate does not move far from the position at which it was scanned during the 2D sweep. Therefore, the transformation between the current ultrasound image and the reference ultrasound volume can be estimated in Equation 3:where T3D USlocalizer is determined during the 3D reconstruction of the tracked 2D sweep.

The image registration can take the estimate as a starting point and perform numerical optimization. This registration is triggered manually when misalignment between the MRI and TRUS is observed. Equation 3 shows that the starting point can be determined solely from the tracking system. After the first image registration, every subsequent registration can use the result of the previous registration as a starting point. The tracking-based starting point can help to recover the iterative image-based initialization when the image registration fails due to a lack of texture information in real-time ultrasound images.

The image registration algorithm is based on minimizing the sum of squared differences (SSD) between the current ultrasound image and the reference ultrasound volume. In theory, SSD-based algorithms are most effective for registering images whose noise has additive Gaussian distribution. Although the raw ultrasound data is corrupted by multiplicative noise with a Rayleigh distribution, the logarithm of the raw data is taken for display, which transforms the multiplicative Rayleigh distributed noise to additive Gaussian-like noise. Therefore, SSD-based algorithms can be used at the end of the visualization pipeline to register ultrasound images. From the perspective of the algorithm's performance, the mathematical formulation of SSD allows the objective function to be optimized using the standard Gauss-Newton algorithm, making the registration very efficient for interventional use. The optimization stops when a local minimum is found or the maximum number of iterations is reached.

Since the spatial tracking of the ultrasound probe assigns a physical location in space to each image pixel, the 2D image is actually a single-slice 3D image, allowing volume-to-volume registration to be conducted. However, the registration of the single-slice volume is very sensitive to noise. There are many local minima along the off-plane direction, decreasing the algorithm's performance. It is therefore helpful to use multiple image frames for the registration. In the example illustrated in , four ultrasound image frames are registered together to the reference ultrasound volume. These four frames are selected from a series of image frames taken over a short time period (e.g., 2 seconds). Since the probe is held manually, the probability of it being absolutely static is almost zero. The motion of the operator can help to cover more prostate tissue in the off-plane direction. Using the probe's tracking information, two frames with the largest translational separation (a and b in ) and two frames with the largest separation in orientation (c and d) are selected from the image series. The registration between these frames and the reference ultrasound volume can be categorized as 2.5D to 3D registration. The objective function is given bywhere N is the number of frames used in the registration, Ik is the kth 2D frame, V is the reference ultrasound volume, and Tk is a transformation model between Ik and V with a parameter vector µ. Since the 2.5D image acquisition is fast, the relative prostate motion between the selected ultrasound frames is negligible. A rigid-body transformation is used in our current implementation to model the prostate's motion relative to the reference ultrasound volume. Therefore, µ is a vector of six components, including three translational and three rotational parameters.

Figure 2. Selected image frames for 2.5D to 3D registration. [Color version available online.]

Figure 2. Selected image frames for 2.5D to 3D registration. [Color version available online.]

During the ultrasound procedure, the operator can apply an interval re-registration when misalignment between the MRI and TRUS images occurs. The registration can be performed incrementally to account for large motion. The position of the reference ultrasound volume can be updated to reflect the prostate's motion in the tracking space, as shown in Equation 5:T3DUSUS is the transformation between the reference ultrasound volume and the real-time TRUS image, which is approximated by T3DUS→2.5DUS to improve the robustness of the registration.

Continuous correction of in-plane motion

The images used for intermittent motion correction are acquired at different time points. The prostate may be moving while being scanned. Therefore, the 2.5D/3D registration yields the average motion of the selected 2D frames instead of the exact motion of the current ultrasound frame. In addition, the prostate is often displaced by pressure from the ultrasound probe or the needle. When the motion is fast, the 2.5/3D registration cannot effectively compensate for it in real time.

In an attempt to solve these problems, a continuous in-plane registration is implemented. The registration algorithm is based on the observation that the probe's pressure on the rectum causes mainly in-plane prostate motion, for which the in-plane image registration can usually compensate. The in-plane registration is performed between a source-target 2D image pair. The target image is the current ultrasound frame, and the source image is the corresponding image of the current frame, which is reconstructed from the reference ultrasound volume using the previous 2.5D/3D registration result. To enhance the registration's accuracy and robustness, an attribute vector Citation[12–14] is used to evaluate the registration between the real-time ultrasound image and the reference ultrasound volume. The attribute vector contains two elements: the first element is the intensity of the pixel, and the second element is the magnitude of the gradient. The similarity measure is defined bywhere J is the source image and I is the target image, ∇ is the operator of 2D image gradient, and CC represents the correlation coefficient. Equation 6 is optimized with the Powell algorithm.

The image gradient of Equation 6 is computed once at the beginning of the registration. The gradient-related term makes the similarity measure more definitive for each pixel, helping to avoid local minima. This similarity measure cannot be used in the 2.5D/3D registration because it requires that the partial gradient of the ultrasound volume be computed at each iteration of the optimization, which can lead to significantly reduced speed. The result of the 2.5D/3D registration is used to initialize the first continuous motion correction. After that, the result of the previous correction is used to initialize the current correction.

The continuous motion correction is optional in our workflow. The operator can either activate the continuous motion correction or trigger the 2.5D/3D registration manually when misalignment between MRI and ultrasound is observed. The continuous correction helps to maintain good MRI/TRUS fusion when a needle is being inserted manually. When a biopsy gun is used, the continuous motion correction is not necessary, since the insertion procedure is very fast and causes little prostate motion.

Experiments and results

Both phantom and patient studies were carried out at the National Institutes of Health Clinical Center to evaluate the system's performance. The 2D sweep at the beginning of the procedure took approximately 10 seconds to scan the prostate from base to apex. The reconstruction of the reference ultrasound volume took approximately 15 seconds using a speed-enhanced algorithm and parallel computing Citation[15]. An experienced operator then registered the MRI and ultrasound volumes to each other in 1–2 minutes based on pre-segmented MRI images. The intermittent 2.5D/3D registration algorithm took approximately 4 seconds to finish each time, and the in-plane motion correction took approximately 350 milliseconds. In patient studies, the average biopsy time for a new target from needle alignment to specimen sampling was 101 ± 68 seconds. The time for repeated biopsies on old targets was 62 ± 36 seconds.

Phantom studies

The accuracy of the system was validated using CIRS prostate biopsy phantoms (CIRS, Norfolk, VA).

Accuracy of motion compensation

A 6-DOF reference tracker was attached to a CIRS phantom. The global coordinate system was fixed on the phantom tracker and dynamic reference tracking Citation[16] was used. The prostate was therefore fixed relative to the reference tracker. After the ultrasound volume was reconstructed, the intraoperative scans and 2.5D/3D registrations were carried out to measure the prostate's position. An artificial error of 5–15 mm was introduced to the starting point of the registration, with the error being uniformly distributed in 3D space. As shown in the example in , the registration starting point () was significantly different from the intraoperative image (). Since the prostate was static in the reference coordinate system, a correct registration should recover the initial position. shows the corresponding image of in the reference ultrasound volume after registration. The registration error of each voxel was defined as the distance from the recovered position to its original position. A total of 20 measurements were taken in the experiment, resulting in an error of 2.3 mm with a variance of 0.9 mm.

Figure 3. Example of image-based motion compensation. (a) Real-time ultrasound. (b) Registration result of (a) in the reconstructed reference ultrasound volume. (c) Initial starting point of the registration.

Figure 3. Example of image-based motion compensation. (a) Real-time ultrasound. (b) Registration result of (a) in the reconstructed reference ultrasound volume. (c) Initial starting point of the registration.

Overall system accuracy

Another CIRS phantom with three synthetic tumors was used to validate the system's overall accuracy. The tumors were marked on the MRI that was used to guide the needle insertions. The phantom was moved arbitrarily before each needle placement to simulate prostate motion. Once the needle was in place, the phantom was scanned with a CT scanner (see ). The accuracy of each needle placement was defined as the distance between the tumor center and the needle track. This distance was measured in the CT image, resulting in an error of 2.4 ± 1.2 mm, with a maximum error of 4.8 mm. The needle track rather than needle tip was chosen as a reference because the insertion depth is controlled by the physician in biopsy procedures and is not monitored by the system. In addition, each biopsy obtains a linear core of tissue 10–20 mm in length. The accuracy along the needle is therefore not critical.

Figure 4. CT image example showing needle tip in tumor. (a) XY section. (b) ZY section. (c) XZ section. [Color version available online.]

Figure 4. CT image example showing needle tip in tumor. (a) XY section. (b) ZY section. (c) XZ section. [Color version available online.]

Patient studies

The system was evaluated in patient studies from three perspectives. Since no ground truth could be obtained in patient studies, visual feedback and the mental judgment of experienced physicians were used to evaluate the system's performance.

Capture range of 2.5D/3D registration

The effectiveness of the 2.5D/3D registration was first tested. shows the objective function near the global minimum with respect to two translation parameters, giving an indication of the smoothness of the objective function and the likely capture range.

Figure 5. 2D plots of the objective function near the global minimum with respect to two translation parameters. (a) is the result of registering one image frame, while (b) is the result of registering four image frames. The grid unit is 1 mm. [Color version available online.]

Figure 5. 2D plots of the objective function near the global minimum with respect to two translation parameters. (a) is the result of registering one image frame, while (b) is the result of registering four image frames. The grid unit is 1 mm. [Color version available online.]

is the result for registering one frame, and the result for registering four frames. The meshes are constructed from the residual error (or average intensity difference) of the registration. It can be observed that using multiple image frames results in a smoother objective function. The numerical optimization is therefore less likely to be trapped by local minima, making the registration more robust.

Evaluation of motion compensation

The system was used prospectively in patient studies to compensate for prostate motion. As shown in , the MRI volume is transformed to the 2D ultrasound image space. The red contours are the intersections of the prostate surface with the ultrasound image. The segmentation was based on the MRI image and obtained before the ultrasound procedure. The initial image fusion, which resulted from the manual registration, is shown in . After significant prostate motion was observed in the image fusion (), the image-based motion compensation was executed (without using continuous image registration). As shown in , the alignment between the TRUS and MRI images was well recovered.

Figure 6. Motion compensation using 2.5D/3D registration. The red contours show the prostate segmentation in the MRI image. The 3D MRI volume is pre-registered to a 3D ultrasound volume that is not shown. Top row: RTUS overlaid on MRI. Bottom row: MRI images. (a) and (a′) are the initial registration without patient motion; (b) and (b′) are the deteriorated registration after patient motion; and (c) and (c′) are the registration after motion compensation. [Color version available online.]

Figure 6. Motion compensation using 2.5D/3D registration. The red contours show the prostate segmentation in the MRI image. The 3D MRI volume is pre-registered to a 3D ultrasound volume that is not shown. Top row: RTUS overlaid on MRI. Bottom row: MRI images. (a) and (a′) are the initial registration without patient motion; (b) and (b′) are the deteriorated registration after patient motion; and (c) and (c′) are the registration after motion compensation. [Color version available online.]

The ultrasound image series and probe motion in patient studies were recorded for retrospective analysis. A total of 20 patient studies were analyzed. At the time point of the motion compensation, one ultrasound image and two MRI images (one each from before and after motion compensation) were saved for each patient. The prostate was then segmented from these 2D images by two radiologists and a radiation oncologist. The prostate segmentations of the MRI images before and after the motion compensation were compared to the corresponding ultrasound segmentation, and the overlapping area in the MRI and ultrasound segmentations was calculated. The results were normalized with the size of the prostate in the 2D ultrasound image. The analysis shows that the overlap of the prostate in the MRI and ultrasound images was 71 ± 18% before motion compensation and 90 ± 7% after motion compensation. The difference is statistically significant based on Student's t-test (p < 0.05%).

Real-time image guidance

shows two screen shots of fused MRI/TRUS guidance in a targeted prostate biopsy. shows the MRI image registered to the color-coded reference ultrasound volume. The green contours are the segmentation of the prostate based on the MRI image. shows the live ultrasound and the corresponding MRI MPR views at the time of needle deployment. The white line is the biopsy guideline, which is dictated by the needle guide and is fixed relative to the ultrasound image. The suspicious regions were identified on the T2-weighted MRI image and transformed to the real-time ultrasound display. An 18G Easy Core biopsy gun (Boston Scientific, Natick, MA) was used to obtain the biopsy sample. The biopsy gun was fired after the biopsy guideline was aligned with the pre-selected target.

Figure 7. Screenshots of fused MRI/TRUS image guidance. (a) T2-weighted MRI fused with reference ultrasound volume (color map), along with the real-time ultrasound and target information. (b) Corresponding real-time ultrasound (top) and MRI MPR views (bottom) at the time of needle deployment. [Color version available online.]

Figure 7. Screenshots of fused MRI/TRUS image guidance. (a) T2-weighted MRI fused with reference ultrasound volume (color map), along with the real-time ultrasound and target information. (b) Corresponding real-time ultrasound (top) and MRI MPR views (bottom) at the time of needle deployment. [Color version available online.]

Discussion and conclusions

We have described an interventional navigation system using MRI for targeting and TRUS for image guidance. The system is used for targeted prostate biopsy and can be extended to focal therapy. Real-time fusion of MRI and ultrasound images is possible despite the presence of prostate motion. In addition, the localizer's tracking error can be accounted for automatically using the image-based registration. Since only pre-interventional MRI images and 2D ultrasound scans are used, the system does not require costly MRI room time for interventional procedures, providing a cheaper and faster solution for MRI-guided prostate biopsy procedures which nonetheless takes advantage of the ability of MR to detect prostate cancers. The registration between the pre-interventional MRI and the ultrasound sweep is currently performed manually because this is the most reliable approach and the registration time seems to be clinically acceptable. It has been noted that the physician can use the time needed for registration to examine the patient. Therefore, there is very limited change to the conventional biopsy protocol. Ultimately, we intend to automate this process so as to require minimal operator intervention. It is worth noting that the initial alignment between the reference ultrasound volume and the MRI image dictates the accuracy of real-time MRI/TRUS fusion. The function of the motion compensation algorithms is to recover the initial alignment, and the system's overall accuracy is the combined error of the initial alignment and the motion correction.

While the system is already very promising for targeted prostate biopsy guidance, one limitation must be addressed. The current system only compensates for rigid motion of the prostate, but the ultrasound probe often introduces deformation at the posterior side of the prostate, where prostate cancer frequently occurs, and this deformation cannot be effectively simulated in the phantom studies. Since it is almost impossible to account for deformation manually, automatic algorithms should be explored to improve the system's performance. Currently, a front-fire ultrasound probe is used in the system, which generally introduces larger deformation of the prostate than side-fire probes. We are extending our system to a bi-plane probe (BP10-5ec, Philips Medical Systems, Andover, MA) that includes a side-fire axial plane. It can also be envisioned that the bi-plane probe will significantly benefit motion compensation, since more off-plane information will be provided.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the paper.

References

  • American Cancer Society. Statistics for 2007., http://www.cancer.org/docroot/STT/stt_0_2007.asp
  • Fichtinger G, Krieger A, Susil RC, Tanacs A, Whitcomb LL, Atalar E. Transrectal prostate biopsy inside closed MRI scanner with remote actuation, under real-time image guidance. Proceedings of the 5th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2002), Toyko, Japan, September 2002. Part I. Lecture Notes in Computer Science 2488., T Dohi, R Kikinis. Springer, Berlin 2002; 91–98
  • Rabbani F, Stroumbakis N, Kava BR, Cookson MS, Fair WR. Incidence and clinical significance of false-negative sextant prostate biopsies. J Urol 1998; 159: 1247–1250
  • Guichard G, Larré S, Gallina A, Lazar A, Faucon H, Chemama S, Allory Y, Patard J, Vordos D, Hoznek A. Extended 21-sample needle biopsy protocol for diagnosis of prostate cancer in 1000 consecutive patients. Eur Urol 2007; 52(2)430–435
  • Kaplan I, Oldenburg NE, Meskell P, Blake M, Church P, Holupka EJ. Real time MRI-ultrasound image guided stereotactic prostate biopsy. Magn Reson Imaging 2002; 20(3)295–299
  • Schlaier JR, Warnat J, Dorenbeck U, Proescholdt M, Schebesch KM, Brawanski A. Image fusion of MR images and real-time ultrasonography: evaluation of fusion accuracy combining two commercial instruments, a neuronavigation system and a ultrasound system. Acta Neurochir (Wien) 2004; 146(3)271–276, discussion 276–277
  • Krücker J, Xu S, Viswanathan A, Shen E, Glossop N, Wood BJ. Clinical evaluation of electromagnetic tracking for biopsy and radiofrequency ablation guidance. Int J Comput Assist Radiol Surg 2006; 1: 169–171
  • Krücker J, Xu S, Glossop N, Guion P, Choyke P, Singh A, Wood JB. Fusion of real-time trans-rectal ultrasound with pre-acquired MRI for multi-modality prostate imaging. Proceedings of SPIE Medical Imaging 2007: Visualization, KR Cleary, MI Miga. Image-Guided Procedures, and Display, San Diego, CA February, 2007, Proc SPIE 2007;6509:650912
  • Xu S, Kruecker J, Guion P, Glossop N, Neeman Z, Choyke P, Singh AK, Wood BJ. Closed-loop control in fused MR-TRUS image-guided prostate biopsy. Proceedings of the 10th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2007), Brisbane, Australia, 29 October-2 November 2007. Part I. Lecture Notes in Computer Science 4791., N Ayache, S Ourselin, AJ Maeder. Springer, Berlin 2007; 128–135
  • Trobaugh JW, Trobaugh D, Richard WD. Three dimensional imaging with stereotactic ultrasonography. Computerized Medical Imaging and Graphics 1994; 18(5)315–323
  • Malone S, Crook JM, Kendal WS, Szanto J. Respiratory-induced prostate motion: quantification and characterization. Int J Radiat Oncol Biol Phys 2000; 48: 105–109
  • Shen D, Davatzikos C. HAMMER: Hierarchical Attribute Matching Mechanism for Elastic Registration. IEEE Trans Med Imag 2002; 21(11)1421–1439
  • Foroughi P, Abolmaesumi P. Elastic registration of 3D ultrasound images. Proceedings of the 8th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2005), Palm Springs, CA, October 2005. Part I. Lecture Notes in Computer Science 3749., JS Duncan, G Gerig. Springer, Berlin 2005; 83–90
  • Baumann M, Mozer P, Daanen V, Troccaz J. Towards 3D ultrasound image based soft tissue tracking: A transrectal ultrasound prostate image alignment system. Proceedings of the 10th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2007), Brisbane, Australia, 29 October-2 November 2007. Part II. Lecture Notes in Computer Science 4792., N Ayache, S Ourselin, AJ Maeder. Springer, Berlin 2007; 26–33
  • Xu S, Kruecker J, Glossop N, Wood BJ. Speed enhanced construction of 3D free-hand ultrasound. In: Computer Assisted Radiology and Surgery. Proceedings of the 21st International Congress and Exhibition (CARS 2007). BerlinGermany June, 2007, Int J Comput Assist Radiol Surg 2007;2(Suppl 1):S470
  • Glossop N, Hu R, Dix G, Behairy Y. Registration methods for percutaneous image guided spine surgery. Computer Assisted Radiology and Surgery. Proceedings of the 13th International Congress and Exhibition (CARS 1999), Paris, France, June 1999., HU Lemke, MW Vannier, K Inamura, AG Farman. Elsevier, Amsterdam 1999; 746–755

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.