551
Views
8
CrossRef citations to date
0
Altmetric
Biomedical Paper

Development of an image-guided robot for small animal research

, , , , &
Pages 357-365 | Received 30 Apr 2007, Accepted 18 Sep 2007, Published online: 06 Jan 2010

Abstract

We developed a robot system that can be used for image-guided experimental procedures on small animals, where the goal is to perform physical actions at specific positions identified on a preoperative image. The animal is first placed in a fixture that is compatible with all imaging systems of interest, including PET, SPECT, CT and MRI. After imaging, the fixture is attached and registered to the robot system, where the image-guided intervention is performed. This system has been applied to perform pO2 measurements with physical probes based on tumor hypoxia images obtained in an animal PET scanner. This paper focuses on the design and validation of the robot system. The validation is performed using a phantom and includes a new method for estimating the Fiducial Localization Error (FLE) that is based on the measured Fiducial Distance Error (FDE). The results indicate that the robot system can position the measurement probe at a defined target with a mean error that is less than 0.4 mm.

Introduction

Hypoxic (oxygen-deficient) cells exist in human cancers and are resistant to radiation treatment. Therefore, we and others have been developing non-invasive imaging methods for identifying hypoxic cells in tumors with the goal of improving treatment, e.g., by tailoring the radiation dosage directed at them. However, these non-invasive imaging methods need to be validated. For example, by physically measuring the tissue oxygen tension (pO2) level of the cells (using an Oxylite probe) Citation[1] and correlating these measurements with PET scan data, it is possible to verify the efficacy of PET scans in locating hypoxic cancer cells Citation[2]. We have previously developed a manual method, based on the use of a stereotaxic template, to correlate PET scan data and pO2 measurements for rodent tumors Citation3, 4. However, that method was time consuming, labor-intensive, and restrictive in terms of probe placement (because of the location and configuration of the template).

The goal of this project was to design and build a robot system that facilitates image-guided procedures such as measurement of pO2 level with physical probes. This paper describes this image-guided robotic system and presents phantom test results that quantify the accuracy of the imaging (microPET) and robot subsystems. Another image-guided robot system for needle placement in small animals was reported by Waspe et al. Citation[5]. They developed a remote center of motion (RCM) robot, with a novel method for calibrating the needle tip position using a high-resolution CCD camera. Their system appears to be focused on intraoperative image guidance (e.g., by ultrasound) and their results measure only the robot positioning accuracy.

System design: Hardware

The robot system consists of a mobile cart that houses the electronics, provides a tabletop for the four-axis robot and display monitor, and contains a pull-out drawer for the keyboard and mouse (see ).

Figure 1. Robot system. Inset: Close-up view of end-effector showing cannula (driven by Z1) and probe attached to probe holder (driven by Z2). [Color version available online.]

Figure 1. Robot system. Inset: Close-up view of end-effector showing cannula (driven by Z1) and probe attached to probe holder (driven by Z2). [Color version available online.]

Robot axes

The robot is composed of a two-degree-of-freedom X-Y horizontal platform and two vertical slides (Z1, Z2). A horizontal aluminum arm is mounted on the first vertical slide (Z1) and provides an attachment for either a registration probe or a cannula. The second vertical slide (Z2) is attached to the first vertical slide and contains a probe holder. This allows the system to insert the Oxylite probe through the cannula and into the tumor. Note that in this case the Z1 axis positions the cannula near the skin surface and the Z2 axis drives the measurement probe to the target.

We used the BiSlide MN10-0100-E0X-61 (Velmex, Bloomfield, NY), coupled with 14904S011 DC motors (Pittman, Harleysville, PA) for the X, Y and Z1 axes. The slides provide 250 mm (10 inches) of travel and have a specified repeatability and accuracy of 0.005 mm and 0.076 mm, respectively. In our experiments using a dial indicator (Mitutoyo model #543-693B, with accuracy of ±0.003 mm), we measured the repeatability and accuracy of the assembled system as 0.007 mm and 0.048 mm, respectively Citation[6]. We note, however, that the higher measured accuracy is most likely due to the limited dial indicator travel (12.7 mm) and does not truly reflect the accuracy of the stage over its entire range of motion.

For the Z2 axis, we used the BUG 1-A.083AB-DCG94_5E5-4 linear actuator (Ultramotion, Cutchogue, NY), which has 100 mm (4 inches) of travel and a specified repeatability of 0.008 mm. Our dial indicator experiments measured repeatability and accuracy as 0.030 mm and 0.075 mm, respectively Citation[6]. Here, the measured repeatability is worse than expected and may have been affected by other factors, such as the control system.

Rodent bed and registration markers

The rodent bed () fits inside the limited bore of the small animal imaging scanners and mounts onto the robot X-Y platform. It includes markers (fiducials) for the registration between image and robot coordinates. The markers are mounted on an adjustable bridge so that they can be positioned over the target region (tumor) and within the scanner field of view. The bridge is removed after registration to enable access to the rodent. We initially used the Acustar® marker system Citation[7], donated by Z-Kat, Inc. (Hollywood, FL), for the CT, MRI and robot markers, and a separate set of support tubes (offset by a known amount) for the radioactive PET markers. We are currently using a simpler marker system that consists of four small hemispheres (3 mm in diameter) drilled into the adjustable bridge (see ). These hemispheres are filled with an appropriate contrast agent prior to imaging. After imaging, the contrast agent is removed and the holes are physically located by the robot. For this procedure, the cannula is replaced by a registration probe, which is guided to the markers using a force control mode Citation[8]. Force control is possible because the system contains a two-axis sensor (XY) beneath the rodent bed and a single-axis sensor (Z1) near the attachment mechanism for the registration probe and cannula.

Figure 2. Rodent bed with marker bridge. Markers (fiducials) have been outlined with black circles for better visibility. [Color version available online.]

Figure 2. Rodent bed with marker bridge. Markers (fiducials) have been outlined with black circles for better visibility. [Color version available online.]

Controller electronics

The robot controller consists of a rackmount computer connected via Ethernet to a DMC-2143 controller board and AMP-20540 power amplifier (Galil Motion Control, Rocklin, CA). The controller provides low-level servo control of the four robot motors. Application software on the PC sends position goals via Ethernet to the controller, which then moves each joint to its goal.

Force sensors

The power amplifier provides eight analog inputs, three of which are used for the interface to the force sensors. For the XY force sensor, we used the DX-480 from Bokam Engineering (Santa Ana, CA), with a custom amplifier board that fits inside a shielded metal case below the sensor body. The Z1 force sensor is an L2357 sensor manufactured by Futek, Inc. (Irvine, CA).

System design: Software

Application software

We developed the application software using 3D Slicer (www.slicer.org), which provides visualization, registration and segmentation capabilities. This software guides the user through most steps of the procedure, which are as follows:

  1. Place the anesthetized tumor-bearing rodent in the rodent bed.

  2. Place the rodent bed in the scanner and obtain image data. For microPET, two scans are required: one with the marker bridge () and one without. The former is used for registration; the latter has better image quality for target definition.

  3. Move the rodent bed to the robot system and load image data into the computer.

  4. Register the image coordinates to the robot coordinates:

  1. Manually guide the robot's registration probe into contact with each of the four markers.

  2. Use a semi-automatic image processing procedure to locate the corresponding image markers.

  3. Perform a registration to determine the transformation that aligns the two sets of marker positions (robot and image).

  1. Remove the registration probe from the Z1 axis and attach the cannula.

  2. Attach the measurement (Oxylite) probe to the Z2 axis and zero its position.

  3. Identify target regions (sets of vertical tracks) in the image ().

  4. Transform the track points to robot coordinates and move the robot so that it positions the cannula at the entry point of the first probe track.

  5. Prompt the user to manually puncture the skin by inserting a needle through the cannula and then removing it.

  6. Command the robot to move the measurement probe through the cannula and into the tumor (). The robot moves the probe vertically inside the tumor in user-defined increments (typically 0.5–1.0 mm), recording data at each position.

  7. When the measurement probe reaches the end of the current track, command the robot to retract the probe back inside the cannula, move it to the starting point of the next track, and repeat the above cannula insertion and probe measurement sequence until the entire grid pattern has been traversed.

Figure 3. 3D Slicer screen showing defined measurement track (dark blue cylinder) and tumor (bright white object) in PET image. [Color version available online.]

Figure 3. 3D Slicer screen showing defined measurement track (dark blue cylinder) and tumor (bright white object) in PET image. [Color version available online.]

Figure 4. Robotic insertion of measurement probe into tumor. The X-Y-Z1 axes position the cannula at the skin surface, and the Z2 axis drives the probe (the thin line) into the tumor. [Color version available online.]

Figure 4. Robotic insertion of measurement probe into tumor. The X-Y-Z1 axes position the cannula at the skin surface, and the Z2 axis drives the probe (the thin line) into the tumor. [Color version available online.]

Force control algorithm

Force-controlled guidance is typically implemented as an admittance controller, which uses an admittance gain, G, to convert a measured force, F, into a desired joint velocity: V = G * F. As noted previously Citation[8], this linear relationship between force and velocity has shortcomings; in particular, a low admittance gain is necessary for fine motion control or to preserve stability when contacting stiff objects, but it results in sluggish performance for gross positioning tasks. One solution is to allow the user to select a different admittance gain for each situation, but this additional interaction can be a nuisance. The solution presented in reference 8, and adopted here, is to use a nonlinear gain to provide fine positioning without sacrificing maximum motion speed, i.e.: V = G(F) * F. The force control program is written in an interpreted language that is downloaded to the controller. A simplified representation of the nonlinear force control equation is given bywhere Vi, Fi and Gi * (1−cos(9 * Fi)) are the commanded velocity (in mm/sec), measured force (in Volts - we did not calibrate the force sensor to obtain the readings in Newtons), and nonlinear admittance gain for axis i, respectively. The implementation described in reference 8 used an experimentally determined exponential to provide the nonlinearity; we use the cosine function because it was available in the vendor's software library. This is shown graphically in , with Gi set to 1 (in our implementation, it was approximately 5 mm/V * sec for the X and Y axes and 10 mm/V * sec for the Z axis). The multiplication by 9 ensures that the maximum force reading (10 Volts) produces the largest commanded velocity. The nonlinear factor 1–cos(9 * Fi) is relatively flat for low forces but has a higher slope for large forces. This is similar to the exponential proposed in reference 8, which is shown as the thin line in . The actual implementation includes a deadband (so that sensor noise does not cause unwanted motion) and another nonlinear factor that is a function of the distance to the travel limit (so that the robot slows down as it approaches the limit).

Figure 5. Illustration of nonlinear gains for force controller. The thick line represents our implementation using 1–cos(9 * Fi), Equation (1). The thin line represents implementation using exp(1–10/Fi) Citation[8].

Figure 5. Illustration of nonlinear gains for force controller. The thick line represents our implementation using 1–cos(9 * Fi), Equation (1). The thin line represents implementation using exp(1–10/Fi) Citation[8].

Image to robot registration

The transformation between image coordinates and robot coordinates is obtained by performing a best-fit estimation between the two 3D point sets (4 image markers and 4 robot markers). We initially implemented an iterative technique, based on a variation of Powell's Method, but for the experiments described here we adopted a closed-form solution that computes a singular value decomposition (SVD) to obtain the rotation matrix Citation[9], Citation[10]. This solution minimizes the sum of the squared L2-norms, and the definition of the Fiducial Registration Error (FRE) is based on this minimum value:

Phantom experiments

We performed several tests to evaluate the accuracy of the overall system, as well as the accuracy of the major subsystem components (the imaging system and robot). Following the terminology of Maurer et al. Citation[7], we focus on the following categories of error:

  • Fiducial localization error (FLE): the error in determining the position of a fiducial marker in image coordinates (FLEI) or in robot coordinates (FLER).

  • Target registration error (TRE): the mean error for locating markers or other features that were not used for registration. We measure TRE for both the registration probe (TRER) and the Oxylite measurement probe (TREM). The latter measurement is most relevant to the application, but is more difficult to obtain.

Design of phantom

We designed a phantom () that has 20 small hemispherical holes (1–20), 2 cylindrical holes (C1–C2) and 4 large registration holes (R1–R4) arranged at 5 different heights. The four large registration holes (R1–R4) were not used for any of the tests reported here and will not be discussed further. The 20 small holes each have a diameter of 3 mm and are therefore equivalent to the markers on the rodent bed. Four of these holes (1, 3, 10 and 11) are arranged in the same pattern as the four registration markers on the rodent bed. The two cylindrical holes have a diameter of 3 mm and a depth of 6 mm. We machined a small plug, with an accurately centered hole 0.35 mm in diameter, to fit in the cylindrical hole. This was used for system accuracy testing. We chose to use Delrin for the phantom because it is compatible with all image modalities of interest (PET, CT, MRI). The phantom is 120 mm × 50 mm × 45 mm, which is small enough to fit inside all small animal scanners. It was machined on a CNC (Computer Numerical Control) machine with a known accuracy of ±0.0005 inches (±0.0127 mm). Considering the material properties of Delrin, we estimate the overall accuracy of the phantom to be ±0.05 mm. Because our “hole finding” procedure with the robot is a manual task involving hand-eye coordination, we darkened the edges of the holes to obtain sufficient visual contrast in the white Delrin.

Figure 6. Phantom with 20 small hemispherical holes (1–20), 2 cylindrical holes (C1–C2) and 4 large registration holes (R1-R4) arranged at 5 different heights.

Figure 6. Phantom with 20 small hemispherical holes (1–20), 2 cylindrical holes (C1–C2) and 4 large registration holes (R1-R4) arranged at 5 different heights.

For microPET imaging of the phantom, the hemispherical and cylindrical holes were filled with positron-emitting radioactive tracer. After scanning, the application software was used to find the centroid of each marker in the image. For the robot measurements, the robot's registration probe was manually guided to each accessible hole. The data was analyzed to determine the fiducial localization errors (FLEI and FLER) and the target registration error for the registration probe (TRER). The target registration error for the measurement probe (TREM) was obtained by guiding the measurement probe to specific features on the phantom, as described below.

Fiducial localization error (FLE)

FLE can be simply defined as “the error of determining the positions of the markers” Citation[7], but it can be difficult to measure directly. One approach is to infer FLE from FRE, using the following approximation Citation[11]:This result was derived from earlier work by Sibson Citation[12], who applied perturbation theory to derive a general result for K dimensions. The N/(N − 2) term can be explained by considering that, in three dimensions, N points with independent errors have 3N degrees of freedom but registration reduces this to 3N − 6 degrees of freedom.

This approach is especially useful with an accurately machined (CNC) phantom because it can be individually applied to each measurement subsystem (image and robot). Furthermore, the phantom can contain a large number of markers and thereby produce a robust estimate of FLE.

We developed an alternate method for estimating FLE that is based on the Fiducial Distance Error (FDE), which we define as the difference, for each pair of markers, between the measured distance and the known distance. For example, if Pab is the measured distance between markers a and b (in image or robot coordinates) and Cab is the distance between those markers in CNC coordinates, then FDE is |PabCab|. Note that for N markers there are a total of N * (N − 1)/2 measurements. Although distance measurements may be novel in the field of computer-assisted surgery, they are often used for verification of Coordinate Measuring Machines (CMMs) Citation[13] and for calibration of CMM or CNC machines Citation[14], Citation[15].

The relationship between FLE (assumed to be identical at each marker) and the average FDE depends on the geometrical arrangement of the markers. We performed simulations to obtain an empirical relationship. Each simulated data set was created by adding zero-mean Gaussian noise, with a specified standard deviation (applied isotropically), to the CNC coordinates of each phantom marker. For each standard deviation, we created 10,000 simulated data sets and computed the average FDE. We performed separate simulations for the image markers and robot markers because, even though we used the same phantom, 14 markers were in the PET field of view and 16 markers were accessible by the robot. In both cases, we obtained a linear relationship between FLE and FDE, with the ratio FLE/FDE being approximately equal to 1.5.

Target registration error (TRE)

TRE for registration probe (TRER)

It is first necessary to find the four registration markers in PET and robot coordinates and compute the transformation between these two coordinate systems. The TRER is then determined by transforming all other markers (those not used for registration) into a common coordinate system and computing the distance between each set of corresponding points; for example, if Pk and Rk are the positions of marker k in PET and robot coordinates, respectively, and is the transformation from PET (image) to robot coordinates, the TRER for point k is given by |RkPk|. Note that TRER does not include any error due to the Z2 axis because the registration probe is attached to the Z1 axis.

TRE for measurement probe (TREM)

The data to estimate TREM is collected by using the application software to register the robot to the PET image (e.g., by locating the four registration markers) and then defining measurement tracks based on markers visible in the image. The cylindrical holes (C1, C2) and associated plug (see the Design of phantom section above) were created specifically for this purpose. If a track is defined through the center of the cylinder and the plug with the centered hole 0.35 mm in diameter is installed, the robot should, in the absence of error, drive the Oxylite probe (head diameter 0.25–0.30 mm) into the centered hole. The X and Y components of TREM can be determined by measuring the distance of the Oxylite probe from the centered hole. Rather than attempting to measure this directly, we put the robot in force control mode and manually move it until the probe is centered inside the hole, as determined visually under magnification. TREM(XY) is given by the amount that the robot is moved in force control mode. The Z position error is more difficult to measure using this technique because the cylindrical holes were not completely filled by the tracer liquid. Therefore, it is determined by defining measurement tracks at selected hemispherical holes (markers). The application software moves the Oxylite probe to each target track and the error, TREM(Z), is measured once again by noting the amount that the robot is moved in force control mode to position the probe tip at the top of the hole. Assuming that the XY and Z components are independent, the value of TREM can be estimated as follows:Although TREM includes the error contribution for all four robot axes, it is not expected to be as precise as TRER because it is based on fewer measurements and has an even greater dependence on visual estimation, especially for the Z component. Furthermore, it should be noted that both TRE measurements do not include errors due to probe compliance/bending or to the motion that would be expected with an actual target (tumor).

Phantom test results

Setup of phantom

The hemispherical and cylindrical markers were filled with 5 and 20 µl of 18F-FDG (26.5 and 106 µCi), respectively. Only 14 hemispherical markers (1–14) and 1 cylindrical hole (C1) were visible in the microPET scanner's limited field of view. On the robot, access to the four deepest markers (2, 7, 13 and 17) proved too difficult and they were therefore eliminated from the testing.

PET image fiducial localization error (FLEI)

One microPET scan of the phantom was acquired and the application software was used to find all 14 imaging markers (filled holes) in the field of view. The PET pixel size was 0.833 mm. We used a corrected value (1.229 mm) for the PET slice spacing that had been experimentally determined during earlier testing. The 14 markers resulted in 91 computed distance errors and yielded an FDEI of 0.17 ± 0.12 mm (mean ± standard deviation). The largest distance error was 0.50 mm. Our simulations produced the empirical relationship FLEI = 1.49 * FDEI, which estimates FLEI to be 0.26 mm.

For comparison, we applied the least-squares registration technique to all 14 imaging markers. The resulting FRE was 0.24 mm, which according to Equation (3), with N = 14, estimates FLEI to be 0.26 mm.

Robot fiducial localization error (FLER)

The robot was used to locate the 16 accessible markers in three different trials. Each trial yielded 120 computed distance errors, with an FDER value (over all three trials) of 0.12 ± 0.10 mm (mean ± standard deviation). The largest distance error was 0.46 mm. Our simulations produced the empirical relationship FLER = 1.51 * FDER, which estimates FLER to be 0.18 mm.

Once again, we compared our result to the one obtained by applying Equation (3) to the FRE from the least-squares registration of all 16 markers. For the data obtained from the three trials, the FRE was 0.17, 0.13 and 0.12 mm, for a mean value of 0.14 mm. This estimates FLER to be 0.15 mm.

Target registration error for registration probe (TRER)

Eleven markers, including all four registration markers, were found both in the PET image and by the robot. Therefore, it was possible to register the PET data set to each of the three robot data sets and compute TRER for the 7 target markers (see ). The combined TRER was 0.29 ± 0.10 mm (mean ± standard deviation).

Table I.  Target registration error, TRER (units: mm)

Target registration error for measurement probe (TREM)

The TREM(XY) measurement was performed using cylinder C1, with the plug installed. The central axis of C1 was used to define a measurement track and the software consequently moved the Oxylite probe to the start of the track. The robot was then moved in force control mode to visually align the probe with the 0.35-mm hole in the plug (). In three trials, with three different registrations, the robot was moved by 0.10, 0.21, and 0.19 mm, respectively, which yields an average TREM(XY) of 0.17 mm. The measurement resolution is approximately ±0.05 mm (fitting a probe head of 0.25–0.30 mm inside a 0.35-mm hole). The TREM(Z) measurement was performed 8 times by positioning the probe at the top of four target points for two different registrations. Our measured Z errors ranged from -0.15 mm to 0.20 mm, with an overall mean ± standard deviation of 0.13 ± 0.06 mm. From Equation (4), we estimate the mean TREM, at C1, to be 0.21 mm.

Figure 7. Measuring the XY component of TREM (the black plug is shown).

Figure 7. Measuring the XY component of TREM (the black plug is shown).

Discussion

The previous sections presented the results of the accuracy tests that were performed with the phantom. We quantified the fiducial localization error for the PET imaging system (FLEI) and for the robot (FLER), then determined the target registration error for both the registration probe (TRER) and the measurement probe (TREM).

In general, there is no direct way to measure FLE and so many researchers estimate it from the fiducial registration error (FRE) using Equation (3), which was popularized by Fitzpatrick and colleagues Citation[11]. We developed a new method for estimating FLE from the average fiducial distance error (FDE), which is the average difference between the measured distance and the known distance between each pair of markers. Our simulations yielded a simple linear relationship between FLE and FDE and produced estimates that agreed closely with those obtained from Equation (3), as shown in . The advantage of this new method is that it is independent of the registration technique and can be more easily computed, once the linear relationship is established.

Table II.  Fiducial localization error, FLE (units: mm).

It is important to note that both Equation (3) and our method were derived assuming independent, normally distributed errors, which is convenient for mathematical analysis and simulations. In practical cases, both FRE and FDE will include the contribution of other types of error, such as distortion, that cannot be modeled this way. In these cases, both methods will produce an FLE estimate that qualitatively “averages” these errors over the fiducials. While not mathematically precise, we believe this is better than ignoring them, which would occur if, for example, the FLE were estimated by taking multiple readings at a single point.

As shown in , FLEI (0.26 mm) is higher than FLER (0.15–0.18 mm), which is not surprising given the PET pixel size and slice spacing of 0.833 mm and 1.229 mm, respectively, compared to the specified robot accuracy of 0.132 mm (0.076 mm for X, Y, and Z1). In fact, it was somewhat surprising that the FLEI was as good as it was; this is believed to be due to the averaging of pixels during the localization process.

The most obvious method for measuring TRE is to obtain the image-to-robot transformation and then pick a target on the image and command the robot to move to that location. The TRE would be given by the distance between the robot's position and the actual target. It is difficult, however, to make precise measurements of this distance, so we adopted the strategy of guiding the robot to the actual target position and then computing TRE from the difference between the robot's final position (at the actual target) and the position obtained by transforming the image target to robot coordinates. This technique requires markers that can be easily seen in the image and by the user who must guide the robot to the correct position. It works well for TRER because the marker was designed to mate with the registration probe.

In contrast, it is more difficult to measure TREM because it is difficult to precisely position the measurement probe with respect to the marker. As described in the Design of phantom section, we constructed a small plug that precisely fitted the cylindrical hole and used that to estimate the XY component of TREM. Our method for measuring the Z component was less precise and relied on visual alignment of the probe tip with the top of the marker (hole). This imprecision is evident in the large range of measurements (-0.15 mm to 0.20 mm) for the Z component of TREM.

Our results indicate that TREM (0.21 mm) is less than TRER (0.29 mm), which appears to be counter-intuitive because the former includes the error due to an additional robot axis (Z2). This discrepancy is explained by the following factors:

  1. The TREM measurement method is less precise than the TRER method, as described in the Torget Registration Error (TRE) section above.

  2. The TREM measurement was performed in a limited region of the image and robot measurement volumes. In particular, TREM(XY) was only measured at cylinder C1, which happens to be near the registration markers.

We can estimate a more realistic value (i.e., one that would apply to the entire measurement workspace) by adding the mean error of the Z2 axis, previously measured (with a dial indicator) to be 0.075 mm Citation[6], to TRER. If we assume that TRER is isotropic (which seems reasonable given our data), the error in each coordinate direction is TRER (0.29 mm) divided by , which evaluates to 0.17 mm. Adding 0.075 to the Z component and recombining the individual components produces a TREM estimate of 0.34 mm. This is a conservative estimate because it assumes that the two sources of error in the Z direction are additive, whereas in reality some cancellation of error is likely.

Conclusion

We completed the design and testing of an image-guided robot system to assist with cancer research and performed phantom experiments to measure its accuracy. Our results indicate that the robot can position a measurement probe at a defined target with a mean error of less than 0.4 mm, with even better results (approximately 0.2 mm) when the target is near the registration markers. Because these experiments were performed with a phantom, rather than a tumor-bearing rodent, the results do not consider errors due to tumor motion or probe deflection. The latter error can be significant for the current fiber-optic Oxylite probe, but can be greatly reduced in future by using a stiff, needle-based probe. In the phantom experiments, the largest error source was marker localization in the microPET image (FLEI), followed by marker localization by the robot (FLER). We anticipate even better results when using more accurate imaging modalities, such as the microCT, a CT scanner dedicated to animal studies. In addition, this device can be easily adapted to perform other image-guided procedures, e.g., obtaining rodent tumor biopsies or injecting anti-tumor reagents such as viral vectors used in gene therapy.

We introduced a new method for estimating FLE that is based on the distance errors (FDE) and found good agreement with the existing method that is based on the registration error (FRE). Our method has the advantage of being independent of the registration method used by the system. It does, however, currently require a simulation study to determine the relationship between FLE and FDE. Both methods were derived assuming independent, normally distributed errors, but other types of error, such as distortion, that affect FRE or FDE will also influence the FLE estimate. Qualitatively, the estimated FLE will “average” these errors over the fiducials; a more rigorous mathematical analysis should be the subject of future work.

Experiments with small animals demonstrate that PET image-guided pO2 measurement is feasible using this prototype image-guided robot system Citation[16]. This robot system should improve the efficiency and accuracy of needle-based procedures for in-vivo measurement, biopsy, and injection in small animals.

References

  • Urano M, Chen Y, Humm J, Koutcher J, Zanzonico P, Ling C. Measurements of tumor tissue oxygen tension using a time-resolved luminescence-based optical oxylite probe: Comparison with a paired survival assay. Radiation Res 2002; 158(2)167–173
  • Cherry SR, Shao Y, Siegel S, Silverman RW, Meadors K, Young J, Jones WF, Newport D, et al. MicroPET: A high resolution PET scanner for imaging small animals. IEEE Trans Nucl Sci 1997; 44(3)1161–1166
  • Humm JL, Ballon D, Hu YC, Ruan S, Chui C, Tulipano PK, Erdi A, Koutcher J, et al. A stereotactic method for the three-dimensional registration of multi-modality biologic images in animals: NMR, PET, histology, and autoradiography. Med Phys 2003; 30: 2303–2314
  • O’Donoghue J, Zanzonico P, Pugachev A, Wen B, Smith-Jones P, Cai S, Burnazi E, Finn R, et al. Assessment of regional tumor hypoxia using 18F-fluoromisonidazole and 64Cu(II)-diacetyl-bis(N4-methylthiosemicarbazone) positron emission tomography: Comparative study featuring microPET imaging, PO2 probe measurement, autoradiography, and fluorescent microscopy in the R3327-AT and FaDu rat tumor models. Int J Radiat Oncol Biol Phys 2005; 61: 1493–1502
  • Waspe A, Cakiroglu H, Lacefield J, Fenster A. Design and validation of a robotic needle positioning system for small animal imaging applications. Proceedings of the 28th IEEE/EMBS Annual International Conference, New York, NY, August, 2006, 412–415
  • Li JC, Balogh E, Iordachita I, Fichtinger G, Kazanzides P. Image-guided robot system for small animal research. Proceedings of the First International Conference on Complex Medical Engineering (CME 2005), TakamatsuJapan, May, 2005, 194–198
  • Maurer C, Fitzpatrick J, Wang M, Galloway R, Maciunas R, Allen G. Registration of head volume images using implantable fiducial markers. IEEE Trans Med Imag 1997; 16(4)447–462
  • Kazanzides P, Zuhars J, Mittelstadt B, Taylor R. Force sensing and control for a surgical robot. Proceedings of the IEEE International Conference on Robotics and Automation, Nice. France May, 1992; 612–617
  • Arun K, Huang T, Blostein S. Least-squares fitting of two 3-D point sets. IEEE Trans Pattern Anal Machine Intell 1987; 9(5)698–700
  • Umeyama S. Least-squares estimation of transformation parameters between two point patterns. IEEE Trans Pattern Anal Mach Intell 1991; 13(4)376–380
  • Fitzpatrick J, West J, Maurer C. Predicting error in rigid-body point-based registration. IEEE Trans Med Imag 1998; 17(5)694–702
  • Sibson R. Studies in the robustness of multidimensional scaling: Perturbational analysis of classical scaling. J Roy Stat Soc B 1979; 41: 217–229
  • 10360-2. ISO. Geometrical Product Specifications (GPS) - Acceptance and reverification tests for coordinate measuring machines (CMM) - Part 2: CMMs used for measuring size. 2001
  • Kruth JP, Vanherck P, De Jonge L. Self-calibration method and software error correction for three-dimensional coordinate measuring machines using artefact measurements. Measurement 1994; 14: 157–167
  • Florussen GHJ, Delbressine FLM, van de Molengraft MJG, Schellekens PHJ. Assessing geometrical errors of multi-axis machines by three-dimensional length measurements. Measurement 2001; 30: 241–255
  • Chang J, Wen B, Kazanzides P, Zanzonico P, Finn RD, Ling CC. PO2 measurements in animal tumors using an image-guided robotic system (abstract). Proceedings of the 48th AAPM Annual Meeting, Orlando, FL, July 30-August 3, 2006, 2240, Med Phys 2006;33(6):

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.