numbers to drop out when decoding audio, reducing
the effective frame rate.
There are two basic operating modes: capture
and playing. Combinations of them allow recording
movements of the expert surgeon and reproducing
them for training the novel surgeon, capturing
movements of the novel surgeon for evaluating,
guiding, etc.
In the capture mode, the surgeon reproduces the
movements related to the video sequence. The
sensor mock-up receives the audio in the training
video, and decodes the frame numbers, which are
periodically distributed. When a frame number is
decoded, the mock-up captures the values of the
position encoders and stores them together with the
frame number. When the capture ends, the mock-up
sends the data to the PC, where can be saved as the
training data file related to the video, or be evaluated
with a previously stored data.
In the playing mode, all the motor mock-ups
previously store the training data, and wait for the
start command from the sensor mock-up. When the
PC starts video, the sensor mock-up decode the first
frame number, and timestamp it with the global
hour. Then, it broadcasts that information to the
motor mock-ups, which can compute the timestamp
for the next frames, and synchronize playing.
The sensor mock-up periodically broadcast
frame-time pairs for prevent errors, and when there
is an unexpected value in the frame number
sequence, which means a change in the video
playing (pausing the video for an explanation,
advance the video, looping some technique… etc.).
6 CONCLUSIONS
In this paper we have discussed two different
synchronization issues we have faced during the
development of a video-surgery learning utility.
Video synchronization was performed by a cost
effective and simple method recurring to the audio
channel. It allows accurate synchronization without
the need of a complex system. Even it is possible to
eliminate a PC by using a dedicated video player and
controlling playing from the sensor mock-up.
Wireless synchronization between mock-ups was
also analyzed using a similar criterion of wireless
sensors networks. We use a synchronization protocol
over Bluetooth (we also tested ZigBee with similar
results) that largely achieves our requirements. The
method avoids accessing the lower layers of the
protocol while performing similar accuracy as others
that use the MAC layer.
With the strategies described in this article, we
conclude in a surgical training classroom in which
images displayed will be correctly synchronized
with the sensor information and the mock-up
movements, so the novel surgeons can acquire the
needed skills in a real-like environment without
harming any patient. This is obtained at low cost by
using off-the-shelf components to build up this
surgical classroom.
ACKNOWLEDGEMENTS
This work has been partially supported by the
Spanish Ministry of Science and Technology under
CICYT project numbers TIC2003-07766 and
TIN2006-15617-C03-02.
REFERENCES
Ballaro, A., et al., 1999. A computer generated interactive
transurethral prostatic resection simulator. In Journal
of Urology, 162 (5), pp. 1633-1635.
Gomes, M.P.S.F., et al., 1999. A computer-assisted
training/monitoring system for TURP structure and
design. In IEEE Trans. Information Technology in
Biomedicine, 3 (4), pp. 242-251.
Kumar, P.V.S., et al., 2002. A computer assisted surgical
trainer for transurethral resection of the prostate. In
Journal of Urology, 168 (5), pp. 2111-2114.
Chen, E., Marcus, B., 1998. Force feedback for surgical
simulation. In Proceedings of the IEEE, pp. 524-530.
Gambadauro, P., Magos, A., 2007. Digital video
technology and surgical training. In European Clinics
in Obstetrics and Gynaecology, 3 (1), pp. 31-34.
Botden, S.M.B.I., et al., 2007. Augmented versus Virtual
Reality Laparoscopic Simulation: What Is the
Difference? In World Journal of Surgery, 31 (4), pp.
764-772.
Sivrikaya, F., Yener, B., 2004. Time Synchronization in
Sensor Networks: A Survey. In IEEE Network. 18 (4),
pp. 45-50.
Ganeriwal, S., et al., 2003. Timing-sync protocol for
sensor networks. In Proceedings of the 1st
International conference on Embedded networked
sensor systems, pp. 138-149.
Maróti, M., et al., 2004. The flooding time
synchronization protocol. In Proceedings of the 2nd
international conference on Embedded networked
sensor systems, pp. 39-49.
Elson, J. et al., 2002. Fine-Grained Time Synchronization
using Reference Broadcasts. In Proc. 5th Symp. Op.
Sys. Design and Implementation, pp. 147–163.
Cox, D. et al., 2005. Time Synchronization for ZigBee
Networks. In Proceedings of the 37th Southeastern
Symposium on System Theory (SSST2005), pp. 135–
138.
BIODEVICES 2008 - International Conference on Biomedical Electronics and Devices
86