Golf Swing
The repetitive and high-impact nature of the golf swing may contribute to lower spine degeneration and chronic low back pain. This project aims to analyze the biomechanical loading of the lumbar spine during the golf swings through advanced motion capture and modeling techniques. A high-fidelity golf simulator combined with a mobile phone-based motion capture system will be used to evaluate swing mechanics. In Part A, state-of-the-art pose estimation models will be tested for their accuracy in extracting 3D motion data from monocular videos. In part B, biomechanical analysis will integrate pose data into an individualized OpenSim model to estimate spinal joint reaction forces and muscle activity. The ultimate goal is to develop a smartphone-based tool capable of real-time swing analysis to provide insight into injury prevention and technique optimization for golfers.
Improving SLAM in the Operating Room Using Event Cameras
This project focuses on enhancing SLAM (Simultaneous Localization and Mapping) in operating rooms using event cameras, which outperform traditional cameras in dynamic range, motion blur, and temporal resolution. By leveraging these capabilities, the project aims to develop a robust, real-time SLAM system tailored for surgical environments, addressing challenges like high-intensity lighting and head movement-induced motion blur.
Intraoperative Ultrasound-X-ray Calibration
Real-time navigation of complex orthopedic surgeries faces challenges due to the dynamic surgical environment and limited visibility of patient anatomy. Intraoperative imaging modalities such as ultrasound and X-ray are commonly used to achieve some level of guidance, although often in a purely visual form[1, 2]. Ultrasound provides high-frequency, radiation-free imaging but is limited to localized areas and is prone to noise [3]. X-ray, on the other hand, offers a wider field of view with less noise but introduces radiation, restricting the number of images that can be safely captured during a typical surgery. Combining ultrasound and X-ray data could potentially balance these strengths, enhancing intraoperative anatomical reconstruction quality while reducing radiation exposure, something vital for achieving surgical navigation. However, to our knowledge, no existing setup or dataset currently integrates both modalities for this purpose. This project focuses on developing a setup that enables sensor fusion of ultrasound and X-ray images to improve intraoperative surgical navigation. Alongside hardware setup, as is shown in Fig.1, a key objective is to establish a practical calibration method between an ultrasound probe and a C-arm X-ray machine. This will lay the foundation for creating a paired X-ray-Ultrasound dataset that can enable many downstream applications involving the said modalities. The final goal is to explore novel calibration techniques and system configurations that balance calibration accuracy with setup simplicity, facilitating efficient collection of joint ultrasound and X-ray datasets.

Powered by  SiROP - the academic career network