Personalised rehabilitation of upper limb disorders

The project aimed to create user-friendly workflows to assess, manage, and monitor upper-limb disorders. The approach combined computational modelling, medical imaging, wearable sensors, and robotic devices to generate personalised digital twins capable of virtually testing treatment strategies for movement-related impairments.

Upper-limb movement disorders—such as stroke-related paralysis, brachial plexus injuries, and rotator-cuff damage—impose a significant societal and clinical burden. These conditions often lead to lifelong functional deficits, and physical inactivity or disease can further result in joint degeneration and osteoarthritis, for which total joint replacement (arthroplasty) remains the most effective end-stage treatment.

Traditionally, clinicians have relied on qualitative assessments such as observational movement analysis and subjective testing. This project sought to augment those methods by developing quantitative tools that integrate imaging, sensor data, and computational models to improve diagnosis, treatment planning, and rehabilitation monitoring—particularly for patients undergoing shoulder arthroplasty.

According to the New Zealand Orthopaedic Joint Registry, approximately one in five shoulder-arthroplasty patients require costly revision surgery, often due to dislocation or implant loosening. Choosing the correct implant size, position, and orientation is complex because of the shoulder’s large range of motion and intricate muscular coordination.

To address this, the team developed a population model of shoulder anatomy, allowing rapid generation of personalised biomechanical models from clinical CT scans. These models were used to simulate functional movements, estimate forces within bone and soft tissue, and provide insights into implant selection and positioning that best match a patient’s anatomy and biomechanics.

Wearable sensors (inertial measurement units, IMUs) were used to measure and monitor patient motion, supporting both pre-operative assessment and post-operative rehabilitation.

Figure 1. Workflow of personalised biomechanical modelling of the upper limb. Following a doctor visit, the first step is to segment the bones of interest from a clinical CT scan. We then compare the current, morbid bone geometry with our population-based pre-morbid bone geometry to automatically classify the bone deformity (Step 2). Biomechanical simulations can then be made to understand the current state (Step 3) and make predictions for surgical outcomes with different size and position implants (Step 4). Finally, a plan is provided to the surgeon (Step 5).
Figure 1. Workflow of personalised biomechanical modelling of the upper limb. Following a doctor visit, the first step is to segment the bones of interest from a clinical CT scan. We then compare the current, morbid bone geometry with our population-based pre-morbid bone geometry to automatically classify the bone deformity (Step 2). Biomechanical simulations can then be made to understand the current state (Step 3) and make predictions for surgical outcomes with different size and position implants (Step 4). Finally, a plan is provided to the surgeon (Step 5).

Clinical translation of our research tools requires streamlining of processes, which we call a Workflow. Figure 1 illustrates our Workflow with various components and tools connected across the patient journey.