Graduation Year


Document Type




Degree Name

MS in Mechanical Engineering (M.S.M.E.)

Degree Granting Department


Major Professor

Stephanie Carey, Ph.D.

Co-Major Professor

Rajiv Dubey, Ph.D.

Committee Member

Redwan Alqasemi, Ph.D.


Amputation, Simulation, Virtual Environment, Rehabilitation, Model


The purpose of this study was to develop an intuitive software that aids in the field of prosthetic training and rehabilitation by creating an individualized visualization of joint angles. This software is titled “the prosthetic training software (PTS) for individualized joint angle representation”, and it enables the individualized portrayal of predicted or pre-recorded joint angles. The PTS is an intuitive program for clinicians and prosthesis users that produces an animation of a virtual avatar reflecting the user’s segment lengths and amputation for rehabilitation and training purposes.

The PTS consists of a graphical user interface (GUI) and a 3D visualization of the information input into the GUI. This software was developed in Microsoft Visual Studio (Microsoft, Redmond, Washington) as well as the Unity game engine (Unity Technologies, San Francisco, California) in the programming language C#. Four GUI tabs were created consisting of a patient input tab, a patient measurements tab, a prosthesis view and search tab, and a tab dedicated to editing a list of prostheses. Code was developed to take information input into these tabs to create an individualized 3D human model for the visualization. Twenty-four models were created in order to allow for unique portrayal of that input data. The models consisted of small, medium and large sizes, both male and female genders, and able-bodied, transradial left side, and transradial right side amputation variations. A generic transradial prosthesis was created for the use in the variations of these models. An additional six stick figure models were generated in order to give additional perspective of the portrayed joint angles. Code was also developed in order to animate these models accurately to the joint angles that are sent to them. Playback speed, viewing orientation, and perspective control functionalities were developed in order to assist in the comprehensiveness of the displayed joint angles.

The PTS is not meant to be standalone software, however, the functionalities that it needed to encapsulate in order to work in conjunction with research currently being conducted at USF were tested. The intuitiveness of the GUI and visualization was evaluated by ease of use surveys, as well as volunteer commentary, in order to find how easily the interface that can be operated in a home setting without the oversight of an experienced operator. On average, subjects agreed that the PTS was intuitive to use, both for inputting information and utilizing the visualization. Feedback from these surveys will be used to further improve the PTS in the future. The feasibility of learning from the visualization output from the PTS was tested by comparing motions from five able-bodied subjects before and after having been taught three motions comprising pre-recorded joint angles animated by the PTS. Joint angles were calculated from recorded marker positions. It was found that after viewing the animation, the joint angles were markedly closer to the joint angles portrayed to them. This shows that the PTS is fully capable of showing joint angles in a comprehensive way. Future work will include additional testing of these functionalities, including the testing of prosthesis users, as well as the introduction and testing of new features of prosthesis recommendation and predictive joint angle production when later combined with future research.