This paper introduces a sensor suite framework for the partial automation of prosthetic arm control allowing high level control with a reduction of cognitive burden placed upon the user. Automation aims to replicate the hand eye co-ordination through the synergy of a virtual 7DOF arm prosthesis together with the development of a gaze tracking system. The interactions between elements of the suite are detailed and a selection of sensors implemented to control a simple simulation. Use of the novel tongue control system is used to provide discrete input to the system. Initial tests are made of of the system together with a users ability to learn to use the system with promising user feedback on ease of interaction and potential for reduced cognitive burden.