The Study Of Eye Movements Psychology Essay

Print   

23 Mar 2015

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

The study of eye movements has been around far before the technological advances of the modern PC. Initial studies, dating back to the late 1800s, used highly invasive techniques involving mechanical contacts with the eye. In the early 1900's methods were developed to track eye movements less invasively by observing light reflections from the cornea. Throughout the 1900's, eye tracking was used passively for observation purposes only, it was not until the development of minicomputers that the necessary resources were available to track eye movements in real-time and hence lead to human computer interactions based on eye movements [1].

Eye tracking is the method used to monitor eye movements with respect to head position. It is also used to determine where one is looking (the gaze point). Commonly used eye tracking techniques can be broadly separated into four categories: Contact lens, pupil or corneal reflection, image-based Video-Oculography (VEOG) and Electrooculography (EOG) [2]. Two main types of eye movements are observed through these technologies; smooth pursuits and saccadic movements. Smooth pursuit movements are slow eye movements which occur when gaze is fixed on a moving target [3]. Saccades are fast eye movements that occur when a person quickly changes their visual target. These fast movements typically last between 30 to 120 milliseconds [4]. Depending on how far the eyes move, the amplitude of the saccade is proportional to the angular distance moved. Fixations, where the eye is stable and fixed on an object, are typically observed following a saccade, these can last between 200 and 600ms [5].

Contact lens eye tracking is one of the most precise methods; however it requires the subject to have a device such as a coil or a mirror attached to a contact lens inserted into the eye. This method has extremely high accuracies, down to 0.08° [6]; however it is extremely invasive and thus inappropriate for use in human machine interfacing. Pupil and corneal reflection is achieved by either using infrared or video cameras directed at the eye to capture reflected light. The amount of light reflected back to a receiver relates to the eye position. Accuracies range from 0.5° to 2° for both horizontal and vertical eye movements [7]. This technique is not as invasive as the contact lens method; however it requires a direct unobstructed line of sight to the naked eye, accuracies suffer for subjects who wear glasses. The subject must not move and the measuring apparatus must be directly in front of the eye, these factors make this type of eye tracking unsuitable for human machine interfacing. Image based techniques (VEOG) use cameras and complex algorithms to determine gaze positions. Accuracies have been quoted from 0.5° to 1° [6], however additional cameras and/or tracking techniques must be incorporated to the system to achieve such accuracies [8,9]. VEOG systems are the least invasive of the four groups and are used for often for human observation studies [10,11]; however there are significant drawbacks in terms of processing power for high quality images, lengthy calibration routines and the need for recalibration if the user moves or ambient light levels change. An eye controlled mouse based on VEOG is commercially available, but it has a very high cost, typically $7000 (~£4616) [12]. The most prevalent eye tracking technology used in human machine interfacing is the Electrooculogram (EOG). It is slightly more invasive than VEOG as it requires electrodes to be attached to the face around the eyes. However the cost and processing required is the least of all the categories of eye tracking. This technique requires simple measurements that can be taken with the head in any position, comparable accuracies to the above technologies have been published [13-15]. The following section provides details on these accuracies, how the EOG is measured and a brief review of HMI applications based on EOG signals.

The Electrooculogram

The EOG is a signal resulting from a differential measurement across an eye or both eyes, either on a horizontal or a vertical plane. The front of the eye, the cornea, is at a more positive potential with respect to the back of the eye, the retina. When the eyes move a change in electric potential results, this is due to the electrostatic field surrounding them. As the eyes rotate, the electrostatic dipole rotates with them thus the displacement voltage can be measured across the eyes in both the vertical and horizontal directions. Historically this was done using a system of many DC electrodes placed around the eyes, depending on how much detailed was required from the EOG signals [7]. Obviously the more electrodes used the more precise detail on eye position is achievable. Recorded potentials are small, usually in the microvolt range, and have been quoted between 15µV and 200µV [7] and 50µV to 3500µV [16]. Results vary due to electrode type, position and of course differences in individual biopotentials or physiological make up. It is possible to record the monopolar behaviour of the eye with the electrode positioned at a distance from the eye up to three times its diameter [17], this dictates a practical limitation for this technique. Obviously the closer to the eye the bigger the potential difference will be. In any case of electrode placement position within the limitation, eye behaviour has been proven to be linear in the range of ±70° [7], this sensitivity is highly stable in the range of ±40° becoming progressively worse at greater excursions. Sensitivities have been measured between 4µV/degree and 20µV/degree, there is a trade off between sensitivity and number of electrodes [7]. Practically horizontal sensitivities are more stable than vertical sensitivities due to muscle movements and eyelid interferences. These linear characteristics of EOG signals make them an easily exploitable physiological signal for use as a Human Machine Interface (HMI) input.

Figure 1. Typical facial electrode positions

Typical EOG systems use five wet gel electrodes [15,18-21], two on either side of the eyes (L) and (R) positioned equidistant from the centre of the eye for horizontal measurements. Two electrodes are used for vertical measurements, one above an eye (U) and the second below that same eye (Lo). The fifth electrode is a reference electrode (Ref) and is placed on an electrically isolated position on the subject's body, such as the forehead, neck or on the earlobe. The frequency range in which the EOG signals lie is from DC to 40Hz, thus the need for an isolated point of reference. This is used to minimize system noise from other physiological signals such as EMG and EEG that lie in the same frequency range.

When using the EOG signals for HMI the users experience is invaluable, in terms of comfort and usability. Therefore a minimum number of electrodes is desirable, minimizing visual obstruction and preparation time. The eye typing systems presented in [18] and [19] show a high usability experience with low learning times; however the use of five electrodes is not conducive to comfort as one electrode is placed on the lower eye lid. A solution to the under eyelid electrode was presented in [14] where EOG gaze detection was achieved by attaching two arrays of wet gel electrodes to a pair of headphones. In this experiment accuracies were not very good, horizontal and vertical accuracies were quoted as 4.4° and 8.3° respectively. While this system leaves the eyes unobstructed, it will suffer the effects of DC drift and long term signal degradation from the electrodes drying out. Excellent levels of control of a wheelchair have been achieved [14,15], with the highest degree of accuracy in the detection of saccadic movement being 2° in [15]. However, the wet gel electrodes used are not suitable for long term use since they dry out and can also cause skin irritation. Three electrode systems have been investigated, a wheelchair control system was presented in [22]. However this system relied only on acquiring horizontal EOG signals for left and right control, the third sensor served only as a reference. The remaining system control was based on Electromyography.

The system designed and discussed in the remainder of this chapter relies on three electrodes to detect both horizontal and vertical EOG signals.

Method

A three dry electrode configuration was used; the sensors were spring mounted to a headband for easy application. The physical arrangement of the sensors placement can be seen in Figure 1. EOG headband sensor configuration. Two sensors are positioned on both sides of the eyes at the temples, and the third sensor is placed in the centre of the forehead. By using this configuration we eliminate the need for any additional facial electrodes. This arrangement could easily be transferred onto a pair of glasses or goggles providing the required skin contact points for adequate signal detection.

tash_EOG_B&W_small2

Figure 1. EOG headband sensor configuration

The sensors used for these experiments have been described in Chapter X, they physically comprise of a 5mm electrode size, with a 2.5mm guard around it, making the total sensor surface 10mm diameter in contact with the user. Internally the sensor hardware consists of a two stage amplifier providing x114 amplification to the EOG signals acquired at the skin. There is a first order highpass filter and a first order lowpass filter in the sensor, providing a bandwidth from 0.1 Hz to 86 Hz. The output signals from the sensors were taken to a control box with additional variable amplification and anti-aliasing filtering before being converted from analogue to 16-bit digital signals by a National-Instruments USB-6212 real-time data acquisition card. LabVIEW software [23] was used to acquire the data and perform all subsequent signal processing, program code is shown in Appendix A. In software a 3rd order 40Hz Butterworth lowpass filter was applied to each data channel prior to extracting the EOG signals, thus eliminating any unwanted noise due to movement artifacts or EMG signals. Another lowpass smoothing filter was applied to the final EOG signals in LabVIEW before displaying on the custom graphical user interface. The signals were also recorded to data files for further interpretation. The horizontal signal was determined by using a simple differential measurement, right sensor (R) minus the left sensor (L). The vertical signal required a more complex measurement; the centre sensor (C) was referenced using a signal conditioned signal generated by adding the right sensor (R) to the left sensor (L). Figure 1. shows the block diagram of the entire system.

Figure 1. EOG experimental block diagram

An experiment was designed in order to measure EOG signals over a gaze range of ±40° on both the horizontal and vertical planes. Initial experiments were carried out on a single user. A series of horizontal and vertical eye movements were made, gaze was directed to the circles at 10°, 20°, 30° and 40° to the left from centre, starting at centre and returning to the centre position after each excursion. The procedure was repeated with the same eye movements to the right, up and down from centre. Figure 1. shows the gaze position grid used during the experiments. The position grid used was on a flat surface, thus the gaze points are not linearly spaced on the diagram. The points correspond to the angle the eye rotates to change gaze position.

Figure 1. Gaze position grid

Results

Both the recorded horizontal and vertical EOG signals were plotted using Matlab [24], in Figure 1.(a) data is presented for eye movements to the left of centre position followed by eye movements to the right of centre position. Figure 1.(b) shows the vertical EOG signals for eye movements up, then down from centre position. The initial falling edge in figure 1.5(a) at three seconds is the saccadic eye movement recorded corresponding to the gaze angle of 10° to the left of centre. Each subsequent falling edge increases in amplitude until the gaze direction is shifted to the right of centre (at 12 seconds). It should be noted that the decay in signal amplitude seen directly after the saccade is associated with gaze fixations; the decay is due to the capacitive nature of sensors coupled to the skin.

Figure 1. EOG signals at 10° intervals from 0° to ±40°: (a) Horizontal EOG (b) Vertical EOG

It is easily seen from these graphs that as gaze angle increases EOG amplitude also increases and that a relationship between EOG amplitude and gaze position exists. In order to identify the relationship between EOG and gaze angle, the step change in amplitude is plotted against eye displacement. Figure 1. shows this relationship for both the left and right eye movements. The data is highly correlated with a linear relationship between the step change and gaze angle over the entire range of ±40°. There is also good agreement between the left and right data sets. From this data, the accuracy of the measured saccadic movements is 1° for the horizontal data.

System noise level was assessed during this experiment, the user held the central gaze position in order to observe the resting potentials generated at the sensor inputs. Noise in the system is attributed to both electronic noise in the sensor and any muscle noise picked up by the sensors. The muscle noise is limited due to the system bandwidth. Throughout this experiment the noise level was less than 10µV.

Figure 1. Step size versus angular displacement for eye movement left and then right.

Figure 1. shows the corresponding data set for the vertical axis where the gaze angles up to ±40° are plotted against the step change in EOG amplitude of the measured saccades. Again there is a strong correlation and a linear relationship between the saccadic amplitudes and angels, for this subject, over the entire range. However it should be highlighted that the accuracy for the vertical data is not as good as for the horizontal with an accuracy of 2.5°. This is in agreement with much published literature that also finds the vertical is less linear at the extremities of vision, usually past 30° of eye deflection [7,15,16,25].

Figure 1. Step size versus angular displacement for eye moments up and then down.

These initial experiments yielded sensitivities for the horizontal and vertical eye movements of 10.3µV/degree and 9.9µV/degree, with standard deviations of 2.6° and 5.6° respectively. These sensitivities fall within range of published findings [7,16], however an EOG controlled system cannot be developed based on a single set of sensitivity measurement from one individual.

Further Experiments

To determine the potential capability of the Electric Potential Sensor for an EOG based human machine interface the above experiment was repeated across a range of subjects. Some users wore glasses during the experiment, some wore contact lenses and others had no eye sight problems at all. Users were instructed to place the head band with the EOG sensors on their own heads with no assistance so that the electrodes were on the user's temples and forehead. The users were positioned in front of the gaze position grid shown in Figure 1. and instructed to make a series of horizontal and vertical eye movements, directing their gaze to the circles at 10°, 20°, 30° and 40° to the left from centre, starting at centre and returning to the centre position after each excursion. Each user was asked to repeat the procedure with the same eye movements to the right, up and down from centre.

Each subject carried out three trials on each gaze direction in order to investigate the repeatability of measurements. Each trial was recorded to data files for further interpretation.

Results

The step change in amplitude at each 10° increment in eye position for each subject was extracted from the trial data. Then the relationship between EOG amplitude with respect to eye gaze position was analysed. Figure 1. Horizontal EOG amplitude versus angular displacement across eight subjects shows the experimental results for the horizontal data for all eight users. It is clear from the graph that a linear relationship holds across a range of users; however the EOG data varies greatly in sensitivities. In this case it was found the maximum sensitivity was 16.6µV/degree (red line) compared to the minimum change of 4.7µV/degree (blue line). The deviation from the linear best fit model averaged at 1.9° with the average accuracy of less than 1°.

Figure 1. Horizontal EOG amplitude versus angular displacement across eight subjects

Similar results were achieved on the vertical plane, the deviation from the linear best fit model was 5.3° and the average resolution was 2°. The corresponding minimum change in EOG signal amplitude per degree of eye displacement was 3.52µV/degree (blue line), and the maximum was 9.86µV/degree (red line) as seen in Figure 1. Vertical EOG amplitude versus angular displacement across eight subjects.

Figure 1. Vertical EOG amplitude versus angular displacement across eight subjects

System noise level was also assessed during this experiment, the users held the central gaze position so that the resting potentials generated at the sensor inputs could be observed. Noise levels during these observations for all users never exceeded 10µV.

Results across the subjects show the linear relationship between EOG saccades and angular displacement of the eye holds true. In order to exploit this relationship the EOG signals must be normalised such that these variations do not disrupt levels of control within an HMI system. A calibration routine was developed and will be discussed in the following section.

Calibration and Normalisation

When the headband (Figure 1. EOG headband sensor configuration) is initially applied, the sensors need around one second to settle to a constant level due to their high input impedance. Once a constant level below the noise threshold is achieved the calibration routine will run automatically. The routine consists of a circle that moves to several locations on the screen. These gaze locations pertain to maximum eye deflections on both the horizontal and vertical planes that are required for the system. Figure 1. shows the positions of the circle on screen.

Figure 1. During the calibration routine the user is instructed to make a series of eye movements; left, right, down and up starting and ending at the central resting position in the middle of the screen.

Figure 1. shows a typical EOG signal for both the horizontal and vertical eye movements recorded during the calibration routine. We can see there is no cross talk between signals; they are completely independent from one another. The peak signal excursion is used to compute the deviation from the screen centre point. The amplitude of these peaks will depend on the position of the electrodes and the position of the user with respect to the screen. The signal to noise for the vertical signals is clearly lower than the horizontal, consistent with the lower angular resolutions seen, both with this method and others reported in the literature.

Figure 1. EOG Signals recorded during a calibration routine: (a) Horizontal EOG, negative signal shows eyes looking far left, positive shows eyes far right, (b) Vertical EOG, negative is down, positive is up.

During the calibration routine, the real-time EOG signals are buffered in software until the routine comes to the end. At the end of the routine the user will have completed maximum eye deflections for the system, these maximum deflections can be used to normalise the EOG signals for use in an HMI system. Depending on the position of the user, the maximum and minimum deflections might not be the same; if a user is slightly to the right of the origin the left deflection will be larger than the right. The same relationship will hold for the vertical plane. Thus in order to normalise any saccadic eye movements they must be normalised to their individual maximum and minimum values; corresponding to gaze locations far right, left, up and down. Figure 1. shows the flow chart for how the calibration routine runs on program start up in order to obtain maximum and minimum EOG values for both the horizontal and vertical planes.

Figure 1. Calibration routine flow chart

Feature Extraction

It is important that valid features of an EOG signal can be extracted for use in an HMI system. Valid features of the EOG signal are the saccadic changes in EOG amplitude corresponding to changes in gaze locations as well as eye blinks which have not been discussed until this point. Eye blinks are another exploitable characteristic of EOG signals that make it exceptional for use in HMI control systems. Blinks are a combination of EMG signals and vertical saccades, they are greater in amplitude than the maximum amplitude on the vertical plane arising from intentional eye movements and they are much shorter in duration than normal saccadic eye movements. A typical eye blink consists of a quick rise and fall in amplitude, the entire process lasts between 100 to 200 milliseconds [26]. Since the characteristics of blinks differ greatly from saccades, they can easily be used as a control signal such as mouse clicks [18,27]. Figure 1. Example EOG signals occurring during Human Machine Interaction shows EOG signals for some eye movements that occur during human computer interaction.

Figure 1. Example EOG signals occurring during Human Machine Interaction: (a) Horizontal (b) Vertical

The data seen in Figure 1. Example EOG signals occurring during Human Machine Interaction was recorded from Labview; the real-time signals were fed into a feature extracting algorithm shown in Figure 1.. The algorithm determines the amplitude of the saccades and thus the change in gaze angle and direction of eye movement made by the user. It also determines if the user blinks. Features are extracted by identifying a valid saccade, a valid saccade has occurred if the change in amplitude lasts longer than 30ms, as discussed in Section 1.1. The saccades are divided by the maximums determined during calibration, the results are values normalised to ±1. Blinks are also identified by this timing threshold, but are further identified by their amplitude being greater than maximum vertical deflection.

Figure 1. EOG feature extraction algorithm

The output from the feature extraction algorithm for the sequence of eye movements in Figure 1. Example EOG signals occurring during Human Machine Interaction is outlined as a time series in Table 1..

Table 1. EOG feature extraction results

Time (seconds)

Horizontal EOG

Vertical EOG

Gaze Indication

0 to 2

No changes > 30ms

2x Blinks

Control Signal

2.5

Rising edge

Falling edge

Right and Down

2.5 to 3.5

No change > 30ms

No change > 30ms

Fixation

3.5

Falling edge

Rising edge

Left and Up

3.5 to 5

No change > 30ms

No change > 30ms

Fixation

5 to 6

No change > 30ms

2x Blinks

Control Signal

6.5

Falling edge

Falling edge

Left and Down

6.5 to 9

No change > 30ms

No change > 30ms

Fixation

Conclusion

This chapter has demonstrated the ability to acquire high quality, repeatable EOG signals across a variety of subjects using three electric potential sensors. It has been shown that with the configuration used EOG signals are comparable in resolution and linearity with systems previously published requiring more than three electrodes [14,15]. Users were required to put on the apparatus themselves minimizing preparation time. With little instruction all were capable of placing the head band in a position that was appropriate for adequate EOG signal acquisition.

The linearity of EOG signals with respect to eye displacement throughout all of the trials shows that these EOG signals are highly suitable for use as a HMI control signal. However due to the variability in sensitivities it is necessary to incorporate calibration and normalisation techniques so that any user can benefit from the same control algorithms.

A method for calibrating and normalising the EOG signals was described. This technique demonstrates how to overcome the variability in individual eye movements. Once normalised to ±1, the eye movements detected within a system can be converted for any control algorithm. Systems can involve scaling eye movement into pixel locations, used as binary control inputs or state machine variable control inputs. A feature extraction method was also presented that discriminated between valid saccadic eye movements as well as eye blinks. It was shown that blinks are an easily exploitable attribute of the eyes and subsequently the EOG signals since they occur only on the vertical and display different characteristics than those of simple eye movements. Blinks can then be used for control signals such as mouse clicks or on/off control instructions etc.

Applications opportunities based on these EOG signal acquisition, calibration and feature extraction techniques could easily be applied to any system where human interaction is involved including communication, mobility and control systems.

Position Sensing

Introduction

Position and movement sensing applications are wide ranging, systems include occupancy monitoring for energy efficiency [28], smart homes and care of the elderly [29-31] to security and offender management [32]. Of particular interest with respect to Human Machine Interfacing is hand position and gesture sensing applications [33-38]. Hand tracking can be achieved by many different methods, these include wearable sensors such as accelerometers and bending sensors [39], external sensors including video cameras [34], infrared proximity sensors [35] and electric field sensors[38]. The most prevalent technology used in hand tracking systems is the use of video due its commercial availability, however it comes with drawbacks in terms of overall usability. These systems require the user to stand directly in front of the sensing area, with the hand placed in a position easily detected by the camera. If the ambient light changes, a rapid hand movement occurs or other skin coloured objects are detected within the scene the system will cease to operate as desired [33]. Infrared reflection techniques suffer the same disadvantages, although they can be minimised by increasing the number of infrared receivers used but this increases cost and the working range is still small compared to that of video tracking systems [35]. While wearable sensors can overcome these disadvantages, they come with drawbacks of their own. They require the user to wear electronic sensors, which need either lengthy cables or batteries to power for functionality. When running on batteries, they will be limited to the usable timeframe of the battery life, three hours as quoted in [40]. Larger, longer life batteries can be used; however there will be a trade off in size and weight. The use of electric field sensors has been well known since Leon Theremin's musical instrument [41]. There was a large pause in development of this technology, up until the 1990's when researchers wanted to overcome the problems discussed above and the field sensing technologies became more affordable [42]. Electric field sensing has advantages over the other technologies; they do not rely on line-of-sight, thus can detect movement within any light or dark space. The sensors rely solely on detecting changes electric fields, the processing power and data storage necessary is far smaller than that of infrared and video systems [42]. This method of hand tracking does not require the user to wear any type of sensor, although a system has been developed where the user wears an electrode array [38]. Small scale, 5 to 150 square centimetre applications exist consisting of a transmitter and a receiver electrode, where a signal is induced into the tracking area creating a dipole field. The signal strength at the receiver varies as the hand moves further into the field, using this configuration absolute position sensing is achievable [42]. While this technology overcomes many of the problems associated with video, infrared and wearable tracking technologies it requires additional power consumption and electronic complexity for the transmitting and receiving electrodes [43]. The Electric Potential Sensor has already been proven capable of recognizing human movement passively by detecting changes in the amplitude of the ambient electric field of 50Hz [44,45], not requiring an induced electric field. Absolute position sensing was achieved in room size space, again only using perturbations arising from human movement in the ambient electric field [46]. This chapter investigates the scalability of the EPS position sensing system for use as a hand tracking application.

Position Sensing with the EPS

Method

In the case of the movement sensing presented in [46], an assumption was made that the object of interest is conducting and in contact with the earth. This assumption allows for the realisation that as the earthed conductor, or human subject moves around, distortion to the ambient electric field will be significant. Due to the ultra high input impedance of the EPS we can consider it as an ideal voltage meter such that it has no effect on the ambient electric field lines. Then as the conducting object approaches and then moves away, the sensor output should vary in amplitude by the near field fall off relationship of 1/, where is the distance between the sensor and the conducting object. This same assumption can be applied for the hand position sensing experiment; hand position may then be inferred by the respective output amplitudes of sensors A and B as shown in Figure 2..

Figure 2. Theoretical hand position diagram

The voltage at sensor outputs A and B are then defined by Equations (2.

(2.)

where and are the sensor output voltages, and are normalisation constants. If the sensors are placed symmetrically such that the two simultaneous equations can be solved for :

(2.)

Constants and are determined by taking a measurement of the ambient 50Hz signal with no conducting presence in the sensor test area. This equation is further simplified since small deviations about the origin the denominator product will be approximately constant:

(2.)

Based on this theoretical approach, an experiment was designed to measure hand position in an active area 300mm in length. Two electric potential sensors were used for this experiment. The sensors provided x10 amplification over a bandwidth from 30Hz to 20kHz. The outputs of the sensors were digitised at a rate of 5 kSamples/sec into 16-bit signals by a National-Instruments USB-6212 real-time data acquisition card. A 3rd order Butterworth bandpass software filter was applied to each channel individually, with a bandwidth from 45Hz to 65Hz. The narrow bandwidth was chosen since we were only interested in the perturbations of the 50Hz ambient field. A program was developed using Labview software (Appendix B - Position Sensing Labview Code) to view RMS displacement voltages occurring when a hand was placed in the active measurement area. Real-time data was also recorded to data files for further off-line processing. Figure 2. shows a block diagram of the experimental set up.

Figure 2. Position sensing block diagram

Results

Normalisation and differential measurements were performed in software; results shown in Figure 2. correspond to a hand movement from left to right (Sensor A towards Sensor B) in the test area at 30mm increments. The hand was placed in the sensor area with the palm to the side for reference to the increments. Since the sensors are normalised to the ambient electric field, the position equidistant from both sensors A and B corresponds to the origin, zero millimetres.

Figure 2. Differential measurement on horizontal plane across field of interest for 30mm incremental hand positions

The results show a linear relationship between ±90mm, with a standard deviation of 5mm. The increase in amplitude change at the extremities of sensor test area result due to the hands proximity to the sensors. As the hand approaches the sensor, the reduction of the 50Hz amplitude corresponds to the region where a steeper gradient is evident on the 1/ curve.

Similar measurements were taken on the vertical plane, the sensor test area was rotated 90° and a hand, palm facing towards the body, was placed at 30mm increments from sensor A to sensor B. Sensor B was positioned such that it was closer to the body of the subject than sensor A. Results are shown in Figure 2., it can be seen that when the hand is closer to sensor A a similar change in sensitivity occurs with the hand in close proximity to the sensor as in the horizontal experiment. When the hand is brought closer to sensor B and subsequently closer to the body, the sensitivity of the sensor is reduced due to the proximity of the large conductive body behind the sensor. Like the horizontal results, the vertical show a linear relationship between ±90mm, with a standard deviation of 15.23mm.

Figure 2. Differential measurement on vertical plane across field of interest for 30mm incremental hand positions

Two Dimensional Position Sensing

With the experimental relationship between target position and differential output investigated and supported by the results on both one dimensional planes x and y, the next logical step was to combine the two axes to create a two dimensional hand sensing area. The immediate thought was to configure an area to support the triangulation of hand position using two pairs of sensors (Figure 2.), each pair defining a measurement axis.

Figure 2. Four sensor two-dimentional hand tracking configuration

A two point calibration technique was used in order to normalise the system, first measurements of the ambient electric field with no conducting objects in the target area were taken to determine constants for each sensor. The user was then instructed to place their hand midpoint between sensor pair A,B and sensor pair C,D. Sensor maximums were calculated to be twice the midpoint measurements, each sensor value was normalised to these maxima before differential measurements were carried out. Target position was then then displayed on an XY Graph on a custom Labview GUI (see Appendix B - Position Sensing Labview Code). Initial observations using this configuration were that the sensitivity was limited; this was due to the fact that the hand is not an ideal target. The hand has complex geometric properties and is attached to an arm; it is not just an earthed conducting object within the target area. When the hand was moved around the target area there were positions where the arm saturated one of the sensors, mainly sensor D closest to the body, resulting in an inaccurate estimation of target position. There were also issues inferring target position in the four corners since the hand was not placed between either of the deterministic sensor pairs. Figure 2. shows the resulting position vectors inferred by the four sensor configuration, when the hand was traced horizontally across the sensing area at three vertical intervals. The vertical intervals relate to normalised y positions, +0.6, 0 and -0.6, at the centre of the palm.

Figure 2. Inferred hand position for four sensor configuration

To overcome the issues displayed by the four sensor configuration, the use of a single y-axis sensor was investigated. Figure 2. shows the results of a single sensor one dimensional experiment where the hand was moved away from the sensor at 30mm increments. The sensing area was positioned directly in front of the user, with the palm was facing towards the body.

Figure 2. Single sensor measurements on vertical plane across field of interest for 30mm incremental hand positions

The results show a clear 1/ reduction in RMS field voltage using a single sensor, with a standard deviation of 20.88mm. While this is not a linear response as is evident using the differential measurements it is still a useful relationship. Using a single sensor has the advantage that disturbances in the far field area will have little to no effect on the hand sensing area, thus a person moving around behind or beside the user will not cause interference to the system. In order to exploit the single sensor response to hand movement a more complex position determination technique was required to get as close as possible to a linear relationship between hand position and sensor voltage. The single sensor was calibrated in the same way as the differential pair, two points were measured: first the ambient field to determine, second the hand is placed at centre point in the target area, this measurement is taken to be half the sensor maximum voltage for the sensing area. The normalised signal of the single sensor response is centred on positive one, swinging between the limits of zero and two. For appropriate position inference this needs to be centred about zero corresponding to the origin of the sensing area, this scaling is achieved by subtracting one from the normalised sensor value. The positive vertical portion of the signal is capable of adequate coverage of the sensor area; however the negative vertical portion needs additional scaling to reach the bottom extremity of the sensing area. It was found that an additional scaling of times four was sufficient to cover the sensing distance required. Full vertical coverage was achieved by employing the following relationships:

(2.)

(2.)

Figure 2. shows the results of normalising and calibrating the single sensor response from Figure 2. using the relationships described by Equations (2. and (2.. We can see the results are close to a linear model, with a standard deviation of 11.19mm.

Figure 2. Normalised and calibrated vertical sensor response

Following suitable results for inferring hand position using a single sensor an improved two-dimensional sensor configuration was developed, shown in Figure 2.. Three single vertical sensors were employed, along with a pair of differential sensors for horizontal position detection; the full coverage sensing area was 300mm by 250mm.

Figure 2. Five sensor two-dimentional hand tracking configuration

Calibration of this system was achieved by the same two point calibration technique described throughout this chapter. An amplitude priority algorithm was developed to accomplish accurate position determination for full coverage over the sensing area. To achieve full coverage the sensing area was divided up into quadrants in order to make inferences of where the hand was. Figure 2. shows the control algorithm flowchart outlining the priority method, Labview code can be found in Appendix B - Position Sensing Labview CodeAppendix A - EOG Labview Code. If x position is negative, then only sensor Y1 or Y2 can infer hand position on the y axis. When x position is positive, only sensor Y2 or Y3 can infer y position. The vertical sensors are scaled as per Equations (2. and (2. based on whether the hand is in a positive or a negative y position. Normalised hand position was displayed on an XY Graph on a custom Labview GUI. Figure 2. Inferred hand position for five sensor configuration shows the resulting position vectors inferred by the five sensor configuration, when the hand was traced horizontally across the sensing area at three vertical intervals. The vertical intervals relate to normalised y positions, +0.6, 0 and -0.6, at the centre of the palm. It can be seen from the graph there is a clear cross over point between the positive and negative x halves of the sensing area. It should be noted that the appearance of noise on the signals is due to using single sensors on the y axis, the noise increases as the hand is positioned further from the sensors. This common mode noise does not appear in the four sensor configuration due to the differential measurements. The trace at +0.6 on the y axis shows some non-linearity which is consistent with relationship shown in Figure 2..

Figure 2. Absolute position determination flowchart

Figure 2. Inferred hand position for five sensor configuration

Mouse Control

To finalise the absolute hand position tracking system for use as a Human Machine Interface, interaction with a mouse cursor was preferred over a simple XY graphical implementation. Labview allows for communication with the computer operating system such that mouse cursor movement and click functionality can be controlled. The first requirement for scaling the x,y positions into pixel locations was to determine the screen resolution of the device being used, this was accomplished using Labview. Once determined, the x,y position was re-scaled to lie within a range between zero and one, then multiplied by the screen resolution resulting in an on screen pixel location. Generating a mouse click event was achieved by holding the cursor, or hand, in a sensitive area for the correct amount of time. The sensitive area was scalable such that it could change with pixel resolutions on different machines, optimum results were achieved with the active click pixel area between 50 and 100 square pixels. The time required to hold the position to initiate a mouse click was 500ms. However these variables could be changed for more or less resolution. Figure 2. shows the program flowchart for controlling the mouse cursor, the Labview code can be found in Appendix A - EOG Labview Code.

Figure 2. Mouse pointer control flowchart

Usability Testing

Due to the complex physical properties of the hand it is difficult to quantify errors for this system, thus a usability study was completed. A group of 15 students at the University of Sussex volunteered to try the hand tracking Human Machine Interface. The usability test consisted of using the mouse control interface and a large icon predictive text typing program called Dynamic Keyboard shown in Figure 2.. Click sensitivity was set to 100 square pixels and the click time was set to 500ms. All tests were carried out in an open laboratory.

Figure 2. Screen shot of the Dynamic Keyboard, developed by CanAssist [47]

Students were given a short demo of the mouse control system with instructions on how the mouse click functionality was implemented. The system was calibrated to the user and they were given as much time as they needed to be comfortable with the interface and the Dynamic Keyboard functionality. Three students were standing up when using the system, the remaining students sat in a chair and either positioned themselves directly in front of the sensing area or slightly to the side. The task was first to spell 'hello' with no errors, the second task was to spell the sentence 'the car is red'. Users were asked to rate the system numerically between one and five (1=very easy, 5=very difficult) in terms of overall ease of 'Learning', 'Control', and 'Comfort'. All users were also asked some questions on the cursor coverage area, the click sensitive area and the click event timing requirement. Results of the usability test are tabulated below. All users were able to spell 'hello' with no errors in under one minute, the sentence was a more difficult task with 75% of users taking up to five minutes to complete, while the remaining 25% were unable to complete or gave up on the task. All users commented that spelling out a sentence with their hand hovering for mouse functionality caused tiredness in their arm. Additionally, all users agreed that the click sensitive area and time could be decreased once familiar with the functionality of the system.

Table 2. Results of hand sensing mouse control usability test

Score

Metric

1

2

3

4

5

Total

Learning

5

7

3

77.3%

Control

2

6

5

2

69.3%

Comfort

7

4

3

1

37.3%

Total score for each of the categories was determined by multiplying each score by the number of users who chose the score. A maximum of 75 points is available for each category. From the results we can see that most of the users were happy with the learning process and the control of the system; however comfort of use of the system scored very low.

Discussion and Further Work

More on applications to healthcare, room monitoring for elderly, bed monitoring etc. to tie in with thesis title

In this chapter it has been demonstrated that the EPS is capable of a small scale hand position sensing application using an entirely passive measurement technique, by sensing changes to the ambient electric field. It has been shows that single axis hand position can easily be determined by a pair of sensors and a simple differential measurement. A two-dimensional hand tracking system was presented using two orthogonal differential sensor pairs, which presented some impractical limitations that were discussed. A single sensor solution was presented, and although the sensor response was non-linear with respect to hand position it was still a viable relationship to exploit for hand position sensing in a small area. Agreeable results were achieved using a five sensor configuration; a differential pair for horizontal positions detection, and three single y-axis sensors for vertical detection. Full coverage hand tracking over a 300mm by 250mm space was achieved in an open laboratory environment.

A method for controlling the mouse cursor was presented. Interfacing with a large icon predictive text typing software usability test was carried out. Results of the usability test show that the mouse control system was easy to learn and use. It also showed that ease of use increased over time. However, in general it was thought that this was not an ideal substitute for a mouse due to having to hover ones hand in the sensing area. This valuable feedback will lead onto further work on the system to incorporate gesture recognition capability where the hand would not have to complete complex and discrete movements for functionality.



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now