Graduation Year

2019

Document Type

Dissertation

Degree

Ph.D.

Degree Name

Doctor of Philosophy (Ph.D.)

Degree Granting Department

Geography

Major Professor

Joni Firat, Ph.D.

Committee Member

Steven Reader, Ph.D.

Committee Member

Philip Van Beynen, Ph.D.

Committee Member

Elizabeth Schotter, Ph.D.

Keywords

Eye Tracking, GIScience, Similarity Analysis, Spatiotemporal

Abstract

Similarity analysis is the subject of several studies that primarily collect, process, and analyze movement data of various types, including animals, humans, vehicles, hurricanes, and eye tracking. The movement studies often focus on characteristics of a mobile entity (moving object) over time and space. Such studies usually are interested in tracking the changes to the moving object. Therefore, the size of the object becomes irrelevant. Each mobile object is considered a moving point through time and space. The points have all two attributes in common which are locations and timestamp of those locations. Despite the similarities between the movement datasets, to date, there is little to no evidence to suggest the existence of collaborative research. This dissertation embarks on a task to bridge the gap between various studies of movement similarity. The goal is to create a method that can be applied to several types of datasets, including Geospatial (i.e., animal movements) and non-geospatial moving entities (i.e., eye movements). By examining the existing methods of similarity analysis from scholars of the two types of moving entities, a framework is proposed, upon which, the Mobile Event Similarity Index (MESI) will be developed.

MESI is a method of analysis that quantifies the similarities between two or more mobile events. It measures how similar the moving datasets are based on many user-defined parameters. These parameters can be sample-based or trial-based. Some of the possible parameters include distance, velocity, direction, type of environment, the total length of trajectory, reaction times to external stimuli and many more. The method creates similarity index at various levels of local, total, and global for each identified parameter. The user has complete liberty of selecting desired parameters. Each of the similarity indices is a value between 0 and 1, reflecting low to high similarity rate. MESI is flexible to be applied to datasets with multiple trials, and in addition to each parameter, each individual trial will have a similarity rate as well. The combination of all the trial and parameter similarity indices are termed total similarity index, and overall similarity of the two moving datasets is termed global similarity index. The global similarity index is identified as Mobile Event Similarity Index (MESI) of two moving datasets. The method can be applied to a variety of moving datasets due to its flexibility.

In the current research, MESI is used for two different types of datasets. First, it is applied to a short birds’ tracking data, in which four parameters were selected, and MESI can produce results to measure the similarities of two female Mallards which were tracked over one hour. As expected, the results showed that Mallards have very similar velocity, when they are, in fact, moving, while the types of habitat they occupied during tracking were very different, and MESI can demonstrate the distinction. The second demonstration was performed on a full eye tracking dataset from a Visual World Paradigm tasks collected by a language perception lab. The data was collected from 64 people in 4 groups of 16, and each subject had a total of 36 trials. For this study, a total of forty-seven trial-based (global) parameters and eigh sample-based parameters were selected. Some of the trial-based parameters include first fixation duration, first fixation region, first fixation time, total time in each region, number of times eyes moved from each region to another, number of times eyes stayed in the same region, and many more. All the trial-based parameters are identified as Eye Movement Parameters in this research. The sample-based parameters are fixation duration, fixation time, saccade duration, saccade distance, saccade direction, and three saccade velocity parameters of average, minimum and maximum.

MESI can generate individual-level similarity indices for each pair of the datasets between all subjects, as well as local, total and global similarity indices for trials and parameters. The parameters that played a more critical role in similarities of various movement datasets are Eye Movements Parameter and Saccade Maximum Velocity and Saccade duration. They all have a higher average across all groups and trials. On the other hand, saccade direction and minimum saccade velocity showed a much lower average similarity and higher variability. Overall, this research demonstrated a robust method of spatiotemporal similarity analysis embedded in GIScience that provides multilevel results for analysis across different datasets. MESI has the potential to be applied to other types of movement datasets in many fields.

Share

COinS