#Read Me - Smart Phone Gyro Data Analysis
###Brief Summary of Data and Methodology
For this project I was looking at data for smart phones gyro data. I took the data from the training and test sets and combined them to make a summarized, clean, and tidy set. Please see below for the highlevel steps and the more detailed steps of the analysis:
-
Read in the training data from the x_train file - This contained all the measurements but not the labels.
-
Read in the features file which included the column headings for the measurement data... Used this to set the column names to something descriptive
-
Read in the subjects of the training data from the subject_train.txt
-
Read in the activities of the training data from the Y_train.txt file.
-
NOTE: It is important to note here that each entry in the subjects and activities corresponds to the row in the training data
-
Combined the subjects and activity ids together into a single dataframe using cbind
-
Combined the subject/activity data to the actual data measurements using cbind
-
Repeated the above process for the test data
-
Combined the test and training data using rbind
-
Found the necessary columns using partial matching for both mean and std
-
NOTE: Only matched on mean() and std() which does not included cases where mean or std could appear earlier in the name... this was so we only got the pure measurements of mean
-
Combined the mean and std deviation vectors and turned the first two columns to true as well... Then made this back into a logical
-
Applied the logical vector to the data set to filter out the unncessary columns
-
Read in the activity labels
-
Merged the labels using the activity ids as the key
-
Re-arranged the columns to get it into proper order
-
Wrote the complete data to a file
-
Used the aggregate function to calculate the mean of each measurement for each subject and activity
-
Wrote this data to a file
-
This resulted in a 180x68 data frame
###Data Summary Please see the code book for a detailed summary of the data fields
#Original Read Me
================================================================== Human Activity Recognition Using Smartphones Dataset Version 1.0
Jorge L. Reyes-Ortiz, Davide Anguita, Alessandro Ghio, Luca Oneto. Smartlab - Non Linear Complex Systems Laboratory DITEN - Università degli Studi di Genova. Via Opera Pia 11A, I-16145, Genoa, Italy. [email protected] www.smartlab.ws
The experiments have been carried out with a group of 30 volunteers within an age bracket of 19-48 years. Each person performed six activities (WALKING, WALKING_UPSTAIRS, WALKING_DOWNSTAIRS, SITTING, STANDING, LAYING) wearing a smartphone (Samsung Galaxy S II) on the waist. Using its embedded accelerometer and gyroscope, we captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz. The experiments have been video-recorded to label the data manually. The obtained dataset has been randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data.
The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details.
- Triaxial acceleration from the accelerometer (total acceleration) and the estimated body acceleration.
- Triaxial Angular velocity from the gyroscope.
- A 561-feature vector with time and frequency domain variables.
- Its activity label.
- An identifier of the subject who carried out the experiment.
-
'README.txt'
-
'features_info.txt': Shows information about the variables used on the feature vector.
-
'features.txt': List of all features.
-
'activity_labels.txt': Links the class labels with their activity name.
-
'train/X_train.txt': Training set.
-
'train/y_train.txt': Training labels.
-
'test/X_test.txt': Test set.
-
'test/y_test.txt': Test labels.
The following files are available for the train and test data. Their descriptions are equivalent.
-
'train/subject_train.txt': Each row identifies the subject who performed the activity for each window sample. Its range is from 1 to 30.
-
'train/Inertial Signals/total_acc_x_train.txt': The acceleration signal from the smartphone accelerometer X axis in standard gravity units 'g'. Every row shows a 128 element vector. The same description applies for the 'total_acc_x_train.txt' and 'total_acc_z_train.txt' files for the Y and Z axis.
-
'train/Inertial Signals/body_acc_x_train.txt': The body acceleration signal obtained by subtracting the gravity from the total acceleration.
-
'train/Inertial Signals/body_gyro_x_train.txt': The angular velocity vector measured by the gyroscope for each window sample. The units are radians/second.
- Features are normalized and bounded within [-1,1].
- Each feature vector is a row on the text file.
For more information about this dataset contact: [email protected]
Use of this dataset in publications must be acknowledged by referencing the following publication [1]
[1] Davide Anguita, Alessandro Ghio, Luca Oneto, Xavier Parra and Jorge L. Reyes-Ortiz. Human Activity Recognition on Smartphones using a Multiclass Hardware-Friendly Support Vector Machine. International Workshop of Ambient Assisted Living (IWAAL 2012). Vitoria-Gasteiz, Spain. Dec 2012
This dataset is distributed AS-IS and no responsibility implied or explicit can be addressed to the authors or their institutions for its use or misuse. Any commercial use is prohibited.
Jorge L. Reyes-Ortiz, Alessandro Ghio, Luca Oneto, Davide Anguita. November 2012.