Logo
  • Home
  • About
  • Blog
  • Portfolio
  • Areas of Interest
Get in touch
Logo

© 2023 Lukáš Dušek. Build with Notion.so & Super.so

LinkedInXFacebookInstagramGitHub

Capstone project: Bellabeat Case Study

Capstone project: Bellabeat Case Study

My first Data Analytics project. The first one just needs to exist; do not seek perfection!

‣
The Assignment (Challenge):

My Why:

To go through the whole cycle of a Data analytics project, which will allow me to:

  • To apply, consolidate, and fill the gaps in the knowledge I’ve gained during the study of the Google Data Analytics Professional Certificate | Coursera
  • To create a first Case study for my Portfolio and upload it to my website: www.lukasdusek.com
  • To complete the first run and further refine the Data Science Project template that I created in Notion while studying this course.
  • To have fun

"To be a data analyst is not only to be a scientist but that it's also to be an artist. The entire world is your canvas. ~ Rishie"

1st Phase: ASK

Goals of the ASK phase:

It’s impossible to solve a problem if you don’t know what it is. These are some things to consider:

  • Define the problem you’re trying to solve
  • Make sure you fully understand the stakeholder’s expectations
  • Focus on the actual problem and avoid any distractions
  • Collaborate with stakeholders and keep an open line of communication
  • Take a step back and see the whole situation in context

Defining the project:

What is the primary question?

  • How can we unlock new growth opportunities for the company? By analyzing smart device data to gain insight into how consumers are using their smart devices, particularly with one of Bellabeat’s products. These insights will help guide the marketing strategy for the company.:
    1. What are some trends in smart device usage?
    2. How could these trends apply to Bellabeat customers?
    3. How could these trends help influence Bellabeat’s marketing strategy?

Who are the primary and secondary stakeholders?

  • Primary:
    • Urška Sršen: Bellabeat’s co-founder and Chief Creative Officer
    • Sando Mur: Mathematician and Bellabeat’s cofounder; a key member of the Bellabeat executive team
  • Secondary:
    • Bellabeat marketing analytics team

What type of problem I am solving?

  • The objective is to identify themes and patterns in users' behaviour, using them as a basis to provide recommendations for marketing campaigns, products, and overall business strategies.
    • Identifying themes
    • Finding patterns
    • Making Predictions

How will I measure the success of the project?

Case Study:

  • At least one prediction or recommendation is derived from the analysis, which can be promptly implemented into the marketing strategy, followed by a test run of the campaign on a smaller scale.

Personal:

  • The project is finished, published, and posted on Linked In.

What have I learned during this phase:

As I initially started with the assignment, I experienced a sense of overwhelm. However, as soon as I started following the Project Template I had created, the task became more manageable and broken down into smaller, approachable segments. Step by step.

Just as the saying goes:

"How do you eat an elephant? One bite at a time!"

I am grateful for the time and effort I invested in creating this template; tackling the assignment would have been significantly more challenging without it.

2nd Phase: PREPARE

Goals of the PREPARE phase:

To decide what data needs to be collected to answer project questions and how to organize them so that they are useful.

  • What metrics to measure
  • Locating data in the database
  • Creating security measures to protect that data

Questions to ask yourself in this step:

1. What do I need to figure out how to solve this problem?

2. What research do I need to do?

Collecting and Organising data:

Data collection and organization were minimal in this case, given that a dataset was provided with the case study assignment.

About dataset Fitbit Fitness Tracker Data:

This dataset was generated by respondents to a distributed survey via Amazon Mechanical Turk between 03.12.2016 and 05.12.2016. Thirty eligible Fitbit users consented to the submission of personal tracker data, including minute-level output for physical activity, heart rate, and sleep monitoring. Individual reports can be parsed by export session ID (column A) or timestamp (column B). Variation between output represents the use of different types of Fitbit trackers and individual tracking behaviours/preferences.

Securing data:

I skipped this step as the dataset is public.

Inspecting data:

The main focus of this phase for this concrete project was to inspect the provided data. This gave me an overall idea of what kind of data I'm working with.

My inspection consisted of:

  • Content of datasets
  • Source
  • Data format
  • Data structure
  • Different types of Bias
  • ROCCC
  • Data ethics and privacy & Open Data
  • Anonymization
  • Tool Used

The results are shown in the table below (scroll from left to right to see the whole table)

Data
Source
Storage
Data Format
Data Structure
Sampling bias?
Observer bias?
Interpretation bias?
Confirmation bias?
Bias Free?
ROCCC?
Ethics & Privacy
Anonymization
Tool Used:
dailyActivity - Quantitative (There are data about: time, intesity, distance calories. Now data about quality nor type of activity) dailyCalories- Quantitative (Number of Calories) dailyIntensities - Quantitative (Intensities, Distance) dailySteps - Quantitative (Total Steps) sleepDay - Quantitative (Total sleep time, total bed time, number of sleep session in day) weightLogInfo - Quantitative (Weight KG/Pounds, Fat (just in two entries - unusable), BMI; - lacking watter, muscle mass, bones) hourlyCalories - Quantitative (Number of calories) hourlyIntensities - Quantitative (Total, Average Intensities) hourlySteps - Quantitative (Number of steps minuteCaloriesNarrow - Quantitative (Number of calories) minuteCaloriesWide - Quantitative (Number of calories) minuteIntensitiesNarrow - Quantitative (Intensity per minute) minuteIntensitiesWide - Quantitative (Intensity per minute) minuteMETsNarrow - Quantitative (Metabolic equivalent of task - one MET energy used while sitting quietly - is used to indicate intensity of of the activity) minuteSleep - Quantitative (Sleep in minutes, we lack of the sleep “deepness” if it was interrupted, etc..) minuteStepsNarrow - Quantitative (Steps per minute) minuteStepsWide - Quantitative (Steps per minute) heartrate_seconds - Quantitative (Heart rate)
FitBit Fitness Tracker Data | Kaggle CC0: Public Domain
Personal Google Drive
- Secondary Data - Continuous + Discrete Data - Quantitative Data - Nominal
Structured data .csv/.xlsx
Yes: The source do not provide detailed information How the sample was created and thus I’ll work with assumption that: dataset is sampling biased
No: The data is gathered by trackers no Observation bias
No: The data is gathered by trackers no Interpretation bias
No: The data is gathered by trackers no Confirmation bias
No: For lack of detailed information about sampling we are working with assumption that samplig bias is present
Reliable = N/A Missing detailed information about the way the sample was created Original = Yes Comprehensive = N/A (for the first portfolio case study yes, In real world I would look for additional data sources) Current = No (03.12.2016-05.12.2016) Cited = No
• Ownership: Publicly shared anonymized • Transaction transparency: - • Consent: - • Currency: - • Privacy: - • Openness: Yes
Yes
Amazon Mechanical Turk between 03.12.2016-05.12.2016

What have I learned during this phase:

An initial inspection of the dataset provides an overview and can early indicate whether the dataset is suitable for analysis. Given the out-of-date and uncertainty surrounding it in this particular case, in a real-life scenario, I would consider seeking an alternative dataset. However, since the primary objective of this case study is its completion, I will proceed to the next step.

3rd Phase: PROCESS

Goals of the PREPARE phase:

A strong analysis depends on the integrity of the data and clean data is the best data. In this phase, the main focus is to clean up the data to get rid of any possible errors, inaccuracies, or inconsistencies. Also at this stage is crucial to get completely familiar with the dataset and identify its potential and limitations.

  • Checklists are most helpful to be thorough.
  • Keeping track of the changes with the Changelog
‣

Snippet of the Change-log

CCS2: Change Log

Category
Change
Version Number
Date of the Change
Description of what change
Changer
Approver
Reason 4 Change
Changed
Date format to YYYY-MM-DD HH:MM:SS AM/PM

all datatables: v01

November 2, 2023

Change date format of the datatables to YYYY-MM-DD. If including time addition: HH:MM:SS AM/PM

Lukáš Dušek
Lukáš Dušek

To use universal format of time

Added
Filtering to the first (header) row

all datatables: v01

November 2, 2023

Added filtering to the header row

Lukáš Dušek
Lukáš Dušek

To be able to check and filter

Removed
Removed column Fat

weightLogInfo v01

November 2, 2023

There were just two entries within this column, there is no way I can gather this info. So deteled

Lukáš Dušek
Lukáš Dušek

To get rid of missing values

Removed
Removed Weight in Pounds coulmn

weightLogInfo v01

November 2, 2023

There were two weight columns in weightLogInfo. One in kg and one in pounds

Lukáš Dušek
Lukáš Dušek

I will work with kg.

Changed
Columns with distance rounded to two decimals

all tables with distance coulmns

November 2, 2023

In all tables were are columns with distance I have rounded it to two decimals

Lukáš Dušek
Lukáš Dušek

we do not need so high resolution in distance for this analysis. In this way will be easier to work with

Changed
Weight rounded to two decimals

weightLogInfo v01

November 2, 2023

Weight column rounded to two decimals

Lukáš Dušek
Lukáš Dušek

we do not need so high resolution in weight for this analysis. In this way will be easier to work with

Changed
Calories rounded to four decimals

All tables with calories v01

November 2, 2023

Calories rounded to four decimals

Lukáš Dušek
Lukáš Dušek

Probably two decimals would be sufficient, but wanted try four to to see how It will be to work with. We do not need so high resolution in calories

Added
New tab with copied data: NameOfTable_transformed

All tables v01(except the hourly data tables, those were copied to source folder)

November 2, 2023

Additional tab in spreadsheets

Lukáš Dušek
Lukáš Dušek

To to work with data efficiently while keeping the source data Except the data tables with hourly data which are to large in this case I have archived the original spreadsheet in source folder

Changed
Renamed the orinal tab to: NameOfTable_source

All tables v01(except the hourly data tables, those were copied to source folder)

November 2, 2023

Lukáš Dušek
Lukáš Dušek

To keep the source data intakt to be able to compare with the transformed data

Removed
Removed User: 4057192912

From all tables 01v

November 3, 2023

From this user we have only few entries (4 in Daily). So 33 users - 1 = 32

Lukáš Dušek
Lukáš Dušek

Added
DayOfWeek - Column

Daily activities merge

November 3, 2023

Added Day of the week so we can follow how active people are based on a day

Lukáš Dušek
Lukáš Dušek

November 3, 2023

Lukáš Dušek
Lukáš Dušek

Added
Added atribute SumUpSleepActivities

CCS2_dailyActivityMerged_v02

November 3, 2023

To track if time in bed sums up with other activities for 24 Hours

Lukáš Dušek
Lukáš Dušek

Removed
Removed Entries with missing data

CCS2_dailyActivityMerged_v02

November 3, 2023

Removed all entries [75] that had missing data (all data were missing expet Sedetery time)

Lukáš Dušek
Lukáš Dušek

Removed
Removed outliers

CCS2_dailyActivityMerged_v02

November 3, 2023

Removed extreme entries [4]. With very low data (very low steps count with combinanation Sum up of all activities times and Sleep)

Lukáš Dušek
Lukáš Dušek

Changed
Calaculated the SumUpTimes

CCS2_dailyActivityMerged_v03

November 3, 2023

Times of many entries does no add up to 1440 when calculated. Some of them were over and some of them were under the times. =IFS(U8>1440, O8-(U8 -1440), U8<1440, O8+(1440-U8), U8=1440, O8)

Lukáš Dušek
Lukáš Dušek

Removed
Removed Additional Outliers (Steps & Calories - With 1440 sedetary minutes)

CCS2_dailyActivityMerged_v04

November 3, 2023

Removed outlieres [7], entries that had counted steps and calories with 1440 minutes of sedetary activity

Lukáš Dušek
Lukáš Dušek

November 3, 2023

Lukáš Dušek
Lukáš Dušek

Removed
Removed Additional Outliers (Entries with extremly low number of Seps)

CCS2_dailyActivityMerged_v04

November 3, 2023

Removed outlieres [8], entries that had extremly low number of sterps (<100).

Lukáš Dušek
Lukáš Dušek

Added
Column SedeteryWithSleep

CCS2_dailyActivityMerged_v04

November 3, 2023

Because we do not have all entries with In bed time and Sleeping time. Created addtional Column SedetaryWithSleep. To be able to compare all data

Lukáš Dušek
Lukáš Dušek

3.1. Data integrity & alignment:

Data Constraint
Initial State
Final State Check
Definition
Examples
Data type/format
⛔ Some of the tables were not in the same format as the rest
✅
Values must be of a certain type: date, number, percentage, Boolean, etc.
If the data type is a date, a single number like 30 would fail the constraint and be invalid
Data range
✅
✅
Values must fall between predefined maximum and minimum values
If the data range is 10-20, a value of 30 would fail the constraint and be invalid
Mandatory
✅
✅
Values can’t be left blank or empty
If age is mandatory, that value must be filled in
Unique
✅
✅
Values can’t have a duplicate
Two people can’t have the same mobile phone number within the same service area
Regular expression (regex) patterns
✅
✅
Values must match a prescribed pattern
A phone number must match ###-###-#### (no other characters allowed)
Cross-field validation
✅
✅
Certain conditions for multiple fields must be satisfied
Values are percentages and values from multiple fields must add up to 100%
Primary-key
⏸️
⏸️
(Databases only) value must be unique per column
A database table can’t have two rows with the same primary key value. A primary key is an identifier in a database that references a column in which each value is unique. More information about primary and foreign keys is provided later in the program.
Set-membership
⏸️
⏸️
(Databases only) values for a column must come from a set of discrete values
Value for a column must be set to Yes, No, or Not Applicable
Foreign-key
⏸️
⏸️
(Databases only) values for a column must be unique values coming from a column in another table
In a U.S. taxpayer database, the State column must be a valid state or territory with the set of acceptable values defined in a separate States table
Accuracy
✅
✅
The degree to which the data conforms to the actual entity being measured or described
If values for zip codes are validated by street location, the accuracy of the data goes up.
Completeness
✅/⛔ weightLogInfo - only 8 respondends sleepDay - only 24 respondents DailyActivity - many of entries have 0 values and 1440 minutes of sedentary minutes. This either mean that in that days they were sitting whole day 😃 or the data was not measured
✅/⛔ weightLogInfo - only 8 respondents sleepDay - only 24 respondents Will Not use Weight
The degree to which the data contains all desired components or measures
If data for personal profiles required hair and eye color, and both are collected, the data is complete.
Consistency
✅
✅
The degree to which the data is repeatable from different points of entry or collection
If a customer has the same address in the sales and repair databases, the data is consistent.
Question
Check State
Are the data aligned with the Objective?
✅ - Yes we can draw some conclusions
Are there some other valuable Variables?
⛔ For this sample No
Are there some missing Variables?
✅/⛔ For this case study there is enough, but for more thorough analysis we would benefit from additional variables
Are there some alternative Variables
⛔ for this sample no
Can/Should be the objective expanded/modified based on current data?
✅ The objective is very broad for the available data

3.2. Data Insufficiency & Errors Decision tree

Insufficiencies

  • Comes from only one source = Yes, it is from one source
  • Continuously updates and is incomplete = No, data is static
  • Is outdated = Yes, data is from 2016 (People’s behaviour probably changed during the pandemic)
  • Is geographically limited = Yes/No - We do not have the information about the regions, that the sample is draw from

To deal with insufficient data, we can:

  • Identify trends within the available data
  • Wait for more data if time allows
  • Discuss with stakeholders and adjust your objective
  • Search for a new dataset
DATA ERRORS
If Yes
If No
Can you fix or request a corrected dataset?
Perform the analysis after the data has been corrected
👇🏻
Do you have enough data to omit the wrong data?
Perform the analysis without the wrong data
👇🏻
NOT ENOUGH DATA
Can you proxy the data?
Perform the analysis with the proxied data
👇🏻
Can you collect more data?
Perform the analysis after data collection
Modify the business objective if possible

3.3. Calculating Sample Size

‼️
As we do not have information about the location from which the sample was drawn from, for this exercise I will assume that is for women from European Union ages between 15-64 - The population size is = 142572761 (Source: https://data.worldbank.org/indicator/SP.POP.1564.FE.IN?locations=EU) - For this population, we would need a sample size of 96 respondents to achieve a 95% confidence level with a 5% percent Margin of error. And 91 for a 90% confidence level with a 5% margin error - To calculate margin of Error I will choose a confidence level at 90%
POPULATION
Confidence Level
All_Other(33) = 90% sleepDay(24) = 90% weightLogInfo(8) = 90%
Margin of Error
All_Other(33) =14,37% sleepDay(24) =16,85% weightLogInfo(8) = 29,17%
Sample Size
All_Other = 33 sleepDay = 24 weightLogInfo = 8

3.4. Cleanup Checklist: Check for & Clean Dirty Data

Steps/Type
Status
Description
Possible Causes
Potential harm to businesses
Back up your data prior to data cleaning
Done

It is always good to be proactive and create your data backup before you start your data clean-up. If your program crashes, or if your changes cause a problem in your dataset, you can always go back to the saved version and restore it. The simple procedure of backing up your data can save you hours of work-- and most importantly, a headache.

Document errors
Done

Documenting your errors can be a big time saver, as it helps you avoid those errors in the future by showing you how you resolved them. For example, you might find an error in a formula in your spreadsheet. You discover that some of the dates in one of your columns haven’t been formatted correctly. If you make a note of this fix, you can reference it the next time your formula is broken, and get a head start on troubleshooting. Documenting your errors also helps you keep track of changes in your work, so that you can backtrack if a fix didn’t work.

Keep track of business objectives
Done

When you are cleaning data, you might make new and interesting discoveries about your dataset-- but you don’t want those discoveries to distract you from the task at hand. For example, if you were working with weather data to find the average number of rainy days in your city, you might notice some interesting patterns about snowfall, too. That is really interesting, but it isn’t related to the question you are trying to answer right now. Being curious is great! But try not to let it distract you from the task at hand.

Account for data cleaning in your deadlines/process
Done

All good things take time, and that includes data cleaning. It is important to keep that in mind when going through your process and looking at your deadlines. When you set aside time for data cleaning, it helps you get a more accurate estimate for ETAs for stakeholders, and can help you know when to request an adjusted ETA.

Analyze the system prior to data cleaning
Done

If we want to clean our data and avoid future errors, we need to understand the root cause of your dirty data. Imagine you are an auto mechanic. You would find the cause of the problem before you started fixing the car, right? The same goes for data. First, you figure out where the errors come from. Maybe it is from a data entry error, not setting up a spell check, lack of formats, or from duplicates. Then, once you understand where bad data comes from, you can control it and keep your data clean.

Fix the Source of the error
Done

Fixing the error itself is important. But if that error is actually part of a bigger problem, you need to find the source of the issue. Otherwise, you will have to keep fixing that same error over and over again. For example, imagine you have a team spreadsheet that tracks everyone’s progress. The table keeps breaking because different people are entering different values. You can keep fixing all of these problems one by one, or you can set up your table to streamline data entry so everyone is on the same page. Addressing the source of the errors in your data will save you a lot of time in the long run.

Check the Size of the data set
Done

Check the Number of categories or labels
Done

Check the for The different data types
Done

Look for all of the Relevant data
Done

It is important to think about all of the relevant data when you are cleaning. This helps make sure you understand the whole story the data is telling, and that you are paying attention to all possible errors. For example, if you are working with data about bird migration patterns from different sources, but you only clean one source, you might not realize that some of the data is being repeated. This will cause problems in your analysis later on. If you want to avoid common errors like duplicates, each field of your data requires equal attention.

Check for Incomplete data
Done

Any data that is missing important fields

Improper data collection or incorrect data entry

Decreased productivity, inaccurate insights, or inability to complete essential services

Check for Missing values
Done

Missing values in your dataset can create errors and give you inaccurate conclusions. For example, if you were trying to get the total number of sales from the last three months, but a week of transactions were missing, your calculations would be inaccurate. As a best practice, try to keep your data as clean as possible by maintaining completeness and consistency.

Check for Duplicate data
Done

Any data record that shows up more than once

Manual data entry, batch data imports, or data migration

Skewed metrics or analyses, inflated or inaccurate counts or predictions, or confusion during data retrieval

Check for Outdated data
Done

Any data that is old which should be replaced with newer and more accurate information

People changing roles or companies, or software and systems becoming obsolete

Inaccurate insights, decision-making, and analytics

Check for Inconsistent data
Done

Any data that uses different formats to represent the same thing

Data stored incorrectly or errors inserted during data transfer

Contradictory data points leading to confusion or inability to classify or segment customers

Check for Incorrect/inaccurate data
Done

Any data that is complete but inaccurate

Human error inserted during data input, fake information, or mock data

Inaccurate insights or decision-making based on bad information resulting in revenue loss

Check for Spelling errors
Done

Misspellings can be as simple as typing or input errors. Most of the time the wrong spelling or common grammatical errors can be detected, but it gets harder with things like names or addresses. For example, if you are working with a spreadsheet table of customer data, you might come across a customer named “John” whose name has been input incorrectly as “Jon” in some places. The spreadsheet’s spellcheck probably won’t flag this, so if you don’t double-check for spelling errors and catch this, your analysis will have mistakes in it.

Check for Misfielded values
Done

A misfielded value happens when the values are entered into the wrong field. These values might still be formatted correctly, which makes them harder to catch if you aren’t careful. For example, you might have a dataset with columns for cities and countries. These are the same type of data, so they are easy to mix up. But if you were trying to find all of the instances of Spain in the country column, and Spain had mistakenly been entered into the city column, you would miss key data points. Making sure your data has been entered correctly is key to accurate, complete analysis. 

3.5. Verification Checklist: Comparing the original unclean data set with the clean one.

Measure twice, cut ones.
Verification Checklist
Status
Sources of errors: Did you use the right tools and functions to find the source of the errors in your dataset?
✅
Null data: Did you search for NULLs using conditional formatting and filters?
✅
Misspelled words: Did you locate all misspellings?
✅
Mistyped numbers: Did you double-check that your numeric data has been entered correctly?
✅
Extra spaces and characters: Did you remove any extra spaces or characters?
✅
Duplicates: Did you remove duplicates
✅
Mismatched data types: Did you check that numeric, date, and string data are typecast correctly?
✅
Messy (inconsistent) strings: Did you make sure that all of your strings are consistent and meaningful?
✅
Messy (inconsistent) date formats: Did you format the dates consistently throughout your dataset?
✅
Misleading variable labels (columns): Did you name your columns meaningfully?
✅
Truncated data: Did you check for truncated or missing data that needs correction?
✅
Business Logic: Did you check that the data makes sense given your knowledge of the business?
✅

3.6. Review the goal of the project

Once you have finished these data-cleaning tasks, it is a good idea to review the goal of your project and confirm that your data is still aligned with that goal. This is a continuous process that you will do throughout your project-- but here are three steps you can keep in mind while thinking about this:

Confirm the business problem
Confirm the goal of the project
Verify that data can solve the problem and is aligned with the goal
image

What have I learned during this phase:

The biggest finding during examining and cleaning the data set is that plenty of entries and data are missing. Many people do not track their daily activities consistently. I'll explore it more in the analysis and the following conclusion.

4th & 5th Phase: ANALYZE & SHARE

Goals of the ANALYZE & SHARE phase:

The focus is on thinking analytically about the data. At this stage, we might sort and format data to make it easier to:

  • Perform calculations
  • Combine data from multiple sources
  • Create tables with the results

Questions to ask yourself in this step:

  1. What story is my data telling me?
  2. How will my data help me solve this problem?
  3. Who needs my company’s product or service? What type of person is most likely to use it?

Everyone shares their results differently so we need to be sure to summarize our results with clear and enticing visuals of our analysis using data via tools like graphs or dashboards. It is our chance to show the stakeholders we have solved their problems and how we got there. Sharing will help the team:

  • Make better decisions
  • Make more informed decisions
  • Lead to better outcomes
  • Successfully communicate our findings

Questions to ask yourself in this step:

  1. How can I make 'what I present' to the stakeholders engaging and easy to understand?
  2. What would help me understand this if I were the listener?

I have decided to share three findings from my analysis: Tracking across the users; Relationships between tracker variables; and Hourly average intensity for the sample.

Tracking across the user

Here, I have looked closely into the findings from the previous step, revealing a significant absence of entries/data. The discrepancies in tracking daily activities are significant, with only 25% of respondents consistently tracking and 46.9% classified as 'High trackers,' able to monitor 21-30 days out of 31. Moreover, when examining sleep tracking, the disparities are even more pronounced; merely 9.4% of respondents tracked daily, with only 28.1% identified as 'High trackers.' Notably, 28.1% of respondents did not track their sleep at all.

Three questions that came to my mind:

  • How to help the users to be more consistent?
  • How to make tracking as simple and as frictionless as possible?
  • Why do people track their sleep much less than their daily activities?
image
image
Not Tracked (=0)
Low Tracker (>=1, <=10)
Moderate Tracker (>=11, <=20)
High Tracker (>=21, <=30)
All-time Tracker (=31)
Days Tracked
0
0
8
15
9
Days Sleep Tracking
9
8
3
9
3
USER (ID)
Entries for Month
Entries % Month
Asleep Entries Month
Asleep % Month
2026352035
31
100.00%
28
100.00%
4558609924
31
100.00%
5
100.00%
7086361926
29
100.00%
24
100.00%
6775888955
16
100.00%
0
90.32%
1644430081
30
100.00%
0
90.32%
2347167796
17
93.55%
14
90.32%
2022484408
31
96.77%
0
87.10%
2320127002
31
90.32%
1
83.87%
4388161847
30
96.77%
23
80.65%
4702921684
30
90.32%
27
80.65%
6962181067
31
93.55%
31
77.42%
8877689391
27
96.77%
0
74.19%
8053475328
30
74.19%
3
61.29%
6117666160
23
61.29%
19
48.39%
8583815059
25
54.84%
0
45.16%
4020332650
15
48.39%
8
25.81%
8792009665
19
100.00%
15
16.13%
2873212765
31
54.84%
0
16.13%
7007744171
24
96.77%
2
12.90%
1503960366
30
96.77%
25
9.68%
8378563200
31
58.06%
31
9.68%
5553957443
31
77.42%
31
6.45%
1624580081
30
100.00%
4
3.23%
8253242879
18
100.00%
0
0.00%
6290855005
24
100.00%
0
0.00%
4445114986
31
96.77%
28
0.00%
1927972279
17
87.10%
5
0.00%
5577150313
28
80.65%
26
0.00%
4319703577
28
77.42%
25
0.00%
3372868164
20
64.52%
0
0.00%
3977333714
29
58.06%
28
0.00%
1844505072
18
51.61%
3
0.00%
—————
———
————
———
————
Mean
26.125
84.27%
12.6875
40.93%
Median
29
93.55%
6.5
20.97%
image

Relationship analysis

Next, I was curious if any metrics shoved a relationship with sleep. Below we will look at two examples.

‣
Source Data Table for Relationship Analysis
USER (ID)
ActivityDate
DayOfWeek
Sedentary Minutes
Total Minutes Asleep
Total Steps
2026352035
2016-05-10
Tuesday
1043
357
254
1927972279
2016-04-13
Wednesday
986
398
356
5553957443
2016-04-17
Sunday
992
350
655
1927972279
2016-04-12
Tuesday
610
750
678
4445114986
2016-05-12
Thursday
881
483
768
2026352035
2016-04-17
Sunday
882
437
838
1927972279
2016-04-15
Friday
890
475
980
5553957443
2016-04-30
Saturday
513
775
1202
8792009665
2016-04-14
Thursday
853
486
1219
4319703577
2016-05-01
Sunday
873
484
1251
8792009665
2016-04-13
Wednesday
806
531
1320
6962181067
2016-04-14
Thursday
819
508
1551
8792009665
2016-05-01
Sunday
834
503
1619
1927972279
2016-04-28
Thursday
1167
166
1675
8792009665
2016-04-27
Wednesday
900
423
1758
5553957443
2016-04-24
Sunday
696
553
1807
8792009665
2016-05-02
Monday
916
415
1831
5553957443
2016-05-07
Saturday
874
442
1868
4020332650
2016-04-16
Saturday
1222
77
1982
4445114986
2016-04-19
Tuesday
895
388
2064
4445114986
2016-04-20
Wednesday
841
439
2072
8378563200
2016-04-17
Sunday
756
525
2132
4445114986
2016-04-17
Sunday
1219
98
2268
4319703577
2016-04-18
Monday
821
515
2276
8792009665
2016-05-04
Wednesday
848
439
2283
1624580081
2016-04-29
Friday
1163
119
2390
8792009665
2016-05-03
Tuesday
739
516
2421
2026352035
2016-04-19
Tuesday
759
498
2424
2026352035
2016-04-21
Thursday
773
477
2467
8792009665
2016-04-15
Friday
937
363
2483
2026352035
2016-04-16
Saturday
723
524
2547
8792009665
2016-04-12
Tuesday
831
458
2564
1844505072
2016-05-01
Sunday
397
590
2573
5553957443
2016-04-20
Wednesday
614
658
2713
4702921684
2016-05-12
Thursday
930
404
2752
2026352035
2016-04-22
Friday
733
520
2915
4445114986
2016-05-04
Wednesday
897
337
2923
8378563200
2016-05-08
Sunday
695
545
2943
8378563200
2016-04-30
Saturday
764
468
2946
4445114986
2016-04-13
Wednesday
840
370
2961
5553957443
2016-05-12
Thursday
903
438
3121
8792009665
2016-04-20
Wednesday
744
528
3147
4445114986
2016-04-12
Tuesday
787
429
3276
2026352035
2016-04-14
Thursday
675
545
3335
6117666160
2016-05-06
Friday
609
658
3365
4445114986
2016-04-25
Monday
916
328
3385
6117666160
2016-04-28
Thursday
883
393
3403
1624580081
2016-05-08
Sunday
1134
137
3427
4558609924
2016-05-01
Sunday
1121
115
3428
2026352035
2016-04-24
Sunday
685
555
3490
7086361926
2016-04-24
Sunday
611
681
3520
6962181067
2016-05-12
Thursday
792
516
3587
2026352035
2016-05-01
Sunday
703
527
3609
4319703577
2016-05-08
Sunday
649
602
3672
4319703577
2016-04-21
Thursday
1341
59
3702
8378563200
2016-04-24
Sunday
813
458
3703
5553957443
2016-04-18
Monday
693
520
3727
1927972279
2016-04-26
Tuesday
933
296
3761
7086361926
2016-05-12
Thursday
916
444
3789
4445114986
2016-05-05
Thursday
734
462
3800
4445114986
2016-04-21
Thursday
756
436
3809
2026352035
2016-04-15
Friday
679
523
3821
1844505072
2016-04-15
Friday
303
644
3844
4445114986
2016-05-10
Tuesday
783
405
3915
4445114986
2016-04-16
Saturday
716
462
3945
4445114986
2016-04-14
Thursday
717
441
3974
1844505072
2016-04-30
Saturday
295
722
4014
5577150313
2016-05-11
Wednesday
841
431
4038
8792009665
2016-04-22
Friday
817
391
4068
4319703577
2016-04-24
Sunday
742
467
4081
5553957443
2016-04-23
Saturday
443
631
4112
6962181067
2016-05-08
Sunday
706
541
4188
2026352035
2016-05-07
Saturday
690
511
4193
5553957443
2016-05-04
Wednesday
746
447
4249
4020332650
2016-05-06
Friday
855
385
4369
2026352035
2016-04-12
Tuesday
702
503
4414
8378563200
2016-05-07
Saturday
769
459
4468
6117666160
2016-05-09
Monday
721
492
4477
4445114986
2016-04-28
Thursday
762
419
4493
4020332650
2016-05-03
Tuesday
934
322
4496
4319703577
2016-04-22
Friday
694
533
4500
4445114986
2016-05-06
Friday
809
374
4514
8378563200
2016-05-12
Thursday
778
496
4561
7007744171
2016-04-16
Saturday
1155
79
4631
4388161847
2016-04-17
Sunday
598
619
4660
4445114986
2016-04-29
Friday
1106
106
4676
2026352035
2016-04-30
Saturday
600
573
4729
4319703577
2016-04-16
Saturday
724
506
4744
5553957443
2016-04-13
Wednesday
726
455
4832
5553957443
2016-05-11
Wednesday
759
463
4926
4319703577
2016-04-23
Saturday
485
692
4935
2026352035
2016-04-13
Wednesday
637
531
4993
6962181067
2016-04-24
Sunday
727
511
5029
5577150313
2016-04-13
Wednesday
812
432
5077
2320127002
2016-04-23
Saturday
1129
61
5079
6117666160
2016-04-18
Monday
689
493
5153
5553957443
2016-05-01
Sunday
517
622
5164
4445114986
2016-05-07
Saturday
866
401
5183
5577150313
2016-05-04
Wednesday
631
603
5206
4445114986
2016-05-01
Sunday
741
439
5232
8792009665
2016-04-23
Saturday
795
339
5245
4445114986
2016-05-09
Monday
641
457
5275
5577150313
2016-04-26
Tuesday
812
354
5325
8378563200
2016-04-28
Thursday
724
506
5417
2347167796
2016-04-28
Thursday
761
408
5439
6962181067
2016-05-01
Sunday
799
411
5454
2347167796
2016-04-17
Sunday
581
556
5472
2026352035
2016-05-08
Sunday
614
541
5528
4020332650
2016-05-10
Tuesday
740
442
5546
6962181067
2016-04-15
Friday
837
370
5563
7007744171
2016-05-01
Sunday
1142
58
5600
6962181067
2016-04-13
Wednesday
587
630
5652
4319703577
2016-04-15
Friday
721
465
5664
8378563200
2016-04-23
Saturday
684
565
5709
5553957443
2016-04-16
Saturday
466
651
5771
7086361926
2016-04-13
Wednesday
763
451
5813
4020332650
2016-05-08
Sunday
775
364
5862
6962181067
2016-05-06
Friday
679
443
5908
2347167796
2016-04-26
Tuesday
723
436
5980
2026352035
2016-04-25
Monday
649
506
6017
8378563200
2016-05-02
Monday
846
351
6064
5553957443
2016-05-08
Sunday
611
568
6083
2026352035
2016-04-27
Wednesday
609
508
6088
3977333714
2016-04-21
Thursday
686
332
6093
4702921684
2016-04-26
Tuesday
786
421
6108
8792009665
2016-04-28
Thursday
714
402
6157
8378563200
2016-04-29
Friday
696
527
6175
4445114986
2016-04-30
Saturday
797
322
6222
4445114986
2016-04-26
Tuesday
839
353
6326
2026352035
2016-04-28
Thursday
564
513
6375
5577150313
2016-04-25
Monday
666
421
6393
1624580081
2016-04-30
Saturday
1040
124
6474
4702921684
2016-04-15
Friday
918
253
6506
4702921684
2016-04-21
Thursday
817
425
6530
4558609924
2016-05-08
Sunday
967
123
6543
2026352035
2016-05-04
Wednesday
535
538
6564
4388161847
2016-04-16
Saturday
724
426
6580
8378563200
2016-05-10
Tuesday
821
342
6582
4702921684
2016-04-19
Tuesday
756
457
6708
2347167796
2016-04-19
Tuesday
537
465
6711
6962181067
2016-05-11
Wednesday
730
452
6722
5577150313
2016-04-27
Wednesday
719
424
6805
6962181067
2016-05-07
Saturday
778
298
6815
4445114986
2016-04-22
Friday
706
388
6831
4702921684
2016-04-13
Wednesday
752
400
6877
4445114986
2016-05-02
Monday
667
502
6910
4702921684
2016-05-06
Friday
767
404
6943
2026352035
2016-05-02
Monday
542
511
7018
8378563200
2016-05-06
Friday
840
323
7045
4702921684
2016-04-27
Wednesday
736
432
7047
3977333714
2016-04-28
Thursday
804
261
7114
6117666160
2016-04-17
Sunday
778
336
7150
1624580081
2016-05-02
Monday
300
796
7155
8792009665
2016-04-30
Saturday
749
343
7174
3977333714
2016-04-27
Wednesday
714
349
7193
4445114986
2016-04-15
Friday
711
337
7198
4702921684
2016-04-12
Tuesday
738
425
7213
2026352035
2016-04-20
Wednesday
603
461
7222
4445114986
2016-04-27
Wednesday
839
332
7243
4445114986
2016-05-08
Sunday
733
361
7303
6117666160
2016-05-08
Sunday
519
555
7328
6117666160
2016-05-07
Saturday
521
498
7336
8378563200
2016-04-27
Wednesday
640
531
7359
5577150313
2016-05-02
Monday
595
525
7439
4445114986
2016-05-03
Tuesday
725
417
7502
5577150313
2016-05-05
Thursday
1153
74
7550
2026352035
2016-04-29
Friday
578
490
7604
6117666160
2016-04-24
Sunday
711
353
7623
8378563200
2016-04-12
Tuesday
848
338
7626
5577150313
2016-04-23
Saturday
767
384
7638
3977333714
2016-04-14
Thursday
801
291
7641
2347167796
2016-04-22
Friday
737
405
7804
4558609924
2016-04-29
Friday
1020
171
7833
4702921684
2016-04-14
Thursday
754
384
7860
8378563200
2016-05-04
Wednesday
767
441
7875
5577150313
2016-04-29
Friday
657
459
7924
4319703577
2016-05-07
Saturday
600
507
7937
4319703577
2016-04-29
Friday
581
523
7990
5577150313
2016-04-12
Tuesday
760
419
8135
4702921684
2016-05-04
Wednesday
736
412
8161
2026352035
2016-05-06
Friday
509
524
8198
6117666160
2016-04-22
Friday
492
480
8206
4319703577
2016-04-30
Saturday
684
490
8221
4702921684
2016-05-09
Monday
717
435
8232
2347167796
2016-04-18
Monday
650
500
8247
5577150313
2016-04-20
Wednesday
744
447
8330
8792009665
2016-04-29
Friday
634
398
8360
8378563200
2016-05-09
Monday
786
359
8382
4020332650
2016-04-12
Tuesday
549
501
8539
8378563200
2016-05-05
Thursday
740
381
8567
2026352035
2016-05-11
Wednesday
557
523
8580
7086361926
2016-04-15
Friday
864
377
8585
5577150313
2016-04-14
Thursday
619
477
8596
4702921684
2016-05-05
Thursday
758
414
8614
8378563200
2016-05-03
Tuesday
725
405
8712
4388161847
2016-04-15
Friday
615
499
8758
4702921684
2016-04-20
Wednesday
681
454
8793
3977333714
2016-04-12
Tuesday
777
274
8856
5577150313
2016-05-10
Tuesday
543
504
8869
2026352035
2016-05-12
Thursday
612
456
8891
3977333714
2016-04-22
Friday
731
355
8911
6117666160
2016-05-01
Sunday
468
507
8915
4319703577
2016-04-19
Tuesday
631
461
8925
4319703577
2016-04-20
Wednesday
573
523
8954
3977333714
2016-04-15
Friday
644
424
9010
4702921684
2016-04-28
Thursday
724
442
9023
4702921684
2016-04-18
Monday
885
293
9105
4445114986
2016-05-11
Wednesday
622
499
9105
7086361926
2016-04-14
Thursday
708
472
9123
4319703577
2016-05-11
Wednesday
550
529
9129
8378563200
2016-05-11
Wednesday
787
368
9143
4558609924
2016-04-26
Tuesday
983
103
9148
4702921684
2016-04-25
Monday
769
370
9167
5577150313
2016-04-22
Friday
782
338
9172
4319703577
2016-04-25
Monday
536
488
9259
4319703577
2016-05-02
Monday
579
478
9261
8378563200
2016-04-20
Wednesday
772
381
9388
6117666160
2016-04-27
Wednesday
425
542
9411
4702921684
2016-05-03
Tuesday
754
327
9454
4388161847
2016-04-26
Tuesday
833
319
9461
7086361926
2016-04-21
Thursday
798
390
9469
2347167796
2016-04-24
Sunday
621
442
9471
2347167796
2016-04-25
Monday
666
433
9482
4319703577
2016-05-10
Tuesday
579
487
9487
4319703577
2016-05-06
Friday
618
450
9524
7086361926
2016-05-11
Wednesday
752
451
9572
6117666160
2016-04-29
Friday
343
600
9592
4388161847
2016-05-05
Thursday
695
471
9603
5553957443
2016-05-06
Friday
747
400
9632
4319703577
2016-05-03
Tuesday
607
474
9648
1503960366
2016-04-17
Sunday
506
700
9705
7086361926
2016-04-22
Friday
722
428
9753
1503960366
2016-04-15
Friday
726
412
9762
5553957443
2016-05-02
Monday
710
409
9769
6117666160
2016-05-05
Thursday
538
392
9799
4702921684
2016-05-11
Wednesday
788
354
9810
1503960366
2016-04-21
Thursday
838
325
9819
5577150313
2016-04-28
Thursday
777
361
9841
5577150313
2016-04-18
Monday
611
527
9893
4319703577
2016-04-26
Tuesday
607
505
9899
4702921684
2016-04-29
Friday
695
433
9930
3977333714
2016-04-13
Wednesday
754
295
10035
1503960366
2016-04-24
Sunday
709
430
10039
7086361926
2016-05-02
Monday
750
440
10052
4388161847
2016-04-21
Thursday
691
442
10055
1503960366
2016-05-08
Sunday
574
594
10060
4388161847
2016-04-28
Thursday
720
428
10074
2347167796
2016-04-21
Thursday
699
460
10080
6962181067
2016-04-30
Saturday
731
422
10081
7086361926
2016-04-25
Monday
756
446
10091
4388161847
2016-05-02
Monday
823
368
10096
2347167796
2016-04-14
Thursday
696
445
10129
4702921684
2016-04-30
Saturday
636
479
10144
6962181067
2016-04-17
Sunday
681
427
10145
6962181067
2016-05-04
Wednesday
650
442
10147
4388161847
2016-04-19
Tuesday
817
329
10181
6962181067
2016-04-12
Tuesday
800
366
10199
4388161847
2016-05-11
Wednesday
664
469
10201
4319703577
2016-04-14
Thursday
534
535
10210
4388161847
2016-05-09
Monday
1092
62
10218
4388161847
2016-04-24
Sunday
584
552
10243
4020332650
2016-05-04
Wednesday
499
478
10252
4388161847
2016-05-01
Sunday
575
547
10255
7086361926
2016-05-03
Tuesday
734
456
10288
4388161847
2016-05-10
Tuesday
771
354
10299
6962181067
2016-04-27
Wednesday
649
455
10320
2347167796
2016-04-13
Wednesday
663
467
10352
4319703577
2016-05-09
Monday
485
535
10378
7086361926
2016-04-26
Tuesday
705
485
10387
3977333714
2016-05-01
Sunday
609
383
10414
3977333714
2016-04-17
Sunday
600
381
10415
6962181067
2016-04-26
Tuesday
697
441
10433
6117666160
2016-04-20
Wednesday
553
474
10449
2347167796
2016-04-15
Friday
627
452
10465
6962181067
2016-05-05
Thursday
608
467
10524
5553957443
2016-04-27
Wednesday
811
347
10538
1503960366
2016-04-20
Wednesday
818
360
10544
4388161847
2016-04-20
Wednesday
700
421
10553
1503960366
2016-05-01
Sunday
730
369
10602
4702921684
2016-05-10
Tuesday
709
416
10613
3977333714
2016-04-29
Friday
744
333
10645
7086361926
2016-05-08
Sunday
724
481
10677
2026352035
2016-05-09
Monday
483
531
10685
7086361926
2016-04-19
Tuesday
697
472
10688
6962181067
2016-04-22
Friday
702
425
10725
1503960366
2016-04-13
Wednesday
776
384
10735
6962181067
2016-04-19
Tuesday
634
476
10742
6962181067
2016-04-29
Friday
597
433
10762
4319703577
2016-04-27
Wednesday
780
286
10780
4319703577
2016-04-28
Thursday
538
497
10817
5577150313
2016-04-21
Thursday
644
414
10830
5553957443
2016-04-25
Monday
709
433
10946
7086361926
2016-05-04
Wednesday
812
420
10988
4388161847
2016-04-18
Monday
1062
99
11009
5577150313
2016-05-03
Tuesday
580
508
11045
7086361926
2016-04-27
Wednesday
755
469
11107
6117666160
2016-04-19
Tuesday
468
465
11135
4702921684
2016-04-16
Saturday
672
382
11140
3977333714
2016-04-25
Monday
781
262
11177
1503960366
2016-04-29
Friday
815
341
11181
4388161847
2016-04-27
Wednesday
640
439
11193
8378563200
2016-04-16
Saturday
524
611
11207
7086361926
2016-04-12
Tuesday
697
514
11317
3977333714
2016-04-26
Tuesday
797
250
11388
5553957443
2016-04-28
Thursday
756
421
11393
6962181067
2016-04-18
Monday
689
442
11404
8378563200
2016-05-01
Sunday
693
475
11419
2347167796
2016-04-27
Wednesday
611
448
11423
6117666160
2016-04-23
Saturday
389
492
11495
3977333714
2016-05-07
Saturday
703
237
11550
7086361926
2016-04-28
Thursday
810
354
11584
5553957443
2016-04-12
Tuesday
667
441
11596
5553957443
2016-05-09
Monday
614
453
11611
3977333714
2016-04-20
Wednesday
871
152
11658
3977333714
2016-04-18
Monday
605
412
11663
3977333714
2016-05-06
Friday
676
323
11677
5553957443
2016-04-22
Friday
776
322
11682
4020332650
2016-05-05
Thursday
916
226
11728
6962181067
2016-04-21
Thursday
689
451
11835
5553957443
2016-04-26
Tuesday
691
412
11886
1503960366
2016-05-07
Saturday
833
331
11992
1503960366
2016-05-09
Monday
835
338
12022
3977333714
2016-04-23
Saturday
724
235
12058
5577150313
2016-04-15
Friday
659
392
12087
6962181067
2016-05-03
Tuesday
698
394
12109
4388161847
2016-04-22
Friday
1033
82
12139
1503960366
2016-05-06
Friday
754
334
12159
2026352035
2016-05-05
Thursday
480
468
12167
8378563200
2016-04-22
Friday
687
441
12200
1503960366
2016-05-10
Tuesday
746
383
12207
5577150313
2016-04-17
Sunday
461
549
12231
3977333714
2016-05-05
Thursday
680
318
12312
6962181067
2016-05-09
Monday
648
489
12342
5553957443
2016-04-21
Thursday
743
399
12346
2026352035
2016-04-23
Saturday
454
522
12357
5577150313
2016-04-30
Saturday
592
412
12363
4388161847
2016-05-04
Wednesday
719
390
12375
8378563200
2016-04-13
Wednesday
654
447
12386
7086361926
2016-05-01
Sunday
730
388
12390
8378563200
2016-04-25
Monday
764
388
12405
3977333714
2016-04-19
Tuesday
738
219
12414
7086361926
2016-05-06
Friday
834
322
12461
4388161847
2016-04-30
Saturday
653
409
12533
5577150313
2016-04-19
Tuesday
733
449
12574
6962181067
2016-04-28
Thursday
653
440
12627
1503960366
2016-04-16
Saturday
806
340
12669
4702921684
2016-04-17
Sunday
448
591
12692
5553957443
2016-04-29
Friday
614
450
12764
1503960366
2016-05-11
Wednesday
824
285
12770
7086361926
2016-05-07
Saturday
621
530
12827
5553957443
2016-05-03
Tuesday
709
380
12848
6962181067
2016-05-02
Monday
588
466
12912
8378563200
2016-04-19
Tuesday
726
387
13070
3977333714
2016-05-10
Tuesday
676
312
13072
1503960366
2016-04-28
Thursday
782
366
13154
1503960366
2016-04-12
Tuesday
728
327
13162
6962181067
2016-04-16
Saturday
741
357
13217
4388161847
2016-04-23
Saturday
661
478
13236
3977333714
2016-04-30
Saturday
769
237
13238
6962181067
2016-04-25
Monday
677
400
13239
8378563200
2016-04-14
Thursday
667
424
13318
5577150313
2016-05-01
Sunday
598
379
13368
3977333714
2016-04-16
Saturday
663
283
13459
3977333714
2016-05-04
Wednesday
852
213
13559
7086361926
2016-05-09
Monday
714
427
13566
3977333714
2016-05-08
Sunday
688
259
13585
8378563200
2016-04-18
Monday
688
398
13630
4558609924
2016-04-21
Thursday
844
126
13743
1503960366
2016-04-26
Tuesday
833
245
13755
6962181067
2016-04-20
Wednesday
667
418
13928
6117666160
2016-04-15
Friday
515
391
14019
1503960366
2016-05-05
Thursday
857
247
14070
3977333714
2016-04-24
Sunday
660
310
14112
5577150313
2016-04-16
Saturday
597
406
14269
5553957443
2016-05-05
Thursday
662
419
14331
3977333714
2016-05-03
Tuesday
594
292
14335
7086361926
2016-04-20
Wednesday
693
492
14365
4702921684
2016-05-07
Saturday
407
520
14370
1503960366
2016-04-23
Saturday
732
361
14371
6117666160
2016-04-16
Saturday
502
380
14450
8378563200
2016-04-15
Friday
634
513
14461
7086361926
2016-04-30
Saturday
584
485
14560
1503960366
2016-04-30
Saturday
712
404
14673
1503960366
2016-05-02
Monday
798
277
14727
4702921684
2016-04-24
Sunday
543
480
15050
1503960366
2016-05-03
Tuesday
816
273
15103
8053475328
2016-04-20
Wednesday
695
486
15108
4702921684
2016-04-23
Saturday
482
465
15126
8378563200
2016-04-21
Thursday
725
396
15148
1503960366
2016-04-25
Monday
814
277
15355
6962181067
2016-05-10
Tuesday
588
469
15448
5553957443
2016-04-19
Tuesday
684
357
15482
1503960366
2016-04-19
Tuesday
775
304
15506
5577150313
2016-04-24
Sunday
409
543
15764
8378563200
2016-04-26
Tuesday
587
550
16208
5553957443
2016-05-10
Tuesday
631
418
16358
3977333714
2016-05-02
Monday
713
230
16520
5553957443
2016-04-15
Friday
696
377
16556
2347167796
2016-04-23
Saturday
627
374
16901
5553957443
2016-04-14
Thursday
664
357
17022
4388161847
2016-05-08
Sunday
576
529
17298
6117666160
2016-04-21
Thursday
566
508
19542
8053475328
2016-05-07
Saturday
1076
74
19769
6962181067
2016-04-23
Saturday
448
528
20031
8053475328
2016-04-23
Saturday
741
331
22359
4388161847
2016-05-07
Saturday
508
472
22770

1. Relationship between sedentary minutes and minutes asleep.

The analysis revealed a negative correlation between time spent sitting and time spent asleep. Inconsistencies in tracking may influence the accuracy of the results, and to draw a more definitive conclusion, analysing a better sample would be necessary. Nonetheless, this finding remains interesting.

image

2. Relationship between number of steps and minutes asleep. On the other hand, I found no relationship between number of steps and time spent asleep.

image

Hourly Average Intensity

In the following graph, we observe the level of activity throughout the day. The highest activity levels are typically recorded between 17:00 and 19:00, with two noticeable dips occurring between 10:00-12:00 and 14:00-16:00.

Hour
Average Intensity
00:00
0.04
01:00
0.02
02:00
0.02
03:00
0.01
04:00
0.01
05:00
0.08
06:00
0.13
07:00
0.18
08:00
0.24
09:00
0.26
10:00
0.29
11:00
0.28
12:00
0.33
13:00
0.31
14:00
0.31
15:00
0.26
16:00
0.30
17:00
0.36
18:00
0.37
19:00
0.36
20:00
0.24
21:00
0.20
22:00
0.15
23:00
0.08
image

6th Phase: ACT

Step 6: Act Now it’s time to act on your data. You will take everything you have learned from your data analysis and put it to use. This could mean providing your stakeholders with recommendations based on your findings so they can make data-driven decisions. Questions to ask yourself in this step: 1. How can I use the feedback I received during the share phase (step 5) to actually meet the stakeholder’s needs and expectations?

Introductory paragraph Lorem ipsum dolor sit amet, consectetur adipiscing elit. Curabitur quis porttitor diam. Sed nec arcu non urna pretium congue vel sit amet quam. Phasellus quis nibh nunc.

The problem

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Phasellus a pretium leo. Praesent eu sem nulla. In nunc arcu, rutrum eu malesuada sed, lacinia nec ipsum. Maecenas quis tincidunt sem. Quisque vitae nisl vel purus laoreet efficitur quis ut arcu. Nulla sit amet risus hendrerit, tristique augue eu, dapibus risus. Curabitur ullamcorper ligula lacus, sit amet pharetra neque interdum in.

image

Ut ultricies tortor massa, id luctus nulla tincidunt a. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Donec mattis mauris vel velit ullamcorper, eu ullamcorper nisi sodales. Nunc vel dui fringilla, fermentum diam in, scelerisque lacus. Curabitur in posuere lacus.

image
image
image

The Solution

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Phasellus a pretium leo. Praesent eu sem nulla. In nunc arcu, rutrum eu malesuada sed, lacinia nec ipsum. Maecenas quis tincidunt sem. Quisque vitae nisl vel purus laoreet efficitur quis ut arcu. Nulla sit amet risus hendrerit, tristique augue eu, dapibus risus. Curabitur ullamcorper ligula lacus, sit amet pharetra neque interdum in.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Curabitur quis porttitor diam. Sed nec arcu non urna pretium congue vel sit amet quam. Phasellus quis nibh nunc.

Ut ultricies tortor massa, id luctus nulla tincidunt a. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Donec mattis mauris vel velit ullamcorper, eu ullamcorper nisi sodales. Nunc vel dui fringilla, fermentum diam in, scelerisque lacus. Curabitur in posuere lacus.

image
image
image

Conlusion

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Phasellus a pretium leo. Praesent eu sem nulla. In nunc arcu, rutrum eu malesuada sed, lacinia nec ipsum. Maecenas quis tincidunt sem. Quisque vitae nisl vel purus laoreet efficitur quis ut arcu. Nulla sit amet risus hendrerit, tristique augue eu, dapibus risus. Curabitur ullamcorper ligula lacus, sit amet pharetra neque interdum in.

Ut ultricies tortor massa, id luctus nulla tincidunt a. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Donec mattis mauris vel velit ullamcorper, eu ullamcorper nisi sodales. Nunc vel dui fringilla, fermentum diam in, scelerisque lacus. Curabitur in posuere lacus.

Credits

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Phasellus a pretium leo. Praesent eu sem nulla. In nunc arcu, rutrum eu malesuada sed, lacinia nec ipsum.