We encourage individual researchers (often learners developing a dissertation) and institutions to conduct research using their SmarterMeasure data. Such research not only further validates the assessment, but it is adding to a growing body of research on the topic of online student readiness.
Use the menu on the left to explore the following content:
 Research Results – View summaries and fulltext white papers of research conducted at peer institutions.
 Assessment Details – Contains an overview and definition of each of the constructs measured by the assessment.
 Construct Validity – Measurements of the degree to which the assessment is measuring the defined constructs.
 Item Reliability – Measurements of the consistency of the assessment items.
 Case Studies – Detailed descriptions of the implementation and benefit of using SmarterMeasure at select institutions.
 Usage Patterns – Measurements of the most common implementation patters for utilizing the assessment.
 Accreditation – Link to statements made by several major accrediting agencies concerning the requirement to measure online student readiness.
If you are interested in conducting research related to online student readiness using SmarterMeasure data please contact your SmarterMeasure account manager or contact Dr. Mac Adkins using the information on the Contact page of this website. The table below provides some suggested research strategies.
Construct  Analysis  Data Sources 

Academic SuccessWhat is the correlation between SmarterMeasure™ scores and learner’s grades? 
Correlation and Analysis of Variance (ANOVA):Stronger relationships may be found with scores of individual attributes and academic achievement. Other case studies have found individual attributes as the strongest indicator of academic success. 

Student EngagementWhat is the correlation between SmarterMeasure™ scores and metrics of student engagement? 
Correlation, Independent Samples ttests, Discriminant Analysis:Stronger relationships may be found with scores in technical competency and technical knowledge. This may especially be the case for learners in their first term of enrollment. As is demonstrated in the National Student Readiness Report scores on technical competency and knowledge improve as the student gains experience in studying online or in a technologyrich environment. Firsttime students are often confused about how to participate in their courses. See “Engagement Metrics” for more information.


Student SatisfactionWhat is the relationship between SmarterMeasure™ scores and metrics of student satisfaction? 
Analysis of Variance (ANOVA), Independent Samples ttests, Discriminant Analysis, Structural Equation Modeling:Responses to end of course survey items such as “I would enroll in another online course” could be used to segment students into groups and then the means of the SmarterMeasure™ scale scores could be compared across the groups. 

Student RetentionWhat is the relationship between SmarterMeasure™ scores and metrics of student retention? 
Correlation, Independent Samples ttests, Discriminant Analysis, Multiple Regression:A comparison of SmarterMeasure™ scale scores between retained and nonretained students could be calculated. 

Quantitative Student FeedbackWhat is the relationship between SmarterMeasure™ scores and quantitative points of student feedback. 
Correlation:Students typically take SmarterMeasure™ near the beginning of their enrollment. After the students have completed their first term of enrollment encourage the students to submit a survey which allows the student to provide feedback about their experiences in the online or technologyrich courses. Then correlations between these reported experiences and the student’s initial SmarterMeasure™ scores can be calculated as a measure of construct validity of SmarterMeasure. Questions which would be appropriate for this survey are provided below. 

Qualitative Student FeedbackWhat is the relationship between SmarterMeasure™ scores and qualitative points of student feedback. 
Comparison:Assemble a focus group of students for a onehour conversation about topics such as the construct of learner readiness. A listing of possible discussion starting questions is presented below. Compare the observations made by the students either to their SmarterMeasure™ scores individually or to aggregate scores from the general population of students who have taken SmarterMeasure. 

Integration Plan ComparisonIs there a difference in SmarterMeasure™ scores between schools with strong and schools with weaker implementation plans? 
Independent Samples Ttest:Considerable variance exists between the implementation plans of different schools/campuses. The impact that SmarterMeasure™ is having could be impacted by the strength of the implementation plan. A comparison of results of some of the suggested research strategies above could be made between schools with different implementation plans. 

Comparison to other Standardized TestsWhat is the relationship between SmarterMeasure scores and scores on other standardized exams such as the Myers Briggs, SAT, Compass or AccuPlacer? 
Correlations:Calculate the correlations between measurements of student readiness and other measurements of student aptitude taken through other admissions assessments. 

Student RetentionWhat is the relationship between SmarterMeasure™ scores and metrics of student retention? 
Correlation, Independent Samples ttests, Discriminant Analysis, Multiple Regression:A comparison of SmarterMeasure™ scale scores between retained and nonretained students could be calculated. 

Rationale for WithdrawalWhat is the relationship between SmarterMeasure™ scores and the reasons students give for why they stopped or dropped out of courses or degree programs. At one institution 55% of the reasons provided for dropping out were due to "life happening." 
Descriptive statistics for qualitative data and correlation for quantitative data:An analysis of SmarterMeasure™ subscale scores for the Life Factors section with stated reasons for withdrawal. 

Extraneous Factors to Consider
When conducting research using SmarterMeasure data one should be aware of the possible impact of these extraneous factors.
SelfSelection Bias – At some institutions students are given the option to take the SmarterMeasure Learning Readiness Indicator. In that case it can be assumed that at some level the student is demonstrating a degree of diligence simply by completing the 125 item assessment. In that case the data set for the school may be over populated with data from "diligent" students and under populated with data from "nondiligent" students. If this skewed data set was then used, for example, to compute a correlation to grades and grades were selected across the entire population of students, including "diligent" and "nondiligent" students, then the comparison would not be valid. To compensate for this schools are encouraged to require all students to complete the assessment.
Treatment Effect – We stress that the SmarterMeasure Learning Readiness Indicator is an indicator of levels of noncognitive readiness for learning online or in a technologyrich environment, it currently does not provide the remediation. This is much like how a thermometer is a measure of a fever, it is not the medicine. While we do link to several resources for remediation, schools are encouraged to link to and provide their own remediation resources. Some schools excel in individually following up with students about their SmarterMeasure scores and providing the student with rich resources for remediation. This is exactly what we want to have happen. However, when this happens it can serve to invalidate the assessment scores. For example, suppose a student has low levels of readiness as indicated by SmarterMeasure. Then the school diligently provides rich resources for remediation in an orientation course which results in the student improving in several areas. Due to this remediation the student achieves higher levels of academic success, engagement, satisfaction and retention than he/she would have otherwise. Then if the school correlates SmarterMeasure scores to these key student performance indicators there will appear to be a nonvalid relationship.