Research Results

2,666,601 All Time Assessments

Research Results from Individual Schools

We encourage our client institutions to conduct research regarding the relationship of the readiness variables measured by SmarterMeasure and student success indicators such as academic achievement, engagement, retention and satisfaction.  Summaries of some of these research projects are below.  A compilation of these research findings is also provided for easier viewing.

Middlesex Community College - Middletown, Connecticut

To answer whether SmarterMeasure scores affect students' grades in online learning, a correlation study was conducted to see the relationships between the scores of SmarterMeasure and the students' grades. The preliminary study done in Spring 2009 and Summer 2009 on 750 cases showed a significant correlation between the score of personal attributes and grades. They were significantly correlated with a positive coefficient, meaning that the higher a score of personal attributes, the higher grade a student would receive. This result implies that personal attributes, represented by self-motivation, self-discipline, and time management, plays a very important role in student success of online learning. This preliminary study was followed by a subsequent study Fall 2010 which analyzed grades on 3228 cases collected across six academic terms. The result confirmed a significant correlation between the score of personal attributes and students' grades. Middlesex Community College used these findings to modify the types of student services that they provide to online learners. This pattern of learner readiness assessment coupled with providing appropriate services to match their deficiencies resulted in substantial gains in student retention. Before SmarterMeasure was implemented, 6% to 13% more students failed online courses than students taking on-ground courses. After the implementation, the gaps were narrowed; 1.3% to 5.8% more online students failed than on-ground students. View full case study.

Argosy University - Chicago Illinois

Argosy University enhances the student experience by integrating SmarterMeasure into its Freshman Experience course. As an activity in the course, students are assigned to reflect on their SmarterMeasure scores and articulate areas for improvement as a part of the Personal Development Plan that students develop. Also during the course, students are arranged in groups with other students with similar traits, as identified by SmarterMeasure, to reflect upon their readiness for online education.

Argosy University identified a four-part research project to Compare, Explore, Trend and Apply findings from an analysis of SmarterMeasure data.

  • Compare - Argosy University provides SmarterMeasure to students in its online as well as hybrid courses. The University was operating on the general assumption that students’ traits and competencies were parallel across students in these two delivery systems. The University used SmarterMeasure data and compared the traits, attributes, and skills of the online and hybrid students. The analysis did find substantial differences between the two groups. As a result of this finding, changes were made to the instructional design process for each of these distinct delivery systems.
  • Explore – The University conducted a correlational analysis to measure the relationships between SmarterMeasure scores and measures of student satisfaction, retention, and academic success. Their findings did reveal a positive significance between each of these constructs. Statistically significant relationships were identified between the SmarterMeasure constructs of Technical Competency, Motivation, Availability of Time, and Retention.
  • TrendThe University conducted an aggregate analysis of SmarterMeasure data to identify mean scores for incoming students to gauge changes in the student body. In addition to the mean scores for their student population per term, a comparison was also made to the national mean scores that are published each year in the Student Readiness Report which provides aggregate data for around 300 higher education institutions.
  • Apply – These analyses were not conducted then placed on a shelf. The findings were shared with the instructional design and student services groups and improvements in processes were made. For example, since technical competency scores increase as the students take more online courses, the instructional designers purposefully allowed only basic forms of technology to be infused into the first courses that students take.

J. Sargeant Reynolds Community College - Richmond, Virginia

As part of its Quality Enhancement Plan (QEP), J. Sargeant Reynolds Community College adopted SmarterMeasure, an assessment tool that assesses student readiness for learning within the online classroom. An analysis was conducted to determine the relationship between the SmarterMeasure sub-scale scores and student's grades. Among the results, the top factors that demonstrate the highest correlation between SmarterMeasure performance and students' academic success are the following:

  • Skills - The results indicated that 66% of the students who scored Medium-High to High in the Skills factor succeeded in their online classes. By contrast, only 5% of students who scored Low-Medium in the Skills section were successful.
  • Time - Of those who scored Medium-High to High by demonstrating that they had an adequate resource of time, 62% were academically successful; only 10% of those who scored Low-Medium to Low were similarly successful.
  • Resources - The results indicated that 66% of the students who scored Medium-High to High in the Resources factor succeeded in their online classes, and only 5% of students who scored Low or Low-Medium in the Resources section were successful.
  • Place - Among those who scored Medium-High to High, 72% were successful in their online courses

North Central Michigan College - Petoskey, MI

Leaders at North Central Michigan College recognize the value of multiple different assessments of students in the admissions process. In addition to using SmarterMeasure to measure levels of online student readiness, they also use the COMPASS exam (provided by ACT) to measure incoming student's skills in reading, writing and math. To determine the degree of relationship between measures of online learner readiness and measures of academic readiness they computed correlations between the scores for the two exams. Statistically significant correlations were found between four of the six SmarterMeasure scales and sections of the Compass exam.

  COMPASS
SMARTERMEASURE Math English Reading E-Write
Learning Styles   X X X
Reading X X X X
Individual Attributes     X  
Life Factors     X X

X = Statistically Significant Correlation ( p < 0.05)

The providers of SmarterMeasure encourage schools to do research with SmarterMeasure data regarding their own students. When schools plan to do an analysis of their SmarterMeasure data they often plan first to correlate SmarterMeasure scores to student's grades in the course. This is a welcomed analysis and typically results in statistically significant findings. The 2008 study conducted by Atanda Research analyzed the SmarterMeasure scores of 2,622 random students representing over 300 schools. Correlations significant at the .05 level or higher were found with 11 of the 15 SmarterMeasure scores variables and student's grades. However, this analysis is really not the most appropriate way to measure the construct validity of SmarterMeasure scores because student's grades are impacted by a myriad of variables (prior academic experiences, IQ, etc.). SmarterMeasure is not designed to be an indicator of academic success. There are several tools such as the ACT, SAT, and GRE which serve this purpose. SmarterMeasure does not measure any constructs of content knowledge in areas such as math, science, history, etc. So to use SmarterMeasure solely as a predictor of academic success is not the most appropriate application.

In addition to correlating SmarterMeasure scores to grades, we recommend three other types of analysis which may be a more valid measurement of the applicability of SmarterMeasure. (1) Identify students who dropped out of the courses and compare the means of their SmarterMeasure scores to the means of the SmarterMeasure scores of the students who persisted in the courses. The intent of SmarterMeasure is to identify students who are "at-risk" of not being a good fit for distance or technology rich learning and it is these students who are more likely to drop out. The real benefit of SmarterMeasure is when schools can identify "at-risk" students then provide the encouragement, remediation and support that the students need to remain in the course and be successful. (2) After students have completed their first online, hybrid or technology rich course, do a survey of the students asking them to report the goodness of fit for them. Ask them questions about how they did keeping up with the volume of reading in the course, the degree to which they could find time to participate in course activities, the level of frustration they had with their computer and the Internet, etc. Then correlate their responses on these questions back to their SmarterMeasure scores. This type is study is very appropriate because SmarterMeasure is intended to be a predictor of goodness of fit of distance or technology rich education. In the 2008 study conducted by Atanda Research, of the 90 correlations calculated between measures of goodness of fit and SmarterMeasure scores 63 of the 90 correlations were statistically significant at the .05 level or higher. (3) The third type of analysis that we encourage is a qualitative study in which you interview individually or in a focus group students who persisted in online or technology rich courses and those who withdrew. Compare the factors that influenced their decision to remain or withdraw to the means of the SmarterMeasure scores from your students.

To assist schools in planning a research project using their SmarterMeasure data we provide a Research Plan document. This document describes various research methodologies that schools have used to analyze their SmarterMeasure data. Not only does the provider of SmarterMeasure support additional analysis of SmarterMeasure data, but we will support you in the effort. If your school would like to construct a study like this contact Dr. Mac Adkins for assistance in designing the study, exporting the correct data, and conducting the statistical analysis.

2013 Student Readiness Report

SmarterServices, LLC, the provider of the SmarterMeasure Learning Readiness Indicator, annually analyzes the SmarterMeasure data in aggregate of all of the students from the prior year who have taken SmarterMeasure. No data specific to individual students or individual schools is made publicly available. Data in the 2013 report was taken from 639,324 unique students from 275 higher education institutions who took the SmarterMeasure assessment from June 1, 2012 to May 31, 2013. Highlights in the report include the following statistically significant differences between the means of the variables of gender, ethnicity, institution type, age range, and number of prior online courses taken as they relate to student readiness for online learning.

  • Gender: Females were found to have statistically significant higher means on the construct of individual attributes, typing rate and life factors.   Males were found to have statistically significant higher means on the constructs of reading rate and technical knowledge. 
  • Ethnicity: Statistically significant differences in means were reported in all eight constructs based on ethnicity.  African-Americans reported the highest mean for Individual Attributes.  Caucasian/White reported the highest mean for Reading Recall, Typing Accuracy, Technical Competency and Life Factors.  Asian or Pacific Islander reported the highest mean for Typing Rate.  Other Race reported the highest mean for Technical Knowledge.
  • Age Range: Significant differences did exist in six of the eight constructs measured.  Generally speaking, age does matter as demonstrated below.  For constructs related to personal maturity, older students had the highest means.  For constructs related to technical matters, younger students had the highest means.  This was consistent with the prior four years’ findings. 
  • Number of Courses: The results demonstrated that experience matters with online learning.  In each of the eight constructs measured, as persons took more online courses their readiness measures improved.  The differences in the means were statistically significant in six of the seven scales.  The greatest difference in means from students with no prior online course experience and those who had taken five or more courses continued (third consecutive year) to be in the area of technical knowledge.  This indicates that with experience students can learn to use the technology required for online courses.
  • Institution Type: Analysis of Variance (ANOVA) was calculated to determine if differences exist between students of different types of institutions. Significant differences did exist between the types of institutions and the factors of individual attributes on all six constructs measured.  Associates Colleges had the highest means for Locus of Control and Persistence.  Master’s Colleges and Universities had the highest means for Academic Attributes, Help Seeking, Procrastination and Time Management.  Comparisons were also made between profit and not-for-profit institutions.  Public institutions had the highest mean for Persistence, Procrastination, and Time Management.  Private not-for-profit institutions, which historically are more selective in admissions, had the highest means for Academic Attributes, Help Seeking, and Locus of Control.

This is the fifth year that the Student Readiness Report has been produced. For five years in a row females have had statistically significant higher means in Individual Attributes, and Academic Attributes. Males have had statistically significant higher means for Technical Knowledge for five years. Caucasians have had statistically significant higher means in Technical Knowledge for five years. Students who have taken five or more online courses reported statistically higher means for the four years in Individual Attributes and Technical Knowledge.

A full copy of the 2013 report is available here.  Student Readiness Reports for previous years: 2012 | 2011 | 2010 | 2009.

Brief Review of Literature on the need for SmarterMeasure

With the shift toward online learning, it is important to explore the adoption of online education. Previous studies found that among academic leaders, 64 percent believe that it takes more discipline for a learner to succeed in an online course (Sloan Consortium, 2006); therefore, placing additional responsibility on students to be self-directed learners. Before the start of an online program or course, it should be determined if a learner's instructional need can be resolved through a distance education approach (Willis & Lockee, 2004). Assessing the pre-requisite skills of the distance learner is critical (Hsiu-Mei & Liaw, 2004; Simonson et al., 2003). Learners need to have enough pre-requisite skills of technological proficiency and a strong motivation to learn by technology (Hsiu-Mei & Liaw, 2004). In a study by Kuh, (2005) of twenty highly engaged institutions, one common characteristic was to know the students—“where they came from, their preferred learning styles, their talents, and when and where they need help” (p. 301). Because of the difficulty in accommodating a group of learners with a wide range of acquired skills, requirements for pre-requisite skills should be set (Falvo & Solloway, 2004). A researched method of examining the notion of online readiness is listed using three aspects: (a) Student's preference for online form of instructional delivery as compared to traditional face to face instruction; (b) Student confidence in using electronic communication for learning and competence and confidence in the use of Internet and computer-mediated communication; and (c) Ability to engage in autonomous learning (P. J. Smith et al., 2003). Hall (2008, para 27) stated that "the primary value of the surveys may lie in raising awareness for any student considering enrolling in a distance education course."

Pamela Dupin-Bryant of Utah State University - Toole conducted a study which was published in The American Journal of Distance Education titled "Pre-entry Variables Related to Retention in Online Distance Education". This study identified pre-entry variables related to course completion and non-completion in university online distance education courses. Four hundred and sixty-four students who were enrolled in online distance education courses participated in the study. Discriminant analysis revealed six pre-entry variables were related to retention, including cumulative grade point average, class rank, number of previous courses completed online, searching the Internet training, operating systems and file management training, and Internet applications training. Results indicate prior educational experience and prior computer training may help distinguish between individuals who complete university online distance education courses and those who do not. SmarterMeasure measures all of the variables that this study indicated as indicators of success except for class rank.

Developmental Students

When developmental students enroll in distance classes, they bring with them the same need for support that they have in a conventional classroom (Caverly and MacDonald, 1998; Rhoda and Burns, 2005), and surprisingly little research has been done on how best to facilitate the progress of underprepared students in an online class (Perez and Foshay, 2002). Distance education requires more self-directed learning and higher levels of personal motivation, independence and self-discipline (Sampson, 2003), in addition to the technical skills required for participation in an online class (Caverly and MacDonald, 1998). These are all skills in which underprepared students might be lacking. Fortunately, the same technology that delivers the class can deliver the support systems.

Additional Research Requests

Additional research on SmarterMeasure is welcomed. If you are interested in conducting research on the topic of online student readiness using SmarterMeasure data please send a brief research request to Dr. Mac Adkins. In the research request describe the purpose and plan for your research including the proposed subjects, timeline, and plans for the dissemination of the research. All research done using SmarterMeasure data must meet our privacy statement. We never release to third parties any data which identifies individual or other school specific data.

Reference List

  • Association, (2004). Retrieved March 10, 2004 from http://www.usdla.org
  • Caverly, D., and MacDonald, L. (1998). Techtalk: Distance developmental education. Journal of Developmental Education, 2. Retrieved October 12, 2007 from Academic Search Premier
  • Dupin-Bryant, P. A. (2004). Pre-entry variables related to retention in online distance education. American Journal of Distance Education, 18(4), 199-206.
  • Falvo, D. A., & Solloway, S. (2004). Constructing community in a graduate course about teaching with technology. TechTrends: Linking Research & Practice to Improve Learning, 48(5), 56.
  • Hall, M. (2008, Fall). Predicting student performance in web-based distance education courses based on survey instruments measuring personality traits and technical skills. Online Journal of Distance Learning Administration, 11. Retrieved April 20, 2009, from http://www.westga.edu/%7Edistance/ojdla/fall113/hall113.html
  • Hsiu-Mei, H., & Liaw, S.-S. (2004). Guiding distance educators in building web-based instructions. International Journal of Instructional Media, 31(2), 125.
  • Kuh, G. D., Kinzie, J., Schuh, J. H., Whitt, E. J. & Associates (2005). Student success in college: Creating conditions that matter. San Francisco: Jossey-Bass.
  • Perez, S., & Foshay, R. (2002). Adding up the distance: Can developmental studies work in a distance learning environment? T H E Journal, 29, pp. 16+. Retrieved May 22, 2007 from Questia.
  • Rhoda, K. R. & Burns, C. N. (2005). Developing and online writing center for distance learning courses. Paper presented at 21st Annual Conference on Distance Learning and Teaching. Retrieved October 13, 2007 from http://www.uwex.edu/disted/conference/Resource_library/proceedings/05_1923.pdf
  • Sampson, N. (2003). Meeting the needs of distance learners. Language, Learning and Technology, 7, pp.103+. Retrieved June 13, 2007, from Questia.
  • Simonson, M., Smaldino, S., Albright, M., & Zvacek, S. (2003). Teaching and learning at a distance. Upper Saddle River, NJ: Pearson Education, Inc.
  • Smith, P. J., Murphy, K. L., & Mahoney, S. E. (2003). Towards identifying factors underlying readiness for online learning: An exploratory study. Distance Education, 24(1), 57. United States Distance Learning
  • Willis, L. L., & Lockee, B. B. (2004). A pragmatic instructional design model for distance learning. International Journal of Instructional Media, 31(1), 9.