E-TRIALS leverages ASSISTments, one of the only evidence-based online learning interventions in the nation, to provide a unique environment for randomized controlled experimentation in authentic learning environments
Like shared scientific instruments in other domains (i.e., the Hubble Space Telescope), E-TRIALS (formerly known as the ASSISTments Testbed) provides the infrastructure necessary for educational researchers to push the boundaries of inquiry
E-TRIALS gives scientists freedom to run experiments in authentic learning environments, revolutionizing how learning science research is conducted
What Is E-TRIALS?
Ed Tech Research Infrastructure to Advance Learning Science
What Can E-TRIALS Do For You?
Learn more about how you can use this revolutionary tool to conduct educational research
We collaborate with researchers to design and conduct cutting edge learning science experiments
Help Students Learn
Our projects have been associated with increased student learning in rural, urban, and suburban schools
Support Quality Research
Our infrastructure is easy to use and supports researchers as they design their experiments
Collect Rigorous Data
ASSISTments has 100,000 users who completed 12 million problems in the 2017-2018 academic year – this data is de-identified and made available for researchers
Case Study: NSF provides $1 million in funding for study conducted in E-TRIALS
Dr. Candace Walkington of Southern Methodist University received $1 million from the NSF to collaborate with the E-TRIALS team. In her project, Dr. Walkington is examining the effects of tailoring algebra question content to students’ career interests. For more info, see Walkington's website.
Research Made Simple
E-TRIALS is easy to navigate – experimental conditions and content can be customized to address almost any research question
"Quite simply, E-TRIALS is a dream come true for researchers interested in student learning."
- Virginia Clinton, University of North Dakota
Peer Reviewed Publications Made Possible by E-TRIALS
Smith, H., Harrison, A., Chan, J. C., & Ottmar, E. (Under review). Dynamic vs. static: Which worked examples work best? Poster submission to the 2020 meeting of The Mathematical Cognition and Learning Society. [Pre-registration]
Harrison, A., Smith, H., Hulse, T., & Ottmar, E. (2020). Spacing out!: Manipulating Spatial Features in Mathematical Expressions Affects Performance. Journal of Numerical Cognition. 6 (2): 186-203. DOI: 10.5964/jnc.v6i2.243
Harrison, A., Smith, H., Hulse, T., & Ottmar, E. (2020). Spacing out: Manipulating spatial features in math expressions affects performance. Paper to be presented in a roundtable session on “Design Considerations in Mathematics Learning” at the 2020 American Educational Research Association Annual Meeting in San Francisco, California
Duquennois, C. (2019). Fictional Money, Real Costs: Impacts of Financial Salience on Disadvantaged Students. PhD Paper
Walkington, C., Clinton, V., & Sparks, A. (2019). The effect of language modification of mathematics story problems on problem-solving in online homework. Instructional Science. April. Free version. Publisher DOI: 10.1007/s11251-019-09481-6
Hurst, M. & Cordes, S. (2018). Labeling Common and Uncommon Fractions Across Notation and Education. CogSci. In the Proceeding of the Cognitive Science Society Annual Conference. ISBN: 978-0-9911967-8-4
McGuire, et al (2017). Counterintuitive effects of online feedback in middle school math: results from a randomized controlled trial in ASSISTments. Educational Media International, 54 (3).
Fyfe, E. R. (2016). Providing feedback on computer-based algebra homework in middle-school classrooms. Computers in Human Behavior, 63, 568-574.
Koedinger, K. R. & McLaughlin, E. A. (2016). Closing the Loop with Quantitative Cognitive Task Analysis. Proceedings of the 9th International Conference on Educational Data Mining.