The rising cost of delivering training to employees has driven many corporate managers to migrate traditional methods of face-to-face instruction into e-learning environments. However, the conversion from synchronous to asynchronous delivery often poses the challenge of appropriating sufficient time and resource to the ADDIE phases of instructional design. Limited time and budgetary constraints often force instructional designers to employ alternative design processes that allow them to deploy instruction rapidly, evaluate data in real-time and employ dynamic design changes intuitively. This paper explores the iterative development process of an e-learning training environment that utilized User Experience evaluations to implement design revisions. All design revisions were based on the data analysis of a series of User Experience evaluations aimed at measuring both usability and learnability or “learner experience” of the e-learning environment. Synchronous, remote usability testing was conducted following Steve Krug’s framework for web usability and asynchronous learner experience embedded evaluations followed Kirkpatrick’s Four Level Training Evaluation Model. A total of three iterative cycles occurred within a three-month period. The results of eight usability tests and 138 learner experience evaluations are discussed.
Jessica Leauanae, University of Hawaii, Provo, UT, USA