Simulations: What Do You Need to Know?

Have you been hearing all of the talk about simulations? It seems that the focus of every learning analyst and recent industry magazine article is software simulation. There are simulation shootouts at conference events, white papers from one e-learning g

Time for a sanity check. The reality is that most of the talk is about the features and functions of these products and a list of vendors in the field for less than two years. There is not enough information about how to effectively design and create simulations for training through proven instructional methodology. The current hoopla around simulations is reminiscent of the emergence of learning management systems (LMS) just a few years ago—new technology that everyone thought would be the answer to their needs. It’s only in the past year that analysts and LMS customers have realized that the tracking, scoring and administration features are only useful when the content is compelling and plentiful. In the meantime, companies are discovering the true cost of implementation and actual time to return on their investment.

We should learn from these experiences and avoid the same issues with simulation technology. Currently, there is a disconnect between the industry buzz and questions you may have about how to meet your business and training needs through simulation. The “buzz” would have you believe that simulation should be the dominant form of training and that the higher the fidelity—the level of simulation realism—the better. Yet there has been virtually no discussion about two essential questions that must be considered in order to build effective software simulations:

  • What differentiates the design of an effective simulation from the design of any other effective learning experience?
  • What’s the right balance between fidelity and bandwidth, fidelity and timeline, and fidelity and learning?

The design of an effective software simulation is very similar to the design of any other effective learning event. Learning objectives, task analysis and audience analysis are crucial in developing simulations with complexity, support, relevance and feedback. The type and depth of simulation need to be related to performance objectives, task complexity and user skill level in order for the simulation to provide an effective means of training.

The biggest difference between effective simulation and effective non-simulation design is the issue of fidelity. In most learning events, a certain level of fidelity is required: If you are using a matching game with screen components and definitions, it’s important that the images used in the matching game accurately represent the screen images in the real application; otherwise, they would not serve any instructional purpose. In a software simulation, this issue goes much deeper. If the user needs to select from a drop-down list, should the list items represent the actual items in the application’s drop-down list? Should the user be able to use the arrow keys to move through the list as they could in the application? Should the user be able to select an item by clicking and pressing “Enter”? The central issue is this: To what extent does the fidelity of the simulation impact learning?

Once you have determined the appropriate level of simulation fidelity, the next logical questions are about balance: What is the right balance between timeline, bandwidth, learning and fidelity? In other words, if your optimum level of fidelity requires a million-dollar budget and your budget is a third of that, you may need to rethink fidelity. Similarly, if you have six weeks to make 20 simulations, you may need to tailor the fidelity level, even if the result is a less robust learning experience.

Historically, organizations have two options: use low-end, off-the-shelf tools that can develop very rudimentary simulations quickly, or custom-develop simulations at very steep timeline and price costs. Most off-the-shelf tools quickly hit a threshold with regard to fidelity, and high-end tools quickly hit a threshold with regard to cost and timeline. Generally, this either means that learners have had to settle for less-than-optimum simulation depth in order for the business to meet timeline and cost constraints, or it means that organizations make a significant investment, while hoping that the application doesn’t change before the custom simulation is done. Neither is an ideal situation. There must be a balance between fidelity, time to produce and cost.

What is required is a high-fidelity, off-the-shelf tool that can cost-effectively meet the needs of projects with short timelines and modest budgets. This tool needs to provide the means to stay within established timeline and cost constraints while simultaneously exceeding the threshold for fidelity and learning. With this type of tool, the designer has the freedom and flexibility to concentrate on the essential components—performance objectives, task analysis and audience analysis. These are the factors that should drive the level of fidelity, not timeline, bandwidth or cost. Ideally, the designers should be free to focus on creating the most effective training for their learners, rather than making concessions because of business constraints.

The good news is that vendors have answered the call, and next-generation simulation-creation tools are now hitting the market that:

  • Create higher levels of fidelity than most custom-developed simulations.

  • Empower subject-matter experts, not programmers, to quickly and efficiently create robust simulations in minutes, rather than days or weeks.
  • Enable subject-matter experts to build instructionally sound simulations—an important component of an effective training solution.

Dave Wilkins, director of Tools and Technology at Knowledge Impact, has more than 10 years of experience in education, application training and e-learning technologies. He is responsible for leading the engineering team at Knowledge Impact.

Annette DiLello, director of Product Development at Knowledge Impact, has more than 12 years of experience in education, application training and e-learning technologies. She is responsible for leading the product development team at Knowledge Impact.

July 2003 Table of Contents