Transforming Surveys into Leadership Tools

A well-designed survey within a measurement process can yield sources of quantifiable data to aid decisions.

If you need business intelligence right now, do you have unlimited money, time and personnel to get it? If you answered “no,” then this article can help you as a learning and development or talent management practitioner. If you answered “yes,” then let me know where I can send you my resume!

Studies conducted in the past indicate users of business intelligence want data for information decision making in a timely manner in order to validate a gut instinct. The value of business intelligence diminishes as time lags, so the need for timely information is critical.

As a learning and development professional, you likely get asked questions after a key program or right before the budget cycle. Data is an objective way to state your business case or validate a program or forecast a future investment. Not surprisingly, many managers make decisions based on anecdotal data that may be harmful in the long run.

The key is to have “roughly reasonable” data at your fingertips to make timely decisions. This doesn’t need to be statistically precise nor have a 100 percent response rate, but provided it is based on reasonable assumptions representing a sample portion of the population, it can be a very good indicator to help make decisions.

This is where surveys come into play. A well-designed survey that is part of a measurement process can yield excellent sources of quantifiable data to aid in decision making. It can save the day as well. For example, a professional services firm held a leadership program last year. The senior partner called the CLO and stated he was talking to another senior partner who had a negative experience at the last program. He requested that it be revamped.

The CLO went to his learning analytics tool and queried his evaluations from that program and all versions of it in the past year. He benchmarked the results against other programs within the corporate university and against an external benchmark point of other leadership programs using the same analytics tool outside his organization.

He then summarized the results in an e-mail to the senior partner, and pointed out that hundreds of senior managers and partners completing the program ranked it in the top three of all programs in their offerings, and that it had a high on-the-job impact applicability rate based on follow-up surveys. He also pointed out that it was in line and slightly higher than other leadership programs from other corporate universities based on the benchmark. The senior partner responded with a brief reply that was something to the effect of, “I’ll talk to my colleague and suggest he talk to you, as this appears to be an isolated incident. Let’s tweak things based on his feedback but not revamp it.”

How many of you would have been forced to start revamping a perfectly good program? If you don’t have credible data in a timely manner, you might find yourself on the defensive. That is not a good situation to be in. A response in which you say you’ll do a six-month analysis of the program also is not appropriate. No senior executive wants to hear that.

Now that I’ve captured your attention, I hope to further orient you toward the power of well-designed surveys for learning and talent management. If they mitigate certain risks, are done right and leverage standards and technology, they can be powerful tools used by the leadership of your team and your organization to make data-driven decisions, not decisions based on rule of thumb or whoever can yell the loudest.

Human Capital Surveys
First, let’s briefly cover the talent management and human capital spectra where surveys can be used for information decision making. The following is a process classification scheme that takes a mutually exclusive and collectively exhaustive approach to defining the process comprising human capital management:

1. Manage Deployment of Personnel
a. Forecast Workforce Requirements
b. Recruit, Select and Hire
c. Succession Planning
d. International Assignment
e. Mobile Workforce
f. Employee Turnover
2. Manage Competencies and Performance
a. Competency Management
b. Performance Appraisal
3. Develop and Train Employees
a. New Hire/Onboarding
b. Learning and Development
c. Coaching and Mentoring
d. Leadership Development
e. Knowledge Management
4. Motivate and Retain Employees
a. Compensation and Benefits
b. Employee Satisfaction
c. Employee Engagement
d. Work/Life Balance
e. Workforce Diversity

In each of the aforementioned processes, standard surveys can be developed to regularly or periodically measure the performance of these functions. The measurement process is practical, scalable and replicable. It is not resource intensive, but it can have powerful and timely implications on decision making and analysis.

Let’s go over a few examples:

A government entity was facing the retirement of a large portion of their workforce. They needed to assess the competencies of the existing workforce to address this issue. There were more than 100,000 personnel in the workforce who held a variety of jobs. Due to lack of money, time and staff, the entity took a crawl-walk-run approach. They built a generic 28-question survey that looked at core competencies of a managerial employee of the future. Business acumen, financial acumen, creativity, decisiveness, external awareness and the like were included on the survey. They devised a rating system that ranged from absent to expert and had a sample of 6,000 future managers self-assess themselves.

This government organization got a great deal of data back from this survey within 10 days. They downloaded the data into a pivot table and ran a red/yellow/green scorecard based on multiple scenarios such as grade level, years of service, agency and job function.

Many areas within human capital management used this to help them plan for the future. For example, recruiting used the data to understand the biggest gaps that needed to be filled from outside the organization so they could start their workforce planning. Learning used the data to understand what future programs needed to be built to ramp up the competencies of existing employees who would be future managers. Finally, the people leading the performance management process used the data to help set goals for individual and organizational performance in the years ahead, which emphasized where the organization needed improvement.

Now, let’s turn our attention to learning in particular.

Learning Surveys and Evaluations
Nearly all learning organizations do a “smile” sheet. However, few organizations do something with it outside of a local perspective. The first step toward improvement is to make it a “smart” sheet.

Unlike smile sheets, a smart sheet gathers far more than just Level 1 Kirkpatrick data. The smart sheet forecasts learning effectiveness, impact to the job, linkage to business results and even ROI with the Phillips process built right into it. The smart sheet takes two to three minutes to do and can be a very powerful forecasting tool when benchmarked within your learning organization and outside of it.

A great example of the smart sheet in action is its use when identifying scrap learning, which refers to education that is not applied on the job. Data shows scrap learning is the result of the environment people come from and go back to. But in the absence of data, it often produces a war of words.

For instance, an insurance company recently collected data on a smart sheet, and gathered a little more on a follow-up smart sheet. These asked if the learner would apply the learning and to what degree, and then asked him the same question 60 days later. Each response was tagged to a business unit within the organization.

One day, a learning manager was reviewing this data and noticed the finance group had the lowest impact (or, conversely, the highest “scrap”) relative to other groups, such as claims, underwriting, HR and IT.

The learning manager went to the CFO with the data. The CFO was embarrassed because finance was supposed to be the poster child of stewardship with company funds, yet it was the biggest source of waste. He was also a data-driven CFO: He viewed the data as being from an objective and reasonable source, and saw the learning manager’s argument not as a complaint but as a legitimate concern.

The CFO called a meeting with the controller. The meeting revealed that finance managers were conducting their own training and thus discouraging their team from using training from the corporate university. This was due to the fact that finance managers didn’t think they had enough involvement in content from the corporate university.

The meeting fostered a dialogue. The end result was that the CFO and the CLO agreed to work more collaboratively, and also that the corporate university would design and deliver training exclusively. There would no longer be duplication by finance.

In the absence of this data, the scrap would have continued to grow. The CFO would have found his own personnel more credible and trustworthy in a war of words. In this example, being armed with timely data gathered through a day-to-day process with internal benchmarks made a world of difference. The key is to have the process in place, create the standard smart sheets and analyze the data regularly to spot the scrap.

As with the human capital examples, the learning example is simple, easy and cost effective. It is neither statistically valid nor precise, but it is roughly reasonable. If roughly reasonable is in the budget, doesn’t it make sense to do it? It could save a multimillion dollar program down the road. Further, it evokes a sense of stewardship in the measurement process that any C-level executive would champion in a heartbeat, especially if the cost to do it was less than a rounding error in the overall learning budget, which it is.

Challenges
By now, you may be thinking surveys are great. But they do come with their pitfalls, which primarily are survey fatigue, response rates and customization.

Survey fatigue exists because employees are bombarded with surveys. How do you get around it? The key is coordination. Know when others are doing surveys and be sensitive to that. Make the surveys optional but highly encouraged with positive reinforcement. Finally, don’t survey the whole population, but instead use a sample.
Also, most learning professionals grimace when they go from paper to electronic evaluations because they fear 100 percent response rates will go to 30 percent (or worse). In actuality, it typically falls to about two-thirds. It can be higher if you emphasize the importance of the evaluation in the class by handing out a notice with course objectives that explains when, where, how and why the evaluation is necessary. Reminder messages also are a nice way to keep rates high.

Additionally, giving away an iPod once a month is just enough incentive to get people to fill them out. In the end, electronic data collection not only saves tons of administrative time and cost, but also yields better comments and more objective data.

Finally, customization is a challenge. I’ve been in rooms where the learning subsets argued over a five-point or seven-point scale, or whether they’ll refer to the teacher of a course as a facilitator or instructor on the form. The key is to reach a consensus with a standard set of questions. Smart survey tools can account for tweaks in verbiage or scale and still maintain data aggregation and consistency. Further, these tools can do conditional or course-specific questions to provide some flexibility.

Try this: Ask the entire team to just use a standard smart sheet for three months, then come back with changes they wanted before. In my experience, nobody comes back with changes 75 percent or more of the time. So, be flexible but firm in revisions.

Conclusion
Surveys make great leadership tools if done right and on a regular basis with the right tools to collect and analyze the data. They are excellent sources of business intelligence for the resource-strapped organization and help make data-driven decisions. They’re not perfect, but neither is anything in business.