<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1509222356040859&amp;ev=PageView&amp;noscript=1">

Crowdstaffing featured as Rising Star and Premium Usability HR platform in 2019

Crowdstaffing has earned the prestigious 2019 Rising Star & Premium Usability Awards from FinancesOnline, a popular B2B software review platform. This recognition is given out annually to products[...]

May 13, 2019

Read More
All Posts

The Secret to Analytics Isn’t Bigger Data, It’s Finding the Right Data

In 1954, Darrell Huff published How to Lie with Statistics, which went on to become one of the most famous business primers in history. His intent was to explain, in simple terms, the abstruse concepts of statistical methods, their increasing presence in commerce and society, and how they’re interpreted. Although modern MBA programs probably don’t include Huff’s masterpiece in their required reading lists, they should. It was ahead of its time when released and remains tremendously relevant in our current culture -- one obsessed with Big Data. Every industry and market is consumed by an urgent need to amass more and more data. This is particularly true in our industry. The persistent issue, however, has little to do with accessing or collecting information -- it’s about how we interpret it. Consider this real-life story.

Statistics Storytime

This brief tale comes from one of our team members. Last June, his son’s close friend was preparing to graduate from eighth grade. However, he was informed that he’d been excluded from attending the promotion ceremony because of a failing mark. The child, let’s call him Sam, is exceptionally bright. The F he received surprised everyone.

The school Sam attended continues to use a computerized grading system. Its benefit, educators say, is to allow parents and students real-time access to progress. However, teachers are not contractually obligated to update it. In this case, the F didn’t appear until it was too late to act. The system does something else -- it makes a recommendation to teachers based on the scores. In this instance, Sam’s parents were told they should consider enrolling their son in a special needs program. That’s when my colleague decided to analyze the data.

To help Sam’s parents understand what happened, our co-worker extracted all of their son’s grades from the system. He then ran three scenarios. In the first, he simply removed all the zeros, which the system records for incomplete or missing work. Nearly all of Sam’s zeros resulted from homework. In the second scenario, my colleague focused only on tests and projects, calculating those scores alone. And in the third scenario, he analyzed test and project grades, yet factored in the small number of related zeros there. Sam, on the rare occasion, did not turn in a major project. Shame on Sam. Here’s what my colleague found.

  • Scenario 1: Removing all zeros from the system, Sam would have achieved a GPA of 86.69 percent.
  • Scenario 2: Counting only test scores and projects, Sam’s GPA would have been 89.65 percent, a very high B or low A, depending on the school’s measurements.
  • Scenario 3: Scoring tests and projects that included the zeros, Sam still would have earned a passing grade at 68.50 percent, a fairly high C.
  • Daily homework assignments, despite much lower point values than tests or projects, weighed shockingly high in the overall grade.

The school’s analysis of the grades merely involved an assessment of the cumulative GPA, followed by the conclusion that Sam may have a learning disability or intellectual shortcoming. The reality? Sam is lazy and has poor study habits.

The boy’s parents confessed that Sam had lied about completing his homework because he wanted to hang out with his friends and play. They said Sam also considered the daily assignments unnecessary “busy work” since he was receiving As or Bs on all his exams and projects. Sam’s parents presented these findings to school principals. The happy ending is that he didn’t end up in a special needs program. In fact, he’s now in an advanced academic program in high school.

Data Too Big, Interpretations Too Broad

Huff’s book brims with similar examples. In some cases, he discovered, data were intentionally misconstrued or manipulated: “The secret language of statistics, so appealing in a fact-minded culture, is employed to sensationalize, inflate, confuse, and oversimplify.” The rest of the time, the conclusions contained bias or overly broad interpretations. Getting to the truth requires identifying the right data.

In one of the book’s first examples, Huff cites an infamously flawed statistic by Time Magazine about the Yale graduating class of 1924. “THE AVERAGE Yaleman, Class of ‘24,” the article begins, “makes $25,111 a year.” It’s a weirdly specific number. It’s also inaccurate. Adjusting for annual and total inflation, that salary today would be $343,316.72, an astonishing amount of money. Huff illustrates the successive problems with this representation.

  • There’s a really slim probability that the “average” income of any group can be reckoned down to the dollar.
  • The incomes reported were not entirely based on salary -- “people in that bracket are likely to have well-scattered investments.”
  • Most importantly, Huff notes, “This lovely average is undoubtedly calculated from the amounts the Yale men said they earned.”
  • Not every graduate was willing to report his income, decreasing the size and composition of the sampling group.
  • Averages are broad. Bill Gates, for example, earned $11.5 billion this year. The typical software developer in his state took home about $75,000. However, when combined, one could trot out a statistic that claims their average salary is somewhere near $5.7 billion. And we know that isn’t true.

Many companies consider themselves data driven, and they rely heavily on information gathered from a variety of sources -- their clients, workers, suppliers and more. Yet, too often we find that their interpretations of the data are biased, oversimplified, overly broad or inductively reasoned to prove a hypothesis rather than deductively analyzed to uncover a reality.

Developing a Meaningful Analytics Program

People analytics and Big Data remain relatively new processes in staffing. Obstacles persist interpreting and using the data. As hiring managers and HR leaders struggle to grasp the complexities of people analytics, they frequently fall back to relying on “gut instincts.” Yet, the problem is larger than that. As Michael Skapinker observed in the Financial Times, “It is not just our biases that get in the way but that past performance cannot predict results.”

Before embarking down the path to utilizing big data, we need to prepare for a mindset shift. This is first step contingent workforce professionals should consider when approaching clients. People analytics are not reactive -- when used properly, they provide illumination rather than support. That means we should approach data with curiosity and impartiality -- not as a vehicle to prove something we already believe, or that others believe. In the end, the results of a careful interpretation might not be what we had hoped, yet they will point us in the best direction.

Identify the Right Data

Know the objectives and what could be different or changed because of the results. Ask your team a few simple questions to uncover the right data.

  • What are we trying to achieve?
  • What information do we ideally need to make a decisive choice or course correct our current direction?
  • What is the real business problem we’re trying to tackle?

By identifying the answers to these questions, we can work backward to uncover the data our clients need.

Build Thoughtful Samples

  • Drive thinking that extends beyond single a department or division. Consider how the data affect the organization and its talent as a whole.
  • Defend against confirmation biases that can arise from like perspectives or people who think the way we do. Approach the analysis as one of the researchers on “MythBusters.” Attempt to disprove accepted norms. Be receptive to risks, failures and unexpected outcomes -- all of these situations are critical learning experiences that will improve the process.
  • Use good data: reliable, valid, clean and complete. The data should be objective, not based on a specific business group, category of talent, company division, or hiring manager.
  • Design comparisons across groups and over time.

Enlist Partners Early

Even the most thoughtful and expertly performed analysis can fail if stakeholders are not informed and included in the process. We should strive to bring others along with us on this journey of discovery, and solicit their input. The decision makers will be more likely to participate, review the research, understand its value and implement the recommended changes. Otherwise, the entire effort can be jeopardized. Without prior knowledge and inclusion, other stakeholders in the process may feel as though they’re being told how to do their jobs, especially if they think things are going well at the moment they’re handed a substantive report outlining all the things they need to change.

Despite best intentions, recipients in this scenario will feel blindsided. And when that happens, crucial plans languish on a shelf unimplemented and collecting dust, which amounts to wasted opportunities, squandered time and lost costs.

Assemble the Right Team

Designing the right team is imperative and should take place before any data collection or analysis occur. Although MSPs and contingent workforce program managers have mountains of useful data in their systems, the effort must be more expansive and collaborative to succeed. The best teams include a broad swath of representatives. In an outsourced workforce program, that would incorporate professionals from the client organization, the MSP, the VMS and staffing partner firms. These subject matters experts will be required to address the Whys, the Whats and the Hows of the project.

  • Why: hiring managers, operational leaders and executives to provide the business expertise.
  • What: staffing partners, procurement leaders and HR officers to provide expertise on the talent.
  • How: Data analytics specialists from the MSP, staffing firm, client organization or technology provider (e.g., VMS) who understand the information, how to gather it and how to interpret it into meaningful results that decision makers can act upon.

Open Minds and Open Eyes

The benefits delivered by people analytics are unparalleled. And while the process might seem alien and overwhelming now, in the not-too-distant future Big Data can open our eyes to a world of exceptional talent and new generations of innovators we didn’t see before. We just need to make sure we’re looking in the right places and turning over the proper stones. As we've covered previously, sometimes the right data is big and sometimes it's small

Sunil Bagai
Sunil Bagai
Sunil is a Silicon Valley entrepreneur, thought leader and influencer who is transforming the way companies think about and acquire talent. Blending vision, technology and business skills honed in the most innovative corporate environments, he has launched a new model for recruitment called Crowdstaffing which is being tapped successfully top global brands. Sunil is passionate about building a company that provides value to the complete staffing ecosystem including clients, candidates and recruiters.
Post a comment

Related Posts

Crowdstaffing featured as Rising Star and Premium Usability HR platform in 2019

Crowdstaffing has earned the prestigious 2019 Rising Star & Premium Usability Awards from FinancesOnline,...
Ravdeep Sawhney May 13, 2019 4:36:00 PM

Re-thinking the Right to Represent

People are not commodities. We understand this intellectually but in staffing we have many systems that t...
Sunil Bagai Mar 14, 2019 9:24:00 AM

HR Analytics: Moneyball For Growing Businesses

HR Analytics: from eyesore to icon The Eiffel Tower celebrated its 130th birthday this year. Originally c...
Casey Enstrom Feb 18, 2019 6:54:00 PM