As digitization, coupled with the global pandemic, propels contingent hiring online and with more individuals relying on employer reviewer sites to evaluate businesses, delivering a positive[...]
As digitization, coupled with the global pandemic, propels contingent hiring online and with more individuals relying on employer reviewer sites to evaluate businesses, delivering a positive[...]
March 10, 2021
Read MoreBig Data became a big topic in 2015. As staffing professionals, we’re coming to depend on people analytics with greater reliance. Through advances in data science, we can generate more accurate metrics, improve forecasting, better predict needs and potential pitfalls, effectively determine optimal candidate qualities, and vet prospective talent intelligently. A data-based hiring process is an undeniably important trend for the industry in 2016.
And yet Big Data can present its share of big problems and big temptations. By the conclusion of 2014, researchers at the Ponemon Institute estimated that 110 million Americans -- about half the adult population of the country -- had fallen prey to cyber criminals who exploited allegedly secure systems to expose their victims’ financial, transactional and personal details. Now try and imagine if the hackers had been authorized to look at those records.
We in the employment industry often process a ton of private data about our clients and workers through VMS, ATS or other HR software. Data security and privacy protections are matters we must take seriously, both as corporate citizens and ethical business leaders. That’s why everyone involved in the staffing industry -- in the very business of people -- should care about the confrontation between Apple and the FBI.
Apple vs. the FBI in Data Privacy Limits
As you may have read in the news, Apple is embroiled in a contentious battle with federal investigators and the Federal District Court for the District of Central California over the privacy rights of iPhone users. After the tragic mass shootings that rocked San Bernardino, California, this past December, investigators with the FBI seized the iPhone used by the suspect. The bureau had a warrant to search the phone, which was issued to the shooter by his employer. That employer gave its consent to have the device’s contents searched. Legally, the FBI can and should analyze the data. The problem, however, is more nuanced.
Gizmodo’s Kate Knibbs provides a solid example of the issue. She writes, “But if the FBI comes across a safe in that house, the warrant and permission do not mean it can force the company that manufactures the safe to create a special tool for opening its safes, especially a tool that would make other safes completely useless as secure storage. That’s the situation that Apple’s dealing with here.”
The FBI was unable to crack the user’s passcode. A certain number of unsuccessful attempts will trigger a failsafe in the Apple OS that erases all of the data. So, the bureau asked a district court to issue an order that would require Apple to build malware into its own system, relying on a new interpretation of an 18th century law (All Writs Act of 1789) that’s becoming popular with government agencies seeking access to user data.
Apple CEO Tim Cook, in his letter to customers, highlighted the company’s cooperation with the government’s investigation. Apple complied with subpoenas and search warrants in the case, and it made engineers available to advise the FBI. Despite that, the government is demanding much more.
Cook explained the request: “Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software -- which does not exist today -- would have the potential to unlock any iPhone in someone’s physical possession. The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”
Data Privacy in Peril
The argument Apple is making should be viewed as an important one. The company isn’t pushing back to hinder a government investigation or protect its own trade secrets. Tim Cook is a wise and compassionate man. He has made countless resources available to aid the FBI. Where he’s drawing his line in the sand is at turning over unlimited access to his customers’ data. This is an issue that impacts our industry as well.
“The implications of the government’s demands are chilling,” Cook writes. “If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data.”
Consider the recent loophole some employers have exploited in their corporate wellness programs. HR professionals know it’s illegal to ask employees to share their personal health information -- the one exception being wellness programs. As Rachel Gillett reports for Business Insider, “Some companies are now taking advantage of this loophole in a major way, and it may clue them in to more of your personal life than you thought possible.”
As it turns out, the Americans with Disabilities Act allows employers to perform voluntary medical examinations and request details of medical histories as part of employee health programs. The Kaiser Family Foundation, Gillett notes, discovered that half of the country’s large employers have asked workers to submit to medical tests and fill out health risk assessments as a prerequisite to participating in wellness programs. And it doesn't end there.
“Employee wellness firms like Castlight Healthcare and insurers are mining various employee data like past pharmaceutical and medical insurance claims, shopping and voting habits, credit scores, and search history within the health apps.”
There are enormous risks and legitimate concerns about these tactics. There are also no discernible privacy protections. In a wellness program run by the City of Houston, employees found that the contracts authorized companies to transmit the data to third party vendors acting on their behalf. The fine print also stated that the data could be made public. So what happens to workers who opt out? They’re forced to pay an additional $300 out of pocket for coverage. Despite being presented as voluntary, this wellness program appears very coercive in nature.
We Have an Obligation to Protect the Privacy of Our Clients and Talent
As Big Data gets bigger -- and as reliance on it grows -- staffing and HR professionals can set a positive example in the midst of what’s becoming a cautionary tale. Our industry champions diverse talent -- exceptional professionals from all walks of life, ethnic backgrounds, religious beliefs, sexual orientations, physical abilities and more. We must ensure that the data we access, or distribute, is never used as tool for inclusion or exclusion. The need isn’t just pressing, it’s imperative. DCR Workforce, in a recent article on maintaining compliance during data mining, offers some top-notch advice.
The Importance of Data Ethics
Our clients and our talent have placed their trust in our abilities to keep their information secure, accurate and free from misuse. Here are some simple ways to ensure that your teams excel.
No two companies will necessarily have the same standards, yet establishing and enforcing ethics in those standards is critical. They must be transparent, agreed upon, communicated and monitored. To deliver the superior service users expect, it falls on us to make sure that our ethical data standards reflect the values and promises we champion in our products -- and that those standards apply to every individual who relies on our platforms: hiring managers, MSPs, staffing providers, recruiters, executives and workers.