Baseball may be “on hold” until at least July, but that doesn’t have to stop you from tracking important stats—ones that reflect your staffing vendors’ performance.
Fans are eagerly awaiting the[...]
Is technology moving too fast? This has been the subject of much discourse since the early 2000s, and never has the debate been more heated. Automation and AI are evolving at a breakneck pace, kids are growing up attached to their phones, self-driving cars are on the horizon, and new technology companies are sprouting up every day. On the one hand, tech does make our lives easier in many ways. But on the other, it seems like its uncontrolled growth could become a problem. The answer lies in learning how to leverage technology properly - and ethically.
The past three weeks have been roller-coaster news weeks for the tech world. First, we had the bitcoin craze that exploded across the globe before quickly fizzling out. Days later, Apple dropped a bombshell by admitting what many of us suspected for a long time: they deliberately slow down older iPhones. Though the company explains that this is to prevent aging lithium batteries from shutting down completely, many an Apple customer will theorize that it’s actually planned obsolescence – a way to force iPhone users to shell out big bucks for a newer model.
Regardless of the real reason behind the slowdown, Apple is now facing a whirlwind of lawsuits, with claims ranging from breach of contract to consumer protection violations. In an effort to remedy the situation, Apple is offering discounted battery replacements and new features that allow users to monitor whether their phones’ batteries are affecting its performance. Probably, that’s what Apple should have done in the first place.
The headlines don’t stop there, either. This past week, a New York Times report found that more than 250 games across the Google Play and Apple App stores are utilizing a monitoring software to – get this – listen to users while they watch tv and movies. The software, called Alphonso, collects tv data for advertisers, allowing them to target ads and even find out which commercials were successful. In other words, it listens to what you’re watching and can tell if a commercial made you run to a store and buy the product. Scary stuff, right?
Though this particular use of the software may seem innocuous enough, it’s the software’s potential and how the data was collected that has users up in arms. In the NYT report, Alphonso revealed that the app is installed in more than 1,000 games and apps. Even more shocking, a vast number of users had no idea the software was there, even after allowing the game or app to access their microphone. Theoretically, the app even has the ability to listen while running in the background, or from your pocket.
It’s this type of news that has consumers worried about the safety of their information and the amount of control they have over technology. We like to think technology makes our lives better, our content experiences richer, and our day-to-day tasks easier. Technology should be there to complement our lives, not run them. And it certainly shouldn’t be an invasion of privacy. Yet, with AI and automation moving so quickly, it seems that the learning curve of how to ethically and properly leverage technology hasn’t quite caught up.
Even an industry as “traditional” as recruiting is feeling the effects of too much technology. Automation software, meant to streamline the hiring process and centralize information, has become something of a hinderance. One survey found that 82% of job-seekers feel recruiting is overly automated, and nearly all of them (95%) think that technology should only be used in a supplementary manner. AI and Big Data are necessary in recruiting, especially when it comes to things like letting a candidate know where they are in the pipeline – something that recruiters often struggle with. However, it’s evident that technology has become a crutch that has depersonalized the job search process and left a stain on candidate experience.
Ethical data collection and usage is another major concern in the staffing industry, and has been since the Big Data boom in 2015. From corporations exploiting medical wellness program data to recruiters using social media to research a candidate’s personal information, Big Data is riddled with ethical pitfalls. As technology continues to grow and we’re able to glean more insights from AI, it’s imperative that we use technology correctly. Not just for ethics’ sake, but for the sake of candidate experience, too.
Before we succumb to explosive growth, and possibly explode because we’re not prepared for it, we should remain mindful of the lessons other companies have taught over the past three years. It’s not just about data security, either. It’s also about data integrity and ethical use. Our own industry isn’t immune from threats. We’ve seen cases go to court where staffing providers have been accused of falsifying candidate records, attempting to manipulate the E-Verify system, misrepresenting claims, and more.
The world of talent acquisition has become fertile ground for tech innovators. In fact, 2018 is poised to become a hallmark year for software developers who are focusing on launching advanced automation to refine outdated hiring processes. Yet with the advent of new platforms, the importance of data security, privacy, and ethical usage will grow in importance.
When data violations occur, the problems are almost always human in nature. They can be unwitting mistakes such as substandard, poorly implemented or outdated security protocols. In rarer cases, they can be intentional. Regardless of the cause or the figurehead who must shoulder the blame, there exists an entire network of people who contributed (knowingly or not) to the issue: executives, software developers, engineers, data analysts, product managers, data storage and processing specialists, and other people responsible for the code and algorithms that facilitated the violation. The good news is that because it’s a human problem, there’s a human solution.
In many ways, problems that lead to data breaches are more human than technical in nature. For example, credit card details don’t appear in an enterprise data warehouse without an invitation. Neither do sensitive candidate details. After the information is stored, software is written to link a user’s profile to the personal data collected. The technical staff is involved both in designing the algorithms and implementing the code. Developers are engaged to perform the work based on the business need. The point is that every aspect of data accumulation, storage and use is an intentional process undertaken deliberately by humans. So it also stands to reason that misuse or abuse also fall within that realm.
Kaiser Fung, who leads the Applied Analytics program at Columbia University, observed that other business needs often take precedence over data ethics in the decision-making process: “Managers debate topics such as product innovation, user experience, resource requirements, competitive strategies, and return on investment.” Educating tech teams on the ethical standards of processing that data, however, seldom takes place.
Our clients and our talent have placed their trust in our abilities to keep their information secure, accurate and free from misuse. Here are some simple ways to ensure that technology leaders in the contingent workforce industry excel.
No two companies will necessarily have the same processes, yet establishing and enforcing ethics standards is critical. They must be transparent, agreed upon, communicated and monitored.
To deliver the superior service users expect, it falls on us to make sure that our ethical data standards reflect the values and promises we champion in our products – and that those standards apply to every individual who relies on our platforms: hiring managers, contingent workforce program leaders, staffing providers, recruiters, executives, and workers.