This is a writing sample from Scripted writer Jasmine Henry
IT pros are about to come up against a new form of user resistance: technology fears. It's likely your users are a little freaked out about the data your team needs to function, and it could make your job more difficult.
A recent poll by Pew Research, discussed by Live Science, found that Americans are afraid of several emerging medical technologies, including gene-editing, brain chips, and synthetic blood. Of those surveyed, 68 percent are either "somewhat" or "very" worried about gene editing on human embryos, while 69 percent fear brain chips to improve cognitive abilities. With 63 percent afraid of synthetic blood, the study also found that humans are especially afraid of med tech with the ability to enhance humans to unnatural heights.
While it's unlikely your IT department is asking employees for blood samples or exploring brain chip implants as a productivity tool, there's some valuable insight behind this research. So, why are people afraid of emerging tech? And how should your IT team navigate these fears?
Why is technology so scary?
Humans aren't afraid of technology that will potentially improve their lives. In fact, 48 percent of the people polled by Pew admitted they wouldn't resist gene-editing technology on embryos if it could reduce their children's gene risk. It turns out that what they're likely afraid of is tech with the potential to make them lose control.
The abundance of high-profile tech data breaches could be partially responsible for this fear privacy loss. Even younger generations, who are less fearful of privacy concerns, remain wary of how their data will be used in the future.
Forbes' Bernard Marr recently pointed out that even our ever-present smartphones aren't strictly "required to survive in society; most of us choose to lease them and pay with our privacy because we like the convenience more than we like living 'off the grid' or becoming a technophobe." However, Marr fears a future where our choices are far more limited, stating, "What happens … when we want to watch television, or talk on the phone, or surf the internet, or even borrow a book from the library, our actions are recorded as a matter of course?"
Staring down a digitally controlled future
The Catholic University of America Columbus School of Law explains this idea is known as digital feudalism, defined as a concept where humans experience a loss of power and rights in a networked ecosystem. Futurist, science fiction author, and cyberpunk movement co-founder Bruce Sterling is just one of many people who fears the Internet of Things (IoT) will lead us into a world of "all-purpose electronic automation through digital surveillance by wireless broadband," as reported by Live Mint.
Sterling isn't the only high-profile tech-head who's openly expressed fears about exactly where artificial intelligence (AI) and other tech is taking the human race. Bill Gates posted in a Reddit Ask Me Anything (AMA), discussed by The Washington Post, that he believes the greatest existential threat to the human race is AI, calling for "regulatory oversight" in case we accidentally "summon a demon" that can't be controlled. Yikes.
While your end users may be the most fearful in your workplace about privacy concerns, even tech geniuses fear digital feudalism. As your IT department is called upon to collect, store, and analyze increasing amounts of data, how do you deal with employee technology fears?
Calming data privacy fears at work
Managing your employees' technology fears isn't simple. In fact, IT pros are increasingly required to collaborate with compliance and legal teams to form workplace policies. However, as your organization considers the use of biometrics, IoT, and other sources of streaming data that comply with laws and your counsel's advice, how do you calm employees?
Step one? Commit to proper protection. Some of the scariest data breaches have involved the leak of employee data. With increasing amounts of personally identifiable information (PII) and electronic protected health information (ePHI) required to comply with the Affordable Care Act (ACA) and other regulatory measures, it can be important to communicate to your employees that you're committed to keeping their social security numbers and other key identifiers safe.
Communicating that personal data protection is a top priority could be a powerful component of company-wide security awareness efforts. By demonstrating that you're working hard at all levels to protect assets, you can open the door to conversations about why tech innovation doesn't always hurt.
Beating fear to foster innovation
Case studies abound that demonstrate just how effectively fear can be a "race to the bottom." Jeff Bezos of Amazon is a firm believer that organizations fail because their culture doesn't foster experiments. The Harvard Business Review notes that given the right environment, employees will naturally innovate in every area within their influence.
The secret to convincing your employees' that IT's technology experimentation, including data collection, isn't a surefire recipe for a breach requires smart data collection practices and ongoing conversations about your efforts to protect data. However, convincing your employees that biometrics and artificial intelligence won't remove their ability to make decisions is equally critical.
Your employees fear what they don't understand. By demonstrating the remarkable potential of AI, biometrics, and IoT in the workplace, you can ignite excitement, while providing clear boundaries about how you will—and won't—use their data.
Is it reasonable to fear the potential of robotics and gene-editing? Quite possibly. But by expanding the focus of your security awareness training and demonstrating the possibilities of exciting new tech, you can educate your colleagues to avoid fearing the worst.
Jasmine Henry is a Seattle-based freelance writer, with specialties in technology, analytics, software, and related fields. She holds a MS degree in Informatics & Analytics, and a Graduate Certificate in Health Care Informatics from Lipscomb University in Nashville, TN. Her work has appeared on Forbes, HP Nucleus, IBM Big Data Hub, Time, ADP Spark, Reuters, and more.