Leaders Can’t Ignore Ethics of Data-Driven Applications

Leaders hoping to catch the next tech wave will need to wrestle with privacy, support and security issues in new data-driven applications.

Every new generation of technology brings unanticipated legal and ethical challenges. The latest issues are arising from three complementary trends for continuously capturing, presenting and analyzing data technologies:

Multiple data-driven applications create ethical challenges for leaders.Big data. Huge amounts of data are processed across thousands of servers with open-source tools. Big-data projects can mine servers that an organization can access, but does not necessarily own.

Internet of Things (IoT). IoT devices collect data from animals, plants and inanimate objects and deliver it to the Internet for processing by computers and mobile devices. IoT tech is being used for tasks such as tracking animal behavior, monitoring soils, recording driving behavior and controlling indoor climates.

Wearables. Glasses, watches, fitness bands, jewelry, fabric and other things placed on the body providing data to Internet-connected devices. Many of the current uses relate to health.

Our new data-centric world generates multiple legal and ethical complications that leaders must pay attention to:

> Data is shared by multiple owners.

> Privacy issues are increasingly complex.

> Some people see advantages in giving up privacy.

> Companies need to secure their data.

> Customer support requires accountability.


Shared Data May Have Many Owners

Economics will sometimes force organizations to use applications that share data, explained Bill Weinberg, director of open-source strategy at Black Duck Software.

Sometimes, that simply means using publicly available information or using software as a service. Other times, however, it means joint ownership of data, which creates strange bedfellows.

“Imagine you’re doing soil quality or moisture sampling or something like that,” he said. “The farmer’s going to use that to determine fertilizer and watering, and know what his yield is and be able to predict in combination with weather data what the yield will be like and be able to impact his finances, but it’s actually quite expensive for a farmer, so it’s very likely that that particular activity will be subsidized by the bank or by a consortium of farms or by the state.

“You have to make some decisions about the ownership of that data, the exclusivity of that data. On the flip side, the farmer might not want his bank to know that much about his productivity because they might be about to foreclose on him or he might not want the state to know because there might be tax implications.”

Privacy concerns become more complex

Not only is the data more sensitive, but as part of a more connected system, with better data mining tools, it’s available to more eyes.

People like to keep their information in silos. They’re OK with giving limited information to specific organizations, but as data becomes shared, they have less control. For example, a driver might give her insurance company permission to connect a monitor to her car engine in exchange for a discount on her premium, but might be less willing to give that information to police.

“A lot of people say, ‘That’s great. I love a discount,’ but ultimately, it impinges on my privacy,” Weinberg said. “It means I can’t drive the way I want to, or if I have to speed up, it means I have to explain everything I do.”

The new data-driven world will take privacy concerns up a notch.

“When we’re just giving email addresses or identity information, that’s way benign compared to the day when we’re giving genetic information or health information,” Weinberg said.

Similarly, health information might be fine when shared with a doctor, but in the hands of prospective employers, it could be used to discriminate against candidates, said Norm Matloff, professor of computer science at the University of California-Davis.

Anonymity has its limits

What’s more, it’s possible to draw inferences and figure out who people are, even when information in anonymized, Matloff said. “If I have enough data, I can deduce things about you that you probably want to keep private,” he said. Even such things as Amazon browsing history could lead to conclusions about a person’s health, he said.

Leaders will need to take steps to let customers know what information is being shared, and with whom, Weinberg pointed out. However, he added that while companies regularly post and update privacy policies, there are no regulations for maintaining those promises. So data originally submitted for one purpose can later be used for another.

“We’ve seen social media dataset owners from one day to the next change their privacy policies and decide to market to the membership using the data they have, or to sell the dataset even if on day one they swore black and blue they wouldn’t,” said Weinberg.

Matloff said it was important for leaders rather than engineers to make the call about pivoting business directions in ways that involve customers’ data. Engineers, he said, tend to focus on the feasibility of actions rather than their social consequences.

Not everybody is concerned about privacy

Nevertheless, giving up privacy is not without its advantages. While older generations view the lack of privacy as invasive, millennials don’t see it that way, and willingly make available information about themselves, said James Hughes, executive director of the Institute of Ethics in Emerging Technologies.

“I’m not sure we need or want that much privacy,” Hughes said. “Young people are giving information out without much thought. Anonymous communication can turn pretty nasty.” Many blogs and newspaper comment systems have moved away from anonymous access to more transparent systems using Facebook and other social media.

Hughes suggested that going forward, we will need to make leaders more transparent in the same way that these systems have made the public more civil.

Security risks become ever-present

Related to privacy is security. Break-ins happen.

Although in the past companies would try to assure their customers that their data was secure, Matloff said it’s better to have a plan in place for when data is breached, rather than trying to assure customers that it won’t be.

“Unauthorized access is inevitable,” he said. “Tech companies have a responsibility to anticipate that and minimize the damage.”

Somebody has to be accountable for customer support

One final consideration for leaders planning systems with IoT, wearables and big data is customer support. Businesses will need to coordinate to support customers when something goes wrong.

“If you don’t have an end-to-end big picture, you’re going to have a lot of self-satisfied ecosystem players who say ‘I’ve done my part,’ ” Weinberg said. “Getting them to accept best practices and getting this responsibility to be sufficiently shared as to be realistic is a huge challenge.”

In the end, customers don’t care whose fault a problem is; they just want to have it fixed.

Related