Data Trust and Privacy – The Risks of Constant Connectivity
As the world becomes more connected and data-driven, notions of trust and privacy are evolving. Organizations striving for resilience are designing human-machine interactions into their systems. To accommodate the needs of both man and computer, they are structuring new processes. And as they do, most are keeping a watchful eye on the known and unknown risks of bio-digital convergence.
The most astute organizations are considering what bio-digital convergence means to their decision-making structures and work, as well as to security and the global society. Here’s why that vigilance is so critical. Bio-digital convergence is the most intimate form of connectivity. Connectivity means machines are collecting human data, and soon as the system collects it that human data belongs to the company not to the human user who generated it.
Fundamentally, we all know this. It’s when we think about the scale of bio-digital convergence through the lens of ownership that the criticality of this conversation goes up fast. Think about all the ways you’re connected to the network right now – your watch, car, home theater, online applications, medical devices, home security system, etc. In exchange for the behavioral data you provide, you’re gaining information, functionality, and a tailored experience. All of them are feeding your personal information into the network, where it becomes the responsibility of the organization to organize, store, analyze, and employ.
Organizations of every size and sector are responsible for private data ownership. For all, this massive obligation is a moving target. Mistakes and breaches happen every day on every scale. Obviously, neither the awareness of the responsibility nor the effort to manage it responsibly is sufficient.
Unwittingly Exposing Our Deepest Secrets
In January 2018, Strava, a popular social media-fitness tracking platform for amateur and hobbyist athletes inadvertently turned secretive military bases and patrol routes into public data. The company had updated its satellite-based global heat map. It profiled a billion exercise activities (like running and cycling) performed by its users who had linked a Fitbit or other wearable fitness tracker. In doing so, it exposed the locations and movements of subscribers located in sensitive military bases around the world. Secret places were revealed. It became possible to discern intelligence operations and names of individuals on the bases.
The situation highlighted a problem. Military branches, including the U.S. Army, have promoted the use of fitness trackers for the wellness of their people and organization. Those fitness trackers served private data into a series of networks. Neither the military nor the fitness tracker companies, nor the company creating the athlete heat maps anticipated this breach. The intent was good. The vulnerabilities were unanticipated and dangerous.
One of the takeaways is that security is not entirely proactive. Too often, companies fail to understand the implications of their data gathering processes until they are responding to their failures. It’s not that different with citizens. We share private data every day, knowingly putting ourselves at risk but opting to do it anyway because of large companies that promise rewards. Companies like Strava – or Facebook.
By now, you’re probably well familiar with the news that broke in early March about Facebook and Cambridge Analytica. Apparently, the social platform used around the world to share everything from personal highlights and great meals to pictures of new babies and memorials for loved ones exposed the private data of approximately 87 million people. The government is involved. Stocks have plummeted. And people are trying to delete their accounts, not understanding that unspooling the network of their private information may be nearly impossible and could take as long as 90 days. 
It’s an urgent, stark, and profoundly uncomfortable reminder that all those milestones, photos, comments, and activities they shared on Facebook are no longer theirs. They are trying to get that information, and more importantly control, back.
Disconnecting is Not an Option
Of course, consumers have the choice to stay off social media, avoid fitness trackers, drive entirely manual cars, etc. But not really. Entirely disconnecting is the modern equivalent of living like a survivalist in a bunker in the woods. To be in today’s world is to be connected in some way, shape, or form. On the organizational side, shutting off data systems to create and protect privacy isn’t an option either. There’s no turning back.
That leaves us with changing expectations of privacy. On the side of organizations, that means transparency and clarity in communicating their processes for bio-digital convergence. For users, it means actually paying attention to those communications, realizing that data can be collected whether or not you have opted in, and doing the necessary diligence to decide whether or not to give explicit or tacit consent to the agreement.
Connectivity is on the Rise
Ideally, we are moving toward a proactive, well-regulated, relationship in which consumers and companies share a notion of privacy and trust. Getting to that point means nothing less than the effort to bridge chasms. We don’t have to fast-forward all that far to see a world in which there is a near-complete convergence between man and machine. It’s work buildings replete with sensors that track energy use and where the best place is for people to work to optimize sustainability.  It’s smart cities. It’s cloud-connected fitness trackers that are not just wearable but embedded in your body, like a pacemaker. 
In 2016, the average person generated 650MB of data a day through PCs, mobile phones, and wearables. By 2020, that daily rate is expected to reach 1.5GB of data. That number may sound high, but consider that an autonomous vehicle generates 4,000 GBs a day – and this data is not just about the car or the road – it’s about the people in the car.
In the very near future, data will be collected about us when we don’t even know it. We don’t have to opt-in. We just have to be in the world. Wary consumers will be as vigilant about data as the companies collecting, storing, and using it.
Great Risk – and Great Promise
Bio-digital convergence challenges the notion of trust and what it means to be human. If data is shaping how we live, we are forced to consider who actually controls our decisions and behaviors. There will be winners and losers. How should we prepare for this future state? Toffler Associates thinks about this evolving opportunity and risk every day. We propose that a new role has emerged – a “Trust Architect.” Someone responsible for working cross-functionally within the organization to design and build a system of privacy, intelligence use, and trust standards specifically for the bio-digitally converged world. Whether or not this role exists in the immediate, your company will have to evolve as notions of trust and privacy change.
Industry giants are rising and falling on the soundness of their data. Now is the time to learn, plan, and begin to shape how the man-machine connectivity will benefit each and all of your stakeholders.
Now is the time to create the reasons why your bio-digitally connected customers can trust you.
 The Strava Heat Map and the End of Secrets, Wired, January 2018
 Some Facebook Quitters Face Technical Obstacles, Wall Street Journal, April 5, 2018
 Facebook is under fire for not protecting your personal data better, Recode, 2018
 The world’s most sustainable building is also the smartest, Wired, 2017
 My pacemaker is tracking me from inside my body, Nextgov, 2018
 Data is the New Oil in the Future of Automated Driving, Intel, 2016
- Security and Resilience Analysis