Data collection by firms had one thing missing – “trust”. Part of it is related to “awareness”, new localisation laws and shifting technological focus of “big-tech” to address the ongoing “privacy debate”. It remains to be seen if there is any conflict of interest in the “non-profits” who are likely to be propped up by the big-tech itself. The onslaught of “self-regulation” and “standards”, represents gatekeeping and keeping the regulators at bay. Lobbyists serve the purpose for “educating the regulators”. The laws remain opaque around data retention, though which is a bigger problem.
That curtain has since been lifted and a convergence of consumer, government, and market forces are now giving users more control over the data they generate. Instead of serving as a resource that can be freely harvested, countries in every region of the world have begun to treat personal data as an asset owned by individuals and held in trust by firms.
The challenges will be multifold. The onus is on several overlapping leadership roles at present:
From the write up:
For established companies, these changes present a new set of data challenges on top of the ones they already have. Most large firms already suffer from a series of internal tensions over customer data. They typically have a Chief Information Officer whose role is to keep data in: collect it, encrypt it, and secure it from hackers. They also have a Chief Digital Officer whose role is to push data out: mine it, model it, and use it to entice users. Some have also added Chief Data Officers — a notably unstable position due, unsurprisingly, to lack of definition for the job — as well as Chief Information Security Officers and Chief Privacy Officers.
My concern is that it will bloat up bureaucracy further, making a data-driven organisation a mess of “accountabilities”. Let’s hope that the real silos are broken down to understand the context of AI application in healthcare.