We mentioned as long ago as June last year, in our third update of the Periodic Table, that the conversation surrounding AI and Privacy had already gained serious momentum.

Today, it seems that that momentum is unstoppable:

       Dec 2019

Dec 2019: The European Commission’s new president, Ursula von der Leyen, pledges GDPR-style legislation to govern the “human and ethical” implications of personal data in AI applications  

       Dec 2019

Dec 2019: The UK’s ICO releases a first draft of regulatory guidance on the use of AI, particularly demanding that clear explanations of how decisions made are made available  

       Dec 2019

World Economic Forum, Davos, Jan 2020: Salesforce calls for a national US privacy law for personal data and Satya Nadella asserts that data and privacy be considered a human rights under “data dignity” – both of which reflected privacy being a major theme at Davos, second only to climate change. 

The conversation has now reached critical mass. These are some of the most notable industry bodies and regulators, plus some of the most powerful technology leaders on the planet – some speaking from arguably the most influential global platform at Davos – all encouraging business and regulators to reach the same conclusion and act more responsibly.

Clearly, the need for tighter links between privacy and AI is more than industry wishful thinking or empty philosophising. It’s a real and tangible need even a requirement. But rather like much of the conversation threads revolving around AI, the businesses we talk to find that there is plenty of discussion and debate about how to make AI projects privacy-sensitive, but little practical guidance for how to achieve it.

This is because most businesses we encounter are searching for the wrong – and arguably an impossible – solution. They are considering rolling out AI projects, and looking for how to make their pre-ordained plan adhere to data privacy and ethics. But retrofitting privacy into data-driven projects is almost impossible to do without either curtailing the projects’ performance and restricting its value, or without only achieving partial privacy adherence. Instead, they should be looking for how to create a privacy-centric business and data culture, and then consider what applications – AI or otherwise – they can apply on top.

“Businesses are thinking about it the wrong way around: “I want to implement AI, so I had better ensure my AI project adheres to my privacy obligations.”

Instead, they should think of it as:

 “I have obligations to ensure continuous adherence to data privacy requirements. If this is achieved, I then give myself and the business the confidence to roll out an AI project – and potentially many more data-driven initiatives.”

The answer to the above is Privacy by Design.

Privacy by Design is a strategic approach that requires any interaction with personal data to be performed with data protection and privacy in mind at every step. Such initiatives give businesses the most granular understanding and control of their data. Far more so than generic data governance programmes.

And the finite control that Privacy by Design offers brings enormous possibilities. Because your data becomes so well structured, so visible, with such oversight, and with such a firm ethical and regulatory grounding, you can be totally confident in your authority to use it and overlay almost any application you choose for whatever end is deemed necessary for your business.

For many, those applications will sit in the analytics, insights and automation camps. Others will have determined that their business objective warrants the use of machine learning (you can discover whether it really does with the help of this guide on Finding the right use case for AI).

Whatever the initiative or necessary technology, by implementing Privacy by Design in advance, you will have addressed one of the key concerns in the debates above: the legal legitimacy and ethical basis of whether the data you hold can be put to the automated and intelligent use you have in mind.