The technology of privacy is breaking ground at a steady pace as industries seek to navigate the brave new world of data. We highlight some of the key developments in cloud, regulation, and encryption for financial services from the Privitar In:Confidence conference recently held in London.
This content requires a Finadium subscription. Articles with an unlocked symbol can be accessed with free registration. Log in or create a free account by signing up here..
Data management is one of the major considerations to getting into the cloud, and we’ve heard frustrations from practitioners over obstacles associated with permission systems, which are stalling cloud use for a variety of tasks, even modeling in a test environment.
Speaking at the conference, HSBC’s Shane Lamont, program director for Google Cloud adoption, explained how a strategy that is “context aware” helps with complexity and decisions about access at the bank. “We want to improve customer service through great analytics, but we need to ensure we have appropriate controls,” he explained during his presentation.
In balancing data access and privacy, context defines what someone is allowed to access, which happens at an organizational level and identifies data sets and roles – so a UK analyst can see UK data at its simplest. There’s also the content side, in other words, what someone is allowed to see, like names and date of birth. Units tasked with financial crime are also allowed to see details like National ID, address and occupation, for examples.
The approach as outlined by Lamont happens in steps: identify critical data elements; catalog and map data to them; identify roles and map them; add context to roles and data; create an access control layer; and access data only through that layer. In this way, an access layer that brokers data requests is combined with a response that provides gateway control, explained Lamont.
Moreover, solutions can be greatly simplified by considering different environments, such as production, testing or development. For production environments, there is real, and valuable data, which comes along with real risks and so needs protection. When it comes to test environments, data is real or pre-built synthetic and used to verify that the system works as intended. And for development, data is synthetic or anonymized and used for developers to test software: there may be many of these for short periods of time.
It can be presumed that the financial crime unit has the most access, not surprisingly, but there is no one role that will be able to access all the data, all the time, explained Lamont. Indeed, the CEO is not even going to be the individual with the most access. So, while considering data context and content enables granular roles, there is no one overarching view that can see the whole.
GDPR meets PSD2
While there is much to say about the push to develop ethical guidelines across many industries, in financial services the priority seems to be reconciling the arrival of European regulation GDPR (General Data Protection Regulation) combined with the incoming PSD2 (Payment Services Directive), also known as open banking. Traditional banks have pointed out that the restrictions of GDPR may clash with the demands of open banking.
Guy Cohen, policy and strategy lead at Privitar, denies that GDPR and open banking are necessarily at odds with one another. In a conversation with Fintech Capital Markets. he explained that GDPR has rights embedded within it referencing data portability, which is not unlike what the mechanics of PSD2 are trying to achieve.
Article 20 of the GDPR is called “Right to data portability”, and it states that: “The data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to the controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided.”
“The principle is that every individual has the right to request from anyone who holds their data, or data provided by those individuals or organization, to transfer it to any other organization of their choice in a common machine-readable format,” said Cohen. “You should be able to move your data the same way you move your money.”
What’s lacking in GDPR is some common standards that achieve those aims. Cohen said the challenge relates to ensuring that products and services developed by two different companies need to be interoperable with respect to data transfers, which can be done by standardizing how data is ‘ported’, but the GDPR doesn’t specify standards.
He expects in a few years to see codes of conduct, which specify what data portability means and what standardization might look like, so it can be “APIs between fintech and bank equivalence”, for example.
Tech advances for data privacy
Some of the advances in computing on private data allow firms with conflicting interests to perform collaborative analyses without exposing their data to each other or anyone else, explained Adrià Gascón, research fellow at the Alan Turing Institute, speaking to Fintech Capital Markets.
This set of technologies is related to a subfield of cryptography called multi-party computation. For a finance-specific example, it means banks with mutually exclusive datasets, like loan defaults or fraud, can put them together for more accurate models.
A simple example is the task of computing the number of common elements in two lists, each held by one of two firms. There is a rich literature on cryptographic solutions for this problem, called private set intersection, he noted. In particular, Google researchers have published a solution tailored to the application of attributing ad conversion.
An online advertiser wants evidence of money well spent. A useful metric is the number of customers that saw an ad online, and then made a purchase at a store. This information is however split between the advertiser and the ad supplier, and neither wants to disclose their data to each other due to privacy concerns. Cryptographic techniques provide a solution to resolve this tension.
Besides multi-party computation, an emerging technology worth noting is homomorphic encryption. Homomorphic encryption allows users to make operations directly on ciphertext without prior decryption. This is theoretically possible, but is yet limited in terms of commercial availability. He pointed to Microsoft Research as a leader in this field for its work on open source SEAL (Simple Encrypted Arithmetic Library).
Gascón is quick to note that the many techniques of cryptography are like “tricks in a big bag”. Data privacy is all about how those techniques are mixed as opposed to seeing development as any one technology being a standalone solution. “This is the way cryptographers think when they develop efficient protocols for a concrete task,” he said.
He also said that quantum computing has become a part of his world as industries struggle to prepare themselves to the potential impact of quantum machines, which could enable attacks on encryption schemes used daily on the internet. This awareness is being spearheaded by The National Institute of Standards and Technology (NIST).
Incidentally, most of the encryption schemes that support homomorphic encryption are quantum resistant, which means that there are no known attacks on the assumptions that support them, which stem from machine learning and would take exponential time to solve, known as “learning with errors”.
“The reason we can encrypt in a way that no one without a key can decrypt is usually because we can prove that you being able to break an encryption would mean that you are able to solve a computational problem that in principle no one in the world knows how to solve,” he explained. “But if all of a sudden quantum technology meant that such problem stopped being hard, we would have to update our encryption schemes so that we can withstand attackers that can run quantum machines.”
*Featured photo courtesy of Privitar