December 9, 2019 by Roisein Hutchinson, Senior Associate, Walker Sorensen LLP
Today, the term “insurtech” is widely used to refer to the use of technology and digital innovation to create value and efficiencies in the sale and delivery of insurance and reinsurance. One of the key benefits of insurtech is the ability to efficiently process customer information and use that information to improve product offerings and the user experience.
Property and casualty insurance companies have increasingly come to view insurtechs as valuable partners in the distribution of insurance products. If you are thinking of entering into a contractual arrangement with an insurtech, here are three key issues to consider that relate to the use and exchange of data:
•Artificial intelligence (AI) and consumer data
•Limitation of liability
AI and defining “customer data”
From a business perspective, insurtech and the rapid adoption of AI and machine learning represent a new frontier of opportunity for the insurance industry. However, collaboration with insurtech companies and the increasing use of AI in nearly all aspects of the insurance business may raise new and complex legal issues for insurers, reinsurers, and agents.
Pay attention to how “customer data” is defined in any agreement in which personal information is being provided to develop or train an AI algorithm. No third party can “own” an individual’s personal information, but a third party can have the contractual right to use it. On the other hand, an end product such as a predictive algorithm can be owned, and is therefore subject to intellectual property rights and obligations.
When negotiating ownership and use rights, the parties should consider whether the objective is to own the information used by the algorithm or the algorithm itself, and to ensure defined terms and ownership are clear. One approach in this situation may be to define different types of data based on the various stages of the product cycle — for example, pre-existing information, information resulting from processing, and the final data product.
Consent and collection
The law governing the collection, use and disclosure of personal information in Canada is the Personal Information Protection and Electronic Documents Act (PIPEDA). Certain provinces, such as Alberta, British Columbia and Québec, have substantially similar provincial privacy legislation that applies instead of PIPEDA.
PIPEDA requires organizations to obtain the meaningful consent of the individual for the collection, use and disclosure of personal information. That information cannot be used or disclosed for purposes other than those for which it was collected, except with the consent of the individual. When the information being collected and used is considered to be sensitive, organizations must generally obtain express consent.
In its Guidelines for Obtaining Meaningful Consent, the Office of the Privacy Commissioner of Canada noted that the level of sensitivity of personal information can change depending on what it can reveal when combined with other personal information about the individual. The Supreme Court of Canada has confirmed that financial information, in particular, “is generally extremely sensitive,” as it is “one of the types of private information that falls at the heart of a person’s “biographical core.”
Use of the data
If the parties intend to use personal information to develop tools such as anti-fraud predictive algorithms, then the consents must be examined carefully to confirm that this is permitted. PIPEDA is clear that an organization cannot use or share personal information unless it has disclosed those purposes to the individual when it collected their original consent. If the consent does not expressly disclose the development of anti-fraud predictive algorithms as a purpose for the collection, then fresh consent may be required. There is an exception: it may not be necessary to obtain fresh consent if it is reasonable in the circumstances for the individual to expect that their information would be used for developing anti-fraud predictive algorithms.
PIPEDA also requires that organizations protect personal information using appropriate security safeguards. Recent amendments to PIPEDA require organizations to report any breach of security safeguards involving personal information if it is reasonable in the circumstances to believe the breach creates a real risk of significant harm to the individual. If you are the party providing customer data to the other, then ensure the written agreement requires the other party to have appropriate data security safeguards in place and to provide notice if there is a breach of those safeguards.
Limitation of liability
Insurtech agreements are often collaborative endeavours that can benefit both parties, but consideration should be given to which party should be liable in the event of a breach of the agreement, and whether there will be any limit to their liability. Limiting liability to a multiple of fees paid may be appropriate in certain circumstances. However, it may also be important to draft exceptions to the limitation of liability related to breaches of privacy, data security, and confidentiality obligations that give rise to consequential damages beyond a multiple of fees.
Roisin Hutchinson is a senior associate at Walker Sorensen LLP, a business boutique law firm that focuses on advising insurers, reinsurers and agents.