Wednesday, April 13, 2022

Video: Privacy and start-ups ... what founders need to know

In my legal practice, I have seen businesses fail because they did not take privacy into account. I’ve seen customers walk away from deals because of privacy issues and I’ve seen acquisitions fail due diligence because of privacy.

Today, I’m going to be talking about privacy by design for start-ups, to help embed privacy into growing and high-growth businesses.

Episode 2 of Season 4 of HBO’s “Silicon Valley” provides a good case study on the possible consequences of not getting privacy compliance right.

Privacy means different things to different people. And people have wildly variable feelings about privacy. As a founder, you need to understand that and take that into account.

In some ways, privacy is about being left alone, not observed and surveilled.

It is about giving people meaningful choices and control. They need to understand what is happening with their personal information and they should have control over it. What they share and how it is used. They should get to choose whether something is widely disseminated or not.

Privacy is also about regulatory compliance. As a founder you need to make sure your company complies with the regulatory obligations imposed on it. If you are in the business to business space, you will need to understand the regulatory obligations imposed on your customers. I can guarantee you that your customers will look very, very closely at whether your product affects their compliance with their legal obligations. And they’ll walk away if there’s any realistic chance that using your product puts their compliance at risk.

Privacy is about trust in a number of ways. If you are in the business to consumer space, your end-users will only embrace your product if they trust it. If they know what the product is doing with their information and they trust you to keep it consistent. If you are in the business to business space, your customers will only use your product or service if they trust you. If you’re a start-up, you don’t yet have a track record or wide adoption to speak on your behalf. A deal with a start-up is always a leap of faith, and trust has to be built. And there are a bunch of indicators of trustworthiness. I have advised clients to walk away from deals where the documentation, responses to questions don’t suggest privacy maturity. If you have just cut and pasted your privacy policy from someone else, we can tell.

Privacy is not just security, but security is critical to privacy. Diligent security is table stakes. And a lack of security is the highest risk area. We seldom see class-action lawsuits for getting the wrong kind of consent, but most privacy/security breaches are followed by class-action lawsuits. Your customers will expect you to safeguard their data with the same degree of diligence as they would do it themselves. In the b2b space, they should be able to expect you to do it better.

You need to make sure there are no surprises. Set expectations and meet them.

In my 20+ years working with companies on privacy, one thing is clear. People don’t like it when something is “creepy”. Usually this is a useless word, since the creepy line is drawn very differently for different people. But what I’ve learned is that where the creepy line is depends on their expectations. But things are always creepy or off-putting when something happens with your personal information that you did not expect.

As a founder, you really have to realize that regardless of whether or not you care about privacy yourself, your end users care about privacy. Don't believe the hype, privacy is far from dead.

If you are in the business to business arena, your customers are going to care very deeply about the privacy and security of the information that they entrust you with. If you have a competitor with greater privacy diligence or a track record, you have important ground to make up.

And, of course, for founders getting investment is critical to the success of their business. The investors during your friends and family round or even seed funding might not be particularly sophisticated when it comes to privacy. But Mark my words, sophisticated funds carry out due diligence and know that privacy failures can often equal business failures. I have seen investments go completely sideways because of privacy liabilities that are hidden in the business. And when it comes time to make an exit via acquisition, every single due diligence questionnaire has an entire section if not a chapter on privacy and security matters. The weeks leading up to a transaction are not the time to be slapping Band-Aids on privacy problems that were built into the business or the product from the very first days. As a founder, you want to make sure that potential privacy issues are, at least, identified and managed long before that point.

The borderless world

I once worked with a founder and CEO of a company who often said that if you are a startup in Canada, and your ambition is the Canadian market, you have set your sights too low and you are likely to fail. The world is global, and digital is more global than any other sector. You might launch your minimally viable product or experiment with product market fit in the local marketplace, but your prospective customers are around the world. This also means that privacy laws around the world are going to affect your business.

If your product or services are directed at consumers, you will have to think about being exposed to and complying with the privacy laws of every single jurisdiction where your end users reside. That is just the nature of the beast.

If you're selling to other businesses, each of those businesses are going to be subject to local privacy laws that may differ significantly from what you're used to. Once you get into particular niches, such as processing personal health information or educational technology, the complexity and the stakes rise significantly.

You definitely want to consult with somebody who is familiar with the alphabet soup of PIPEDA, PIPA, CASL, PHIA, GDPR, COPPA, CCPA, CPRA, HIPAA.

You're going to want to talk carefully and deeply with your customers to find out what their regulatory requirements are, which they need to push down onto their suppliers.

The consequences of getting it wrong can be significant. You can end up with a useless product or service, one that cannot be sold or that cannot be used by your target customers. I’ve seen that happen.

A privacy incident can cause significant reputational harm, which can be disastrous as a newcomer in a marketplace trying to attract customers.

Fixing issues after the fact is often very expensive. Some privacy and security requirements may mandate a particular way to architect your back-end systems. Some rules may require localization for certain customers, and if you did not anticipate that out of the gate, implementing those requirements can be time and resource intensive.

Of course, there's always the possibility of regulatory action resulting in fines and penalties. Few things stand out on a due diligence checklist like having to disclose an ongoing regulatory investigation or a hit to your balance sheet caused by penalties.

All of these, individually or taken together, can be a significant impediment to closing an investment deal or a financing, and can be completely fatal to a possible exit by acquisition.

So what's the way to manage this? It's something called privacy by design, which is a methodology that was originally created in Canada by Dr Ann Cavoukian, the former information and privacy commissioner of Ontario.

Here's what it requires at a relatively high level.

First of all, you need to be proactive about privacy and not reactive. You want to think deeply about privacy, anticipate issues and address them up front rather than reacting to issues or problems as they come up.

Second, you need to make privacy the default. You need to think about privacy holistically, focusing particularly on consumers and user choice, and setting your defaults to be privacy protective so that end users get to choose whether or not they deviate from those privacy protective defaults.

Third, you need to embed privacy into your design and coding process. Privacy should be a topic at every project management meeting. I'll talk about the methodology for that in a couple minutes.

You need to think about privacy as positive sum game rather than a zero-sum game. Too often, people think about privacy versus efficiency, or privacy versus innovation, or privacy versus security. You need to be creative and think about privacy as a driver of efficiency, innovation and security.

Fifth, you need to build in end-to-end security. As I mentioned before, security may in fact be the highest risk area given the possibility of liability and penalties in this area, you need to think about protecting end users from themselves, from their carelessness, and from all possible adversaries.

Sixth, you need to build visibility and transparency. Just about every single privacy law out there requires that an organization be open and transparent about its practices. In my experience, the more proactive an organization is in talking about privacy and security, and how they address it, it is a significant “leg up” compared to anybody else who does not.

Seventh, and finally, you need to always be aware that and users are human beings who have a strong interest in their own privacy. They might make individual choices that differ from your own privacy comfort levels, but that is human. Always look at your product and all of your choices through the eyes of your human and users. Think about how you will explain your product and services to an end user, and can the choices that you have made in its design be justified to them?

A key tool to implement this is to document your privacy process and build it iteratively into your product development process. For every single product or feature of a product, you need to document what data from or about users is collected. What data is generated? What inferences are made? You will want to get very detailed, knowing every single data field that is collected or generated in connection with your product.

Next you need to carefully document how each data element is used? Why do you need that data, how do you propose to use it and is it necessary for that product or feature? If it is not “must have” but “good to have”, how do you build that choice into your product?

You need to ask “is this data ever externally exposed”? Does it go to a third party to be processed on your behalf, is it ever publicly surfaced? Are there any ways that the data might be exposed to a bad guy or adversary?

In most places, privacy regulations require that you give individual users notice about the purposes for which personal information is collected, used or disclosed. You need to give users control over this. How are the obligations for notice and control built into your product from day one? When a user clicks a button, is it obvious to them what happens next?

You will then need to ask “where is the data”? Is it stored locally on a device or server managed by the user or the customer? Is it on servers that you control? Is it a combination of the two? Is the data safe, wherever it resides? To some people, local on device storage and processing is seen as being more privacy protective than storage with the service provider. But in some cases, those endpoints are less secure than a data center environment which may have different risks.

Finally, think about life cycle management for the data. How long is it retained? How long do you or the end user actually need that information for? If it's no longer needed for the purpose identified to the end user, it should be securely deleted. You'll also want to think about giving the end user control over deleting their information. In some jurisdictions, this is a legal requirement.

Everybody on your team needs to understand privacy as a concept and how privacy relates to their work function. Not everybody will become a subject matter expert, but a pervasive level of awareness is critical. Making sure that you do have subject matter expertise properly deployed in your company is important.

You also have to understand that it is an iterative process. Modern development environments can sometimes be likened to building or upgrading an aircraft while it is in flight. You need to be thinking of flight worthiness at every stage.

When a product or service is initially designed, you need to go through that privacy design process to identify and mitigate all of the privacy issues. No product should be launched, even in beta until those issues have been identified and addressed. And then any add-ons or enhancements to that product or service need to go through the exact same scrutiny to make sure that no new issues are introduced without having been carefully thought through and managed.

I have seen too many interesting and innovative product ideas fail because privacy and compliance simply was not on the founder’s radar until it is too late. I have seen financing deals derailed and acquisitions tanked for similar reasons.

Understandably, founders are often most focused on product market fit and a minimally viable product to launch. But you need to realize that a product that cannot be used by your customers or that has significant regulatory and compliance risk is not a viable product.

I hope this has been of interest. The discussion was obviously at a pretty high level, but my colleagues and I are always happy to talk with startup founders to help assess the impact of privacy and compliance on their businesses.

If you have any questions or comments, please feel free to leave them below. I read them all and try to reply to them all as well. If your company needs help in this area, please reach out.

And, of course, feel free to share this with anybody in the startup community for whom it may be useful.

No comments: