A wire fence

Data privacy is so hot right now

We get it. When you’re a start-up, there are soo many things vying for your time and attention. Funding, hiring, innovating… Which means there’s a chance that your data privacy might get pushed further and further down your priority list as more visible, visceral issues appear. Because there are more important areas you can compete in, right?

Absolutely. You know your business better than anyone. But if you’re not seeing privacy as an area in which you can compete, you’re doing yourself - and your users - a massive disservice. Data privacy is huge right now, and for good reasons. And your users care about it. A lot.

So let’s bust some of those privacy-isn’t-for-us-just-yet myths…

Myth 1: Privacy is too complex for a startup

Now, we’re not saying that great privacy is easy. But we are very much saying that it’s not that complicated, either.

Particularly if you’ve got it baked in from the very start. That means deciding on a compliant privacy policy and building it into your app, site, or store, so they’re as secure as possible on day one. But it also means that your software has to be easily updatable as laws, and let’s be honest, bad faith actors, evolve. Which brings us neatly to our next myth.

Myth 2: We’re too small to be a target

Any company that collects user data, be it health info, credit card numbers, addresses, spending and shopping habits, or political affiliations, can be subject to privacy laws. And that’s regardless of company size, turnover, or seed stage.

In addition, as those laws change to keep up with the tech, so every company has to be able to keep up. Because the fines for breaching them are real.

The average penalty imposed on companies in breach of Europe’s GDPR stands at around half a million dollars currently. And even the smaller fines clock in at around 60k or more. Loss of capital as a result of privacy fines is completely unnecessary, when solutions are so easily found and implemented.

Then there are those bad-faith actors we mentioned earlier. No amount of data is too small to sell on the dark web where it can fetch huge amounts of money, as several companies have learned to their cost. And whether you’re held to ransom or fined for allowing the breach in the first place, regaining your users’ trust can be an impossible task.

Myth 3: Users don’t care about privacy

Wrong. This myth is massively wrong, and frankly, borders on dangerous. Users are far more savvy about data collection, and the abuse of that data these days, thanks in large part to some huge and very public breaches. Flo, we’re very much looking at you. Talk about sensitive data… And, according to America’s FTC, they handed that data over to Facebook and Google for, essentially, free. Data that included period dates and intention to conceive. And although they somehow escaped a huge fine, Flo’s owners settled out of court, and that settlement included the FTC telling everyone what they’d done and that the practise only ceased as a result of the negative publicity. As, we guess, penance, they’ve since launched the at-least-they’re-trying Anonymous mode.

Flo had tens of millions of active users at the time, so the amount of data they had access to is mindboggling. And it’s one of the first cases your users are likely to think about when it comes to how well you’re going to look after their data. Just ask all the women who deleted the app following that humiliating settlement.

Myth 4: Privacy gets in the way

There are those cases, though, where privacy can actually make a big difference to users’ lives and security. Let’s say you’re the CEO of a startup that collects, ahem, information of a sensitive nature. Are you one of those CEOs who believes that adding a consent box makes your users uncomfortable enough to drop out of the sign-up process? Well, you’re here, so we’re guessing you’re not.

But we’ve actually heard that argument from health start-ups, and it’s about as tone-deaf as it’s possible to get. Protecting user data, particularly when it comes to health, menstrual cycles, fertility treatments, or mental health, is front and centre for a growing number of users.

And those users do their research, looking out for companies that have a bad rep, or huge fines under their belts. So if your start-up has a privacy policy that’s clear and honest, you won’t lose users. In fact, you’re more likely to see an uplift in those funnel metrics and not a drop.

That’s because being honest from the start can instill trust and a feeling of control in your users, and that is invaluable. Good privacy and the positive PR that comes with it also means you’ll see those effects at the top of your sign-up funnel, not inside it.

Data privacy: It’s less complicated than you think

As far as we’re concerned, there are 3 foundational practices that can help your start-up stay both future law and bad faith proof. Stick to them, and you’ll always have the protection you need:

  1. Data minimization Keep your data collection and storage at the absolute minimum necessary to maintain functionality of your app, site, or store. For instance, if you use age bands on your app or site, you need only ask for year of birth. You don’t need your users’ exact birth date. And they’ll know that.

  2. Transparency and choice Remember when Gamestation changed its privacy policy so that they owned your soul? Neither does anyone else, but for a while there, Gamestation owned more souls than games. Make your privacy policies short and easy to read, give opt outs and data deletion rights - all requirements for GDPR compliance - and your users will love you. And they’ll keep their souls, which is a bonus.

  3. Encryption Yes, yes, we know it should be a given. But data encryption - both at rest and in transit - is crucial. And it’s the first thing your users want to hear. Get the basics right, and the rest will fall into place.

What about LLMs and privacy?

As you may have noticed, we’re big fans of LLMs here at DataFenix. They’re smart, powerful but they can come with a few considerations where privacy is concerned. For instance, in the early stages, there will likely be trade-off, as you balance the functionality you need from your specially-trained LLM against privacy compliance. And that generally breaks down to this:

Managed Vs. Local You can either have a specifically trained, or local, LLM that sits on your servers or user devices, or a managed LLM, likely provided by OpenAI/Google/Other tech company.

Managed LLMs have several advantages. There’s great security, as the companies have the space and money for it. They also offer high performance levels, and advanced features for speedy innovation and solution implementation.

That said, sending your data to, or storing it externally can mean that you don’t have the same level of control that self-storage or on-device storage brings. Of course, you’ll always have full control over how much and what data you send, including the levels of sensitivity.

But, use a local LLM, and the tech could put limitations on your functionality. LLMs need room and power to run at their full potential and no one can guarantee that user handsets, tablets, and average PCs will be able to handle it. This area is evolving so rapidly though, that I’m sure these comments will be out of date by the time you’re reading this!

In conclusion

An honest, genuine attempt at privacy protection and the flexibility to evolve are crucial building blocks for any startup, because laying that groundwork now can give you a robust framework for a data-protected future.