Crisis support:call or text988988lifeline.org
← Back to Start Assessment
impela
Log inSign up

Your data

Privacy is more than a policy page.

Most AI products promise privacy and rely on you to trust them. We don’t think that’s enough. The way data is protected matters more than the way it’s described. Below is what Impela actually does — what we store, how it’s protected, what you can see and change, and what you can do if you ever want to leave.

What we store

To do coaching, Impela has to remember. Some of what it stores:

  • Your account: name, email, password (hashed, not stored as readable text), phone number if you opt in to SMS.
  • Your conversations: every coaching session, with timestamps and message content.
  • Your profile: personality observations, values, strengths, growth areas — built up across sessions.
  • Your goals and commitments: what you said you’d do, when, and what happened.
  • Your relationships: people you discuss in coaching, their relationship to you, the context.
  • Patterns: things Impela has noticed about how you work, decide, and react.
  • Documents and artifacts: anything you upload, anything coaching produces.

There’s no shadow profile. Everything Impela has on you is in your account, and you can see all of it.

What we don’t do

We don’t sell your data. Not to anyone. Not for any reason.

We don’t share your conversations or profile with employers, insurers, family members, advertisers, data brokers, or anyone outside Impela.

We don’t train general-purpose AI models on your data. Your sessions are not in some training set making future models smarter. They’re yours.

We don’t read your conversations. No human at Impela — including the founder — can read what you’ve said in coaching unless you explicitly grant permission for a specific support situation, and that permission is time-limited.

How that’s possible

You might wonder: if Impela stores all of this, how can we not read it?

Because storing and reading are different things. Your data sits on our servers in encrypted form — locked. The keys to unlock it are tied to your active session, created when you log in with your email and password, and erased when your session ends. Without an active session, the data is encrypted bytes that nobody — including us — can make sense of.

We never have your password. Passwords are turned into a one-way hash the moment you create them; even we can’t reverse them. So when we say “we can’t read your data,” it isn’t a promise we’re trusting ourselves to keep. It’s a property of how the system is built.

One honest caveat: in any cloud-hosted system, an administrator with deliberate intent and full server access could theoretically work around this. We mention it because nothing in any cloud product is absolutely impossible. But the architecture means casual access — an employee browsing your conversations, a feature accidentally surfacing private content, a stray query pulling private data into a report — is structurally prevented. To read your data, someone would have to deliberately circumvent multiple layers, and the attempt would be logged. The bar is high, deliberately so.

Three commitments, built into the code

  1. Your data is not shared with marketers, advertisers, or any third party beyond the subprocessors needed to operate the service— the AI provider, the email and SMS providers, and the billing provider. Each is under binding data-protection agreements. (See the full list below.)
  2. No human at Impela can read your data without your explicit, scoped, time-bounded permission.When you need support help, you grant access through Settings — choosing exactly which items the support team can see (specific sessions, memories, patterns, or other records), and for how long (up to seven days). When the grant expires or you revoke it, access is cut off automatically and the cached data is purged.
  3. These rules are enforced architecturally, not just by policy.A policy that says “we won’t” depends on the company keeping its word. An architecture that says “we can’t” depends on math. We’ve built it the second way.

Memory you can see, edit, and own

Here’s the part most AI products get wrong.

To coach you well, Impela has to remember a lot — what you’ve said, what you value, what patterns it’s noticed, what you’re working on. Without memory, every session starts from zero and nothing compounds.

But memory you can’t see is untrustworthy. Most AI tools have opaque memory: the system has formed views of you that you have no way to inspect or correct. Impela does it differently.

Everything is visible.Every memory the system holds about you is on a page you can open. There’s nothing hidden in a profile we keep but you don’t see.

Everything is sourced. Every memory shows where it came from: which session, which message, what reasoning produced it, and how confident the system is. You can trace any belief back to the conversation that produced it.

Everything is versioned.When a memory updates — because new evidence came in, or because you corrected it — the old version isn’t overwritten. It’s preserved as a prior version, linked to the new one. You can see what Impela used to believe about you and why it changed its mind.

You can edit it.Anything you don’t agree with, you can change. Anything you don’t want stored, you can delete.

The AI can’t overwrite what you’ve edited.This is a structural invariant. If the AI’s view of you ever conflicts with something you’ve explicitly set, the AI does not silently update. It proposes a change — visible to you as a pending review — and waits. You accept, edit, or reject. If you reject, the AI doesn’t re-propose the same change for at least thirty days. This is enforced at the data layer, not in the AI’s prompt — it’s not something the system can be talked into ignoring.

Old versions stay in your record.When a memory is replaced, the prior version isn’t deleted — it’s preserved alongside the current version, with the full history of when it changed and why. You can see it any time in your data export.

Your story is yours.Impela can produce a narrative version of your structured memory — a prose summary of how the system understands you. The narrative is built entirely from your structured memories — every fact in it has a corresponding memory in your record that you can see and edit. And when you rewrite a section in your own words, your version is locked from AI override on the next regeneration. The user’s version of their own story wins.

Personality, visible too

If Impela is going to form a view of your personality, you should be able to see what view it’s formed.

The Personality view shows each trait, the confidence level, and the specific evidence — which sessions and observations contributed. If you’ve uploaded a validated assessment, that’s visible too, with how it factored in. You can retake or re-upload anytime.

There’s no black-box profile that gets used in coaching but never shown to you.

How it’s protected

The mechanical layer:

  • Encryption in transit. Everything between your device and our servers travels over TLS.
  • Encryption at rest. Sensitive data is encrypted in the database, including conversation content.
  • Password hashing.Passwords are cryptographically hashed — we don’t store anything readable, even internally.
  • Access controls. Production access is restricted to essential service operations and audited.
  • Cloud infrastructure with enterprise-grade certifications. No system is uncrackable, but the setup is at the level a careful engineer would design.

These are table stakes. We mention them because if they weren’t here, the rest wouldn’t matter.

What you can do, anytime

  • See everything.Your profile, your conversations, your memories, your patterns, your story, your personality view — all visible from inside the app.
  • Edit any of it. Anything Impela has wrong, you correct. Edits to your story are locked from AI override.
  • Delete any of it.A specific session, a goal, a person you don’t want stored, a pattern observation — delete it.
  • Export everything. A one-click download of all your data, available in Settings.
  • Opt out of features.Turn off pattern detection, turn off memory storage, opt out of follow-up SMS — granular controls, not all-or-nothing.
  • Delete your account.Permanent, complete erasure from our active systems. Not “marked as deleted” — actually gone.

Who we work with

A few subprocessors are necessary to operate Impela. Each is under binding data-protection terms.

  • AI provider (Anthropic and/or OpenAI for coaching responses). Conversation content is processed to generate responses but is not retained for training.
  • Email and SMS delivery(Resend for email, Twilio for SMS). Used only to send notifications you’ve opted into.
  • Billing(Stripe). Handles payment processing. We don’t store credit card numbers; Stripe does.
  • Cloud hosting (Vercel for the application, managed Postgres for data). Standard infrastructure.

That’s it. No analytics trackers, no advertising pixels, no third-party retargeting.

What happens if Impela goes away

If we’re acquired, the acquirer inherits the same privacy commitments. If your data would be used differently under new ownership, you’ll get notice and the chance to export and delete before the transition.

If we shut down, you’ll get notice in advance, the ability to export your data, and a permanent deletion of everything that remains.

You’re not trapped here.

Bottom line

Most AI products treat privacy as something they have to say. We treat it as something we have to build. Memory is visible, editable, versioned. Humans can’t read your data without your time-bounded permission. The AI can’t quietly form a view of you that you’d disagree with if you saw it. Your data is yours, and leaving with all of it is one click away.

For the legal version of all this — the binding policy, the formal commitments, the regulatory specifics — read our Privacy Policy.

Read more