We Open-Sourced Our Python SDK.
Here's How We Built It.
It started as six files tucked inside our Telegram bot's repo. A thin wrapper around httpx so we'd stop writing the same API calls by hand. Then we realised: if we needed this, other developers integrating FaceVault probably do too. So we extracted it, added types, wrote tests, and published it to PyPI. Now at v1.0.0 with full trust engine, proof of address, identity credentials, and sanctions screening support. This is the story of that process — and what's new.
Why Build an SDK at All
The FaceVault API is simple. Create a session, get a URL, check the result. Three endpoints cover 90% of use cases. You could integrate it with raw requests or fetch in an afternoon.
So why wrap it in a library?
Because the API is only part of the story. There's the auth header you have to remember (X-FaceVault-Api-Key, not X-API-Key, not Authorization: Bearer). There's the webapp URL you have to construct from the session ID and token. There's the webhook signature you have to verify with HMAC-SHA256 over a canonicalised JSON body. There's the error responses that come back as {"detail": "..."} sometimes and {"error": "..."} other times, depending on whether FastAPI or our middleware caught it first.
None of these are hard. All of them are things you'd rather not think about. An SDK is a promise: we've thought about it so you don't have to.
The Embedded Phase
The SDK started life as a facevault/ directory inside our Telegram bot monorepo. Six files:
facevault/
├── __init__.py # exports, version
├── _client.py # sync client (httpx)
├── _async_client.py # async client (httpx)
├── models.py # dataclasses
├── exceptions.py # typed errors
└── webhook.py # HMAC verification
The Dockerfile just did COPY facevault ./facevault and Python's import system handled the rest. It worked. For months, it was fine. The bot imported from facevault import AsyncFaceVaultClient, created sessions, and everybody was happy.
But as we started thinking about other developers integrating FaceVault — people building their own bots, their own backends, their own verification flows — the embedded SDK became a problem. You can't pip install a subdirectory of someone else's private GitLab repo. And asking developers to copy six files into their project and keep them updated manually is how you get ten incompatible forks.
Design Decisions That Mattered
Before extracting the SDK, we made a few deliberate choices. Some of them were obvious in hindsight; others came from getting things wrong first.
1 httpx, not requests
Our bot is async (aiogram). Using requests would mean blocking the event loop on every API call, or doing the asyncio.to_thread() dance. httpx gives us both sync and async clients with an almost identical API surface. One dependency, both worlds.
We pin httpx>=0.24,<1 — wide enough to not cause conflicts in other people's projects, narrow enough that we won't break on a major version bump.
2 Dataclasses, not dicts
Every API response is parsed into a typed dataclass. Session, SessionStatus, WebhookEvent. No reaching into nested dicts. No data["session_id"] and hoping the key exists. Your editor's autocomplete shows you exactly what fields are available.
We considered Pydantic, but it felt heavy for what we needed. Dataclasses are stdlib, zero dependencies, and they work with py.typed out of the box. The SDK has exactly one dependency: httpx. That's it.
3 Typed exceptions, not status codes
A 401 raises AuthError. A 404 raises NotFoundError. A 429 raises RateLimitError. Everything else raises FaceVaultError with the status code attached. You catch what you care about and let the rest bubble up.
This sounds obvious, but it changes how you write integration code. Instead of checking response.status_code == 401 in every call site, you wrap the whole thing in a try/except and handle each case once. Less code, fewer places to forget.
4 The webapp URL is built for you
When you call create_session(), the response includes a webapp_url that's ready to send to your user. The SDK constructs it from the session ID and token so you never have to remember the URL format. This is the kind of small thing that saves people from filing a support ticket at 2am.
Extracting to a Standalone Package
The extraction process was mechanical, but getting the packaging right took some care. Python packaging in 2026 is better than it's ever been — but "better" still means there are four build backends, three ways to specify dependencies, and a PEP for every permutation.
We settled on:
src/ layout
PEP 621. The package lives in src/facevault/, not at the repo root. This prevents accidental imports of the local source when you're running tests — you always test the installed package.
Hatch as build backend
Dynamic version read from __init__.py. One source of truth. Bump the version string, push, build. No syncing between files.
py.typed marker
An empty file that tells mypy and pyright "this package ships inline type annotations." No stubs needed, no types-facevault package. Your editor just works.
On the bot side, the migration was a one-line change. Replace COPY facevault ./facevault in the Dockerfile with facevault>=0.1.0 in requirements.txt. The import paths stayed exactly the same. from facevault import AsyncFaceVaultClient worked before, works after. No code changes in the bot itself.
What It Looks Like in Practice
Install it:
pip install facevault Create a verification session (sync):
from facevault import FaceVaultClient
client = FaceVaultClient("fv_live_your_api_key")
# Create a verification session
session = client.create_session(external_user_id="user-123")
print(session.webapp_url) # send this to your user
# Or require proof of address for this session
session = client.create_session(
external_user_id="user-456",
require_poa=True,
)
# Check results — trust engine gives you a single score
status = client.get_session(session.session_id)
print(status.trust_score) # 0-100
print(status.trust_decision) # "accept", "review", "reject"
print(status.face_match_passed)
client.close() Or async, if you're in an aiogram/FastAPI/aiohttp context:
from facevault import AsyncFaceVaultClient
async with AsyncFaceVaultClient("fv_live_your_api_key") as client:
session = await client.create_session(external_user_id="user-123")
print(session.webapp_url) Verify a webhook:
from facevault import verify_signature, parse_event
body = request.body
signature = request.headers["X-Signature"]
if verify_signature(body, signature, secret="whsec_your_secret"):
event = parse_event(body)
# Trust engine decision — one field, no guesswork
if event.trust_decision == "accept":
print(f"User {event.external_user_id} verified!")
elif event.trust_decision == "review":
print(f"Manual review needed — score: {event.trust_score}")
else:
print(f"Rejected — sanctions hit: {event.sanctions_hit}") That's the entire surface area. Two clients, three models, four exceptions, two webhook helpers. No configuration objects, no builder patterns, no middleware chains. You import what you need and call the method. The trust engine does the thinking — you just read the decision.
Why Open Source
This one's easy. An SDK is a trust boundary. When you install a package that handles your API keys and your users' verification data, you should be able to read the source. All of it. Not a minified blob, not a compiled extension — plain Python files you can open in your editor and understand in 15 minutes.
The SDK is MIT-licensed. Fork it, vendor it, modify it, ship it. If our default client doesn't fit your architecture, subclass it. If you want to add retry logic or caching, the _raise_for_status method is right there to extend. We're not precious about it.
There's also a practical reason: open-source SDKs get bug reports faster. If a developer hits a weird edge case with our webhook signature verification, they can open an issue or a PR instead of emailing support and waiting for us to reproduce it. The feedback loop is tighter. The code gets better.
What's New in 1.0.0
The SDK started at v0.1.0 with the basics: create a session, check the status, verify a webhook. That was enough for our Telegram bot. It wasn't enough for the platform FaceVault has become. v1.0.0 brings the SDK into full parity with the API — every signal, every score, every decision the engine makes is now a typed field on your response objects.
Trust engine
status.trust_score (0–100) and status.trust_decision ("accept", "review", "reject") on every session. One number, one decision. No more parsing face_match + anti_spoofing + document_fraud yourself.
Proof of address
create_session(require_poa=True) enables per-session proof of address collection. The status.poa dict gives you extraction results, name matching, and fraud scoring.
Identity credentials & sanctions
status.credential for reusable identity credentials (verify once, prove forever). event.sanctions_hit on webhooks. The full anti-spoofing breakdown lands in status.anti_spoofing.
Security audit
Full security audit before going open source. FaceVaultClient.__repr__ now redacts the API key. get_session() validates session IDs against path traversal. verify_signature() handles None gracefully instead of crashing.
The upgrade is non-breaking. Every new field has a default value. Your existing code keeps working — you just get more data when you want it.
Go Build Something
The SDK is live on PyPI and the source is on GitHub. Install it, read it, break it, improve it. If you're building a verification flow — whether it's a Telegram bot, a Django backend, a FastAPI service, or something we haven't thought of yet — this is the fastest way to get started.
And if you build something cool with it, let us know. We'd love to see it.
Links
facevault-python on GitHub — source code, MIT license
facevault on PyPI — pip install facevault
FaceVault API Docs — quickstart guide and API reference
From Zero to First Verification in 10 Minutes — step-by-step tutorial with curl, Python, and Node.js