Meetings are where decisions happen. But the knowledge disappears.
Vernix exists to fix that.
Vernix started with a question: What if an AI agent could sit in your meeting, build a memory of everything said, and answer questions on the spot?
Not a recording you never rewatch. Not notes biased toward whoever happened to be typing. An actual participant that listens, understands, and remembers — across every call you have.
The idea came from a hackathon. The first version was rough: a bot that joined Google Meet calls, transcribed them in real time, embedded every sentence into a vector database, and answered questions by voice. It worked. People asked it things during live meetings and got answers grounded in what had actually been said — not just in the current call, but across their entire meeting history.
That was the moment it clicked. Meetings are the primary decision-making channel for most teams, but the knowledge generated in them is trapped — in people's heads, in forgotten recordings, in notes nobody reads. Decisions, context, and action items disappear the moment the call ends.
Vernix turns that around. Every conversation becomes permanent, searchable, actionable knowledge. Summaries write themselves. Action items are extracted automatically. Your documents become part of the agent's context. And you can ask it anything — during the meeting or after — and get an answer grounded in what was actually said.
The vision
Vernix becomes your organization's collective brain. Not just meeting notes — meeting intelligence. Every decision, discussion, and insight permanently searchable and actionable.
We're building this in the open, one feature at a time. Voice agent, silent mode, knowledge base, cross-meeting search, MCP integration — each piece makes the whole thing smarter.
Built by
Tim Borovkov
Founder & developer@timborovkovGitHub
Built with ♥ in Europe.
Want to see what Vernix can do?
Try Vernix Free