We’re a small team of engineers and lawyers who got tired of reading 50-page contracts line by line. Reviewing NDAs, SaaS agreements, and employment contracts was slow, repetitive, and error-prone and even with templates, humans miss things.
Why we built it
Last year, we were working with a few early-stage startups that were drowning in vendor and customer contracts. Lawyers were expensive; founders didn’t have time; AI tools at the time were either too generic (“chat with your PDF”) or too opaque to trust.
-Understands legal context (not just raw text)
-Explains clauses clearly (“This indemnity clause exposes you to X risk”)
-Highlights risks, missing terms, and inconsistencies
-Lets you compare versions to see what changed
Under the hood:
-We use LLMs fine-tuned on contract-specific datasets (public agreements + anonymized real samples).
-A clause-level parser breaks down contracts into semantic sections (e.g., Termination, Liability, IP, etc.).
-We run a hybrid rule-and-LLM pipeline: rules handle structure / entities; the model handles nuance and reasoning.
-Everything is stateless and localizable, with optional on-prem deployment for privacy-sensitive orgs.
-We also built a comparison engine that highlights redlines across different drafts (useful for negotiation).
Nice work! I use Claude for task like this with a simple prompt. I’m not a lawyer so I’m certain my current process is risky.
I think your site needs to call out that problem more clearly. The two pains you solve for me:
- saving money on a lawyer for first pass / basic contracts
- Reducing risk of my current “vibe red lining”
I also think you need establish your credibility on the site. My first thought was “did some kid vibe code this? Or is this someone with an actual JD.”
We’re a small team of engineers and lawyers who got tired of reading 50-page contracts line by line. Reviewing NDAs, SaaS agreements, and employment contracts was slow, repetitive, and error-prone and even with templates, humans miss things.
Why we built it
Last year, we were working with a few early-stage startups that were drowning in vendor and customer contracts. Lawyers were expensive; founders didn’t have time; AI tools at the time were either too generic (“chat with your PDF”) or too opaque to trust.
-Understands legal context (not just raw text)
-Explains clauses clearly (“This indemnity clause exposes you to X risk”)
-Highlights risks, missing terms, and inconsistencies
-Lets you compare versions to see what changed
Under the hood:
-We use LLMs fine-tuned on contract-specific datasets (public agreements + anonymized real samples).
-A clause-level parser breaks down contracts into semantic sections (e.g., Termination, Liability, IP, etc.).
-We run a hybrid rule-and-LLM pipeline: rules handle structure / entities; the model handles nuance and reasoning.
-Everything is stateless and localizable, with optional on-prem deployment for privacy-sensitive orgs.
-We also built a comparison engine that highlights redlines across different drafts (useful for negotiation).
Nice work! I use Claude for task like this with a simple prompt. I’m not a lawyer so I’m certain my current process is risky.
I think your site needs to call out that problem more clearly. The two pains you solve for me: - saving money on a lawyer for first pass / basic contracts - Reducing risk of my current “vibe red lining”
I also think you need establish your credibility on the site. My first thought was “did some kid vibe code this? Or is this someone with an actual JD.”