How a score actually gets made.
This is the unglamorous version of what happens between "we picked a category" and "the review went live." If you ever wondered whether you can trust the numbers next to the products, this page exists for you.
Affiliate commission rates do not move scores. A product that pays us a 30% referral fee is held to the exact same rubric as one that pays us nothing. If you ever spot a review that looks suspiciously soft on a tool we make money from, email us and we will reopen the file. Full disclosure policy.
What Claripick is, and what it is not.
Claripick is a small editorial outfit that publishes structured, opinionated reviews of B2B software. We are not a consulting firm. We do not deploy tools at customer sites. The implementation timelines you see in our reviews are built from vendor documentation, common patterns reported by real users, and our own time spent inside the product.
What we offer is structure and consistency. Every tool gets weighed against the same six categories, in the same order, with the same questions. Twenty hours of research, condensed into an honest twenty-minute read.
We use modern tools, including AI assistance, to read documentation faster and cross-reference user reviews at scale. Every review is then read, edited, and signed off by a human before it goes anywhere near publish. If a sentence sounds like a press release, we cut it.
Where the research comes from.
Vendor docs
Pricing pages, API references, onboarding guides, support articles. The bits the salesperson does not bring up on the call.
User reviews
Verified G2, Capterra, and Trustpilot reviews, read in volume. We pay particular attention to one and two-star reviews from accounts more than 90 days old that is where the truth lives.
Reddit and forums
r/sales, r/marketing, r/projectmanagement, vendor-specific subreddits and Slack communities. Frustration in the wild tells you a lot about what tomorrow looks like with this tool.
Pricing checks
We re-verify every price, every plan, every add-on, on the first week of each month. When a vendor moves something quietly, we note the date and the change.
Changelog reading
A vendor that ships boring infrastructure improvements every two weeks is a different animal to one shipping a new logo every quarter. We pay attention to which is which.
Integration audits
Vendors love to brag about "1,000+ integrations." We open the directory and count the ones that actually do useful work and the ones that need a paid Zapier seat.
How we weigh a score.
Every product is scored across six categories. Each category is weighted, then combined into the overall verdict you see on the review page. Higher weight on the things that bite teams in month six, lower weight on the things every vendor checks the box on.
| Category | Weight | What we are actually measuring |
|---|---|---|
| Real value at real prices | 25% | What the bill actually looks like in month 13. Hidden costs, contract flexibility, what the free tier really covers. |
| Time to first useful day | 20% | How long until the team is doing real work in this thing, not still in a setup wizard. Migration friction, training load, defaults that make sense. |
| How well it does the job | 20% | Feature coverage against the rest of the category. Depth where it matters. The features that look great in a demo and fall apart on Tuesday. |
| Support that picks up | 15% | Response times, documentation quality, whether the community forum is a graveyard. Being on a 12-hour ticket queue at 3pm on a Tuesday counts. |
| Plays nicely with others | 10% | Native integrations that work, an API that is documented and stable, the option to leave without losing your data. |
| Security and compliance | 10% | SOC 2, GDPR, data residency, SSO. Whether it is gated behind a fictional "contact sales for pricing" tier matters too. |
Updates and corrections.
- ◆ We re-verify pricing every month and stamp the date on the page.
- ◆ A meaningful product update triggers a partial rescore, usually within two weeks.
- ◆ A reader-reported error gets investigated within seven days, with a dated correction note when we change anything.
- ◆ If a vendor offers us money, free service, or "exclusive partnerships" in exchange for a softer review, we say no and write about the offer.
- ◆ We do not write sponsored reviews. If we ever do, they will be unmissably labelled and physically separate from the editorial reviews.
Disagree with a score?
We genuinely want to hear it. If you have used a tool we reviewed and the numbers do not match your experience, write in. If we got it wrong, we will say so on the page.