How we test software

Author's Profile Image
Paddy Stobbs
Co-Founder & CEO
Product Logo
GET THIS PRODUCT →
Is 
 right for you?
Get fast, free advice from an expert
Chat with an expert

Stackfix's Verdict

Strengths
Weaknesses
Best for
Less good for

What does 

 do?

What does 

 look like?

No items found.

Strengths

Weaknesses

Who is 

best for?

Who is

less good for?

Find out which 
is right for your business
Chat with an expert and get personalized recommendations - 100% free.
chat with an expert
Our story

How we test software

Stackfix's team of experts curate and test software products, so you don’t have to. Here’s how.

Author's Profile Image
Paddy Stobbs
Co-Founder & CEO

Stackfix’s mission is to help businesses buy the right software. A key part of that is working out what software is (a) good, and (b) right for different users and use cases.

We believe that the only way to do that is via objective, independent testing of each product to produce expert insights and ratings. Here's how we do it:

1. We select the top products in each category

We pinpoint the ~50 top products in a software category (e.g. CRM), so that we can focus our testing. To do this we:

  • Hunt: We spend days searching  the web and private communities (on Slack, Discord and WhatsApp) for the products that leading startups are  raving about. We also track what products the best VCs and Angel investors are funding.
  • Consult: We speak with 20+ members of our Product Advisory Board (a panel of operators from the leading Seed and Series A startups) to understand what they're using and what products are on their radar.

2. Our experts test each product - by interviewing experienced users and by getting our hands dirty

i. We interview experienced users of each product

We then interview independent, experienced users of each product - people who have used the product in the wild, at scale. These are not faceless stooges who have been paid to say glowing things by the vendor - these are independent, experienced users with the highest quality bars.

ii. We speak with each vendor

We also hold calls with each software vendor directly - to query functionality, pricing and their roadmap.

iii. We get our hands dirty by testing the products ourselves

Finally, we spend a minimum of 3 days (though often it's months) testing each product ourselves - putting it through its paces and seeing if it walks the walk. No quarters given.

3. We score the products and explain why

Our interviews and testing generate thousands of data points and pages of textual analysis.

For each product we score each objective feature (e.g. "Reporting") and subjective quality (e.g. "Ease-of-use") out of 10, apply a category-wide relative weighting to each score, and combine the weighted scores to create an overall score for each tool.

We then write an analysis of the product, calling out its strengths, weaknesses, who it is typically a good fit for, and who it is typically not a good fit for.

The cumulative analysis goes through multiple rounds of review with both Stackfix experts and our Product Advisory Board, to ensure accuracy and standardisation.

The result? A best-in-class, high quality, quantified, expert profile of a given software product.

4. Finally, we keep the data fresh

One of the magical things about software is that evolves - in some cases very fast.

So a key part of our job is ensuring our data and analysis is currently accurate, rather than out-of-date.

To do this, we refresh our data at least every 3 months. Painful - but essential ✨

Find the right software for your business.
Chat with an expert and get personalized recommendations - 100% free.
chat with an expert

Rippling Review 2024

January 3, 2024

Ashby Review 2024

January 3, 2024

Attio Review 2024

January 3, 2024