I run a software agency. That means I have an obvious conflict of interest in writing this post. I also have an unusual vantage point — I've watched other agencies work, I've inherited projects from agencies that failed clients, and I've talked to hundreds of founders who got burned.
So take this for what it is: a checklist from someone with skin in the game, trying to be honest about what actually matters when you're choosing a development partner. Some of this will make Teamseven look good. Some of it applies to us too. I'll try to be clear about which is which.
Why this matters more than you think
A bad agency relationship doesn't just cost money. It costs time — usually 3–6 months of it. It costs confidence — yours, your team's, your investors'. And it often leaves you with a codebase that's hard to maintain, hard to hand off, and hard to build on.
The founders who have the worst experiences aren't the ones who hired bad agencies. They're the ones who hired agencies without doing proper due diligence, didn't recognize the warning signs early, and by the time they knew something was wrong, they were six figures in and didn't know how to get out.
This checklist is designed to surface problems before you sign, not after.
Before you even talk to an agency
Check 1 — Do they have verified reviews on Clutch, GoodFirms, or Trustpilot?
Anyone can put testimonials on their website. Verified reviews on third-party platforms are harder to fake. Clutch in particular requires direct client interviews — a reviewer has to get on a call with Clutch and answer specific questions about the project.
Look for:
- Volume: 10+ reviews is meaningful. 2–3 reviews is a thin sample.
- Recency: Reviews from 3+ years ago don't tell you much about who the agency is today.
- Specificity: Generic praise ("great team, highly recommend") is less useful than specific detail ("delivered the mobile app on time, fixed three critical bugs within 24 hours post-launch").
- Response to negative reviews: How an agency responds to criticism tells you more about their character than five-star reviews.
Red flag: No third-party reviews at all, or all reviews are from the same time period (suggests a one-time review collection drive, not consistent client satisfaction).
Check 2 — Do their case studies have real outcomes?
"We built a logistics platform for a supply chain company" is not a case study. A real case study tells you what the client's problem was, what was built, how long it took, and what changed as a result.
Look for:
- Named clients (or industries with enough detail to be credible)
- Before/after metrics where relevant (time saved, costs reduced, users onboarded)
- Technical specifics (stack, architecture decisions, integration challenges)
- Timeline and budget honesty
Red flag: All case studies are screenshots and vague descriptions. No metrics. No client names even with permission.
Check 3 — Is their portfolio consistent with your project type?
An agency that has built 50 e-commerce stores isn't necessarily the right agency for your enterprise SaaS. Look for work that's genuinely similar to what you're building — similar complexity, similar industry, similar user types.
Red flag: Portfolio is broad but shallow. Lots of different project types, none with depth. Or a portfolio that looks impressive but the work is all simple CRUD apps dressed up to look complex.
Check 4 — How long have they been operating?
Longevity is an imperfect signal but it's a real one. An agency that's been running for 5+ years has navigated actual challenges — client disputes, technical failures, market downturns. New agencies aren't automatically bad, but they haven't been tested in the same way.
Red flag: Agency founded within the last 12–18 months with no verifiable history of the founders working together or in the industry.
During the first conversation
Check 5 — Do they ask more questions than they answer?
In your first call with an agency, the ratio of questions to answers tells you something important. An agency that jumps straight to proposing solutions — tech stack, timeline, team composition — before deeply understanding your problem is optimizing for the sale, not the outcome.
Good agencies ask things like:
- What does success look like 6 months after launch?
- Who is your primary user and what's the most important thing they need to be able to do?
- What's been tried before and why didn't it work?
- What happens if the project runs 4 weeks over schedule?
Red flag: The first call is mostly a pitch. They're selling, not listening.
Check 6 — Do they push back on your scope?
If you describe your product and an agency just nods along, agreeing that everything you've outlined is necessary and achievable in your timeline, be careful. A good agency will challenge your scope — not to be difficult, but because they've seen how products actually get built.
Healthy pushback sounds like:
- "Do you actually need X in v1, or is that something that can wait until you have users?"
- "That integration is more complex than it looks — have you spoken to their API team?"
- "Your 8-week timeline is aggressive for this scope. Here's what I'd cut to make it work."
Red flag: Zero pushback. They agree with everything. They tell you what you want to hear.
Check 7 — Are they honest about what they don't know?
No agency knows everything. The ones who pretend otherwise are the dangerous ones. A trustworthy agency will say "I'm not sure — let me find out" rather than making something up to look competent.
Watch for overconfidence about:
- Exact timelines (complex projects have inherent uncertainty)
- Specific technologies they haven't worked with before
- Integration complexity with third-party systems they haven't used
Red flag: Confident answers to every question, including ones that require research or discovery to answer accurately.
Check 8 — Do they mention risks proactively?
Good agencies tell you what could go wrong before you sign — not to scare you off, but because they've been through enough projects to know where the bodies are buried. If an agency presents only the upside of working with them, they're either inexperienced or they're hiding something.
Things a good agency will proactively mention:
- Typical causes of scope creep in your type of project
- Risks specific to your tech stack or integration requirements
- What happens if key team members leave during the project
- What their escalation process is when something goes wrong
Red flag: No mention of risk, challenges, or what happens when things don't go according to plan.
Evaluating the proposal
Check 9 — Is the scope specific enough to hold them accountable?
A proposal should be specific enough that both parties agree on exactly what's being built. Vague scope isn't just frustrating — it's how agencies protect themselves when you inevitably disagree about whether something was included.
A good proposal includes:
- Specific features with enough detail to be unambiguous
- Explicit list of what's excluded (as important as what's included)
- Clear definition of "done" for each deliverable
- Specific timeline with milestones
- Clear payment schedule tied to milestones
Red flag: Proposal describes deliverables at a high level with phrases like "full-featured platform," "complete backend," or "all necessary integrations." These mean nothing.
Check 10 — Is the pricing model clear?
There are two main pricing models: fixed price and time & materials. Both are legitimate. Both have implications you should understand.
Fixed price: You pay an agreed amount for an agreed scope. Good for well-defined projects. Risk: if the scope isn't well-defined, you'll either get a watered-down product or pay for change orders.
Time & materials: You pay for hours worked, usually at an agreed hourly rate. Good for projects with evolving requirements. Risk: costs can exceed initial estimates if scope isn't managed carefully.
Whichever model you use, make sure you understand it fully and that it's documented clearly.
Red flag: Unclear pricing structure. Hidden costs. Vague language about what's included in the base price vs. what's extra.
Check 11 — Do they explain their architecture decisions?
Any proposal for a technical project should include — or at least reference — the technical approach. What stack are they recommending and why? How will they handle your specific scaling requirements? What does the deployment architecture look like?
You don't need to understand every technical detail. But you should be able to ask "why are you recommending Node.js over Python for this?" and get a real answer, not "we're most comfortable with it."
Red flag: Zero technical detail in the proposal. "We'll use the right technology for the job" without specifying what that technology is.
Check 12 — Who actually builds your project?
Many agencies sell you a senior team and deliver a junior team. Some outsource work to subcontractors without telling you. Some use your project to train new developers.
Ask directly:
- Who will be working on my project day-to-day?
- What are their experience levels?
- Can I meet them before signing?
- Do you use subcontractors?
- Will the same team work on my project for its full duration?
Red flag: Evasive answers about team composition. Reluctance to introduce you to the people who will actually do the work. Mention of "resource allocation" that suggests they assign people based on availability rather than fit.
During the project
This checklist is mostly pre-engagement, but here are three things to watch for once you've started:
Watch 1 — Are demos getting harder to schedule?
Regular demos are how you verify progress. If an agency starts making excuses for why this week's demo needs to be pushed, or if demos become infrequent, something is usually wrong.
Watch 2 — Is communication getting slower?
Response time is often a leading indicator of project health. When things are going well, communication is easy and fast. When things are going badly, agencies go quiet while they try to figure out how to explain the problem.
Watch 3 — Are change requests accumulating before anything is delivered?
Some change requests are legitimate. A pattern of change requests before any deliverable is ready often means the original scope wasn't well-understood. Figure out whether the changes are coming from genuine scope evolution or from the agency not understanding the requirements in the first place.
The meta-checklist
After going through all of this, step back and ask yourself three questions:
Do I trust these people? Not just their work, but their character. Are they honest when they don't know something? Do they tell me things I don't want to hear? Have they proactively flagged any risks?
Do they understand my business? Not just the technical requirements, but the problem you're solving, who your users are, and what success looks like for your company. The best technical execution in the world is wasted if it's solving the wrong problem.
Do their incentives align with mine? An agency is incentivized to win projects and get paid. Sometimes that aligns with your interests (they want you to be happy so you come back and refer others). Sometimes it doesn't (they want to close the deal even if they're not the right fit). Think about how they've behaved when alignment was tested — did they tell you something that risked losing the deal, or did they tell you what you wanted to hear?
One honest thing about Teamseven
We fail some of these checks too. We've had projects run over timeline. We've had communication gaps when things got complicated. We've made architecture decisions we'd make differently in retrospect.
What I can say is that when we've gotten things wrong, we've said so — and then fixed them. That's not a boast. It's just what we think the minimum standard looks like.
If you're evaluating us against this checklist, you should. Ask us the hard questions. If our answers aren't good enough, work with someone else. We'd rather lose a project than take one we can't deliver on.
Muhammad Nabeel is the co-founder of Teamseven, a software development agency based in Lahore, Pakistan. We've been building custom software since 2017. If you want to put us through this checklist, we welcome it.