This guide is your shortcut to turning Demodesk's AI Scorecards module into an everyday coaching asset for your team. Whether you're a rep, manager, or admin, you'll learn how to get the most out of this specific module, from setup to coaching workflows.
You'll find:
Ready-to-use templates (Scorecards, QBRs, Prompts)
Role-specific playbooks for Admins, Managers, and Users
Swipeable checklists to apply today
Common pitfalls to avoid and what “good” looks like
Admins
Admins
As an Admin, your job is to set teams up for success. When scorecards are built around your company’s sales methodology and coaching needs, managers can coach more consistently, and users always know what “good” looks like.
What You’ll Do
Create custom scorecards from scratch or clone existing ones
Use the question library to pull in SPICED, MEDDIC, BANT, and more
Automate scorecard assignment by meeting type, title, or groups
Do It Today
Meet with different departments to gather scorecard needs
Configure auto-assignment rules
What a Great Scorecard Looks Like
Each scorecard should include 5–8 high-impact questions that reflect your team’s talk track, product positioning, and customer goals.
Ready to Use Scorecard Templates
Ready to Use Scorecard Templates
💡 Tip: You can copy and paste these question sets directly into your custom scorecards in Demodesk. They’re structured for immediate use. No edits needed.
Example structure for a CSM: Upsell / Expansion Scorecard
Was a clear agenda presented in the beginning, and were expectations set and discussed with the customer?
Do they have a good understanding of the customer's needs and obstacles?
(e.g. through deep SPICED discovery or information from previous calls)Did they pose probing questions that uncover deeper insights?
(SPICED)Were they fully engaged and actively listening during the conversation?
Did the rep pinpoint how customers could benefit from using your product to grow or achieve their objectives?
(based on what was discussed previously)If objections arose, were they addressed skillfully?
Was a clear and specific next step established for future engagement?
(e.g. meeting booked, pilot start discussed etc.)
Example Structure for AE: Closing Call Scorecard
(Use for late-stage calls focused on closing deals)
Did the AE confirm that all decision-makers have been identified and involved?
e.g., validated buying committee or confirmed economic buyer.Did they clearly articulate ROI or success outcomes from the customer’s POV?
e.g., not just features, but measurable value.Were pricing, contract terms, and objections handled clearly and confidently?
e.g., no ambiguity left about scope, timeline, or costs.Was the mutual action plan updated and agreed upon?
e.g., dates, legal checkpoints, internal approvals.Did the AE reinforce urgency tied to a business trigger or deadline?
e.g., quarter close, project start, hiring wave.Was a clear verbal or written commitment established before the call ended?
e.g., next legal steps, verbal yes, signature timeline.
Example Structure for Marketing: Voice of Customer Review Scorecard
(Use for analyzing marketing interviews, event feedback, or VOC calls)
Was the customer’s role and context clearly established at the start?
e.g., title, team, product usage history.Did the interviewer uncover emotional drivers or key pains?
e.g., “What frustrated you before using Demodesk?”Were specific success stories or metrics shared?
e.g., “We saved 3 hours per week” or “Ramp time cut in half.”Was brand language or phrasing used by the customer captured accurately?
e.g., “We love how intuitive the setup is”Were quotes tagged as public-safe, internal-use, or NDA-only?
e.g., helpful for content production later.Did the interview reveal any use-case expansions or upsell opportunities?
e.g., “We’re planning to roll this out to Europe next quarter.”
Example Structure for Product: Feature Feedback Call Scorecard
(Use for product validation or beta testing interviews)
Was the problem the feature aims to solve clearly discussed?
e.g., Did the user articulate the challenge in their own words?Did the tester use or review the feature during the session?
e.g., live walkthrough, prototype test, or screen share.Did they identify any confusing UX elements or blockers?
e.g., misclicks, unclear labels, extra steps.Was feedback categorized as bug, UX issue, or feature gap?
e.g., important for dev follow-up.Were feature requests discussed in terms of impact or frequency?
e.g., “We’d use this weekly — it would save hours.”Was the participant willing to continue in beta or refer others?
e.g., shows engagement and market fit.
Example Structure for Enablement / RevOps: Certification Call Review
(Use for final enablement check-ins or certification sessions)
Did the rep clearly articulate the product’s value proposition?
e.g., tailored to a specific persona or use case.Were demo flows smooth and aligned with best practices?
e.g., not too technical, focused on outcomes.Did the rep handle a mock objection confidently?
e.g., pricing pushback, competitor mention.Was the scorecard used correctly throughout the role-play?
e.g., followed structure, asked clarifying questions.Did the rep adapt based on customer responses or cues?
e.g., showed listening and flexibility.Was the rep self-aware during debrief (e.g., acknowledged gaps)?
e.g., “Next time I’ll anchor pricing earlier.”
Example Structure for Support: Escalation Review Call Scorecard
(Use for QA or training on high-impact or complex support cases)
Was the issue summarized clearly and without blame?
e.g., timeline, tools involved, customer impact.Did the agent ask precise diagnostic questions before escalating?
e.g., “Is this happening on all browsers?”Were internal notes and tags used to document the case accurately?
e.g., engineering handoff, bug reference, workaround.Did the agent keep the customer updated on progress and timelines?
e.g., set expectations, replied proactively.Was empathy and professionalism maintained even during tense moments?
e.g., “I completely understand your frustration.”Did the call end with a clear resolution or next step?
e.g., fix confirmed, ticket closed, follow-up promised.
Common Use Cases for Scorecards
Common Use Cases for Scorecards
1. Sales Teams
Use Case: Discovery Calls (SPICED / MEDDIC)
✅ Best practice: Use a structured scorecard to coach reps on uncovering business pain, urgency, and decision process.
❌ Pitfall: Superficial discovery without impact or next steps.
Example questions:
Did the rep uncover core pain points?
Was the impact framed in business terms?
Did the rep confirm the decision-making process?
Use Case: Demo Reviews
✅ Best practice: Evaluate if the demo connects to the customer’s story.
❌ Pitfall: Generic walkthroughs with no clear alignment.
Example questions:
Was the demo tailored to the use case?
Were objections proactively addressed?
Did the rep validate alignment mid-way?
Use Case: Closing Call Audits
✅ Best practice: Review how reps confirm stakeholders, urgency, and close deals cleanly.
❌ Pitfall: Leaving objections or pricing ambiguous.
Example questions:
Were all decision-makers confirmed?
Was pricing discussed transparently?
Were final objections resolved?
2. Customer Success Teams
Use Case: Onboarding Consistency
✅ Best practice: Ensure early alignment on goals, metrics, and milestones.
❌ Pitfall: Technical onboarding without customer context.
Example questions:
Were success metrics discussed?
Was first value clearly defined?
Were roles and responsibilities aligned?
Use Case: QBR Effectiveness
✅ Best practice: Coach CSMs to connect product impact with strategic goals.
❌ Pitfall: Relying solely on usage stats.
Example questions:
Was ROI presented using hard data?
Were risks discussed transparently?
Was a mutual action plan defined?
Use Case: Churn Risk Detection
✅ Best practice: Use scorecards to flag soft signals of dissatisfaction.
❌ Pitfall: Waiting for low usage metrics to catch issues.
Example questions:
Did the customer raise concerns or frustrations?
Was renewal discussed proactively?
Did the customer seem aligned on value?
3. Product & Marketing Teams
Use Case: Feature Rollout Monitoring
✅ Best practice: Validate how reps pitch new features and how customers respond.
❌ Pitfall: Features not mentioned or poorly positioned.
Example questions:
Was the new feature introduced?
Was the value explained clearly?
Were customer concerns noted?
Use Case: Voice-of-Customer Collection
✅ Best practice: Use comments in scorecards to gather qualitative feedback.
❌ Pitfall: Relying only on post-call summaries or CRM notes.
Example questions:
Were product requests or complaints captured?
Was feedback tagged for follow-up?
4. Enablement & RevOps
Use Case: Sales Methodology Compliance
✅ Best practice: Use scorecard data to check if reps follow SPICED, MEDDIC, or BANT.
❌ Pitfall: Teams trained but not applying methodology live.
Use Case: Training Effectiveness
✅ Best practice: Compare scores before/after enablement to measure impact.
❌ Pitfall: No follow-up measurement after workshops.
Automations
Once your scorecards are set up, the next step for any admin is to define automations that ensure the right scorecard is applied to the right meeting , without manual work.
We recommend using a combination of filters like meeting title, team group, and audience type to make sure each meeting is evaluated with the most relevant scorecard.
A well-structured automation setup might look like this:
Managers
Managers
Scorecards aren’t just a compliance tool. They’re how you turn calls into coaching. Used consistently, they surface what top performers do well and where others need targeted support.
What You’re Aiming For
As a manager, your goal is not just to score calls, it's to develop people.
Scorecards help you:
Spot coachable moments in real calls
Detect patterns across reps, teams, or call types
Evaluate consistently across different managers
Benchmark rep growth over time
Connect specific behaviors to deal progress
Best Practice: Weekly Coaching Rhythm
Establish a recurring rhythm for reviewing and using scorecards:
Review 3 calls per rep per week
Prioritize high-impact deals or missed opportunities.Use comments to coach in context
Instead of generic feedback, leave specific notes tied to each question:"You set the agenda clearly, but missed clarifying the decision-maker — consider using the 'authority' prompt next time."
Bring scorecard trends to 1:1s
Use the overall scores and repeated feedback patterns to fuel performance discussions, rather than gut feel.Spot & scale what works
Notice that one rep excels at uncovering urgency? Tag and share their call as a best practice across the team.
Use Scorecards to Answer Strategic Questions:
Coaching Question | How Scorecards Help |
“Why is this deal stuck?” | Check if discovery was thorough and objections addressed |
“Which reps need support?” | Compare average scores across criteria |
“Is our new messaging working?” | Look at adoption rates in scored conversations |
“Are we demoing effectively?” | Review how value was positioned and tied to needs |
Pro Tip: Mix Manager + Peer Reviews
Encourage peer scorecard reviews once a week. It helps reps develop a critical eye for call quality and creates shared accountability across the team.
Manager Actions (At a Glance)
✅ Review top & bottom-scoring calls weekly
✅ Leave contextual feedback in scorecards
✅ Tag best-in-class calls for team learning
✅ Use scorecard data to plan training sessions
✅ Coach on behavior, not just results
Users
Users
Scorecards aren’t just for managers. They’re your personal growth toolkit.
What to use them for
Get specific feedback tied to moments in your own calls
Spot your strengths, and areas to improve
Study what top reps do differently
Track your personal growth over time
Prep smarter for high-stakes calls or renewals
Best Practice: Build a Personal Feedback Loop
Use scorecards to develop a consistent self-review habit
After Every Key Meeting:
Request feedback from your manager
Rewatch your call and self-score
Reflect on what you’d do differently next time
“I realized I jumped into the demo without confirming the pain point again, next time, I’ll ask a quick check-in question first.”
Once a Week:
Review your own trends in Insights
Pick one behavior to improve (e.g. asking layered questions, setting next steps)
Try it in your next meeting and track how it went
Pro Tip: Learn by Example
Ask your manager to tag top-performing calls for you. Watch how those reps open the meeting, ask discovery questions, or handle objections. Notice what’s different, then try it out in your own calls.
User Actions (At a Glance)
✅ Self-review 1 call per week
✅ Request feedback from your manager on key deals
✅ Rewatch tagged best-practice recordings
✅ Track your own scorecard trends
✅ Reflect → improve → repeat
Explore Other Coaching & AI Modules
Looking to get more out of Demodesk’s Coaching & AI seat? Explore our other best practice guides:


