How to Avoid Groundhog Day Syndrome in Software Demonstrations

As your organization’s project manager in a software selection, you’ve been working hard for the last month or two gathering requirements, reviewing RFPs, and managing the team’s expectations. The time for the software demonstrations has arrived at last and you walk into the presentation room and think you finally have an opportunity to sit back with a donut, cup of coffee, and watch the show. While I don’t want to stop you from that donut…your job is far from being a passive observer. Below are some tactics that I’ve used to properly manage, evaluate, and score demonstrations:

  • Building the Script: When building your demonstration script (i.e., what the vendor will be following to show that the software can meet your organization’s needs), keep the business users in mind and make the script be “scenario or business process focused.” For example, the vendor should demonstrate taking an order from order entry through fulfillment, shipping, and invoicing. This allows the team to see the system in a business process context and evaluate the overall usability of the system.
  • Estimating the Demo Schedule: Two to four minutes is a good rule of thumb to estimate the duration it will take to demonstrate each requirement. If the demo is looking too lengthy based on the volume of requirements to score, consider moving the financials, technology, and reporting components of the demonstration to the week prior to the demonstration with a subset of your full evaluation team.
  • Make Demos Realistic: Send the vendor sample master data (i.e., parts, customers, suppliers) and reports a couple of weeks in advance of the demonstration so that it can be leveraged in the demonstration. This way you aren’t stuck with trying to compare your business to a bicycle manufacturer (unless you are a bicycle manufacturer).
  • Structuring the Demonstration Week: In my experience, software evaluation processes have the best results when each of the vendor finalists completes their demonstration in the same week. It can be either consecutive days or with an ‘off day’ between demonstrations, but having each vendor’s capabilities fresh in your mind allows for more efficient comparisons between software. Remember to keep the demonstrations offsite from the client facilities to eliminate distractions and maximize participation. Don’t forget the caffeine!
  • Structuring Individual Vendor Demonstrations
    • Allow the vendor 60 minutes at the start of the day to do whatever they choose (if asked, advise the vendors to focus on a brief navigational overview). Software companies invest a lot of money in training their pre-sales teams and their skills should not be taken for granted. This is also the only time the vendor should be allowed to use PowerPoint slides.
    • The bulk of the day is spent executing and grading the demonstration script. The facilitator’s job is to make sure the evaluation team is keeping up with the questions and the vendor’s are staying on time.
    • Use 5/3/1 scores to score the demonstrations, where 5 = Software Exceeds Requirement, 3 = Software Meets Requirement, 1 = Software Does Not Meet Requirement/Gap. Note: do not use Yes/No for grading. 5/3/1 will create a greater spread in the scores and too many ‘Exceeds Expectations’ can be a bad thing.
    • Give a 30 minute break for lunch. It’s tempting to make this a working lunch, but allowing time for everyone to clear their head for a moment is more beneficial than powering through the script.
    • The evaluation team should score the entire script and not just their area of responsibility. This provides a better average score, a great way for your company to learn about how their inputs affect downstream processes and an opportunity to build the future leaders of your company to think outside their silo.
    • Use a ‘Parking Lot’ when you find yourself spending too much time on a single requirement or question. It’s perfectly acceptable to not answer a question in the moment and schedule a follow-up session with the vendor.
    • Debrief as a team immediately following the demonstration to reach consensus on gaps (i.e., where the vendor did not meet the requirement) and follow-up questions. Default to the requirement’s owner when there is debate over a specific question.
  • What to Grade On
    • The software vendor’s ability to show that the system meets the specific requirement
    • If a vendor uses a third-party tool to meet the demo requirement, this is acceptable because it will have an impact on the vendor’s score in the total cost of ownership analysis (common areas are credit card processing, sales tax reporting, manifest tools)
  • What Not to Grade On
    • Vendor claims regarding system functionality – vendors must show they can meet the requirement
    • Ability or “attractiveness” of the presenter (yes, this happens)
    • Look and feel of the system or navigation (this should be evaluated as a separate Usability survey grade) – Note: a Usability survey that asks the evaluation team to rank each software’s ‘look and feel’ should be sent after all vendors have demonstrated.

Software demonstrations weeks are long and can feel like you’re in the movie ‘Groundhog Day’. It is your responsibility as the facilitator to not only ensure the week run smoothly, but to keep the team excited. You set the example, so do whatever it takes to keep yourself enthusiastic. The best motivation is to imagine six months from now and the company is in the throes of implementation. Do you want everyone beating down your door asking “How did we ever pick this software?!” because the demonstrations went poorly and the wrong software was chosen? Probably not.

1 Comment

  • Alex Foucre-Stimes July 17, 2013 10:37 am

    This is so important to keep things fresh, great post!

Phone: 312-602-4000
222 W. Adams
Chicago, IL 60606
Show Buttons
Share On Facebook
Share On Twitter
Share on LinkedIn
Hide Buttons