Top Five Reasons Lead Scoring Will Fail
- Not enough leads. I read a recent Marketing Sherpa article about a company that was able to produce some awesome lead gen results without a CRM or lead scoring. You can get away with this if your a small company starting out and your still ramping up your demand generation efforts. You really need a large supply of leads to justify lead scoring. Lead scoring is disruptive for organizations as your automating the prioritization of how leads should be followed up on. If there are not enough leads then sales or the tele-prospecting team will jump on every lead regardless of the best lead scoring system put in place. People need to make their quotas.
- There is no end goal. Lead scoring can't be something that you setup just to cross it off your marketing automation checklist. It must be setup with a goal in mind. Typically it has a dual purpose: it helps sales prioritize the follow up of qualified leads and it helps marketing determine the type of offers that should be sent as lead scoring is tied to the different stages of a buyer's journey. Possible goals can include increasing the velocity of leads through the sales funnel, increasing the overall revenue generated by sales, launching a specific lead nurturing campaign that is geared to leads with a low lead score etc... You also should consider the emotional side (have you read Switch?)- making sales people happy and giving more power and respect to marketing. These aren't as flashy to the CEO but they make things more enticing for both sales and marketing. Besides winning a Markie (special award for marketers), the biggest award I've seen given to a marketer was receiving recognition from the sales team of a job well done.
- Unrealistic expectations. Lead scoring at the moment is still not a science. It will not slice bread. It will not make your product more attractive. It will not generate more leads (it may reduce the number of leads). Many marketers and sales teams see it as the bright shiny toy on the shelf that they have to have. They think it will solve all of their problems. It won't. It will promote some great conversations and it can help you improve your lead generation engine but this will take time. Because lead scoring is not a science it will require some tweaking and fine tuning. Don't expect it to move mountains after a few days.
I also see sales teams that want to take over the lead scoring discussions and build out very elaborate and complicated lead scoring models that use some of the traditional BANT criteria that sales teams have used for years to qualify leads. I don't recommend that lead scoring be used for that purpose and I'm not alone (See: Why BANT No Longer Applies for B2B Lead Qualification). Automated lead scoring is best used or doing an initial qualification of leads that meet a minimal level of an agreed upon definition between sales and marketing of a "qualified lead: and then having a tele-prospecting team do further qualification to see if a real sales opportunity exists. At that stage, there are different tools that sales can use to determine if a prospect is really engaged or not. - Too many lead scoring criteria at the beginning. I recommend using five criteria to track the profile of an individual (job role, industry, company size etc...) and five criteria to track engagement (recency and frequency of web visits, social media activity, email responses, form completions etc..) to start off with. Test out the criteria and see if it's working after three months. You may find that you can add some additional criteria based on discussions with your sales team. Build in a schedule for a regular review of your lead scoring criteria.
- Not enough data. I still help review about one lead scoring model each week on average and one of the biggest issues I see is lead scoring programs that are scoring on data points that don't exist. Your lead scoring criteria can't be created in a vacuum. For example, if your going to score leads based on industry, review your web forms to determine if you are collecting that data. Review your live events to see if you ask that question at booths and/or roadshows. The score is only as good as the data you are collecting. You may even require some additional data tools to clean your data and append additional information.
In addition, you need to review your lead scoring model every 3-6 months to ensure that it's still producing enough quality leads. If you don't do this, sales will ignore the lead score and you will be back to square one.
This is not meant to discourage you but it's something to keep in mind when thinking about a lead scoring project. Many companies I work with will concentrate on getting their database in order first as well as creating nurturing paths to help enrich their database before tackling scoring. It comes down to your priorities.
Articles to consider: Lead Scoring Has Drastically Changed – How do You Measure up?
Image courtesy of Beachhead Marketing
Image courtesy of Beachhead Marketing
3 comments:
Thanks for the post Chad, I'm glad you liked the meme. Would you mind adding a do-follow link to our blog? http://land.beachheadmarketing.com/blog
Thanks,
Steven
Hi Chad. Great article. I would be curious to know how you are measuring Social Media engagement for Lead Scoring.
@Steven - done
@Kurt - Here is an example: companies are pulling in social media data where their company is mentioned. They are then matching the unique ID of the Twitter handle with an actual contact in their database. What is needed on the contact record is the Twitter ID to make that match.
Another example is "social sign-on" where you can extract data from social media tools like Facebook or LinkedIn and pull that data into your database and score on it. For example: You interests listed on LinkedIn.
Post a Comment