Hone

How to Use Data to Identify Key Skills Gaps with Jenny Dearborn

What's covered

Join host Tom Griffiths in an illuminating episode featuring special guest Jenny Dearborn, a seasoned learning leader, executive, Board member, and accomplished author.

In this conversation, you'll uncover:

  • Analyzing substantial data sets to pinpoint key skills gaps
  • Crafting impactful learning interventions for measurable business outcomes
  • Aligning learning and corporate strategies to propel sales and leadership success at major entities like HP, SuccessFactors, and SAP

Unveil the art of narrating compelling stories that resonate with your executive team, showcasing learning's impact on business results.

Dive into this transformative conversation to enhance learner engagement, revolutionize learning, and elevate your organization's growth. Join us for an insightful shift in your learning and leadership paradigm.

 

About the speakers

JennyDearborn

Jenny Dearborn

Board Director | Leadership Development & Growth Advisor

Jenny Dearborn is an entrepreneurial leader, driving people transformation aligned to company strategy / objectives to achieve growth targets. Award-winning thought leader, speaker and author on data-driven Sales and Leadership. Jenny is a painter, glass blower, metal sculptor (aka welder), film producer, bee keeper, competitive runner and neurodiverse.

TomGriffiths

Tom Griffiths

CEO and Co-founder Hone

Tom is the co-founder and CEO of Hone, a next-generation live learning platform for management and people-skills. Prior to Hone, Tom was co-founder and Chief Product Officer of gaming unicorn FanDuel, where over a decade he helped create a multi-award winning product and a thriving distributed team. He has had lifelong passions for education, technology, and business and is grateful for the opportunity to combine all three at Hone. Tom lives in San Diego with his wife and two young children.

Tom regularly speaks and writes about leadership development, management training, and the future of work.

Episode transcript

Tom Griffiths

Today I'm excited to be joined by my friend and our advisor at Hone, Jenny Dearborn. Jenny is an author, an executive, an advisor, and a board member, and has really seen it all when it comes to talent development, both at the small scale with startups and a huge global scale, with, pick multinationals like SAP.

She's a five-time chief learning officer, chief talent Officer, and Chief People Officer, including some companies you may have heard of, like HP, Success Factors, and SAP where she was CLO in charge of internal training for over 70,000 people globally. She's written two books the Data-Driven Leader, A Powerful Approach to Delivering Measurable Business Impact through People Analytics and Data-Driven, and how Performance Analytics delivers extraordinary sales results.

So we're gonna dig into some exciting things around measurement, we've known each other for a few years and she's been a tremendous help to me. And Hone, as we've built our company in this space. So can't wait to dig into some really meaty topics and some amazing experiences with Jenny. Jenny, thanks for joining us.

Jenny Dearborn

Thanks, Tom. Thanks for having me.

Tom Griffiths

So, Jenny, one of the things that we originally hit off around was the subject of using data, in talent development. And you've obviously written a couple of books on that subject, so I know it's close to your heart. I was just curious to hear how did that come to be?

What got you originally so interested in the uses and importance of data in, in talent development?

Jenny Dearborn

I know exactly the sort of turning point or the pivotal moment that, turned me from a regular, learning leader into a data-obsessed learning leader. So I was at Hewlett Packard, and I was responsible for global sales enablement.

I remember very distinctly presenting at a QBR and putting up a slide that was hard to gather all the data for this QBR. And it was, you know, it said something like 2000 sales reps took this one-day class and gave it a 4.5 out of five stars, and 1500 sales reps took this you know, two-day class and, and they gave it a 4.7 out of five stars.

And I, I thought that was the end of my job because I was presenting the volume and I was presenting the transactions. I was also feeling really good cuz it was hard to gather that information. And the CEO at the time, stopped me mid-presentation, Mark Heard, and he said, stop, stop, stop, stop right there.

Because all I know for sure from what you're presenting is that you've wasted a lot of time and a lot of money. I don't actually have any. Evidence that anything that you do matters and makes our sales reps more productive. So why don't you go away and come back when you can actually prove that something that you do matters?

And I, oh. Oh yeah. Uh, good idea. So I sort of slunk away and was like, how do I actually do that? I mean, how do I prove that what I do matters? I, you know, cuz he was just saying that nothing I did had any value. So, I started to try to gather. The data that he was asking for, like CRM data and all this stuff.

And I couldn't because, I was above my pay grade and it was not something for me to be, I wasn't allowed to see the data. So then I was pretty sure that I couldn't at that company, deliver the results that the CEO was asking for. So I started up on a path to move to a smaller company where I would've access to the data.

Started to be a bigger fish in a smaller pond. And so then I went to SuccessFactors, which was about 1500 employees, and was responsible for sales enablement there. And so then started a path to run sales enablement at Success Factors, but also with the understanding that I would've access to all of the performance data.

The first book Data Driven, which is, published in 2015 is the story with the names changed, of my journey, my first year at SuccessFactors.The book came about because I would go to conferences and present, you know, at small conferences. A lot of people would just stand up and say, this is what I'm working on, and here's my progress.

And there would be lots of us sort of sharing. And so I would share, I'm like, here's my, my progress. And. People would be like, how did you do that? I'm like, well, I, we just made it up and we just figured it out and, and, and, and then I got invited to bigger and bigger conferences and then I was presenting the work that I was doing at, on a bigger stage, and I would just get thronged afterward.

And people were asking me, you know, where did you learn how to do that? And which, well, my team and I just figured it out. I mean, really it was my team and then conference. Attendees would say, can you please write it down? So I, yeah, so that's what became book one is my Journey that I went through my first year at SuccessFactors.

We just sort of fictionalized that, and that became data driven, which is a book about how to use learning and performance improvement tools to predict the performance of sales reps. the second book is the same algorithms because we used some, we used some, you know, some fancy math, the same algorithms to predict, the effectiveness of leaders and leadership roles.

Was a dare and somebody said, okay, you can do this. With sales cuz there's so much data in sales, of course, you can do, you know, after a while it was like, well obviously you can do this in sales, but can you do it in leadership like a dare? And I was like, I totally can. So then the second book, the Data-Driven Leader.

That's how that story was born. And that was that's the story again, a fictionalized version, of what we went through at SAP.

Tom Griffiths

That's phenomenal. There's so much in that. I love it. I mean, there are so many common challenges we hear all the time. Like, I can't get access to the data, or I don't know how to prove what I'm doing.

And then, yeah, it's easy in things like sales that have got. Dollar numbers on them, but what about fuzzy things like leadership and management? So, yep. You've, you've done it all. So this podcast is now gonna be five and a half hours long as we dig into it all.

It would be interesting to just dig a little deeper and understand perhaps a little more about the methodology, starting with the sales side of how you went about going beyond just those level-one ratings that, got shot down a little by the CEO and got to things that really did prove the business case or proved to the executive leadership that this was working.

Jenny Dearborn

Well, I give a lot of, credit to my partner in this she's the math brain of it, and she's the CEO of a company called Emplay, who I partnered with. And what we did was really start with. looking at the CRM, and we pulled out the different stages. At the time, I think we had five sales cycle stages in which every company sort of has something different than how they configured their CRM.

But we would look at CRM data, and we would look at where do deals fall out of the sales. Cycle. First of all, we had to tell everybody that we were doing this and be sure to put your deals in the CRM and put them inaccurately. Because garbage in, garbage out. So, you needed, a sales team that had really good discipline in how to use the CRM.

Then we would pull CRM data of where were deals stalling. Where would they where would deals fall out of the sales cycle? Then we just studied the CRM data, and found the sales reps who were the best at each stage, you know, just because somebody is great at it.

Closing, um, maybe they're great at that last stage. So we sort of found the sales reps that were best at each stage. Did a lot of interviews of sales reps too, and a lot of shadowing, sort of like silent on the call, you know, double jack, this is before like gong, so we were like double jacking on calls.

Like an archeologist, like what exactly are they doing? You know, how much do they prep? And so we really, really understood what was the knowledge, skills, behavior, competencies, and habits at every stage of the sales cycle. Then we could say, okay, at this stage, stage one, you need these 10 skills.

At this stage, you need these 10 skills. And then we would say, okay, then how do we assess for those skills? Cuz the skills are gonna be different at every stage of the sales cycle. So how do we assess those skills? And then how do we teach those skills? And then how, so we basically took a problem, broke it apart.

Analyzed all of the pieces, like, putting a thousand-piece puzzle on the ground, studying everything, and then slowly put it back together and really, really understood that coming together. Then we could really, really finely calibrate. And what we found were 167. Variables. So puzzle pieces, you know, when we broke the whole thing apart that were distinctive, some of them were the same across stages.

Then we developed. Training and learning, or performance support tools or some other support mechanism like a coach or something like that for every single stage. And then we just kept running the process, and we just kept running it until it was faster, faster, faster. And then we could just shorten the sales cycle.

And then what we did is we got to the point where we. I think we had like a five x increase in new hire sales reps meeting quota. We just started, you know, hitting our stride and, and sales reps were just crushing it. And then it became this. This learning mechanism became part of the pitch to recruit new sales reps, right?

We'd say come to success factors. Don't go to Workday. Our competitor, our little itty bitty competitor that had just started on the other side of the bay, you know, or some other, competitor come to us. Because we can basically guarantee, not literally, but basically guarantee that you'll make a quota because, from a sale, from sales rep's, comp structure, like they have to make, they have to make quota, or they're, they're gone, right?

You know, uh, their base is, is actually pretty low. Anything that a company can do to help that sales rep. Make quota, and they're not just like on your own with like a laptop in their home office. Then the sales rep's like, okay, I wanna go there 'cause I know I'll be successful there.

So it became part of this whole, pitch for the company.

Tom Griffiths

At the high level, looking at all of those benefits, not just, driving sales results, but engaging employees in their own development. And even going so far as to attract, talent more effectively because of this, certainty that you could have around the performance and the development that they would get.

It was really interesting to hear where you started as well because, when we think about it. How do we use data in learning and development? Oftentimes that's as you first described, which is after the program; oh, what can we go look at to see if it had an impact? But actually, you started with the data to understand where was the dropoff in the sales process.

Let's break that apart into the skills, then let's assess those skills. And only at that point do we start doing any interventions or training. Of course, there are output metrics beyond that. But it's awesome to see that you've also done that kind of upfront data analysis.

Jenny Dearborn

Yeah, and, and to the point where, you know, I would say to my teams, if you don't know what you're gonna measure, before teaching a class or a skill or anything, uh, you know, don't do it. Like if you, if you can't know, this is red on our dashboard and we're gonna put this learning intervention in place and then we're gonna measure.

This goes from yellow to green. Like if you don't have that structure set up upfront, don't waste people's time. Mm-hmm. I mean, especially sales reps, a lot of people don't have time to waste. Um, but sales especially are on the clock, and you can't, you know, every, for every sales day that you waste that you're asking them to do a training class and they're not out in the field selling, you know, that affects the bottom line of your company.

And then now everybody's jobs are at risk because you've just wasted somebody's time. So, if you don't know and have fantastic certainty that. You are gonna call up a sales rep and ask them to stop selling to prospects because you want that sales rep to do your learning thing instead. You better damn well know that that thing's gonna work before you waste somebody's time.

Tom Griffiths

Ideally, when you've got the results, you can show here, here's the lift in performance, therefore, you know, more than pays for itself, um, as the time off the floor. But how did you actually get that then? Was it a phased rollout? Did you try it with some, you know, a smaller group of reps first?

Jenny Dearborn

Yeah, just to get the data and then have the confidence to roll it out further. So always start with a pilot, then measure the heck out of it. So a ton of measurement upfront so you know exactly what you're expecting, then start with a small group of people who are willing, and you want it to be an even sample.

Mm-hmm. So it can't be just like top performers looking to get incrementally better. Yep. Because then that's gonna skew. That's gonna skew all your data. So, you want it to be an even sample of performance levels, but a small group and something, and you want a pilot short enough, you know, that's really not more than a quarter, a quarter of intervention and then a quarter of results.

Sometimes when we do pilots, we're like, I don't know if it's gonna work, and I don't know how long to wait mm-hmm. Until I get the results. So, um, you have to have that research design upfront before starting because you just don't want to waste people's time.

Go slow up front to go fast later.

Tom Griffiths

Yeah. A hundred percent. And, as you say, you're doing a science experiment. You're picking a control group. Yeah. And a test group. And we're proving a hypothesis that this works. Um, that's great. So I know it was a dare to move this over to leadership, you know, that's where our hearts are. Uh, we'd love to hear how that translated, what were some of the differences or challenges of doing that, and how you overcame them. Yeah, so it, the harder part for leadership was what really is success? What are the success measures? And so we were pulling a lot from the employee engagement survey.

But we would just we would look at specific questions. So, because also you don't want to. You don't wanna teach to the test, right for leadership? You do with sales because, you know, a sale is a sale. But with leadership, a lot of things go into, like, I'm telling people bad news, even though it's the, I have to say it now, they're mad.

The hypothesis going in is that, even in difficult times, A good leader or a great leader can still have a good employee engagement survey. Results even in bad times, even during a layoff, even in, you know, the darkest times of a company. That's the time when a leader.

Can step up and still, you know, really perform their leadership strengths. Then you just need to go into the survey and pull out specific questions. So don't, you know, certainly taking anything at the aggregate or the, the average of things can be dangerous.

Mm-hmm. So you wanna go into very specific things. So we would. Pull out specific questions like trust, and we would really study how things are how things were phrased. So, do I trust my leader to tell me the truth? And not did I like what I heard? Mm-hmm. But, did I, did I feel like my, my leader was acting in the best interest of the.

In a balanced way, in the best interest of the balanced, the corporation, the employees, the customers, and the shareholders, things like that. Mm-hmm. Mm-hmm. We made sure that, that we didn't factor in any questions. Like, am I happy with what my leader had to tell me? Mm-hmm. Not that we were in bad times at SAP, cuz we, we weren't, but we were just really careful in what were the measures.

Then you start, and then I'm telling the story backward because we kind of had to start with the leadership principles of SAP. Does it mean to be a leader here? What are the values, leadership values, because, you know, leadership is measured differently in different places.

So our definition of leadership at SAP was being a talent magnet, and growing and developing, talent underneath you. Because we had a strong culture of promotion from within. Very informed by SAP is a German company, so internal talent development, and leadership development were super important.

It's not the kind of company where you're just gonna keep pulling people from the top, and mm-hmm. People at the bottom don't move. You know, there's this expectation that people are moving up the levels. All the time. And so we had to go in and analyze, you know, how many high performers did you have?

Did your high performers move? Did you promote them to other functional areas? you know, we put in, uh, talent rotation programs. Did you, so there was a question like, did you encourage your top talent to leave your team and go someplace else? And were you creating high performing? People to come up underneath.

So we had to do a lot of maneuvering in our talent management. So, Yeah. Systems to be able to pull the data and we would, you know, I mean this, no offense to success factors, which is the tool we use, but it, Success Factors couldn't do this. So this was, we had to build a separate data lake and pull reports, and then manipulate the data in a separate system that was custom-built for the purpose of doing the research.

And then, the research became the book.

Tom Griffiths

I think it's really interesting to look at the, you know, the differences there, right? So in the sales case, we're looking at pipeline data to see through sales stages where the issues are diving into understanding competencies and training on those.

In the leadership case, it's coming more from, you know, the values of the company, the principles that are important, and then mapping those. Somewhat fuzzy principles to real hard data in the employee system of record to see if folks are progressing or rotating. That's really neat. Then I'm sure you're able to create some leadership development, interventions, and training.

We'd love to hear the end of the story in terms of how it actually played out.

Jenny Dearborn

So similar to the sales example, we broke apart the competencies, skills and behaviors and characteristics, and habits of the best leaders.

So what is so? From a data perspective, our best leaders were the ones that were talent magnets and were creating a talent pipeline of developing people, so continuing to meet their business performance. Metrics had high engagement scores, and then you went within that, and you could, you know, break down like trust and other behaviors and characteristics.

And then also moved talent through. So we, you know, like really a talent pipeline. Mm-hmm. Um, and could, you know, had an efficient way of growing, mentoring, coaching, developing, and then exporting talent to higher. Roles. And so we found the leaders that were the best at that and again, sort of shadowed and, and, double jacked on their calls and analyzed and just researched the heck out of these.

Best example leaders and then be, and then, you know, made sure that we stripped out bias and things like that. because we were such a cross-cultural, we're in like 180 countries or something. And then used those formulas and those models to inform a new development of our leadership curriculum.

So then, the leadership program that we put in place was, for every, there were five leadership levels for the 10,000 or so people, managers. I think there were 6,000 level one first-level managers. And then it just kind of went up from there. It was a three-day face-to-face program to prep you for the level so for high-performing individual contributors that want to be ready to be people managers.

They went through that program, and then they were put in a pool, and then first-level managers were interviewed and pulled from the pool of the graduates that went through the leadership. Readiness program. And then they were put in the first level manager, program manager role, and then they were put through a three-day program there.

And then, after three or so years they were, they could elect to go through another learning program. That was a readiness for a manager of managers program. They went through the readiness program, and then they were promoted to the next level, and then they went through that program. So what we found was super successful was not getting the job but learning about the job.

But it was learning about the job you're trying to get the job, and then taking another learning program that says, okay, now you're in the job. And because what we found was most. What flamed leaders out was, I think I want that job, and then I go get that job, and then I'm in the job, and then I'm like, oh, I don't want this job.

So what do I do to get myself ready? Like sales, before you pick up the phone and have a prospect, you do a lot of role-playing. You're doing a lot of practice. You are training; you don't go out and try and run a marathon without training. You know, it's all of the readiness things that you can do.

Before you start prospecting, before you start managing people, how do you prepare? And those are the most successful leaders have, have spent a lot of time in preparation.

Tom Griffiths

It sounds like a lot of this was done at scale, which is great, and we'll talk about that in a little bit.

But with 70,000 employees, it sounds like 6,000 to 10,000 managers slash leaders at various levels—there are good sample sizes there to be able to do a lot of this data analysis, which is great. Just curious for some of the folks listening who've got much smaller employee populations than that.

Say a few hundred to the low thousands; they might have a hundred couple of hundred managers. What advice would you give them, if anything, that's different about doing something at a slightly different scale?

Jenny Dearborn

Yeah. I think it's easier at scale.  But the steps are gonna be the same.

At a smaller size where you want to, um, don't just jump in with a solution. Oh, I heard about this training program on the radio at my drive-in, so I'm gonna implement it. Like, whoa, whoa, whoa. You know, what are you trying to do? What are the outcomes you wanna achieve? How do you know it's broken?

What would it look like? If it was fixed, whatever it is, how are you gonna measure your success? Because whatever it is, regardless of the money that you're spending on learning, it's people's time, and that is the most precious thing, you know, heart-crushing, if you waste people's time, and also you're burning your reputation, right?

Like you're, you're burning the bridge because the next time you say, Hey guys, I have another training program, and I want you to try, they're gonna be like, Ugh, here's the learning lady. Like, always, chasing the next hottest distraction. And using us as Guinea pigs.

We're the employees. She doesn't know what she's doing. She's just signing us up for stuff and telling us I have to do it and sending us all these nasty email reminders. The more time you spend upfront understanding the problem, taking the problem apart, understanding the knowledge, skills, behaviors, competencies, habits, whatever that feed to success and challenges, and then be able to really train towards those specific skills and behaviors, and then put that problem back together into a learning solution. You know that those are the same things, whether it's 10,000. People, managers, or a hundred people managers.  Always start with a pilot.

Be able to prove the impact and the value of your pilot, and then you have credibility. You can go back to leadership of your department or your company and say, you know, here are the results that we have from this pilot of mixed randoms, right? And here's their fantastic success, and I'm very confident I'm gonna get that same success level when I go out from this group of randoms to the larger population.

Tom Griffiths

It's such good advice because it's the right thing to do, right? Follow that logical process. But to your point, When you get in front of executives, the story writes itself if you've done the process.

Jenny Dearborn

You look extremely credible, ahead of, I would say 90% of learning leaders if you've really got that, problem, solution, mindset.

Here's some preliminary evidence and data to show that this is why we should scale it. That's such great advice. Just curious if you see common mistakes or challenges that learner-learning leaders make when they are working with data, I think, overreacting to level one data.

I feel so old sometimes that I cannot believe we are still so focused as an industry on four out of five stars. It's crazy. So I, I remember a project at SuccessFactors when we were, it was sales training—it was new hire, sales rep orientation.

Beautiful 11th floor of our San Francisco high-rise, looking out at the bay, and sales reps would fly in worldwide. And it was like 50 sales reps a week, you know, and everybody from all over the world came for this one day, excuse me, one-week thing, you know, it was a case study, and it was guest speakers, and it was projects and presentations and, homework and all this stuff.

It was great, sales reps went off, and then we started hiring at a volume. It was like one week. It was on the 11th floor in this beautiful conference room, and then one week in the basement, which was cavernous and freezing and had no windows and all that stuff.

And then the next week in the, on the 11th floor and then the next week in the basement. And then there were enough of these examples that my team came to me and said, we have to stop teaching in the basement because the Level one evaluation scores were so bad, and we, and I said, well, what do you guys suggest?

And they said, we suggest slowing down hiring so that we can only have the volume of people. And I said, you seriously wanna slow down hiring sales reps? So we had enough data. We had like 200 sales reps or something that had gone through on the 11th floor and 200 that had gone through in the basement.

And I said, all right, well, let's look in the data. So then we went, and we put these two groups together, and we looked at a discount rate, call rate, closure rate, how many bills of material, how many other products are they putting in the bill of material? How long is the sales cycle?

All of the pieces of the CRM pulled everything out. And guess what? It was exactly the same. Exactly the same. So then we, I was hoping you could say the basement group was better, and it was gonna be, you should do your training in the basement. It was, you couldn't, you couldn't tell them apart.

So then we said, okay, 200 people. Then we sent them a separate survey and said, you know, gosh, you're crushing it in your sales numbers just like everybody else. What? Give us feedback about your experience. That was specific to the basement, and it was cold. It was cold. That's why they were so mad. So yeah, level one evaluation.

And then we did this bigger research project and we found that really what drives those level one evaluations is, was the instructor funny? Was the catering good? And was the temperature of the room comfortable? That's what drives level-one evaluation scores. At the end of the day, I'm like; I don't care if you're cold.

Bring a parka; put a note in the email that says, bring a parka.  You just got, you gotta be really, really careful. Like what is driving, I mean, way, way, way back in the day, at Hewlett Packard, you know, the evaluation scores were fantastic through the roof.

Because that was a day that they didn't have to be on the phones. Right. They got a break from the call center, and they loved it. They don't care. You train me on anything you want, I don't care. It's not gonna make a damn bit of difference in how I do my job. I am just so glad that I am not in the bullpen on the phones because it was seen as a vacation day.

So it's like, what are you really trying to accomplish? And it can be a fallacy to overfocus on those or, you know, report them as positive if they do come back, well, you've gotta go to those higher levels, level two and learning level three on behavior change, level four on ROI to prove to the business that it's working and, and just, you know, be good at your job to know that the things you're spending your days on are actually worthwhile.

Tom Griffiths

I love that anecdote. I guess finally, for this segment, talking of ROI at the higher level, you know, 2023 is an interesting year for businesses everywhere. What would you say it's important to keep in mind as a data-driven learning leader in 2023?

Jenny Dearborn

Yeah, I would say this is anti-climactic.

I'd say stick with the basics. Yeah. Like, really understand, your pitch to your executive team you want to put in. An intervention, you're trying to address this particular problem or metric, and you wanna put this learning intervention or performance support tool or something in place.

The cost of this intervention is X, and the cost of the problem is this other number. The cost of fixing it, like if we could shorten the sales cycle by a week, if we could add two more, you know, other products into every deal or whatever it is you're trying to solve, you know, be able to quantify that.

Then you should have kind of an easy story. Just here's the cost of this problem. Here is the cost of the solution. And if you can't articulate it that way. I would say go back and do more homework. Really in, in tough economic times, no stakeholder wants to hear about this. Will this will make employees happy, and this will increase engagement?

Unless you can directly tie a learning intervention to attrition, this will be an attrition prevention thing. But remember, people don't; I mean, in their hearts, people come to work for purpose and meaning and for making a difference.

And a deeper level of joy comes from a job well done and workplace satisfaction. And I would be really, Cautious about never to cha chase like happiness. We want people to be happy at work. You know, people's happy is fleeting. It comes and goes. I just think it's super dangerous to try and chase that because it's almost like a drug.

Like you'll just have to keep amping it up to continue delighting and stimulating people. So it's better to go as a learning leader; it's better to drive learning interventions that go towards purpose and meaning and some deeper levels of joy at work. And that really comes from helping people to become higher performers.

So I would really stick to the basics around identifying a problem, the cost of the problem, the cost of the solution, or cost of the intervention. Then present it that way.

Tom Griffiths

Well said. That will be so clear for, you know, the executive team and the CFO who's increasingly involved in all of these decisions.

If we can map it back to dollars. I think one of the things that you really espouse so well is a related topic of connecting the learning strategy to the higher context of the corporate strategy. Also allows you to tell a story of how this links to business outcomes and get the executive team's attention.

What are the steps involved in that alignment process between corporate strategy and learning strategy?

Jenny Dearborn

Yeah. Those steps are really what motivated me to go from a learning leader, which I love over and over and over, to, well, now hold on, learning is a piece of talent.

So then, I would be a talent leader. And then, well now, hold on. Talent is a part of the whole people strategy. So then I went from chief talent Officer to Chief People Officer. But really, it kind of all came from how I make learning more impactful. Starting with a corporate, the bigger picture corporation strategy, and, you know, what is the purpose of the corporation in the world, and then what is the strategy for that corporation to achieve that purpose?

And then how do you know, how does that corporation, this is all work that the learning leader doesn't do. This is all the work your C-suite, your board, your CEO are doing, or your head of corporate strategy. So, you know, why does the company exist in the world, the purpose of the company, and then, the strategy to achieve that purpose.

And then the goals and, objectives, measurable metrics to achieve that strategy. Then from those business metrics, you say, well, then that's the job of the chief people officer. What is the overall people goal that needs to happen that needs to be in place to achieve those corporate goals?

And then you do your people strategy. Like at SAP, it was to be a talent magnet and to grow and develop talent from within. And to be, I think maybe the tagline was like, your best first job or something, you know? To really grow people. So your people strategy and then your talent strategy from that, and the talent strategy includes like compensation.

What will we pay at the 50th percentile for our peers or the 70th percentile or whatever? What is that bigger talent strategy? And then, within talent, learning is one of the biggest places of pieces of talent. It's certainly the largest budget, and the, I mean, in most companies, it's the largest budget, and it's the largest number of humans within your HR team except for recruiting.

But typically, it's a large number of humans. But in order to achieve the learning strategy, you have to see how it fits mm-hmm within those bigger things. And honestly, it's challenging as a learning leader. If you come into a company and you're super data-driven and organized, you're like, okay, and everybody upstream from you is a mess.

Like, it gets really hard to do your job.  I used to tell people, people come to me and say, here are all these challenges. I'm like; you need to be in a different company. Because if they're. If people at the top don't have a clear strategy, but they just sort of feel like they're barking directions in one direction, and you know, and they kind of giving, things that feel contradictory, and you're flip-flopping about what market you're gonna be in and all sorts of, it feels messy at the top. And then you don't have a clear people strategy. Well, what's our strategy? Hire people and then fire them when they're no good.

Like, wait, hold on. That's not a strategy. And then what kind of learning do we need? Well, just let's not get sued. It's just like compliance, like what's legally required, and let's try not to get sued. You're like, um, okay.

Then after a while, you were like, maybe I should be someplace else because everybody above upstream from me is, is, is reactive and doesn't have their piece together because it's really hard to do your piece.

When you're, because you're at the end of the process, right? Mm-hmm. Learning is the knowledge, skills, and abilities you need to achieve goals and goals aligned with a strategy. And if those other things aren't in place, you're like, how do I know what I should be doing?  So in some ways, you find yourself in the role of asking questions.

Some of these may have answers, others don't, and maybe have to find different pieces of the answer or make some assumptions just to fill in the blanks because it isn't a fully baked or at least stable strategy. And so you can, you kind of get by, you're staying at that company too, just find the commonalities between the different directions and yeah.

How do you manage that, I guess, uncertainty or vagueness, if it's possible? Yeah, I would go to your stakeholder. And whoever is, you're signing off, you, whoever you report to, and, and lay it out and say, these are the assumptions I'm working towards. Yeah. I'd like to be able to address this particular business goal.

If that's not clear, you don't wanna insult people, but you just say, this is what I know I can control. Cuz I know that a really good. Employee onboarding, you know, by industry-standard data, a really strong onboarding program is going to reduce attrition at the six-month mark.

So, you can get that externally, right? You don't need to have an organized internal team to know that. And you can know, as a learning practitioner, you can know What programs you can put in place and the metrics they will impact? You could; what programs impact attrition, retention, and engagement?

You can operate without clarity above you. In a lot of areas, just by doing good research in the industry and getting industry metrics from the Association of Talent Development or the Institute of Corporate Productivity or Berson or whatever. There are lots of things that you can do.

To back up your programs. And, like you say, looking at benchmarks and how you stack up. But, again, if you follow the right process, you're gonna be able to tell a very powerful story and build your credibility if you are upfront like you said about the assumptions we're making here.

The questions you would like answers to but weren't able to get. And I understand that we're moving fast, but here are my assumptions. Therefore, here are the programs we're initiating. Then if something changes over the course of that program, then you can kind of track back to those assumptions and explain why you're in this situation as opposed to, you know, being expected to predict the future and have a learning program in place for when things change.

Tom Griffiths

So I think that's great advice. So much wisdom in all of that. Jenny, I've just got a few rapid-fire questions to finish up here. The first one is around stop, start, continue. Given where we're at as an industry, I'd love to hear from you what you think learning leaders should stop doing right now.

Jenny Dearborn

I think that we get distracted with the fads. For learners and budgets and organizations, a lot of money is spent on the coolest, newest, and jazzy. Distraction a hundred percent. It's a great one because I think naively sometimes people think that that makes them look good and that they're current and going with the latest thing, but actually it's the opposite.

And what makes you credible is following that logical process of problem, solution, and result.

Tom Griffiths

I think that's a great call out. What are folks not doing that they should start doing? Spending more time really understanding the needs of your learners and the needs of your clients.

Jenny Dearborn

What I hear consistently is the learning team is hiding kind of behind the scenes and is worried about getting called out, is shy, and just doesn't wanna be out in front because they're afraid of making a mistake or somebody calling their bluff that they don't know what they're doing.

And they don't spend enough time getting to know. Their clients and their client's needs, and they should do that.

Tom Griffiths

Yeah, agree. And then, to give some credit, out in the world, what are you seeing folks doing that you think is good? And they should definitely continue doing that right now.

Jenny Dearborn

I think that you guys are doing a great job with your ROI calculator and driving that with clients. And, I talk about you guys a lot when I'm out in the world talking with clients and saying, even if they're. Doing something different has anything to do with learning.

You know, I was doing some consulting about the sales process and a sales tool, and I said, why aren't you guys using an ROI calculator? This other company that's a learning company; they use an ROI calculator. And I just think that it's super helpful to be leading, Leading your customers to think about things differently, to be able to make your customers smarter so that they can be better advocates for themselves internally with their stakeholders.

Tom Griffiths

Right on. Yeah, no, thank you. That, that's been a, a real winner for us. What we are trying to do, as you know, is just take a lot of this great wisdom that's out in the world and make it easy, in a productized way, for folks to deploy that and get a sense of ROI against the skills and capabilities that they want to develop.

So, um, appreciate the reference there. Last one, and we've talked about this a bunch, so I think we've probably already given people a dozen tips here, but what would one takeaway for a learning leader who's, aspiring to progress in their career and they wanna stand out to their team and their executive leadership team, what, what's one thing that they could do?

Jenny Dearborn

A super easy thing would be taking a class on how to present data visually. Mm-hmm. And visual storytelling of data and information. There are a couple of books written about it. I think there are a couple of online classes. You guys might even have a class, I don't know, but when I was actively in the role, I put everybody on my team through the program. It found they were incredibly powerful at communicating to their internal stakeholders what they were trying to say if they could do it in a visual way. It's such a great one because it's how executives think. My previous company, FanDuel, had an executive team. I think half of them were at McKinsey, and they were just excellent at that and also responded well to well-presented data, as do all executives.

Tom Griffiths

So I think that's such a great tip. Especially applicable to learning and development, as we've been talking about. Yeah. So Jenny, thank you so much for your time. This was a phenomenal conversation with so much wisdom shared. I always learn something when we speak, so thank you so much for being with us today.

Jenny Dearborn

It was absolutely a blast. Thanks, Tom. It was really fun.

Tom Griffiths

Today I'm excited to be joined by my friend and our advisor at Hone, Jenny Dearborn. Jenny is an author, an executive, an advisor, and a board member, and has really seen it all when it comes to talent development, both at the small scale with startups and a huge global scale, with, pick multinationals like SAP.

She's a five-time chief learning officer, chief talent Officer, and Chief People Officer, including some companies you may have heard of, like HP, Success Factors, and SAP where she was CLO in charge of internal training for over 70,000 people globally. She's written two books the Data-Driven Leader, A Powerful Approach to Delivering Measurable Business Impact through People Analytics and Data-Driven, and how Performance Analytics delivers extraordinary sales results.

So we're gonna dig into some exciting things around measurement, we've known each other for a few years and she's been a tremendous help to me. And Hone, as we've built our company in this space. So can't wait to dig into some really meaty topics and some amazing experiences with Jenny. Jenny, thanks for joining us.

Jenny Dearborn

Thanks, Tom. Thanks for having me.

Tom Griffiths

So, Jenny, one of the things that we originally hit off around was the subject of using data, in talent development. And you've obviously written a couple of books on that subject, so I know it's close to your heart. I was just curious to hear how did that come to be?

What got you originally so interested in the uses and importance of data in, in talent development?

Jenny Dearborn

I know exactly the sort of turning point or the pivotal moment that, turned me from a regular, learning leader into a data-obsessed learning leader. So I was at Hewlett Packard, and I was responsible for global sales enablement.

I remember very distinctly presenting at a QBR and putting up a slide that was hard to gather all the data for this QBR. And it was, you know, it said something like 2000 sales reps took this one-day class and gave it a 4.5 out of five stars, and 1500 sales reps took this you know, two-day class and, and they gave it a 4.7 out of five stars.

And I, I thought that was the end of my job because I was presenting the volume and I was presenting the transactions. I was also feeling really good cuz it was hard to gather that information. And the CEO at the time, stopped me mid-presentation, Mark Heard, and he said, stop, stop, stop, stop right there.

Because all I know for sure from what you're presenting is that you've wasted a lot of time and a lot of money. I don't actually have any. Evidence that anything that you do matters and makes our sales reps more productive. So why don't you go away and come back when you can actually prove that something that you do matters?

And I, oh. Oh yeah. Uh, good idea. So I sort of slunk away and was like, how do I actually do that? I mean, how do I prove that what I do matters? I, you know, cuz he was just saying that nothing I did had any value. So, I started to try to gather. The data that he was asking for, like CRM data and all this stuff.

And I couldn't because, I was above my pay grade and it was not something for me to be, I wasn't allowed to see the data. So then I was pretty sure that I couldn't at that company, deliver the results that the CEO was asking for. So I started up on a path to move to a smaller company where I would've access to the data.

Started to be a bigger fish in a smaller pond. And so then I went to SuccessFactors, which was about 1500 employees, and was responsible for sales enablement there. And so then started a path to run sales enablement at Success Factors, but also with the understanding that I would've access to all of the performance data.

The first book Data Driven, which is, published in 2015 is the story with the names changed, of my journey, my first year at SuccessFactors.The book came about because I would go to conferences and present, you know, at small conferences. A lot of people would just stand up and say, this is what I'm working on, and here's my progress.

And there would be lots of us sort of sharing. And so I would share, I'm like, here's my, my progress. And. People would be like, how did you do that? I'm like, well, I, we just made it up and we just figured it out and, and, and, and then I got invited to bigger and bigger conferences and then I was presenting the work that I was doing at, on a bigger stage, and I would just get thronged afterward.

And people were asking me, you know, where did you learn how to do that? And which, well, my team and I just figured it out. I mean, really it was my team and then conference. Attendees would say, can you please write it down? So I, yeah, so that's what became book one is my Journey that I went through my first year at SuccessFactors.

We just sort of fictionalized that, and that became data driven, which is a book about how to use learning and performance improvement tools to predict the performance of sales reps. the second book is the same algorithms because we used some, we used some, you know, some fancy math, the same algorithms to predict, the effectiveness of leaders and leadership roles.

Was a dare and somebody said, okay, you can do this. With sales cuz there's so much data in sales, of course, you can do, you know, after a while it was like, well obviously you can do this in sales, but can you do it in leadership like a dare? And I was like, I totally can. So then the second book, the Data-Driven Leader.

That's how that story was born. And that was that's the story again, a fictionalized version, of what we went through at SAP.

Tom Griffiths

That's phenomenal. There's so much in that. I love it. I mean, there are so many common challenges we hear all the time. Like, I can't get access to the data, or I don't know how to prove what I'm doing.

And then, yeah, it's easy in things like sales that have got. Dollar numbers on them, but what about fuzzy things like leadership and management? So, yep. You've, you've done it all. So this podcast is now gonna be five and a half hours long as we dig into it all.

It would be interesting to just dig a little deeper and understand perhaps a little more about the methodology, starting with the sales side of how you went about going beyond just those level-one ratings that, got shot down a little by the CEO and got to things that really did prove the business case or proved to the executive leadership that this was working.

Jenny Dearborn

Well, I give a lot of, credit to my partner in this she's the math brain of it, and she's the CEO of a company called Emplay, who I partnered with. And what we did was really start with. looking at the CRM, and we pulled out the different stages. At the time, I think we had five sales cycle stages in which every company sort of has something different than how they configured their CRM.

But we would look at CRM data, and we would look at where do deals fall out of the sales. Cycle. First of all, we had to tell everybody that we were doing this and be sure to put your deals in the CRM and put them inaccurately. Because garbage in, garbage out. So, you needed, a sales team that had really good discipline in how to use the CRM.

Then we would pull CRM data of where were deals stalling. Where would they where would deals fall out of the sales cycle? Then we just studied the CRM data, and found the sales reps who were the best at each stage, you know, just because somebody is great at it.

Closing, um, maybe they're great at that last stage. So we sort of found the sales reps that were best at each stage. Did a lot of interviews of sales reps too, and a lot of shadowing, sort of like silent on the call, you know, double jack, this is before like gong, so we were like double jacking on calls.

Like an archeologist, like what exactly are they doing? You know, how much do they prep? And so we really, really understood what was the knowledge, skills, behavior, competencies, and habits at every stage of the sales cycle. Then we could say, okay, at this stage, stage one, you need these 10 skills.

At this stage, you need these 10 skills. And then we would say, okay, then how do we assess for those skills? Cuz the skills are gonna be different at every stage of the sales cycle. So how do we assess those skills? And then how do we teach those skills? And then how, so we basically took a problem, broke it apart.

Analyzed all of the pieces, like, putting a thousand-piece puzzle on the ground, studying everything, and then slowly put it back together and really, really understood that coming together. Then we could really, really finely calibrate. And what we found were 167. Variables. So puzzle pieces, you know, when we broke the whole thing apart that were distinctive, some of them were the same across stages.

Then we developed. Training and learning, or performance support tools or some other support mechanism like a coach or something like that for every single stage. And then we just kept running the process, and we just kept running it until it was faster, faster, faster. And then we could just shorten the sales cycle.

And then what we did is we got to the point where we. I think we had like a five x increase in new hire sales reps meeting quota. We just started, you know, hitting our stride and, and sales reps were just crushing it. And then it became this. This learning mechanism became part of the pitch to recruit new sales reps, right?

We'd say come to success factors. Don't go to Workday. Our competitor, our little itty bitty competitor that had just started on the other side of the bay, you know, or some other, competitor come to us. Because we can basically guarantee, not literally, but basically guarantee that you'll make a quota because, from a sale, from sales rep's, comp structure, like they have to make, they have to make quota, or they're, they're gone, right?

You know, uh, their base is, is actually pretty low. Anything that a company can do to help that sales rep. Make quota, and they're not just like on your own with like a laptop in their home office. Then the sales rep's like, okay, I wanna go there 'cause I know I'll be successful there.

So it became part of this whole, pitch for the company.

Tom Griffiths

At the high level, looking at all of those benefits, not just, driving sales results, but engaging employees in their own development. And even going so far as to attract, talent more effectively because of this, certainty that you could have around the performance and the development that they would get.

It was really interesting to hear where you started as well because, when we think about it. How do we use data in learning and development? Oftentimes that's as you first described, which is after the program; oh, what can we go look at to see if it had an impact? But actually, you started with the data to understand where was the dropoff in the sales process.

Let's break that apart into the skills, then let's assess those skills. And only at that point do we start doing any interventions or training. Of course, there are output metrics beyond that. But it's awesome to see that you've also done that kind of upfront data analysis.

Jenny Dearborn

Yeah, and, and to the point where, you know, I would say to my teams, if you don't know what you're gonna measure, before teaching a class or a skill or anything, uh, you know, don't do it. Like if you, if you can't know, this is red on our dashboard and we're gonna put this learning intervention in place and then we're gonna measure.

This goes from yellow to green. Like if you don't have that structure set up upfront, don't waste people's time. Mm-hmm. I mean, especially sales reps, a lot of people don't have time to waste. Um, but sales especially are on the clock, and you can't, you know, every, for every sales day that you waste that you're asking them to do a training class and they're not out in the field selling, you know, that affects the bottom line of your company.

And then now everybody's jobs are at risk because you've just wasted somebody's time. So, if you don't know and have fantastic certainty that. You are gonna call up a sales rep and ask them to stop selling to prospects because you want that sales rep to do your learning thing instead. You better damn well know that that thing's gonna work before you waste somebody's time.

Tom Griffiths

Ideally, when you've got the results, you can show here, here's the lift in performance, therefore, you know, more than pays for itself, um, as the time off the floor. But how did you actually get that then? Was it a phased rollout? Did you try it with some, you know, a smaller group of reps first?

Jenny Dearborn

Yeah, just to get the data and then have the confidence to roll it out further. So always start with a pilot, then measure the heck out of it. So a ton of measurement upfront so you know exactly what you're expecting, then start with a small group of people who are willing, and you want it to be an even sample.

Mm-hmm. So it can't be just like top performers looking to get incrementally better. Yep. Because then that's gonna skew. That's gonna skew all your data. So, you want it to be an even sample of performance levels, but a small group and something, and you want a pilot short enough, you know, that's really not more than a quarter, a quarter of intervention and then a quarter of results.

Sometimes when we do pilots, we're like, I don't know if it's gonna work, and I don't know how long to wait mm-hmm. Until I get the results. So, um, you have to have that research design upfront before starting because you just don't want to waste people's time.

Go slow up front to go fast later.

Tom Griffiths

Yeah. A hundred percent. And, as you say, you're doing a science experiment. You're picking a control group. Yeah. And a test group. And we're proving a hypothesis that this works. Um, that's great. So I know it was a dare to move this over to leadership, you know, that's where our hearts are. Uh, we'd love to hear how that translated, what were some of the differences or challenges of doing that, and how you overcame them. Yeah, so it, the harder part for leadership was what really is success? What are the success measures? And so we were pulling a lot from the employee engagement survey.

But we would just we would look at specific questions. So, because also you don't want to. You don't wanna teach to the test, right for leadership? You do with sales because, you know, a sale is a sale. But with leadership, a lot of things go into, like, I'm telling people bad news, even though it's the, I have to say it now, they're mad.

The hypothesis going in is that, even in difficult times, A good leader or a great leader can still have a good employee engagement survey. Results even in bad times, even during a layoff, even in, you know, the darkest times of a company. That's the time when a leader.

Can step up and still, you know, really perform their leadership strengths. Then you just need to go into the survey and pull out specific questions. So don't, you know, certainly taking anything at the aggregate or the, the average of things can be dangerous.

Mm-hmm. So you wanna go into very specific things. So we would. Pull out specific questions like trust, and we would really study how things are how things were phrased. So, do I trust my leader to tell me the truth? And not did I like what I heard? Mm-hmm. But, did I, did I feel like my, my leader was acting in the best interest of the.

In a balanced way, in the best interest of the balanced, the corporation, the employees, the customers, and the shareholders, things like that. Mm-hmm. Mm-hmm. We made sure that, that we didn't factor in any questions. Like, am I happy with what my leader had to tell me? Mm-hmm. Not that we were in bad times at SAP, cuz we, we weren't, but we were just really careful in what were the measures.

Then you start, and then I'm telling the story backward because we kind of had to start with the leadership principles of SAP. Does it mean to be a leader here? What are the values, leadership values, because, you know, leadership is measured differently in different places.

So our definition of leadership at SAP was being a talent magnet, and growing and developing, talent underneath you. Because we had a strong culture of promotion from within. Very informed by SAP is a German company, so internal talent development, and leadership development were super important.

It's not the kind of company where you're just gonna keep pulling people from the top, and mm-hmm. People at the bottom don't move. You know, there's this expectation that people are moving up the levels. All the time. And so we had to go in and analyze, you know, how many high performers did you have?

Did your high performers move? Did you promote them to other functional areas? you know, we put in, uh, talent rotation programs. Did you, so there was a question like, did you encourage your top talent to leave your team and go someplace else? And were you creating high performing? People to come up underneath.

So we had to do a lot of maneuvering in our talent management. So, Yeah. Systems to be able to pull the data and we would, you know, I mean this, no offense to success factors, which is the tool we use, but it, Success Factors couldn't do this. So this was, we had to build a separate data lake and pull reports, and then manipulate the data in a separate system that was custom-built for the purpose of doing the research.

And then, the research became the book.

Tom Griffiths

I think it's really interesting to look at the, you know, the differences there, right? So in the sales case, we're looking at pipeline data to see through sales stages where the issues are diving into understanding competencies and training on those.

In the leadership case, it's coming more from, you know, the values of the company, the principles that are important, and then mapping those. Somewhat fuzzy principles to real hard data in the employee system of record to see if folks are progressing or rotating. That's really neat. Then I'm sure you're able to create some leadership development, interventions, and training.

We'd love to hear the end of the story in terms of how it actually played out.

Jenny Dearborn

So similar to the sales example, we broke apart the competencies, skills and behaviors and characteristics, and habits of the best leaders.

So what is so? From a data perspective, our best leaders were the ones that were talent magnets and were creating a talent pipeline of developing people, so continuing to meet their business performance. Metrics had high engagement scores, and then you went within that, and you could, you know, break down like trust and other behaviors and characteristics.

And then also moved talent through. So we, you know, like really a talent pipeline. Mm-hmm. Um, and could, you know, had an efficient way of growing, mentoring, coaching, developing, and then exporting talent to higher. Roles. And so we found the leaders that were the best at that and again, sort of shadowed and, and, double jacked on their calls and analyzed and just researched the heck out of these.

Best example leaders and then be, and then, you know, made sure that we stripped out bias and things like that. because we were such a cross-cultural, we're in like 180 countries or something. And then used those formulas and those models to inform a new development of our leadership curriculum.

So then, the leadership program that we put in place was, for every, there were five leadership levels for the 10,000 or so people, managers. I think there were 6,000 level one first-level managers. And then it just kind of went up from there. It was a three-day face-to-face program to prep you for the level so for high-performing individual contributors that want to be ready to be people managers.

They went through that program, and then they were put in a pool, and then first-level managers were interviewed and pulled from the pool of the graduates that went through the leadership. Readiness program. And then they were put in the first level manager, program manager role, and then they were put through a three-day program there.

And then, after three or so years they were, they could elect to go through another learning program. That was a readiness for a manager of managers program. They went through the readiness program, and then they were promoted to the next level, and then they went through that program. So what we found was super successful was not getting the job but learning about the job.

But it was learning about the job you're trying to get the job, and then taking another learning program that says, okay, now you're in the job. And because what we found was most. What flamed leaders out was, I think I want that job, and then I go get that job, and then I'm in the job, and then I'm like, oh, I don't want this job.

So what do I do to get myself ready? Like sales, before you pick up the phone and have a prospect, you do a lot of role-playing. You're doing a lot of practice. You are training; you don't go out and try and run a marathon without training. You know, it's all of the readiness things that you can do.

Before you start prospecting, before you start managing people, how do you prepare? And those are the most successful leaders have, have spent a lot of time in preparation.

Tom Griffiths

It sounds like a lot of this was done at scale, which is great, and we'll talk about that in a little bit.

But with 70,000 employees, it sounds like 6,000 to 10,000 managers slash leaders at various levels—there are good sample sizes there to be able to do a lot of this data analysis, which is great. Just curious for some of the folks listening who've got much smaller employee populations than that.

Say a few hundred to the low thousands; they might have a hundred couple of hundred managers. What advice would you give them, if anything, that's different about doing something at a slightly different scale?

Jenny Dearborn

Yeah. I think it's easier at scale.  But the steps are gonna be the same.

At a smaller size where you want to, um, don't just jump in with a solution. Oh, I heard about this training program on the radio at my drive-in, so I'm gonna implement it. Like, whoa, whoa, whoa. You know, what are you trying to do? What are the outcomes you wanna achieve? How do you know it's broken?

What would it look like? If it was fixed, whatever it is, how are you gonna measure your success? Because whatever it is, regardless of the money that you're spending on learning, it's people's time, and that is the most precious thing, you know, heart-crushing, if you waste people's time, and also you're burning your reputation, right?

Like you're, you're burning the bridge because the next time you say, Hey guys, I have another training program, and I want you to try, they're gonna be like, Ugh, here's the learning lady. Like, always, chasing the next hottest distraction. And using us as Guinea pigs.

We're the employees. She doesn't know what she's doing. She's just signing us up for stuff and telling us I have to do it and sending us all these nasty email reminders. The more time you spend upfront understanding the problem, taking the problem apart, understanding the knowledge, skills, behaviors, competencies, habits, whatever that feed to success and challenges, and then be able to really train towards those specific skills and behaviors, and then put that problem back together into a learning solution. You know that those are the same things, whether it's 10,000. People, managers, or a hundred people managers.  Always start with a pilot.

Be able to prove the impact and the value of your pilot, and then you have credibility. You can go back to leadership of your department or your company and say, you know, here are the results that we have from this pilot of mixed randoms, right? And here's their fantastic success, and I'm very confident I'm gonna get that same success level when I go out from this group of randoms to the larger population.

Tom Griffiths

It's such good advice because it's the right thing to do, right? Follow that logical process. But to your point, When you get in front of executives, the story writes itself if you've done the process.

Jenny Dearborn

You look extremely credible, ahead of, I would say 90% of learning leaders if you've really got that, problem, solution, mindset.

Here's some preliminary evidence and data to show that this is why we should scale it. That's such great advice. Just curious if you see common mistakes or challenges that learner-learning leaders make when they are working with data, I think, overreacting to level one data.

I feel so old sometimes that I cannot believe we are still so focused as an industry on four out of five stars. It's crazy. So I, I remember a project at SuccessFactors when we were, it was sales training—it was new hire, sales rep orientation.

Beautiful 11th floor of our San Francisco high-rise, looking out at the bay, and sales reps would fly in worldwide. And it was like 50 sales reps a week, you know, and everybody from all over the world came for this one day, excuse me, one-week thing, you know, it was a case study, and it was guest speakers, and it was projects and presentations and, homework and all this stuff.

It was great, sales reps went off, and then we started hiring at a volume. It was like one week. It was on the 11th floor in this beautiful conference room, and then one week in the basement, which was cavernous and freezing and had no windows and all that stuff.

And then the next week in the, on the 11th floor and then the next week in the basement. And then there were enough of these examples that my team came to me and said, we have to stop teaching in the basement because the Level one evaluation scores were so bad, and we, and I said, well, what do you guys suggest?

And they said, we suggest slowing down hiring so that we can only have the volume of people. And I said, you seriously wanna slow down hiring sales reps? So we had enough data. We had like 200 sales reps or something that had gone through on the 11th floor and 200 that had gone through in the basement.

And I said, all right, well, let's look in the data. So then we went, and we put these two groups together, and we looked at a discount rate, call rate, closure rate, how many bills of material, how many other products are they putting in the bill of material? How long is the sales cycle?

All of the pieces of the CRM pulled everything out. And guess what? It was exactly the same. Exactly the same. So then we, I was hoping you could say the basement group was better, and it was gonna be, you should do your training in the basement. It was, you couldn't, you couldn't tell them apart.

So then we said, okay, 200 people. Then we sent them a separate survey and said, you know, gosh, you're crushing it in your sales numbers just like everybody else. What? Give us feedback about your experience. That was specific to the basement, and it was cold. It was cold. That's why they were so mad. So yeah, level one evaluation.

And then we did this bigger research project and we found that really what drives those level one evaluations is, was the instructor funny? Was the catering good? And was the temperature of the room comfortable? That's what drives level-one evaluation scores. At the end of the day, I'm like; I don't care if you're cold.

Bring a parka; put a note in the email that says, bring a parka.  You just got, you gotta be really, really careful. Like what is driving, I mean, way, way, way back in the day, at Hewlett Packard, you know, the evaluation scores were fantastic through the roof.

Because that was a day that they didn't have to be on the phones. Right. They got a break from the call center, and they loved it. They don't care. You train me on anything you want, I don't care. It's not gonna make a damn bit of difference in how I do my job. I am just so glad that I am not in the bullpen on the phones because it was seen as a vacation day.

So it's like, what are you really trying to accomplish? And it can be a fallacy to overfocus on those or, you know, report them as positive if they do come back, well, you've gotta go to those higher levels, level two and learning level three on behavior change, level four on ROI to prove to the business that it's working and, and just, you know, be good at your job to know that the things you're spending your days on are actually worthwhile.

Tom Griffiths

I love that anecdote. I guess finally, for this segment, talking of ROI at the higher level, you know, 2023 is an interesting year for businesses everywhere. What would you say it's important to keep in mind as a data-driven learning leader in 2023?

Jenny Dearborn

Yeah, I would say this is anti-climactic.

I'd say stick with the basics. Yeah. Like, really understand, your pitch to your executive team you want to put in. An intervention, you're trying to address this particular problem or metric, and you wanna put this learning intervention or performance support tool or something in place.

The cost of this intervention is X, and the cost of the problem is this other number. The cost of fixing it, like if we could shorten the sales cycle by a week, if we could add two more, you know, other products into every deal or whatever it is you're trying to solve, you know, be able to quantify that.

Then you should have kind of an easy story. Just here's the cost of this problem. Here is the cost of the solution. And if you can't articulate it that way. I would say go back and do more homework. Really in, in tough economic times, no stakeholder wants to hear about this. Will this will make employees happy, and this will increase engagement?

Unless you can directly tie a learning intervention to attrition, this will be an attrition prevention thing. But remember, people don't; I mean, in their hearts, people come to work for purpose and meaning and for making a difference.

And a deeper level of joy comes from a job well done and workplace satisfaction. And I would be really, Cautious about never to cha chase like happiness. We want people to be happy at work. You know, people's happy is fleeting. It comes and goes. I just think it's super dangerous to try and chase that because it's almost like a drug.

Like you'll just have to keep amping it up to continue delighting and stimulating people. So it's better to go as a learning leader; it's better to drive learning interventions that go towards purpose and meaning and some deeper levels of joy at work. And that really comes from helping people to become higher performers.

So I would really stick to the basics around identifying a problem, the cost of the problem, the cost of the solution, or cost of the intervention. Then present it that way.

Tom Griffiths

Well said. That will be so clear for, you know, the executive team and the CFO who's increasingly involved in all of these decisions.

If we can map it back to dollars. I think one of the things that you really espouse so well is a related topic of connecting the learning strategy to the higher context of the corporate strategy. Also allows you to tell a story of how this links to business outcomes and get the executive team's attention.

What are the steps involved in that alignment process between corporate strategy and learning strategy?

Jenny Dearborn

Yeah. Those steps are really what motivated me to go from a learning leader, which I love over and over and over, to, well, now hold on, learning is a piece of talent.

So then, I would be a talent leader. And then, well now, hold on. Talent is a part of the whole people strategy. So then I went from chief talent Officer to Chief People Officer. But really, it kind of all came from how I make learning more impactful. Starting with a corporate, the bigger picture corporation strategy, and, you know, what is the purpose of the corporation in the world, and then what is the strategy for that corporation to achieve that purpose?

And then how do you know, how does that corporation, this is all work that the learning leader doesn't do. This is all the work your C-suite, your board, your CEO are doing, or your head of corporate strategy. So, you know, why does the company exist in the world, the purpose of the company, and then, the strategy to achieve that purpose.

And then the goals and, objectives, measurable metrics to achieve that strategy. Then from those business metrics, you say, well, then that's the job of the chief people officer. What is the overall people goal that needs to happen that needs to be in place to achieve those corporate goals?

And then you do your people strategy. Like at SAP, it was to be a talent magnet and to grow and develop talent from within. And to be, I think maybe the tagline was like, your best first job or something, you know? To really grow people. So your people strategy and then your talent strategy from that, and the talent strategy includes like compensation.

What will we pay at the 50th percentile for our peers or the 70th percentile or whatever? What is that bigger talent strategy? And then, within talent, learning is one of the biggest places of pieces of talent. It's certainly the largest budget, and the, I mean, in most companies, it's the largest budget, and it's the largest number of humans within your HR team except for recruiting.

But typically, it's a large number of humans. But in order to achieve the learning strategy, you have to see how it fits mm-hmm within those bigger things. And honestly, it's challenging as a learning leader. If you come into a company and you're super data-driven and organized, you're like, okay, and everybody upstream from you is a mess.

Like, it gets really hard to do your job.  I used to tell people, people come to me and say, here are all these challenges. I'm like; you need to be in a different company. Because if they're. If people at the top don't have a clear strategy, but they just sort of feel like they're barking directions in one direction, and you know, and they kind of giving, things that feel contradictory, and you're flip-flopping about what market you're gonna be in and all sorts of, it feels messy at the top. And then you don't have a clear people strategy. Well, what's our strategy? Hire people and then fire them when they're no good.

Like, wait, hold on. That's not a strategy. And then what kind of learning do we need? Well, just let's not get sued. It's just like compliance, like what's legally required, and let's try not to get sued. You're like, um, okay.

Then after a while, you were like, maybe I should be someplace else because everybody above upstream from me is, is, is reactive and doesn't have their piece together because it's really hard to do your piece.

When you're, because you're at the end of the process, right? Mm-hmm. Learning is the knowledge, skills, and abilities you need to achieve goals and goals aligned with a strategy. And if those other things aren't in place, you're like, how do I know what I should be doing?  So in some ways, you find yourself in the role of asking questions.

Some of these may have answers, others don't, and maybe have to find different pieces of the answer or make some assumptions just to fill in the blanks because it isn't a fully baked or at least stable strategy. And so you can, you kind of get by, you're staying at that company too, just find the commonalities between the different directions and yeah.

How do you manage that, I guess, uncertainty or vagueness, if it's possible? Yeah, I would go to your stakeholder. And whoever is, you're signing off, you, whoever you report to, and, and lay it out and say, these are the assumptions I'm working towards. Yeah. I'd like to be able to address this particular business goal.

If that's not clear, you don't wanna insult people, but you just say, this is what I know I can control. Cuz I know that a really good. Employee onboarding, you know, by industry-standard data, a really strong onboarding program is going to reduce attrition at the six-month mark.

So, you can get that externally, right? You don't need to have an organized internal team to know that. And you can know, as a learning practitioner, you can know What programs you can put in place and the metrics they will impact? You could; what programs impact attrition, retention, and engagement?

You can operate without clarity above you. In a lot of areas, just by doing good research in the industry and getting industry metrics from the Association of Talent Development or the Institute of Corporate Productivity or Berson or whatever. There are lots of things that you can do.

To back up your programs. And, like you say, looking at benchmarks and how you stack up. But, again, if you follow the right process, you're gonna be able to tell a very powerful story and build your credibility if you are upfront like you said about the assumptions we're making here.

The questions you would like answers to but weren't able to get. And I understand that we're moving fast, but here are my assumptions. Therefore, here are the programs we're initiating. Then if something changes over the course of that program, then you can kind of track back to those assumptions and explain why you're in this situation as opposed to, you know, being expected to predict the future and have a learning program in place for when things change.

Tom Griffiths

So I think that's great advice. So much wisdom in all of that. Jenny, I've just got a few rapid-fire questions to finish up here. The first one is around stop, start, continue. Given where we're at as an industry, I'd love to hear from you what you think learning leaders should stop doing right now.

Jenny Dearborn

I think that we get distracted with the fads. For learners and budgets and organizations, a lot of money is spent on the coolest, newest, and jazzy. Distraction a hundred percent. It's a great one because I think naively sometimes people think that that makes them look good and that they're current and going with the latest thing, but actually it's the opposite.

And what makes you credible is following that logical process of problem, solution, and result.

Tom Griffiths

I think that's a great call out. What are folks not doing that they should start doing? Spending more time really understanding the needs of your learners and the needs of your clients.

Jenny Dearborn

What I hear consistently is the learning team is hiding kind of behind the scenes and is worried about getting called out, is shy, and just doesn't wanna be out in front because they're afraid of making a mistake or somebody calling their bluff that they don't know what they're doing.

And they don't spend enough time getting to know. Their clients and their client's needs, and they should do that.

Tom Griffiths

Yeah, agree. And then, to give some credit, out in the world, what are you seeing folks doing that you think is good? And they should definitely continue doing that right now.

Jenny Dearborn

I think that you guys are doing a great job with your ROI calculator and driving that with clients. And, I talk about you guys a lot when I'm out in the world talking with clients and saying, even if they're. Doing something different has anything to do with learning.

You know, I was doing some consulting about the sales process and a sales tool, and I said, why aren't you guys using an ROI calculator? This other company that's a learning company; they use an ROI calculator. And I just think that it's super helpful to be leading, Leading your customers to think about things differently, to be able to make your customers smarter so that they can be better advocates for themselves internally with their stakeholders.

Tom Griffiths

Right on. Yeah, no, thank you. That, that's been a, a real winner for us. What we are trying to do, as you know, is just take a lot of this great wisdom that's out in the world and make it easy, in a productized way, for folks to deploy that and get a sense of ROI against the skills and capabilities that they want to develop.

So, um, appreciate the reference there. Last one, and we've talked about this a bunch, so I think we've probably already given people a dozen tips here, but what would one takeaway for a learning leader who's, aspiring to progress in their career and they wanna stand out to their team and their executive leadership team, what, what's one thing that they could do?

Jenny Dearborn

A super easy thing would be taking a class on how to present data visually. Mm-hmm. And visual storytelling of data and information. There are a couple of books written about it. I think there are a couple of online classes. You guys might even have a class, I don't know, but when I was actively in the role, I put everybody on my team through the program. It found they were incredibly powerful at communicating to their internal stakeholders what they were trying to say if they could do it in a visual way. It's such a great one because it's how executives think. My previous company, FanDuel, had an executive team. I think half of them were at McKinsey, and they were just excellent at that and also responded well to well-presented data, as do all executives.

Tom Griffiths

So I think that's such a great tip. Especially applicable to learning and development, as we've been talking about. Yeah. So Jenny, thank you so much for your time. This was a phenomenal conversation with so much wisdom shared. I always learn something when we speak, so thank you so much for being with us today.

Jenny Dearborn

It was absolutely a blast. Thanks, Tom. It was really fun.