Digital Transformation

Almost every organization in the world has come to the realization that while it may provide other services and products, becoming a technology based company is the only way to survive with such radical changes happening in the market so quickly. The biggest challenge that comes with that realization is that you likely have a talent pool that wasn’t selected with that end goal in mind. As a result, many companies are undergoing a “digital transformation” of sorts to get their staff upskilled for the transition – whether it’s the extreme of literally replacing bodies with automation or AI, or teaching front-line customers how to use company apps and technology so they can teach and support customers.

In 2014, Capital One found itself in just such a situation. While a dramatic investment had to be made in engineers, design, and product skill sets to get through the transition, another 45k associates had to undergo a radical transformation that started with the way they thought, then cascaded out to more technical and design-focused skills. Our Principal, Matthew Daniel, was put in a position to lead that effort as a full-time associate of Capital One. The sections below detail some of the work that happened to create a differentiated learning experience for Capital One’s associates.

Assessments

While some folks recognize their technical skill gap, others are far from acknowledging their areas of opportunity for growth. We used opt-in assessments to help folks recognize where they were in their digital fluency. The challenge with writing such assessments is that it’s easy to make the learner feel as though they’ll never catch up – we were intentional about this tool delivering a message that didn’t make the user feel like they were far from their goal, but they still had a long way to go. There were two different assessments created along the journey – one specifically targeted at the masses and one orientated to those transitioning into a more product-oriented role.

Get your dScore

The “dScore,” was developed to help the masses assess whether the user was “1)working, 2)living, and 3)banking” digitally. With 45 questions, 15 covering each area, we asked behavior based questions like “Do you still receive paper statements?” or “How are you accessing the news?.” At the completion of the assessment, the user was given a score in each area and a dynamic result page based on their score that pointed them to a “Digital 101” site with content related to working, living, and banking digitally. To be honest, the level of “assessment” tied to this tool wasn’t terribly complicated – it was more of a self-rater, allowing the user to get a sense of their gaps in the digital journey.

We were hyper-focused on user experience when designing the rater, intentionally  dropping features that made it feel like an eLearning and working hard to align ourselves with an experience that closer resembled a self-rater from Facebook about which Harry Potter character you might closely align with. We used bright colors, ditched the next button, and dropped some of the obligatory intro and exit pages to keep the experience moving. We also allowed users to email their results to themselves from within the course and the results page also contained links to content that would help them grow.

Over 60% of Capital One associates opted in for this tool, many taking it multiple times to see how their score changed over time. We started seeing associates post their “dScore” across our company social platform and even see team leaders modeling the same behavior and encouraging their organizations to do the same.

The level of “academic” learning is debatable, but the ultimate goal of helping people figure out where they were in their digital journey and actively taking steps to increase their digital fluency was certainly achieved as excitement grew.  Associates enjoyed and the ability to both figure out where they were in their “digital journey,” a term that could feel incredibly esoteric, and access content and action steps to help your digital fluency grow.

The Product Mindset Rater

For a more advanced audience, we decided we needed a more advanced “rater,” again focusing on this as a self-evaluation tool, not a formal Kirkpatrick evaluation plan. This would again be a bit of a “pre-assessment” with the goal to align the learner up with a realistic goal of where they needed growth. Instead of scores, this time they received a “hot spot” of darker blue in the competencies around the product mindset they needed to grow in, and lighter blue in the areas they were already well developed.

For this audience, the goal wasn’t just culture change and easing into “digital,” for this audience, we needed to align skill gaps with learning programs/paths. In this case, clicking on those hot spots would trigger an API back into our LMS and actually register the learner on their learning plan for a program of informal content (ranging from 45mins to a few hours) aligned with their gaps. While it was conceptually very simple, writing an API call into a course was new territory for us (though common now with xAPI) and created an extraordinarily simple experience for the user to move them from the concept of rating to actually committing to learning on their learning plans. When demoing this concept at a conference, my favorite response to the simplicity of the experience was an experienced learning designer on the front row mumbling aloud: “Are you f—— kidding me?”.

Gathering Content

When given five (5) heads to build out our digital learning effort, I knew immediately I wanted to test the water with two full time roles we had never had: a curator, and a video-genius (we had used contractors, but never doubled-down on bringing the talent in house).

Curation

While I had seen the writings of folks like Elliott Masie on use of curators, I had never actually met a learning curator. The road with the curator was rocky – finding the right talent was challenging. We needed the library science experience behind being rigorous around what was curated, how often, and what criteria would be used. We also needed that role to be about the end goal of the learner experience, not the content. Additionally, finding a way for more traditional learning designers to work with a curator was dicey. We were all stumbling to find our places, but we got there. Our curator ironed out all of the processes we need to select and “weed” our content regularly.

In order to embrace curation, we had some simple but large hurdles – we needed to get YouTube whitelisted for all associates so we could leverage external content. We needed to let go of responsibility that users might navigate to videos/blog posts that had content other than what we intended.

Media & Brand

We started rough. Really rough. No talent in the video space, just a video animation tool (explainer videos were just getting hot at that time), iPhones for voiceover, and a lot of passion about executing. I believe it was this effort that put the effort into high gear and cleared the way for me to get the headcount I was looking for. We learned the software as we went and pushed out crude videos but that got the point across – we were leveraging the agile concept of Minimum Viable Product (MVP) and acknowledging we would need to iterate. It bought us what we needed – time and talent.

As we progressed with our more advanced content, our user research had told us that users were comfortable with external content and videos on cutting-edge topics, but our learners really wanted to hear from the experts at Capital One. That was going to require creating a ton of videos with high-profile SMEs. Our video talent had to be far more than an editor or videographer, we needed a leader who could edit – someone who was comfortable giving VP+ direction and managing vendor video crews. We also needed a creative element that would help us establish the “brand” for our content and knew how to capture the voice of our brand.

Creating an Experience

This entire initiative was driven by “what if…” statements…

  • “What if we could give everyone in the company a ‘digital score’ so they could see their level of digital fluency?”
  • “What if we used animation tools to create videos since we can’t do anything else?”
  • What if we tried recording voice over on our phones?
  • “What if we got rid of the next button?”
  • “What if we gave users different (and dynamic) assessment results?”
  • “What if we automatically registered people for a program based on their assessment results?”
  • “What if we hired a full-time curator?”
  • “What if we gave users the ability to navigate through a formal learning path AND free roaming through a website?”
  • “What if we take this through the user lab and get user responses?”
  • “What if created coaching kits so teams could self-organize?”
  • “What if we broke away from company brand to create a differentiated experience?”
  • “What if we put a coach on video for teams that are awkward and don’t have anyone willing to facilitate?”

Those are some examples I can think of off the top of my head. As our team got more and more comfortable with design thinking, we opened the door for more and more ideas and more and more iteration. There were so many other concepts that came up – some worked, some didn’t – but we were a team organized and energized by creating a different experience to help our learners tackle really challenging content.

In addition to applying design thinking, I saw this as an amazing opportunity to lend credibility to the project by applying some of the concepts we were teaching our learners – like user experience. With internal user labs available, we took multiple segments of this solution through the lab – the dScore assessment, the Digital 101 site, the more advanced DigU site. Each piece came back with brand new insights and always something surprising and often, something very simple – bring thins above the code, color coordinate categories, arrange content by modality, list the length of videos, etc.

User Insights + Data Analytics

Additionally, we started using data to make minute by minute changes. When we launched the first site, we noticed in the first few hours we had a quick bounce rate for most users who were visiting. It turns out they weren’t sure where to start. We literally added a (1), (2), and (3) next to the steps and the bounce rates dropped sharply and the dScore assessments jumped dramatically. We quickly realized that neither data insights after launch or user insights before launch were enough, we had to find ways to get at both with every project to be successful.

Our use of crude design thinking approaches combined with data insights drove an experience that became the benchmark for future learning launches. We would regularly hear something along the lines of “we want a DigU for leadership,” or “we want a DigU on D&I,” etc. Also, in one of my favorite moments in my career, someone recognized my profile picture from DigU and stopped me in an elevator to thank me for how “cool” the site was. I was incredibly grateful for that moment and so glad to share it with the amazing team that executed on the effort!