An interview with Paul Hlivko, the Vice President and Chief Technology Officer at Wellmark Blue Cross and Blue Shield.
Using open models is still a novel idea for most companies, but I’ve had the fortune of interacting with a few leaders who have been experimenting with open models in business for more than a decade. One of those leaders is Paul Hlivko, now the Vice President and Chief Technology Officer at Wellmark Blue Cross and Blue Shield. He has been experimenting with open models since the early 2000s. I recently spoke with Paul about how crowdsourcing was going at Wellmark Blue Cross and Blue Shield.
What I found fascinating in our conversation was Paul’s insight into how open-sourcing has stimulated interest and engagement from both internal employees as well as external talent in solving issues for the business. In our conversation, Paul mentioned something that I think is critically important in initial conversations companies are having about trying out open-source platforms: There’s a prevailing myth that internal stakeholders are the best quality control. From our experience, that myth exists across many sectors. That simply hasn’t proven to be correct, he says. Here’s more wisdom from Paul about the power of open sourcing.
John Winsor: How are you utilizing open models at Wellmark?
Paul Hlivko: We’re crowdsourcing two streams of work. First, we have a platform that we have built for our external stakeholders that runs multiple scrum teams alongside a new crowd-powered team that is able to take on story assignments and work with an integrated pipeline back into our platform code base. Second, we’re using the data science crowd to help solve for our future state product design by iterating through models that correlate product utilization, product performance and variation across our customer base. For both efforts, we are very close to counting on the output from the crowd to deliver these key investments we’re making.
Winsor: What were some of the pain points?
Hlivko: When people are first introduced to the crowd, they often go toward concerns about work-product quality and control of the end deliverables. The irony of that is an underlying assumption that the few hundred employees of the company are assumed to be the best control mechanism for quality versus the crowd of thousands competing and measured with data and algorithms. I haven’t found a scenario where a crowd’s quality control failed to meet expectations if the challenge is defined appropriately, with the right engagement from the company and the right incentive model for the work.
Winsor: How did you prepare your teams?
Hlivko: We defined a manual internal crowdsourcing challenge that was opened up to all of our developers to accomplish this. I wanted to get the mental model of our leadership and team members acclimated to how crowds work, before we began our partnership with Topcoder, which was the platform we chose.
Winsor: What are some of the business results you’ve seen?
Hlivko: Platform models and crowds allow a traditional business model to stay aligned with how venture-backed businesses are scaling and accessing talent and partner ecosystems to deliver their value proposition. It can also drive internal innovation, improved work-life integration, lead to continuous up-skilling and break down some of the internal hierarchical constraints many employees face in a larger organization when they want to try alternative work as part of their growth plan.
Winsor: What advice would you offer to businesses who are building out open source programs?
Hlivko: Before you launch, you’ll start running into engineering challenges, funding questions, dev/ops pipeline integration, branding considerations and a host of other small items to work through. I would recommend identifying key talent, both leadership and team members, which are good at leaning into the model and thinking through these problems horizontally. A small core internal team can advocate for the crowd and evangelize it when ready.