Our guest Andrzej Wolosewicz has had years of experience helping companies define and build machine learning and analytical solutions that have a measurable impact on the business, and he shares with us his experience and expertise. He shares with us the biggest pitfalls he sees companies fall into over an over as they try to implement these initiatives.
The problem was there was a lot of activity every month that they were doing, but in terms of progressing, their analytic capabilities were really kind of being able to to grow and be more effective. They weren’t, they weren’t able to do that. As the saying goes, they had a lot of action but not a lot of progress.
Ginette Methot: I’m Ginette.
Curtis Seare: And I’m Curtis, and you are listening to Data Crunch, a podcast about how applied data science, machine learning, and artificial intelligence are changing the world.
Andre Wolosewicz: My name is Andre Wolosewicz. I am currently the director of sales at HEXstream. We are Chicago based analytics and data consultancy. But this is kind of the, the latest step on my journey. So I actually started out coming straight out of college into a, a predictive modeling startup. And this would have been in the late nineties. Artificial intelligence at the time was, was a big buzzword as it is today. And we were looking at being able to do fairly advanced modeling of systems, but actually looking at the data as being the model. So if you were looking at uh, everything from a jet engine to the human body to complex refineries, we didn’t necessarily understand all the nuances of how they ran, but we had all the data and so we would use that data to build out those models. And then I ended up going from that actually flipping into the, into the other side of the world around program management.
So, not so much doing the analysis, but understanding how the analytics and designs and all of those steps fit together to actually deliver a furnished product. And so that was, that was very useful because it taught me that, hey, there’s a lot more, you may find things that are interesting, but on the business side of the world you have all of the constraints that analysts may not always be aware of or or may not, you know, really want to take into consideration like budgets, schedule, things of that nature. And so I learned how to operate with that. And then another interesting twist of fate, met somebody who knew somebody who was looking for somebody that could provide that line of business experience, but actually selling a business intelligence platform. Not necessarily that you knew how the all the software worked. And you know, if you click here, this happen, if you click here, that happened, but could sit across the table from somebody who was in a line of business and say, I understand the business problem you’re having.
I understand how to solve it and here’s how the technology can be applied. Because the, the reality is technology in and of itself will never solve a tool. It needs people, it needs processes, it needs the people to use it. My Dad used to like to look at a rake and say, well, the art’s not going to rake itself, so the rake does the job, but it needs somebody to use it. After about five, six years actually selling and being involved with the bi platform, the opportunity to join HEXstream came up, and for me, this was kind of a combination of all of the past experiences because it gave me the opportunity to engage with clients and engage with our inner teens on what is it that you’re trying to do. So going back to my first experience, what is the project? What is the model?
What is the data that you’re trying to work with and build? But then I also had to understand why that was relevant. Why would a client engage with a company like HEXstream to undertake a project? How is that project measured? There’s a lot of things that over the years I’ve found people would love to do, but that doesn’t actually translate to a project that doesn’t translate to work. So how do you bundle all of those pieces together, whether it’s the technology, whether it’s the people, whether it’s process, how do you put a bow around all of that and say, well, this is how we’re actually going to move forward. One of the analogies that’s usually a go-to for me is kind of an hourglass shape, and I found this time and time again, especially early on, you’re at the top of the hourglass and everything’s wide open and the conversations, especially around what can you do with data and you know, in today’s world, it’s not just how do you show it, what report can you write?
It’s how can I leverage machine learning technology? How can I leverage AI? How can I leverage all of the other data sources that are available to me? And the world’s your oyster. There’s all sorts of stuff, and inevitably the conversation narrows and narrows and narrows kind of that crossover point in the hourglass. That’s the nexus at which you’ve now identified something that’s tangible, something that is a, we can absolutely do this. Unfortunately a lot of times the conversation blows right through that, and you end up at the bottom, the wide open aperture again of the hourglass shape where people said, you know what, that one idea was really good. Now let’s build on it. And so you’ve gone from something that was executable, that was tangible, that was, you know, measurable that you could implement back into, well let’s try to do all of this. And then somebody looks at that project and says, we can’t do it. There’s no way. It’s too big. And I’ve seen this with, with a lot of clients and a lot of customers over the years where they have that moment and they have that identification of what to do and then they just keep going right through it. They don’t stop and say, “well let’s, let’s execute this and then move forward.”
Curtis: Do you think that’s because people are really excited about all the potential possibilities and maybe it takes a little bit of restraint to say, “okay,” like “we have something now, let’s run with this first.” Like, why do we tend to do that?
Andre: Absolutely. I think there’s an element of restraint, and I think the other is an element of discipline and being able to, again, balance that. It’s not just, it’s not just what the technology could do, it’s not just what an opportunity might be. It has to balance what the, the kind of curiosity or the, the, what could we do with this aspect is with the almost completely nontechnical, well what is the business need? What’s the business goal we’re trying to achieve and how do those two balance against each other? So there is a, there is a big, I would say that discipline and restraint is actually a really big component, a leading indicator in my experience of companies that are successful with these kinds of initiatives because it’s not that they discard the grand vision, but the understand that the grand vision’s not going to be realized all at once. That you have to kind of build the, build the path to get there.
Curtis: Got it. Are there any examples that come to mind of cases where you’ve really seen this come through that you can help maybe the audience really get concretely?
Andre: Yes. So I’ll, uh, withhold the names to protect the guilty. But this was a consumer products goods DPG from that we were working with, and specifically we were talking to the marketing department and so the marketing department have fairly involved manual process where they would be collecting all of this internal data they had on product sales and all the internal information about it, the testing, their market surveys, all of that. And they would also be collecting a lot of third party data. So your traditional Nielsen reports and then other social media scraping that they were doing, things of that nature. And then every month they would go through and they would collect and and mine and manually review that data so that they could provide reports to their sales teams. So as their sales teams went out into the field, they knew which customers might be at risk, which customers might be open to new ideas, which customers could grow things of that nature.
And the, the problem was the, there was a lot of activity every month that they were doing, but in terms of progressing their analytic capabilities or really kind of being able to to grow and be more effective, they weren’t, they weren’t able to do that. As the saying goes, they had a lot of action but not a lot of progress. And so what we did when we first talked with them was understand, “well, what, what is all of the data that you’re collecting? What could that do?” And we talked about the ability to do, to really kind of do customer 360. We talked about the ability to feed back into their R&D group and be able to provide additional customer feedback as well as distributor feedback. We talked about all the different, how it could potentially impact their manufacturing processes, where they were sourcing some of their raw materials from all of that.
Going back to kind of an hourglass analogy that was at the top; everything was open. And what we did is we identified that the platform that we were selling at the at the time could ingest all of that data. It was basically designed to be data agnostic, structured data, unstructured data. So I could mix my, my sales numbers, my, you know, highly relational, quantifiable data with social media texts, and I could do some mining and analytics and sentiment analytics on that and understand what people liked, and I could pull all that together. And so we started to kind of narrow down the discussion around the ability of the platform to go in and in a fraction of the time that was manually being taken, go through and mine this data and identify what are some of the trends and more importantly, what are some of the changing data points from month over months. So rather than just getting a monthly report,
The sales team would be able to get a monthly report, but also how that monthly report changed over time. And unfortunately the conversation didn’t stop there. And so once we said, “okay, well this is what we want to do, we’re going to focus this report on what are those changing trends over time. We’re going to let the technology, he kind of pull all that information together.” We went right through that into, “well if the, if we can do that with the report, what about integrating this data back in? What about not just giving the sales people report, but let’s also bring in the manufacturing team. Now let’s also bring in, um, the, uh, the, the product development team, and let’s talk about absolute all this information in this trending that we’re going to be able to do. And so now instead of having that focus conversation about taking all of this data and applying it to a specific need that the sales team had, now around the table, we had the sales team, we had the product development team, we had the manufacturing team.
As you can imagine, they all love the idea, but now they all wanted to take the application and the implementation of that idea to the benefit of their department. And so we went from what was a, you know, again, a very well defined-project that had a . . . that a very measurable ROI and very measurable return. Because I can look at my sales performance changing over time too, “well now if we implement this platform, it almost became an, everybody’s going to get everything. And so as we continued to say, okay, well we need to identify what we can do, we created a circumstance where as we tried to go up to get executive approval to do this, everybody from those different departments said, “well I can get this and I can get this and I can get this. And what was a, again, a manageable and unmeasurable product project became an overly encumbering initiative. And one that the, the executive said, look, “I, I, I’m going to take a step back. We’re, we’re not going to move forward with this because it sounds really good. I just don’t see how I’m going to get there.” And at that point we had missed our chance to, to have that, to really give them that well-defined opportunity.
Curtis: So sort of the concept of having some, I don’t know if you want to call it a quick win first, but, but some sort of small defined measurable project that you can do that actually does something for the business. And then of course once that successful you can expand it from there. Is that,
Andre: Absolutely.
Curtis: that’s kind of the, a concept there. That’s really interesting. And um, the whole, uh, idea of scope creep, right, that software engineers they’re always dealing with, is that the biggest challenge that companies have and when they’re thinking of doing analytical initiatives or are there other things that come up in your experience?
Andre: So I see that probably about a, and I’ll use rough numbers, probably about 45% of the time. Forty-five percent of the challenges we see will fall into that, where whether it’s scope creep, whether it’s poor definition, whether it’s just lack of discipline, um, a, an initiative ends up not being successful if it’s executed on or just doesn’t even get executed on to begin with. The other, the other problem that we run into, kind of the other really big bucket is when companies look at these initiatives and they look at what they’re trying to do from a, from an, it’s really more on the advanced analytics standpoint that we see it with some of the newer technologies that are coming out where instead of looking at it from an enterprise standpoint, it gets looked at from a departmental standpoint or it gets looked at from an individual standpoint.
And the challenge that we see there is that you end up at an enterprise level, you have the architecture, you have your, your it architecture that’s running your firm, right, that, that’s doing all the day-to-day work. And then over time you end up with these individual, you know, whether you want to call it shadow IT, whether you want to call it data silos, whatever label you give it, you end up with these pockets of kind of side architectures or side projects that are disconnected from what that main, what that overall it strategy is. And so as an example, you know, we’re working with a, um, I’ll call it an entertainment firm, and they are looking at really improving what, what, what a, what a person’s experience is at an event. And part of the challenge is there’s a lot of information that they’ve connected over history that they’ve collected over history in terms of what the expectations are, how have, what has attendance been at the event, um, what are the demographics so that those kinds of long macro trends and that is, as you would expect, your traditional relational data warehouse database kind of data.
But what they’re also looking at is instead of having an idea of what will happen and then being able to react to things on the fly, we would like to have the ability to react in near real time. And then, I’m not going to say real time, but within a couple of minutes, something where we can change it during the course of the event. And that was the part that they were looking at. And our initial conversations were only around that real-time data architecture. What would that look like? What is the data coming in? But as the conversation progressed, we learned that they had this secondary warehouse. And so we, we asked, “well, how are you doing the analytics right now on the batch? Because the last thing we want is to be reacting in real time to one kind of analytic, one kind of data. And then separately as you’re planning for your future events, you’re looking at completely different information.”
Everything from as simple as, am I aggregating my information the same way? Am I going to be able to go back upstream and understand how my data is decomposed into the individual elements? Or am I doing my planning based on these long moving average trends? But then in real time and reacting completely differently. So our view is you really need to, as clients and where we see clients who have challenges is they don’t really have that singular view of their data. Now, it’s not saying that there’s one answer for everything, but really there’s only, we use the term “one data.” There’s only one data that gets generated. It gets used differently throughout the organization, but it’s not different data at each one of those points. So if you’re collecting it in real time to do some analytics on it and then you’re going to be doing historical batch analysis on that same data, that should be a single float.
There should be a single thought process that goes into, how do I tap into it in realtime? How do I clean it? How do I store it? So as I’m doing my batch analysis or I’m doing my other analysis, then people don’t lose trust in the data. And the challenge we’ve seen is when those two are disconnected, that’s when you run into the proverbial ask two people the same question, and you’ll get two different answers because they’re running the queries two different ways. If there’s another kind of big bucket of challenges or, or early indicators that a project is going to run into difficulty, it’s that those individual, uh, solutions, the individual architectures, the individual thought process that are trying to be solved are done truly independently of what the enterprise architecture philosophy is. It doesn’t need to like only use the same tools or only use the same kind of the same methodologies that are already employed. You want to be able to change those. You want to be able to grow, but you can’t do them in isolation of what those philosophies are.
I think you’re right on there. What are your strategies? You deal with obviously a lot of clients and see this problem over and over again. What are your strategies to mitigate this? Is it technology that needs to be applied? Is it a process issue?
So it’s really a little bit of both, but our experience has been, it is the majority is around process and a lot of times what we see is a client will make a decision that we’re going to implement tool A from this vendor and they’re going to implement it. And then once it’s implemented, then they’ll see, okay, so now it’s in, now what happens, we actually have incorporated an alignment step, and this may seem, this may seem kind of project management or sales 101, but doing discovery before you implement a tool is a step that is more often than, it surprises me a little bit, but more often than not, looked over in the initial rush or urgency or desire to implement a technology. And so a lot of times what we’ll see is that a tool will get implemented and, and I’m going to take back the word implemented.
A tool will be enabled or it will be installed. And then people go, okay, well now what do I do with it? Oh well the out-of-the-box configuration actually needs to be changed to our process or well out of the box. The definition of on time delivery as an example is defined this way. We actually define it because we go through a distributor another way. And so there’s a lot of potential nuances are a lot of, again, we use the word “alignment” that has to be done between what the technology can do and what is needed in the organization. And going back to what we talked about initially a little bit, the concepts of discipline and restraint. Sometimes we see that not happening. And so it will be, we’ll put the technology in and that will solve my problems and that will help me get better. And if that initial alignment isn’t done, that’s where you run into these other challenges.
Curtis: That’s really interesting. And um, so we’ve talked a lot about process and discipline and how you approach these. I did also want to touch on a little bit more on the data analytics modeling side and just hear from you what kinds of modeling and technologies are people implementing. Is, is it more, you know, like 80% of the problems you see can be solved by fairly basic, well-understood techniques or is it more, you know, we really have to delve into some deep learning techniques and these kinds of things and, and get some AI experts to, to really build some sort of sophisticated model? What are you seeing in terms of technology being applied to problems?
Andre: So honestly what we see a lot in the field is I’ll say halfway between the two, kind of the two ranges that you gave because a lot of basic reporting from an analytics standpoint, a lot of basic operational reporting, a lot of basic, I’ll call it business intelligence, looking at trends over times and changes a lot of the clients that we deal with, in fact most of them, those tools are there. The tools are there. They’re part of the daily process and their companies are realizing the value of and, and utilizing on a day to day basis what those technologies offer the year-over-year sales performance, year-over-year, what the customer behavior is, all of the those kinds of analytics from a business standpoint where we see the real opportunity now is not necessarily going all the way to the level of having to, to hire data scientists and having to hire the, the real specialized people around AI and deep learning, machine learning and you know, don’t get me wrong, those, there are some very, very exciting developments from for the companies that have gone to the second level.
That is absolutely the next step we’re seeing and in that area where we’re seeing a lot of growth is around customer modeling and customer engagement because as those capabilities grow, the variety of data that’s able to be pulled into those models is incredible and the granularity or the depth of understanding that companies can get about their clients is, is incredible to see. A lot of companies know their customers way better than we know ourselves, but in the middle is where we’re seeing a lot of capability where again, companies have integrated those, those basic AI reportings, but where they’re not realizing the full value is a lot of those analytics are still siloed; they’re siloed within departments, and it’s the cross-functional analytics where we’re seeing a lot of interest in a lot of, “okay, what’s the next step I take?” Again, I’d love to get to the end point and all of that advanced analytics, but what we’re seeing companies start to gain the next level of value is . . . let’s say for example, I’m looking at HR data, so in your traditional HR analytics, you’re looking at retention rates, you’re looking at recruitment rates, you’re looking at where do I get the best people from, all of the, how do I improve my hire and performance?
How do I not lose people? All of that kind of information. But that same set of data. Now if I marry it with sales data for let’s say I’m looking at a sales team or I look at manufacturing data quality data, let’s see, I’m looking at how people are performing. Now I can start to look at what the performance is over time correlated against where am I recruiting people from. So now I’m not just measuring recruitment success based on what my acceptance perception is or what my retention after a year is. Now I’m actually looking at job performance. Now let me blend all those three together. So now I can look at how are people performing on the job? Where did they come from through what channel did they, did they learn about our company or did I recruit them from, and then what is happening on my actual deliverable, whether it’s a service deliverable or whether it’s a product deliverable.
So now I can start to see, well a certain person or a certain set of traits helps me really develop new products very well that provide a direct impact to my bottom line. And now I’m starting to cut across the data from three different departments in terms of understanding where are those, from a clustering example, where are those those segments are those groups that I really want to amplify our extent. And then why I call the kind of the middle level is because now once I start to go across those different business areas or the different parts of my business, now I can start to add those advanced tools. So as I’m looking at those cross-functional analytics, now I want to do some machine learning to say, “okay, are there other aspects of the business that I need to pull into this to really identify what are the core capabilities, or the core characteristics of employees that I’m looking for?
And so that continues to build on itself. The jump from traditional reports to advanced analytics can be problematic because there’s a lot of steps in between. And sometimes if I make that step, I haven’t looked at all of the, all the correlations, all of the kind of inputs into that understand how they’re related. But if I stop in the middle of that journey and take those traditional analytic tools, but apply them across my organization and combinations that may not have been done before, I break out of those individual business area silos. That’s where we see companies currently starting to get a lot more value. And then I can build on that and say, now how do I apply machine learning? How do I apply deep learning to continue to identify these new relationships and this new ability, whether now it’s looking for new requirements for product development or new skill sets that I need to recruit for? I’ve got the foundation to be able to build on that.
Curtis: And is that a sort of organizationally, is that usually some sort of center of, analytical center of excellence that kinda can bring all that stuff together and help each department?
Andre: Absolutely. So what we see inevitably coming out of that is a data governance center of excellence, a analytics center of excellence. There’s a myriad of different structures, whether they’re defined, whether they’re distributed throughout an organization. You know, that could be a separate conversation in and of itself. But we see companies recognizing that they need that structure to be able to then build on and have some discipline around, well what are they doing with this? How are they doing it so that they truly get the value out of that investment.
Curtis: Is there anything else before we close off here that you feel like you’d like to close off on or share with the audience that you, you feel it’s important?
Andre: You know, I think as companies look at really leveraging what some of the new capabilities are, I know that especially in technology, it can be over the years, whether it’s cloud, whether it’s, you know, artificial intelligence I’ll pick on just because I’ve seen that one come and go over the years. There’s a lot of temptation to, to really follow what’s the, what’s the latest trend, what’s the newest thing to do? And what’s really exciting these days is that the, the ability of the technology has caught up with a lot of the words. And so what, what people have described this technology is being able to do a lot of the language has been very similar. Going back 20, 30 years, the differences now, the physical, uh, technology we have is able to execute on them. But to me, the most important thing that companies can do as they, as they undertake these journeys is not lose sight of the need for restraint, the need for discipline, the need for that process control.
Because again, there is no new technology that will solve all these problems. There’s, I would love to have, pick your cliche, the silver bullet, the magic bullet, whatever it is that solves these problems. And so the application of that technology and having it done in a well-structured, well-governed manner is far and away the biggest indicator of success. The companies that have that discipline, that have that control, that have that governance are almost always successful with the initiatives and reap rewards that are magnitudes greater than what they initially expected. If the companies don’t have that discipline, they may, in all other facets, be identical. But without that control, without that discipline, it’s very hard to do.
Contact Information
Email address: Andre.Wolosewicz@hexstream.com
Attributions
Music
“Loopster” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/