Education and AI

For David Guralnick, education, AI, and cognitive psychology have always held possibility. With many years of experience in this niche, David runs a company that designs education programs, which employ AI and machine learning, for large companies, universities, and everything in between.  

David Guralnick: Somehow what’s happened in a lot of the uses of technology and education to this point is we’ve taken the mass education system that was there only to solve a scalability problem, not because it was the best educational method. So we’ve taken that and now we’ve scaled that even further online because it’s easy to do and easy to track.

Ginette Methot: I’m Ginette,

Curtis Seare: and I’m Curtis,

Ginette: and you are listening to Data Crunch,

Curtis: a podcast about how applied data science, machine learning, and artificial intelligence are changing the world.

Ginette: Data Crunch is produced by the Data Crunch Corporation, an analytics training and consulting company.

Curtis: First off, I’d like to thank everyone who has taken the Tableau fundamentals zombie course that we announced the last episode. We’ve been getting a lot of great feedback from you. It’s fun to see how people are enjoying the course and thinking that it’s fun and also clear and it’s helping them learn the fundamentals of Tableau. The reason we made that course is because Tableau and data visualization are really important skills. They can help you get a better job, they can help you add value to your organization. And so we hope that the course is helping people out. Also, according to the feedback that we have received, we’ve made a couple of enhancements to the course, so there are now quizzes to test your knowledge.

There are quick tips with each of the videos to help you go a little bit further than even what the videos teach. We’ve also included a way to earn badges and a certificate so that you can show off your skills to your employer or whoever. And we’ve also thrown in a couple other bonuses. One is our a hundred plus page manual that we actually use to train at fortune 500 companies so that’ll have screenshots and tutorials and tips and tricks on the Tableau fundamentals. And we have also included a checklist and a cheat sheet, both of which we actually use internally in our consulting practice to help us do good work. One of them will help you know which kind of chart to use in any given scenario that you may encounter, whether that’s a bar chart or a scatter plot or any number of other more advanced charts.

And the other is a checklist that you can run down and say, “do I have this, this, this and this in my visualization before I take it to present to someone to make sure that that’s going to be a good experience.” So hopefully all of that equals something that is really going to help you guys. And something also where you can learn Tableau and have fun doing it, saving the world from the zombie apocalypse, and the price has risen a little bit since last time. But for our long-time listeners here, if you use the code “podcastzombie” without any spaces in the middle, then that’ll go ahead and take off 25% of the list price that is currently on the page. So hopefully more of you guys can take it and keep giving us feedback so we can keep improving it. And we would love to hear from you

Ginette: Now onto the show today. We chat with David Guralnick, president and CEO of kaleidoscope learning.

David: I’ve had a long time interest in both education and technology going way, way back. I was, I was lucky enough to go to an elementary school outside of Washington DC called Green acres school in Rockville, Maryland, which was very project based. So it was non-traditional education. You worked on projects, you worked collaboratively with people, your teachers’ role was almost as much an advisor and mentor as a traditional teacher. It wasn’t person in front of the room talking at you, and you learn how to, you know, you really learn how to think creatively and pursue your own interests and learn by doing, and so all of that stayed with me as I got older and I developed interest in technology from a really young age. I had my first computer at 13, which was at a time when people did not normally have a computer at 13 and was interested then through that in how computers could learn and what did artificial intelligence mean.

And it was a field that was, was a bit of a mystery and ended up as I was finishing college, getting interested in the work of an artificial intelligence professional named Roger Shank who was at Yale. And Roger was just at the time leaving Yale with some faculty to start an Institute at Northwestern university that brought together a cognitive psychology, computer science and AI and education to apply artificial intelligence techniques to education. And so I did my PhD at that program and ended up being asked to focus particularly on business problems in the corporate world and work with some corporate clients through Accenture, that was in Anderson consulting and ah, it’s kind of what, what, you know, the work that continues to this day.

Curtis: Yeah. That’s great. What, what year around were you’re doing your PhD, just so I get a.

David: PhD for me was starting in ’89 and so wrapping up in ’94. Late ’80s early ’90s.

Curtis: Before the AI wave hit everything, right. You guys were working on this stuff on the cutting edge it sounds like.

David: Yeah, absolutely. It was, it was, um, we were considered on the cutting edge was a cutting edge lab. We were, you know, written up in the early days of wired magazine and all that kind of stuff. And it was really interesting place to be, it was a tremendous group of people. We had, I mean, some of them I still work with to this day. We had people who were excellent writers. We had people who were really cutting-edge thinkers in AI and in education and, and in cognitive psychology, which sometimes almost like cognitive science side sometimes gets left out, right? It’s, you know, how do you, how do you think and learn? How do you, how do you understand what your, you know, what you’re experiencing. And all of that goes into designing the experience.

So yeah, those were, it was a really a really fascinating place to be and built on a lot of the principles that, that I kind of believed in from my formative years and couldn’t work out any better.

Curtis: Yeah. That’s awesome. Now, now you’ve seen this whole progression of, of AI machine learning . . . What’s your perspective on that since you’ve, you’ve lived this entire cycle now?

David: Yeah, I’ve lived a, yeah, I’ve lived a few cycles. When, I mean, when I first started doing it, it was kind of, you know, the, uh, you know, the almost, almost became the dying days of, of AI at one point, right? Like we were doing really interesting things I think in applying it to education. But as a field AI was considered, it was considered a failure. The years since my PhD were mostly what’s considered AI winter, you know, really it just didn’t had high hopes.

We expect it to be in a Jetsons like world and we are not. What happened? And you know, now I’ve seen the Renaissance and the Renaissance has been certainly interesting to see. There’s obviously a lot more computing power now, which has helped. There’s sort of a lot more public interest and understanding of what AI could be. And some of that’s, you know, there’s probably more, more good than bad though. Sometimes it’s a little scary. We also are in danger of being over-hyped once again. And I think that’s the thing that we, we look at. I mean I’ll talk to people sometimes even about what’s possible, what kind of conversations online systems can have with people, and there’s usually an overstatement of, of what the reality is. And so I think that’s something to be cautious of as we move forward and keep thinking about where AI techniques and machine learning, which, which to me, which the traditionalist is a subset of AI can fit in and not, you know, not overstate and not necessarily feel like the goal has to be a fully functional human replacement. I don’t know that that’s a societal goal for a lot of reasons, but even in terms of technology, it’s not clear that that’s what we need. And in particular in the world of education, it’s not clear that that’s what we would want.

Curtis: Right. Now, can we talk a little bit about cognitive psychology and the angle that, that that takes in your work? That’s not a topic we hit very often on this show, but I think it’s really interesting as it applies to to the work you’re doing.

David: Yeah, absolutely. There’s, I mean to me, and it’s always been a critical part of what we do. You’re not looking at just putting technology out there, you’re looking at technology that in some ways on one side might mirror some of human thought processes. So that’s part of what we were doing back in my old research lab at Northwestern was thinking about how technology could, could reflect human thought processes. But then on the, on the end user side, so on the more practical side, we need to develop technology experiences that really do help people accomplish their goals, whether they’re educational goals or whether they’re otherwise. In order to do that, we need to have an understanding of how people think, how they learn, how they process information, how they acquire skills. Some of that borders on education research, but a lot of that is the cognitive side and it all, to me it really is all interdisciplinary, right? You understand what you can do with technology. How do people think and learn, how can they use it? And then some of the, the educational principles, learning by doing and other things and all of that to me goes together to produce the, the technology and education that I think would be, would be ideal.

Curtis: Great. Yeah. Maybe to get a little bit more concrete here, so the company you have now, Kaleidoscope Learning, what are you guys doing? What are you guys seeing as being effective to help people learn and how are you applying AI to that?

David: Yeah. Kaleidoscope Learning. We work with a mix of clients, big, big corporate clients and smaller nonprofits and kind of, you know, occasionally universities and kind of everything in between. And our overall goal is to produce technology, some, some kind of learning technology that helps people really improve performance in the business world is most of what we do or for the universities, obviously it’s more a traditional education. But when, particularly when you’re looking at businesses and you’re looking at at job performance, we’re really trying to improve skills. So it’s not about memorization, it’s not about do you know, certain facts, but it’s what, what are you going to be able to do? So in order to do that from the educational side, we focus very heavily on learning by doing. Put people in a situation where they can make decisions that are complex. So you design experience where they have to make decisions.

Even something simple. If it’s a customer service training system, you’re putting them in a situation where you’ve got real customers that are coming up in video and now potentially, you know, haven’t rolled anything out yet for client, but potentially in virtual reality, you’ve got a customer coming up to you. What do you do from that point? What do you say? Maybe the customer is angry, how do you calm the customer? Maybe the customer is is you know, unhappy about something. How do you solve the problem? And so that’s a, that’s not just memorizing the principles, it’s applying them. And so learning by doing is one of the core, one of the core features there. On the AI side, we look to collect data on sort of what people are doing and use that to assess strengths and weaknesses and where they might go next. So even staying with customer service example, if you’re having particular problems with understanding company policy, then we’ll might sent you, send you to another customer who has a particularly difficult policy issue and you have to deal with it.

Or depending on what we know, maybe an easier policy issue until you can get get better at it. You know, it depends on a lot of factors. Within the experience we also want to use, you know, what I would call AI techniques to customize the experience. So for example, if you are within, you know, within a learn by doing simulation for example, which is kind of the, the, the model that I was talking about here, we have a lot of data and we can use it to customize not just what customer you deal with next, but what happens within this situation. So it could, we can use it to help determine how the situation plays out. We can particularly determine what kinds of feedback and quite use the word tutoring, but guidance you get from the system. So there might be a coaching component that would help you, well maybe you need to think about, think a little bit about how you might handle the situation. Think a little bit about how you would calm somebody and you know it’s, it’s more you can, you can create a little Socratic dialogue with learner. You can choose certain video clips that would come up based on data that are appropriate for the situation that it’s a more sophisticated model than simply, “here’s the step you took, here’s what happens.” And so those are a couple of ways that we are using AI techniques behind the scenes and I think it should only get only become more and more and more as time goes on.

Curtis: That’s interesting. Can you talk to us a little bit about how you approach the design of a system like that? Like how do you, is more of it taking someone that understands the curriculum and mapping out these are the different scenarios and this is what we would want to happen in certain situations. You know, more rules-based, but then AI is applied to that to make it more intelligent or how does the design approach look like?

David: A little bit. So it starts off a little more open ended. It starts off with really a, you know what we call a content collection phase, a content analysis phase where our team will sit down with the subject matter experts with the, with the goal of understanding what people really need to do. A lot of times subject matter experts are accustomed to thinking in terms of principles and in terms of sort of “here are the things we need to teach,” and we want to flip that on its head and think about what is it the people need to be able do and then therefore what will they need to be able to experience in order to do that. And so you kind of start with open ended questions about real life situations and start to understand what, what really happens in this job? You’re really trying to get a feel for the job.

What are the situations that people encounter? What are the situations that are particularly difficult? What are the ones that everybody should be able to handle? What are the, what are the common mistakes that people make? What are common misconceptions people have within that you also get a lot of real life stories and so you get real experiences, whether it’s something you know on the simpler side like customer service or something complicated, such as interviewing legal clients as a pro bono attorney, which is another one we worked on recently. I can speak to a little bit and either you know and either any and all of these cases you start with gathering information and then from that our team will generally put together some more along the lines. I think of what you’re saying, a content document that structure is things in that structure is the content in terms of what skills we need to cover and what we used to call teaching points.

What is it that we really want people to be able to do when they’re done with this? And from that then we can go ahead and build the scenarios. So you take kind of the key things you want people to be able to do. Then you take the, you know, kind of based on among other things, the stories that you’ve heard from the subject matter experts and other things that come out of your head as a creative person, design scenarios that will help cover those, those, those goals and that’s you know, and address those skills. That’s kind of the short version of how you get in, in our process to having, having the structure of, of a course and it’s not all learned by doing there. I mean interspersed in there depends on the course and the audience. The initial phase also involves a lot of understanding the audience and what they need.

There were other decisions about everything from length to lengths of different segments. Do we, you know, we love putting you in an immersive simulation, but not all jobs really allow that. Sometimes it’s really nice if you have an hour to get into something in a lot of depth, but sometimes they’re doing it at work and it’s just not going to happen. And then you have to design around that as well. But from the content to, you know, to technology and AI side, it really comes from, you know, from the early interviews and the stories and builds from there.

Curtis: That’s a great process. And focusing on what people need to do, I think is really strong. So once you, once you get there and you’ve designed this document, um, how then do you apply the machine learning or the AI to what you have built? What I’m trying to get to is to help people understand kind of the process of how you apply these, these principles to the product that you’ve then created.

David: Yeah, I mean, you know, part of that early design phase is designing the overall experience. So then within that you’re thinking, what can I do? So if I want someone to become an expert at a certain type of conversation, you may build a little module. You may first design this little module that’s going to help people have this conversation and what’s it going to look like? So what’s, what’s gonna be on the screen? Are you going to have choices you can make? Are there going to be, you know, as a little more open-ended, are there scenes that you analyze? Like all of those are our sort of tools and techniques that we might use. Once you have that part of the experience sketched out, we usually work with the graphic artists and sketch out the screens, then is usually the time to get down to the level of technical detail.

Okay, well we really want this really want this particular thing to be customized. We really want it to be recommending next steps based on the past experience in, in the, in the program. Here’s what we want. And that’s, that’s the time that’s more a partnership between the online instructional designers who are at least, you know, very aware of technology and the technical people who really can, can put into action. And that’s I think where the, from our team standpoint, where there’s the strongest partnership between design, instructional design and technology. So what exactly, what kind of algorithms do we need to be able to create the experience that we want? And we’ll have a sort of a enough of a framework at that point that everybody understands where we are and where we need to go.

Curtis: Is it fair to say then conceptually this is something of a recommendation engine for learning, much like Netflix recommends what you may want to watch next. This is more recommending, this is the next step you need to take to get to the understanding you need to, to complete this action. Is that fair?

David: Um, I think that is one of several possibilities and one of several things that we can do, and that’s absolutely one of them. Another one might be what kind of guidance are you going to get from kind of an online teaching component? So may just be in text. Text will pop up, you, you back to the customer service that you have a customer and you say the wrong thing. You got to get, get something coming up. Potentially it could be in text that says, “Hey, you know, this isn’t really the right thing because you really, you’re violating the principle. Like the core principle of like, you know, always calm the customer first, even if you think the customer’s wrong,” so you violated that principle, let’s talk about why. And then it could even branch to a little dialogue that might help you understand, “Okay, well why do you think, you know, what do you think could have done better?” And those are harder to build, right? Then you’re in a really tight, you know, tight scoping situation from a technology stand point right now. Like the division of down the line is that’s going to get a lot easier. But so those are both other ways that we can use this data in technology. So it’s not as always as simple as just what’s next, but it’s actually affecting your experience here at a, at a much more granular level.

Curtis: Okay. That’s really cool. I’m glad we clarified that. So that’s even getting into some natural language processing, understanding what they’re saying, how to respond to them, these kinds of things.

David: Exactly. And that, that is one, you know, that uh, you know, back to my comment earlier about hype, like within really tightly scoped situations, we can, we can do some things that are interesting. You do have to be really careful because if you lose, if you having a simulated conversation like that with someone and it doesn’t feel real, they say something out of bounds and you know, or something unexpected and the system doesn’t respond, you’ll lose a lot of credibility. So we’ve really tried to scope that tightly and use it in very specific situations. We did one in the world of accounting that I think worked pretty well and I think that partly worked well because it was a little bit more clear, like they’re solving an accounting problem and the scope was really confined. You know, I absolutely see that getting better over time, although you know, it has been, has been 30 years of his saying it’s getting better over time, but it is getting now better over time.

Curtis: Right. Yeah. So it’s still true that machine learning is good at very, very narrow applications. Right. And there’s not not anywhere near general.

David: No, absolutely. And so that’s, that’s where I feel like, you know, I really liked that we could do things like this, but I also don’t want to over-hype it and leave the impression that, you know, this is having a freeform as, as an, as a coach or a teacher. Why? Because it’s not at all doing that yet.

Curtis: Right. Yeah. Well, I appreciate that the whole point of this conversation is to really help people understand the possibility space, the limits, all the exact things. So that’s great. But with that, you have been able to leverage it and you have seen gains from it. What would you say if you were to characterize the gains you’ve seen because of the machine learning techniques that you’re leveraging, how would you say that that’s affecting your business and the students that are learning?

David: Yeah, that’s a really good question. I mean, we have not been, um, we’ve not had a chance to quantify it. I mean, it’s always over challenge in our businesses is to do that and to sort of get clients to give you the time to study everything as much as you’d want. We’ve certainly seen improvements. We had one client when one of the customer service clients that I mentioned, we saw that there, there’s customer service scores as a, as a company went up significantly. This is a major retail chain. Always a little bit of cred assignment problem. This, you know, this rolled out over a period of months. You know, there were other things that had happened. It was part of a big initiative if we can’t get all the credit, but it certainly seemed to be a factor. We have really nice anecdotal evidence that people enjoy the process, enjoy the, the learning experience, which is often not true in corporate training.

So you get something out of that, so you get sort of a, perhaps harder to quantify, but I think significant morale boost, for lack of a better term. I mean, people would find themselves talking about the experience that they had because they thought it was pretty cool. Then that gave them a better impression of the company that they work for and made them feel good about their jobs. And you know, you’re paying a lot more attention when you’re, when you’re learning by doing, you’re paying a lot more attention when you’re emotionally connected to, to the experience as opposed to trying to click through a page and answer the question so you can get done and move on to something that you find more interesting. So I think those are definitely a couple of ways in which we’ve been able to see at least at least qualitatively that, that, you know, there’s these methods have been successful.

Curtis: Yeah. That’s great. And uh, do you have, I’m just curious, do you have, uh, maybe a favorite example or case that you’ve done that you can share and just kinda tell, tell us what the result was and what the, what the program was?

David: Yeah, let’s think. There, I mean, there are, I know a few of that come to mind. I mean certainly the customer service one I just mentioned came to mind that had really vivid, vivid characters as well as, as as technology behind it. The legal education one I was mentioning before is another one that comes to mind. It kind of comes back as one that people still talk about partly again because of sort of the dramatic situations that are realistic to people where we’re evaluating but also because it does have several different pieces of technology behind it. Um, one of which is sort of helping you understand where to go next in the program, what might help you. And the other is that that one has some discussion forums. You are interacting with other people and there’s technology behind that that helps point out not just the basics about, “Hey, somebody responded to your post. You may want to take a look.,” though it does do that, but also looks for other posts that are conceptually similar to ones that you are being active in. And that’s, that’s kind of a helpful thing as well. So trying to, to really use what it knows about you in a noninvasive way to recommend things that you might, that might bring about more participation and get, get people more interested in the non-required parts of the, of the program.

Curtis: Got it. Yeah, that’s really cool. And, and that just brings up the point that there’s a lot of, there’s a lot of places you could take this, right? There’s a lot of things you could do with this. And so I’m curious, you know, where do you, where do you see this go in the next, I don’t know, five years, 10 years, something like that? Where do you hope to take all of this?

David: Yeah, I mean, I think we’re at a great point and I hope it’s only gonna get better. Um, the advent of new technologies in, you know, other types of technologies I think is really gonna help us. Uh, I mentioned briefly virtual reality before. I think that that’s one and we’re starting to see more and, you know, starting to experiment more. Some of the things that we’ve done, even in a a simulation model on a computer, you’re staring at a screen. I mean, there’s certainly something to that. It’s been very successful, but when you’re in a more immersive environment, I think it’s even even more successful, right? So you’re really, you know, really fully immersed in whatever it is that you’re doing. Whether it’s, you know, fixing an air conditioner or, or helping a customer or, or working with a client and whatever it is, you’re really fully immersed. Um, augmented reality as well.

I mean, you have the, the chance to, to take things that you know are just in time and give you more information and give you things that layer over the real world and bring that all together. And so if you have an intelligent coaching slash tutoring slash teaching component behind all of this, it gets a lot easier to imagine a very interesting educational world. There’s a keynote talk that I give where I kind of lay out a bit of this vision of sort of, um, how all of these technologies might work together. And I made sure the, the 30 second version of it is that it involves, one example is you start off as a student and you’re studying, um, ancient Greece and you’re doing that and kind of a simulated world with an intelligent coaching component that helps guide you to take a look at what might be interesting and why and can answer some of the questions that you have.

But then maybe you take a trip to Greece, modern day Greece, but there there’s a lot to see from ancient Greece. And while you’re there, the coaching component is aware of exactly where you are. It can refer you to the things that you learned before. It can connect you with other people who might have interesting things to say. It can give you some interesting activities that you might want to do. Hey, while you’re exploring the Parthenon, here’s something you might want to want to do. And all of that can happen with an intelligent guiding component behind you, which is aware of your own experiences, at least within the program and your own interests and can make use of those. That’s a very short version. But those are, those are generally things that are not, that are either not far away or not technically impossible. There’s certainly effort to effort required to get there but are not things that seem like they’re outside the realm of possibility at all in the next five or 10 years.

Curtis: And that’s really cool to think about. I think you know, where’s education going to go and on all the really interesting things we can do with technology. Just one other question for you. This may be a little off topic, but I’m just curious from your perspective as an educator, you’ve been working in the corporate education world for awhile. You know, education is changing. We have the university model and now all of a sudden education, certain areas is becoming cheaper and more ubiquitous and all these things. What do you think education will look like?

David: Yeah, that’s really, that’s a really good question. A couple of things come to mind. One is, I would imagine education breaking off into a little bit, a little bit more, becoming a little bit more segmented where there are or it’s a little easier to acquire skills if you need it. So there’s, you know, there’s, how is this going to connect directly to the job market without someone needing to spend quite as much time and often money.

And obviously some of this will depend on how things evolve in the future and if college costs continue to be as they are, if that ends up changing over over time. But you know, right now it’s expensive, and it’s often out of bounds for people and people have a lot of difficulty getting some of the skills that they need for jobs that they otherwise absolutely have the capabilities for. And I think there’s an opportunity there, and that’s one area where I think technology can help because what technology does so well is it scales up and you know, you build, you build this, this, this AI based component to help people learn something better and can handle a million people as well as it can handle one person. And I think that’s the opportunity that we really have. As far as overall education, I, you know, I also, there’s been a fair amount of moving toward being able to more easily take tests online and that kind of thing.

And that to me doesn’t have a tremendous amount of value, right? So that, that’s the easy way to scale something. But what we really want to look at is, you know, how can you take the experiences and have those scale up? And that does get me back to where technology can help. So I can imagine a, an educational world that is much more, much more technology based but in ways that are both individual and collaborative, put people in a position to learn things that they both they find interesting that they are able to connect emotionally with more and that can help them become better prepared for, for jobs. You know, both in the current and the future worlds. And I think all of that, all that I think is, um, all of that is related to ways in which education or which technology can really help us.

If you think back on the history of, of education, uh, historically, if I really quick, quick summary. You know, when the really old days we had apprenticeships, right? People learn by apprenticing. They would learn, they’d work with a quote unquote master. They’d worked together, they would learn by doing, they would be coached, they would have actual experiences, they would learn skills, they would learn by practicing all the things that we believe in, in modern education and all the things we believed in at the time. And what really happened over time was that when education had to scale up to accommodate mass workers, we ended up with classrooms. Right? So because you had to, there was no other way. You didn’t have enough experts to work individually with everybody anymore. So you needed mass education systems. And somehow what’s happened in a lot of the uses of technology and education to this point is we’ve taken the mass education system that was there only to solve a scalability problem.

Not because it was the best educational method with the person in front of the room with a bunch of people kind of listening and taking tests, memorizing facts. So we’ve taken that and now we’ve scaled that even further online because it’s easy to do and easy to track, and what I want to do is go the opposite way and go back to the learning by doing and the apprenticing and all of that in a in a different sense because we can do that with technology. Technology does let you scale up build at once and it can support millions and that’s, that’s true if it’s a great experience as much as it’s true, if it’s a test and I think that’s what I’d like to see more of and that’s why I do think I will see, we will see more of,

Curtis: We would like to give a big thanks to David Guralnick for being on the show and as always a huge thank you to you our listeners. It means a lot to us that you guys keep coming back month after month and listening to our episodes. If you have been listening for a while and you’ve found value in the shows and it’s helped you in your work or your career or just an understanding data and opening your mind a little bit. It would be great if you could leave us a review, take a couple of seconds and give us a review either on iTunes or on Google Play or wherever you get your podcasts. It helps us out a lot and also helps other people find the show so that they can also learn about data.

Ginette: As always, for our transcript and links to attributions, head to data crunch Corp com slash podcast, and we’ll see you next time.