Without a doubt, the digital revolution has brought about a plethora of world-changing technologies that have significantly improved everyday lives and enhanced business performance and healthcare across the globe. During the recent Covid-19 pandemic, the world witnessed the power and agility of digital technology and the benefits it can bring to society. Yet, on the flip side, many of its darker implications have become hard to ignore in recent times. As the foothold of modern technology continues to expand, it is becoming a matter of urgency that all users – from politicians to business leaders to ordinary people – are aware of its drawbacks. It goes without saying that technology is here to stay, but if it is to remain a force for good, we can no longer turn a blind eye to its downsides. How can we make the future secure? Who needs to act? Have we created a tech monster? And can the power of technology be curtailed?
One thought leader and tech evangelist addressing these issues is Author and Futurist, Ben Pring He joins ISG hosts Steve Hall and Karen Collyer for this episode of the Imagine Your Future podcast to discuss his recent book Monster: A Tough Love Letter on Taming the Machines that Rule our Jobs, Lives, and Future (Wiley, 2021). Together they debate some of the tribulations that have arisen from the use of digital technology, and how we can all step up to counteract the drawbacks.
Transcript
Steve Hall
Welcome to the Imagine Your Future podcast, I'm Steve Hall, your host for this episode. And with me I have my favorite co-host and fellow book nerd Karen Collier. I've been working with technology for over 30 years. 15 of those years with ISG, where I've supported some of the largest enterprises in the world on their digital journeys. Our aim with this podcast is to inform, educate, and engage our listeners on technology and digital related topics. We're lucky to have a great lineup of senior execs and thought leaders from various industries, so please subscribe and stay tuned to our socials, for the great content. And as always, I'm pleased to be sharing the mic with my friend and colleague Karen Collier, Karen let's introduce yourself.
Karen Collyer
Hey Steve. I've been in the tech business for about 25 years and have held a variety of client focused roles from global product management to program management, to software design and implementations. I'm hoping to bring all of that experience to bear today as we talk with an author of Monster. A tough love letter on taming the machines that rule our jobs, lives, and future. Welcome Ben Pring. So great to have you on the podcast.
Ben Pring
Thank you guys. Great to be with you.
Karen Collyer
If Ben's name sounds familiar, there's a pretty good chance that you have read either Code Halos or What to do When Machines do Everything. Both of these books, which were co-authored along with Paul Roehrig and Malcolm Frank, are on the ISG digital reading list. So, let's start today off with a little bit of background. Ben, if you were talking to a roomful of Millennials or Gen Xers, what would you tell them in terms of your journey? What were your steppingstones to get from the beginning of your IT career, to becoming a global IT thought leader, author, and futurist?
Ben Pring
Gosh, that's a great question to start Karen. I think my journey has been one where I've put my surfboard on a big wave. That's the way I think about it. I was lucky enough when I was graduating to come into a world that was becoming more computerized. Personal computers were beginning to show up in businesses, in the in the workforce. So, I'm talking here about the sort of mid 80s. And although my degree wasn't in computer sciences, it was actually in philosophy, I’d had from an early age an interest in science fiction. I was a big Trecky and I always loved that intersection of ideas, philosophy, science, and technology. And that was the beginning of a big wave. I think if you fast forward 35 or 40 years, we've seen that wave just gather momentum and still gathering momentum and I was lucky enough, smart enough, a combination of both, to realize this was going to be a big deal. And so, I started working in the tech industry, initially as a kind of tea boy, and then found my way into a couple of big consulting companies: Coopers and Lybrand back then, part of PwC now. Eventually, I found my way into Gartner, the IT analyst firm and started writing about what I thought were interesting trends. And luckily enough, I spotted the emergence of the Cloud, Cloud Computing. I moved to San Francisco to write about that more closely and I met a guy writing code in his apartment, who was just starting up a company. I wrote a piece about his company, that was Marc Benioff. So, I started writing about the Cloud, the early days of software as a service. Always with this sort of curiosity about what was going to happen next, what was the next disruption. And that then brought me into Cognizant and up to the present day where I continue to follow that curiosity, continue to try and paddle as furiously as I can on the wave, and continue to be focused on what comes next. And I think if you reverse engineer that, or if you try and extract the essence out of it over a 35/40 year career, it’s really being open to and interested, in fact actively pursuing what's next, rather than being afraid of the future, which I think the majority of people are. It's wanting to embrace that future. I think if you have that sense of curiosity, and you're energized by that then, I mean this is probably the best time in human existence to be alive. Because there is so much disruption, so much of it is technologically driven and so much is changing. But I think for young people coming into the workforce today, there’s huge change ahead, huge disruption ahead, huge new exciting frontiers that we're all going to pursue.
Steve Hall
I think Ben, you and I are of a similar generation, and I certainly remember all the Star Treks as well and was an equal geek on that. I think, watching a young Marc Benioff code though in his apartment has got to be a great story in itself. For those that don't know, Marc has clearly built a huge business, under the name of SalesForce.com. So, I think he's been a bit successful there. It's been quite nice his success.
Ben Pring
Yeah, I know it’s an amazing story. He's one of the giants of the industry, obviously in my lifetime, and it's been amazing watching his journey and his progress. Very exciting!
Steve Hall
That's good. Well, Ben, let me start right off and say congratulations on the book. As we started off saying, it was really a great read, a really entertaining read. Not only was it a good read, but you guys use some great techniques to really draw the reader into the story. I felt like we were collaboratively problem solving throughout the whole read, which is rare for a book of this nature. So, really well done! A little Plato and Socrates, I think, were embedded in the whole book. So, your definition of the monster is really intriguing. It's a hackable, vulnerable, wealth concentrated, civics destroying monster, growing faster than a beanstalk and set to create many wonderful things, but also many terrible things. I love that description of what you call the monster and how it really sets. And I think there were really sort of two themes that you carried through it. First, you start the diagnosis, the problem. Our love of technology has created this monster it's really a highly addictive monster that's consuming our lives, you compare it to any other, you know, addictive behavior and I think you call it digital fentanyl throughout the book. And then you give us a bit of a prescription for what you guys call the manifesto for how to tame it. We didn't really suggest a cure, but for our listeners can you go through the first part of the book, define the problem, some of the conditions and we'll just have a bit of discussion on it.
Ben Pring
Sure, thank you! Really appreciate those nice words. Building on what I said in my introduction, the first line of the book is “we love technology.” Paul Roehrig, my co-author, and I, we have very similar stories; we both love technology, we've been in technology our whole lives, so the idea of talking about technology in seemingly a negative way is somewhat counterintuitive. But it's because we love technology that we're concerned about what's going on. And rather than want to run away from the issue; run away from these challenges, we think it behooves people like us, and you Steve, and you Karen, and anybody listening to this who works in technology, believes in the power of the positive power of technology, I think it behooves us to step up and grapple with these issues, which are probably no surprise to anyone. But rather than ignoring them, I think it's important that we address them. And, I mean, just to summarize those, again I don’t think they're going to be really any surprise to people nowadays. When we started writing this book, three or four years ago perhaps people didn't really, fully have the sense that we do now. But you know social media; the bullying that's going on in social media, the hate speech. Steve, we were just talking before we came online about the reaction after England lost in the Euro finals; the racism, the hatred that that comes out, the trolling, the bullying that comes out on social media. The fact that kids addicted to selfies on social media, and there's all this body dysmorphia, and suicide rates are going off the charts in most societies where kids have just completely grown up with this and are completely addicted to it. The disinformation wars that are going on. The deep fake wars going on. The cyber insecurity that we're all feeling as nations, states and the proverbial 400-pound guy on the sofa; just hacking our lives to bits. I mean these are real issues that are destabilizing different aspects of society, individually, societally. They boil down to the fact that this technology that's been developed, in very relative terms, recent times, is so powerful that it's just at the center of every aspect of our lives, and of our work. And every aspect of the discussion about the future now really centers around this monster. And there's no one single technology that we're saying is a monster. We're not saying social media is a monster. We’re not saying anything individually, but I think if you step back from what's going on you can see that the downside of what's going on is monstrous. That's the monster that we're talking about and that's really the framework. That's the context of thinking about well, nobody's going to turn this stuff off; it would be ridiculous to turn this stuff off, because we all know the good news story. Particularly, you know, in the pandemic, the fact we've been able to continue to work with, we've been able to continue to connect with our families and our friends on these platforms and tools. So, there's a great new story in here. But, unless we deal with the downsides and the dark side of this; our fear, Paul’s fear, my fear, is that this dark side is going to wash over the good stuff, and it's going to wash it away.
Steve Hall
Well, you guys spent a lot of time in the book really talking about how addictive it is. And the thing that I found so fascinating; some of it I knew, some of it I didn't know; is how intentionally addictive it is. Like organizations, whether they're using analytics, psychological games, gamification; so many different tools to drive that level of addiction. You talk about the selfies; I think we've got a very narcissistic society growing because of that. And Cognizant, I think it was probably seven years ago, really coined the phrase SMAC, really talking about social, mobile, analytics, compute and bringing everything together. The amount of power that we have on our Apple or Android devices now, to click nonstop and do everything on that one device. And again, we don't want to say that devices the monster, but all of the apps, all of the things that we do that make it so easy now and so time consuming; it was amazing how you guys described it, and so many aspects of our lives that are completely controlled by it.
Ben Pring
Yeah, and I think there's a little bit of a mea culpa, to be honest, even in writing this book and again framing in the way we did. Because when we wrote that book that you mentioned earlier on; Code Halos, which came out in 2014, and we started writing that in 2011 or 2012, so it was 10 years ago; I don't think many people at that stage really had a sense of what was going on and the notion of, of the digital fingerprint. This idea that every interaction we were having was creating data, and that there were companies that were mining that data and making sense of that data. I think that idea, as it dawned on people; certainly, the way we framed in the book; that it was an incredibly powerful business tool. And I think we've been right in saying that. I mean that is at the heart of the story of the fang vendors. It’s at the heart of the story of the incredible explosion of wealth that people in that world have enjoyed. But I don't think that we fully appreciated in writing that book, just how dark this dark side was going to be. We sort of touched on it. We were at a chapter in that book called “Don't be evil 2.0.” You know, how to avoid getting caught up in the dark side of this. We didn't fully appreciate how powerful this was going to be. And as I said there's an element of mea culpa there, because I think if we played the tape back, rolled the tape back, and said, “wow, this is what's going on and let's play out these scenarios”; I think we would have focused more on these issues as they were being generated, rather than focusing more on the commercial power of it. And I'm sure you guys know the book by Shoshana Zuboff, The Age of Surveillance Capitalism. Shoshana Zuboff is an ex-professor at Harvard Business School. I'm not sure if you guys have talked to her, but that's a very powerful and very important book, I think. And reading that a few years ago; that was quite enlightening to us the authors, in, as you say Steve, really drilling into, digging into the detail of how the algorithms have been designed, and developed, and engineered to, as we put it in the book, sort of seek heat, and then double down when they find heat, and then in essence throw nitroglycerin on that heat to further inflame that heat. I think when we look at that now and what's gone on in the last 15 years in doing that, that's really the root of the issue that we're talking about. Because that is so powerful; that's at the heart of what is destabilizing the world. And again, we're not saying that Facebook or Google or anybody who has done that, have done that from any perspective of bad faith. We're not saying that they were wrong in doing that. They are incredible genius people, who’ve engineered this incredible modern technology. But, again, if we ignore the downside of it; I think that's going to put us in a pretty bad spot frankly.
Steve Hall
Yeah, no I think you're spot on, Ben. There are three angles that we should explore. There's certainly the social angle that we're talking about now, and what it means to society. At the same time that The Age of Surveillance Capitalism came out, Code Halos came out and a series of books around that. There was also the book that came out on platform revolution. And it essentially gave the roadmap for every company to go out and create their own network effects, monetize their data, use these technologies to accelerate their own business cases. And so that became a mantra at the G2000, at the board levels to create platforms to further monetize, and I guess I'll say fuel the monster, if you will. And you guys really talk about trusts on the index. And when you think about that that piece of it, when you try to monetize data, when you think about your monetization strategy, the trust that you have and the trust that you're building with your clients, your customers, your employees, your colleagues, you know, all the way through the value chain on there. And then third, I think is the governmental aspect of this and I know you talked a lot about regulation, but one of the things I think that you guys explored probably better than anybody that I've read in a long time, is really your concept of mutually assured digital destruction. The fact that in one way, we're probably already at war digitally, with nation states as you mentioned. We talked about the Colonial Pipeline, JMD, to see a ransomware attack. Those are just in the last several weeks. President Biden and President Putin have already met twice, on what I understand to be two really intense calls, regarding cyber security and ransomware. Any perspective from your thought on how this all lies with those three big underlying issues? But more, even specifically, on the nation state and the cybersecurity and ransomware aspects of it?
Ben Pring
Yeah, well again it goes comes back to this question; you and Karen and again, us, and anybody listening to this anything working tech industry; we have a vested interest in making this future secure. And we all know, if we're honest, and you don't have to be that controversial to say this nowadays, that the the future is incredibly insecure, because this foundation of digitalization that we built, is not on rebar it's on shifting sand frankly. It's amazing what's going on, really. And as Putin said, the country that controls AI is going to control the world. So, the stakes for this couldn't be any higher. The stakes for creating secure foundations couldn't be any higher. Again, we’ve sort of stumbled into this in the last generation in a pretty clumsy, ad hoc, ill thought through way; from a programmatic perspective, from the design of systems, from thinking about the broader security aspects of maintaining the cloud as this foundation for the future. With that perspective that I have as a cloud analyst for many years, I'm a cloud evangelist; if the cloud is going to continue to scale, which people believe is the case, the cyber security aspects of it are going to have to be strengthened immeasurably at a business level. You see this I’m sure in your day-to-day work; it's always confused me it's surprised me that a multibillion market capitalization figure rests on an extremely miniscule amount of spending on cyber security in a company. If you reverse engineer the logic, a company worth $100 billion market cap, maybe its revenues of 30 billion, maybe it's IT spending is 7% of that 30 billion, and then it's cyber security budget as a part of the IT budget is probably 5%. So, you've got 5% of 7% of 30 million of 100 billion. If you do that math, it just doesn't make sense to me. We've been talking about this for a long time; companies need to spend a lot more on this and they need to take it much more seriously. And as Aaron Levie CEO Box said, if you want a job for the next five years work in IT, but if you want a job for life work in cybersecurity. I think this is a moment in which, as you say, geopolitical actors are taking this much more seriously. I think the notion of playing offense, rather than just defense is the way forward. I think the Western countries have been slow to realize the scale of the game. And I think you can draw the circle back to where you started in your question and context, Steve, which is, this is geopolitical, but it’s personal as well. It's what's happening at an individual level in the in the way that, the systems, the algorithms are fracturing us and splitting us apart. A few years ago, I was at a big conference, and I talked to a very senior person in the three-letter acronym business. I won't say who, but he said to me that the model of cyber-attacks, the model of these actors who are acting ill if you like, is simply to drip water into cracks. That was the phrase he used. If you think about the metaphor of the castle on the hill, there are bad actors who are dripping water into the cracks of our castle, and that's how castles fall down. You don't need to put a nuclear bomb into that castle. You don't need to fly an F-16 over the castle and drop the bomb. You just drip water into that castle and over time it'll crack and fall. And that's kind of what's happening at the moment.
Karen Collyer
So, extending the metaphor of offense and defense. At the beginning of the book, you guys made some pretty concrete suggestions as to what people can do to, in your words, it was tame the beast. So, do you see business leaders and policymakers, taking necessary steps to implement enough change or the right changes to quiet the monster?
Ben Pring
I think that's what's happening. I think that’s the change in the zeitgeist, the change in the weather if you like. Business leaders; this is why this is ultimately a business book, business leaders are beginning to take this more seriously, because they understand the power of technology better. We've lived through a click or two of digital transformation now. Commercially, we can see that this is the future. We can see within the pandemic and the K-shaped recovery that the companies that had a pretty robust digital channel to market have done well, and those that didn't have fallen further behind. So, as you said, Steve, the G-1000, G-500, board level agenda is pretty clear now, that this is the way, we have to take this more seriously. And then within that, there are all the more tactical issues around privacy, security, data ethics stance, talent, etc. All the things again that we all talk about in our day jobs a lot. And I think within that, the notion of cyber security is obviously extremely high on every CEO’s, every board's agenda now. Cognizant have had well publicized issues in this in this space. So, we’re taking it more seriously, our clients are taking it more seriously, everybody's taking this more seriously. And then I think in terms of the use of data; data is the heart of personalization, and every business is trying to create hyper-personalized solutions, services and experiences now. And that's a great thing, that's a wonderful thing, when you as an individual customer experience that; it's beautiful. And of course you're going to be a satisfied customer, a repeat buyer. But at the same time, the flip side of that data personalization is the surveillance we were just talking about; the Zuboff model. So, a tech leader or a marketing leader, really has to think through, in a pretty sophisticated way, the issues that we lay out in the book. In a way that perhaps five years ago, three years ago they weren't thinking through. What is our stance towards personalization? What data should we use? What data shouldn't we use? How should we publicize or make transparent the algorithm that we’re using to make decisions? These are now the underlying mission critical decisions that business leaders need to grapple with. And again, that's why we think the timing of the book, whilst for some it may seem a bit of a head scratcher -why are you drawing attention to this? - we're drawing attention to it because the conversations we have with our clients, conferences out on the road, these are the issues that have been a little unarticulated, a little bit on the edge of the conversation, but now are bang right in the middle of a conversation that anybody in any aspect of business is really grappling with.
Karen Collyer
So, business leaders are starting to move on it. What do you think about governments? Are they lagging to do initiatives like; I don't know I'm Canadian and I know we just passed some legislation to tax digital platforms in a way that we haven't before, and Australia with passing legislation to protect news content. I think President Biden, within the last week or so, issued an executive order that helps restore net neutrality and provide a better scrutiny of big tech. Does government get it? Or are they still lagging?
Ben Pring
Well, I think, again, the zeitgeist is changing. We're right in the midst of that change and you're right, I think it's long overdue. Again, we started writing this book a few years ago, and those sorts of movements that you've seen Biden and the Australian government and others making in recent months, they're overdue, but they are beginning to happen. Biden's obviously made some important, executive hiring decisions in the last six months. Tim Wu from Columbia and then his protégé Lina Khan, coming into the FTC. I think that's a great thing. Again, some of my more libertarian friends, my more Ayn Rand reading friends, think this is the end of tech, that the government's showing up and we're going to slow down innovation and all this sort of stuff. But I think, to me, it is illogical to imagine that the way you manage something in its infancy, is the way you manage it when it's 35 years old. And it's no longer that little seed on the edge of the forest, but it's the biggest oak in the in the forest. So, section 230 made sense 35 years ago. It doesn't make sense today. I think the government does have a huge role to play in this. And again, the way we frame our book; our observation has been that a lot of individuals have been waiting for the government to act, and a lot of governments around the world have been waiting for individuals to act. So, there's been a little bit of standing on the side of the pool, waiting for the other person to dive in. I think we tried to frame recommendations, advice, guidance, thoughts for individuals, because ultimately, we all as individuals have agency in how we portray ourselves online, how we act online. One of the thoughts in the book is just; even though we're in a modern technological age; the oldest sort of philosophical thought known to mankind: the Golden Rule. “Do unto others as you'd have them do unto you”. That's found in every religion, in every era of human history. We seem to have forgotten a little bit online in cyberspace; we need to get back to that. So, there's a series of thoughts for individuals, but then there is a series of suggestions, specific legislative suggestions. One is that we recommended the creation in the US of what we call a federal tech administration, a federal FTA, because if you look at the charters of the FTC and the FCC, they don't really address the issues that we've been talking about for half an hour or so. You could say that Lina Khan’s appointment begins to suggest that the FTC and the politicians that make those appointments do begin to understand that the FTC is charged and does need to change. And she's going to be the person to change that. But there are a lot of things that the government has really been very hands off on, certainly here in the US in the last few years. That famous moment when Mark Zuckerberg’s on the hill and says, “Senator we sell ads”. You can't really have an umpire or referee of a game if he doesn't really know how the game is being played. So, governments, politicians, legislators, I think are slowly waking up to this. To frame it within a metaphor, just to boil this all down to make it easy; you might say that the last 35 years have really been laying the tarmacadam of the information superhighway. And now we're all going down that road, Al Gore’s Information Superhighway, the internet, the cloud, we're going down this thing at 200 miles an hour. There are no stop signs, there's no yield line, there's no stoplights, there's no red lights, there’s no yield signs, there's no rotaries, roundabouts there’s no road markings on that information superhighway. I think the story of the next 10 years, certainly probably longer 20 years, is to put that architecture of safety on that information superhighway there, so we can go down, we can use this at 300 miles 400 miles an hour, but not kill ourselves or kill other people. We'll just blow the whole thing up in the process. That's in a way the metaphor I have in my mind of where we've been in the last couple of decades and where we're going to go in the next couple of decades.
Steve Hall
Yeah, there's so much to unpack on that one as well, Ben. If you think about AI, and AI superpowers, and some of the books that are out there on the use of some of the technology, I agree with you that we can't put the genie back in the bottle, that we have to rethink it. But also, if I put my capitalist hat on, there’s such a drive now to monetize data, or to use data for business decisions, that it really gets into this, - I won't even say it's a grey area because everything we do is a grey area -, but you get into this very bad situation where you have to look at what your competitors are doing to maintain some sort of competitive differentiator, and data tends to be that competitive differentiator. When I look at GDPR effectiveness for example in Europe, when GDPR first came out it did a nice job of stopping everything. Now you get the GDPR messages on every website, and most of the time you click consent or accept and your data is available for everything, because they've made it difficult to protect it. So, we are going to have some things there. And I love your analogy that you guys use the book on really taming fire. I think that was a great analogy that you use towards the end; we didn't stop using fire. We built fireplaces, we invented fire extinguishers, we built fire departments, we've established fire codes, we really tamed it. And I think what you're partially saying is we can't put this back in. Data is driving it, Code Halos is real, we're all becoming part of the algorithms. I think your cyborg analogy was also really good. Being a technology geek, it was so refreshing to read it and how you put it. But you know, we cannot unbreak the egg. So, what do you think the future really holds? And how do you really think about, not only the next big conflict, but how we should think about this going forward?
Ben Pring
That's exactly right Steven. Again, we don't want to put the genie back in the bottle, because we know how powerful and positive that power can be. We're kind of experiencing it now, I mean, these platforms that we're using for podcasts and entertainment; they’re wonderful, they're amazing. We’re, not some modern Puritan who wants to shut this stuff down, but again in that metaphor of fire, or another metaphor we use is nuclear power; if you think about the emergence of nuclear power in the 20th century, it made people sick. Mary Curie died of it herself and many people around her died of radioactivity. We had to figure out how to harness that, how to tame that monster. And I think we will. But again, if we just sit back in a complacent way and think, “oh you know, generations to come they’ll figure this out”; I don't think that's a smart thing to do. It's going to take a lot of agitation and focus on this to create that fireplace around the fire. Coming back to nuclear power, and we talk about this in the book; I think there's a good parallel with the establishment of what's called the UK Atomic Energy Authority, which is the governing body of how nuclear power is used in the UK. There are parallels, there are versions of this in other G7 countries. That sort of framework of management, if you like around the core raw technology, it's not just technologists; there's not just nuclear scientists who are in charge managing this stuff. It’s lawyers, it's commercial people, it’s lay people, its people without any commercial vessel or scientific vested interest in this. I think in the story of the last 20 to 30 years of tech, we've let the geeks get on with it themselves, you know. Certainly, the non-techy people, the non-geeky people have looked at Silicon Valley and just been so in awe of it. And that's had some very positive consequences, but we can see it’s had some less positive consequences. So, we think that technology writ large; and I think this is true writ small in a business; and then writ very small at an individual, personal level, it's too important to be left alone to technologists, simply technologists alone. We think that the government has a role here, not to stop this, but to harness this. To create brakes, if you like, on this information superhighway, because the engine is so powerful it’s taking us along too fast. We’ve got lots of thoughts in the book around data portability and ownership; legislation around algorithm audit legislation; around political advertising; around age limits on social media; around data sovereignty. Coming back to that notion we talked about earlier of cyber insecurity and geopolitics; it's amazing to me to think that if BMW want to ship a car on an ocean liner from Germany into the US they have to fill out hundreds and hundreds of pages of paperwork, customs paperwork etc. But if somebody in Germany, or Russia, or Iran wants to ship a piece of code over the Internet into the US, they can do that without any paperwork at all in a nanosecond. I mean is that is that is that good? I mean that's crazy in a way, isn't it? I think the notion of data sovereignty, and we suggest the Twitter verified, blue checkmark model, can be scaled into all sorts of data and cross regional boundaries. Again, we've got to grapple with issues like this and that's kind of what the book is talking about.
Karen Collyer
I think that could be a whole other podcast topic. So, I think we'll, we'll leave it here. Steve, do you have any closing remarks?
Steve Hall
I’ve just got a huge smile right now Ben, because what a fascinating conversation! Each time you say something I'm just dying to jump back in. As Karen said I think we could have, another two hours on this, so thank you so much.
Ben Pring
Thank you, guys, it’s been great talking to you.
Karen Collyer
All right, so just to close it up, I cannot recommend this book enough. It’s published by Wiley. You can find it on Amazon, and as you heard today it’s totally worth the read. I’ve asked the ISG podcast support team to post a link to Monster in the show notes for this episode. If you liked what you heard today, you can hear more from the Imagine Your Future team by accessing our website at www.isg-one.com/podcasts , or by subscribing wherever you get your podcast content. As always thank you to Ben and thank you to you for lending us your ears. We can’t wait until we meet again.
About the hosts
Steve Hall
Steve Hall is responsible for the firm’s Europe, Middle East & Africa region, as well as its global Digital Advisory Services business. During his time with ISG, Mr. Hall has led some of the company’s largest and most complex engagements with clients as diverse as United Airlines, Symantec, BP, World Bank, CEMEX and Motorola. He is a seasoned professional who brings considerable experience in emerging technologies to ISG clients. Prior to his position at ISG, Mr. Hall held senior roles at a number of renowned IT services companies, including Unisys and MCI. He also led large-scale eBusiness initiatives for technology solutions providers C-Bridge and CBSI and gained deep outsourcing and offshore software development experience as a delivery executive with Covansys. Mr. Hall co-authored Managing Global Development Risk: A Guide to Managing Global Software Development. He earned his degree in Computer Science from Regis University.
Karen Collyer
Karen Collyer is a director in ISG’s Digital Practice and works with our clients to optimize their IT operations and transform their organizations to align to business outcomes. She specializes in technology enablement/operational excellence programs that range from app and ERP development to website redesign to UX alignment. The firm also looks to Karen to work with our clients to manage large, multi-deliverable initiatives as well as the day-to-day relationships with a number of our high-profile clients.
LinkedIn Profile