Visit techpolicyinstitute.org for more!
Aug. 29, 2022

Cathryn Ross on the Regulatory Horizons Council and Re-Imagining Regulation

Cathryn Ross on the Regulatory Horizons Council and Re-Imagining Regulation

Cathryn Ross is Strategy and Regulatory Affairs D…

Cathryn Ross is Strategy and Regulatory Affairs Director at Thames Water. She is responsible for shaping and embedding a strategy to ensure that Thames Water delivers for customers, communities and the environment. She is an experienced regulatory and competition economist and has worked across a number of different sectors advising on economic, regulatory and competition issues.

Liked the episode? Well, there's plenty more where that came from! Visit techpolicyinstitute.org to explore all of our latest research! 

Transcript

Bob Hahn:

Hello, and welcome to the Technology Policy Institute’s Two Think Minimum. I'm your host, Bob Hahn. Today is July 18th, 2022, and I will be speaking with Cathryn Ross, who is strategy and external affairs director at Thames water in the United Kingdom. Prior to that, Katherine was chief executive of Ofwat (The Water Services Regulation Authority), the independent economic regulator for water in England and Wales. And Cathryn has recently stepped down as the first chair of the Regulatory Horizons Council. I invited her here today to talk about a report the council issue just last month on regulation and innovation, which is on my top 10 list. And I hope we can persuade you today--it should be on yours. Katherine, welcome to Two Think Minimum.

Cathryn Ross:

Thank you for having me, Bob, looking forward to talking to you.

Bob Hahn: (:45)

So you and I have talked about reimagining regulation before in some of our weird fantasies, but today I want to focus on the topic of regulation and innovation. So can you tell our listeners why you think regulation and innovation is important and might be worth a podcast?

Cathryn Ross:

<Laugh> well, let's start from the premise that it is important and it's worth a podcast. No, I think regulation innovation is a really fruitful kind of intersection between the whole question about technological progress and what is needed to get new technology from concept through to startup, through to scale, and delivering benefits in the economy in society and a regulatory landscape that in its essence is designed to prevent harm in some form and is not there actually to sort of promulgate innovation, but does have a massive impact on the extent to which good ideas can actually be taken up and deployed at scale and deliver benefits. So in fact, as the regulatory horizons council, we spent quite a long time talking to innovators, talking to disruptors and actually indeed talking to their investors. And it's amazing the number of examples that you hit of where somebody's idea has really run into the sound, the moment it starts to get of sufficient scale, both to become useful to society, but also then to hit the regulatory landscape. So it's a really, really important intersection of this regulation and innovation question.

Bob Hahn: (2:08)

So before we get started, can you tell our listeners what the regulatory horizons council is, which I may sometimes refer to as RHC, but it won't be on the test and what your role was?

Cathryn Ross:

Yeah, absolutely. So I was the first chair of the regulatory horizons council and the regulatory horizons council is an independent advisory council, basically owned by the UK government, if I can put it that way. We were created on the back of a white paper from about three years ago now called regulation and the fourth industrial revolution. And it was basically a really good piece of work, this white paper and identified all the different barriers to the UK taking advantage of technological innovation. And one of the barriers that it came up with was where technological innovation meets regulation, and that regulation can sometimes kill the technological innovation before it gets even actually to start up. And certainly when it goes from startup to scale. So the idea of the regulatory horizons council, basically our mission statement is to recommend reforms to regulation that are designed to ensure that the UK gets best value out of technological innovation.

Cathryn Ross:

And that value can be all sorts of different types of value. So it might be financial value and, and value in terms of global competitiveness, but it could also be value in terms of improved sustainability or social inclusion or health and wellbeing or things like that. So basically our job is to try and sort of disrupt the regulatory status quo and find areas where regulation could be better with a view to enabling the UK to get better value out of technological innovation. And I think it's fair to say, actually, when we were set up about sort of two and a half years ago, there was a lot of people who thought that that meant we would go around wanting to see deregulation. And obviously there are examples where we do want to see deregulation and where we think deregulation would be beneficial for technological innovation. But actually one of the things we found is that good regulation: regulation at the right time in the right place, and a truly proportionate way that's clear and predictable can be a really important enabler of technological innovation. So a lot of the time we're not just looking to deregulate, we're also looking to reregulate or make sure that regulation is fit for purpose and proportionate and actually protects the public. So that's what we've really been doing for the past two and a half years.

Bob Hahn: (4:10)

So one of the phrases we use in the U.S., and I think they use in the UK as well and elsewhere the OECD and all countries around the world is this notion of smarter regulation. How does your report or your team's report or the council's report fit into that?

Cathryn Ross:

Yeah, I mean, very much in line with that. I mean, one of the things that we have found quite persuasive actually, when we've looked at the regulatory landscape, for example, is a best practice, is this idea of adaptive regulation or possibly even anticipatory regulation. And it's really all about regulation that is forward-looking rather than backward-looking. And that is open to learning and adapting as it goes. So, you know, in terms of best practice in a regulatory environment that is better for technological innovation rather than worse for technological innovation, something with that kind of adaptive approach and that inbuilt flexibility and that openness to learning is very much what we would like seeing. I think that very much aligns to this OECD notion of smarter regulation.

Bob Hahn: (5:05)

So your report has in the title, the words, innovation, friendly regulation, and yet in your opening remarks or responses to my queries, you talked about innovation that delivers value. I'm guessing that you're probably not in favor of innovation for innovation's sake. How do you think about good innovation, bad innovation or what it is you're really trying to do?

Cathryn Ross:

Yeah, I think Bob, you hit on an important point there and I should confess this report, went through a number of phases in its life cycle, in its initial conception. We referred to it as pro-innovation regulatory principles, because that's what you know, we were asked to do is cut with some regulatory principles that would be pro-innovation. And actually when we sat down and examined our collective conscience, we didn't really want to be pro-innovation for the sake of being pro-innovation. We want to be, and this is where the phrase innovation friendly came from. You know, we want to be proportionate. We want to enable innovation to succeed where it can deliver value for society. But you know, we don't see our job as being to be pro-innovation in and of itself. And actually when we sat back and thought about regulation and I think we said this in the report itself, we became fairly quickly convinced that it is not the job of regulation to be pro-innovation. That is not why we regulate. We regulate to prevent harm in some way. And it's very important that while regulation is being conducted, being designed and being implemented with a view to preventing harm, it does not unduly restrict or harm or hinder technological innovation, but actually being pro-innovation, didn't really feel like what we should be about. So yeah, it's an important point. It's a very nuanced point, Bob, well done for picking that off.

Bob Hahn: (6:40)

So if someone is suffering from insomnia or is interested in learning more about regulation, can they find this report or appendices online?

Cathryn Ross:

Yeah, absolutely. So you can have as much for as little of this report as you like, there is quite a pithy executive summary that runs to sort of like about eight pages, something like that. There's then a main body of the report itself, which I think is about sort of 70 pages, something like that. And then if you are really, really keen on learning more about regulation and its impact on innovation, we've got a whole series of case studies that's drawn mostly from the UK, but not exclusively from the UK where you can drill down. And there's another sort of, I don’t know, probably another sort of 80, maybe even nearly a hundred pages of those case studies that you can go to to find examples of the kind of thing that we allude to in the report. If you are looking for it, it is on the www.gov.uk website. And if you search for “closing the gap,” you will easily find the report in all of its forms, the short form, but also all the case studies as well.

Bob Hahn: (7:36)

That's great. Thanks. So you have a bunch of principles in there and we're going to go over those and I promise our listeners, they won't be on the test, but before we get to principals, I want to talk about cases, because cases are more interesting and you know, those people who have suffered through law school have had to learn a lot about the pros and cons of case study approaches. Can you give me an idea of what you would consider a good user friendly innovation or regulation that promotes innovation and conversely, can you give me something that hasn't been an example of something that hasn't been so innovation friendly; there are several in the report.

Cathryn Ross:

Yeah. I mean, I'll start with the good stuff. Cause a lot of these things are about good ideas done well and done in an adaptive way. So let's start with a really well-known example and that's the example of regulatory sandboxes. So in the UK we have a regulator called the financial conduct authority and the FCA basically regulates the conduct of financial services firms, financial services, massively regulated sector, massively regulated sector. And rightly so because people quite often, you know, they get involved in that sector sporadically, you know, you, you maybe take out one or two pension plans in your lifetime. You're not doing it every week. The consequences of getting it wrong are really big. They're hard to reverse. And actually the level of understanding and level of information you need to make a decent choice is quite complex. So there's a reason this stuff is highly regulated.

Cathryn Ross:

However, the fact that it's highly regulated means that in order to survive in that sort of environment, there's a massive regulatory overhead that any firm incurs and people were worried that this was going to kill off smaller, more innovative business models because actually you don't get a chance to grow to the scale at which you can absorb that kind of regulatory overhead because you're basically squished at birth because the regulatory overhead just comes down on top of you. And that's it. So what the FCA did is they set up this idea of the regulatory sandbox, where they basically said that if you are a smaller company and or you have an innovative business model and we will sort of create a safe space in which you can basically play until you get to a certain size and we will, as the regulator, will watch that we'll watch what you do and, and how you do it.

Cathryn Ross:

But we won't burden you with a lot of regulatory intervention when you are that small scale. So it gives a bit of breathing space and it also gives innovative business models, the chance to test a market, which I think is quite important as well. Now that worked quite well, but one of the reasons, and this comes onto the bad version. So one of the things that FCA learned when it did those sandboxes, that sandboxes worked really, really well. But what they observed was at the point, a company grew beyond the small scale of the sandbox. A lot of these companies really struggled because there was a big difference between being small enough to be in the sandbox and big enough and established enough to cope with the full might of FCA regulation. So that's something that wasn't working so well, but then, you know, being an adaptive regulator, they've watched that they've talked to the companies that they regulate, they've learned from that.

Cathryn Ross:

And so they're adapting their approach. And so they're thinking about things that they're calling scale boxes, or I think that some people have referred to them as sort of regulatory nurseries, which might sound slightly patronizing, but you sort of get the idea where the regulator doesn't just sort of take you out of the sandbox and just sort of let you go. They kind of hold your hand and they support you as you grow. And they support you in understanding what the regulation means for you, how to sort of work with that. You know, so it's not quite so binary as either you're in sandbox and you're not subject to regulation or you are not in the sandbox and you're subject to a hundred percent of regulation. So I would use that as an example of both actually something that worked really, really well up to a point then didn't work well and actually then has been adapted. And, and I think now is potentially going to work better as a result of the adaptation.

Bob Hahn: (11:15)

So when I think about this in the U.S. context, you know, I think of a bunch of big guys or big firms or whatever you want to call them. And they kind of like the status quo and they may like having these regulatory barriers to entry or costs that are imposed on everyone because the new guy on the block, if you will, may not be able to thrive in this environment, precisely why, in your example, I think the FCA set up the sandbox, but how do regulators insulate themselves from these kinds of political pressures to allow the kind of entrepreneurship or innovative ideas that you're talking about? Or is that just not a problem in the U.K.?

Cathryn Ross:

I think it's a huge problem. Although I think it's a slightly different problem in the U.K. than maybe it is in the U.S. Because I think the way that you describe that, Bob, I'm conscious that you refer to them as political pressures. And I think in the U.S., because of the nature of regulators and the political appointments to the top of regulators and also the sheer might of sort of lobbying at that sort of political level, I think maybe in the U.S. that there probably is a political pressure dimension to it. I don't think that's true in the UK.., but I do think that the phenomenon that you observe, which is that, you know, big established companies tend to like the status quo in terms of regulation is absolutely a problem in the U.K. It's absolutely a problem. I just don't think it's mediated via vested interests, engaging in political lobbying.

Cathryn Ross:

Essentially what happens is you get companies who are, you know, big, they have regulatory affairs teams, like the kind of team that I work for. They have public policy teams. They have people who know how to talk to government. They have people who know how to talk to regulators, whose job it is to talk to government, whose job it is to talk to regulators and because they know how to have those conversations and they devote the resources to those conversations, they become influential. And I don't think there's anything particularly nefarious necessarily about that, but it is an inherent bias in the conversations that the policy makers and regulators have. And it's why one of the most important takeaways from this report is the importance for regulators and policy makers of getting out and talking to people beyond the usual suspects. You know, if you keep talking to the people who want to talk to you, you will only talk to the people who have a vested interest in the regulatory status quo.

Cathryn Ross:

So you've got to go and seek out the disruptors. You've got to go and seek out the innovators, the people who don't even know that they want to talk to you. They may not even know that you are relevant yet, but you kind of need to look at the world from their point of view and make sure that you are sort of conducting your regulation in a way that isn't biased towards the interest of the incumbents in the interest of those who favor the status quo. But it's a really difficult thing to do because it requires you as a regulator. It requires you and as a policy maker too, to sort of have a different kind of conversation with a different kind of person and that's difficult to do, right. That requires a mindset shift, but it also requires a whole shift in the modality of public engagement. It's a big deal.

Bob Hahn: (13:56)

So I have a question about that. You were one of the sort of pioneers in promoting customer engagement when you were at Ofwat in a slightly different context, just reaching out to the customers to see what it is that they wanted and ensuring that they get good value for money. What's the incentive of these different regulators to go out and talk to these new people on the block? I refer to them as “guys” sometimes, but I, you know, they could be men, women, or whatever, what's the incentive to do this and do they have the resources to do this? They have day jobs.

Cathryn Ross:

It's a challenge. I mean, the reason that we've identified this as one of the gaps, if you like between, as it were best practice and, and what you see a lot of the time in reality is precisely for the reason you say Bob it's, you know, I mean, I spent 20 years working with regulators, as you say, they, they have a lot to do. And most of what people are clamoring for you to do is to fix a problem that might have manifested itself--I don't know--two years ago, three years ago, the root cause of which might have been in place five years ago. So there's this constant pressure if you like to sort of regulate through the rear view mirror the whole time. And it's really hard to think forwards because by definition, yeah, the problems of the future haven't happened yet. So no nobody's clamoring for you to solve them.

Cathryn Ross:

And if you then combine that with the fact that you might not even know who to speak to, it becomes doubly difficult, but it's really important. It's really important because otherwise, you know, all you'll be doing is fixing yesterday's issues and you won't actually be moving your sector or, or moving the economy on or moving society on, in a way that's going to deliver best value for the future. It is really, really important. And if there is one thing that this report might do is just, you know, giving regulators when they are under huge pressure to devote all of their time and all of their resources to fixing the problems of the past, give them something to wave, say, no, no, no, no, look, it's really important that we do not devote all of our resources to this backward looking stuff. We need to carve out and dedicate some time to thinking about the future and talking to the people who are actually going to inhabit that future. If that's all we achieve, I should be really happy actually.

Bob Hahn: (15:53)

Okay. So now I'm going to get to the principles at least a little bit, just in case we put 'em on the test. You talk about, or the report talks about, fostering a culture of openness and a growth mindset. So I think openness may relate to these examples about how you reach out to people to find out what's going on in the real world to do prudent regulation and test adapt and learn. What is this growth mindset you're talking about? Is this economic growth or is it something different?

Cathryn Ross:

No, no, this is psychological growth and it's something I feel very strongly about. And, and I know lots of other people do as well. I mean, the idea of growth mindset came out of psychology. And I think it was sort of originally given that term by a U.S. psychologist called Carol Dweck. And a lot of her studies were about sort of educational psychology and child development. And the point is basically in a nutshell, without doing her work of many decades of complete disservice, but it is better to try and do a difficult task and to fail at that task. But the effort of trying to succeed at that task creates rewiring in the brain and grows neural capacity and grows your ability to do more and, and think better thoughts as opposed to picking the task that you know you know how to do and scoring 10 out 10 on the test, picking the task that you know, how to do in scoring 10 out of 10 gives you a dopamine hit, but doesn't develop your brain.

Cathryn Ross:

Whereas if you pick the tough task that even if you don't actually score 10 out 10 on the test and get the dopamine hit, your brain has developed, and it's the same point for regulators, right? So regulators are, you know, I love regulators for this. Regulators are a source of incredible expertise, you know, and particularly where you get a sector, regulators, they know a lot about the sector that they regulate. They're really knowledgeable, they're really expert, which is a good thing. But the downside of that can be that regulators feel that they need to know what the right answer is and that they can know what the right answer is and that their job is to then enforce or drive through the right answer and respect to the sector that they regulate as opposed to creating a process that enables a better way to be developed perhaps in co-creation with the sector, perhaps in co-creation with customers, perhaps learning from other sectors, which actually might take you to a better solution than if you just sat there and said, no, no, I'm the expert. And I know the sector and I know what the right answer is, and this is what it is and you need to go off and do it. So it's an approach really that values testing and trialing things and experimentation and learning and adaptation, which is what takes you right the way back to this idea of adapted regulation or smarter regulation. I think this growth mindset is absolutely critical if we are to do that sort of smarter regulation, that the OECD takes better.

Bob Hahn: (18:25)

Okay. So you and I are economists by training. I'm going to push back a little, I, not pushback, pushback, isn't the right word, but you know, economists generally believe that people respond to incentives. Okay, what incentives or do we have appropriate incentives either in the regulatory community or the regulated community to do some of the kinds of things you're talking about, for example, with respect to this growth mindset and dopamine, which all sounds great, but what I, at the end of the day, what I'm interested in is regulation that promotes, well, let me ask you, let me put the question to you. The sort of you're sitting in your thesis defense at Oxford and some person who somehow some professor says, what do you think the purpose or purposes of regulation should be? And you can't give me more than three objectives or three purposes, and preferably less is better. How should we think about this problem?

Cathryn Ross:

Okay. Well, if you’re going to ask me that, I mean, the first thing I'd say is that I think the cup of regulation is ultimately to make the world better place. I mean, you know, I think that's really why regulators get out of bed in the morning, but they have a particular toolkit that they can use to do that. And that is about using incentives, which exist in many, in different varieties to change behavior. So that's what this is about. And it's that behavior change piece, which is why the mindset thing is so important. So, you know, we can talk about this as an institutional level, all you like, but I mean, it's human beings that make decisions and it's human behavior that ultimately we're trying to influence either individually or often, if we're talking about institutions, collectively. Just to give you an example of the kind of thing that I'm talking about, and it's, it's a small one, but it's significant.

Cathryn Ross:

So maybe I would give this example, but I swear I was nothing to do with it Ofwat, the water regulator that I used to work for, but this happened after I left, has put in place an innovation fund. And basically what this innovation fund does is it gives companies like the one that I work for--so regulated companies--the ability to bid, to offer up for a small amount of money to use, to fund a project that is innovative. I would just say two things about that. The first is that the real value in that innovation fund, isn't actually really in the projects that it creates. It's in the mindset shift, that it engenders in companies like mine, because it's very easy in a regulated water business to be quite risk averse, head down, focusing on day job, putting one foot in front of the other.

Cathryn Ross:

You've got enough challenges to deal with, and it's quite hard sometimes to raise your eyes above the horizon and learn from others and think beyond what you can do today to what might be possible tomorrow. And just having a little pot of money on the table that you can bid for just galvanizes the organization. Well, what are we going to do? You know, what are our ideas? Where might we look for? And suddenly you've pulled that different thinking right into the organization, and you're changing the way that people are talking in the organization and how they're thinking and how they're behaving. The other example, which is the same thing. It's the Ofwat innovation fund. One of the things that Ofwat looks for when it assesses these bids for its innovation fund is that it looks for companies who build into their bid, how they are going to share the learning with other companies. So again, it, it's just a really good example of how a small, tiny little regulatory intervention can just create conversation and can create connections and can change the way that people think. And it's really powerful.

Bob Hahn:

So I buy that, you know, that fits in with behavioral economics. It fits in with political science. You're certain

Cathryn Ross:

Am I passing my Viber?

Bob Hahn: (21:40)

<Laugh> I don’t about that. But it's making the issue more salient, you know, and giving people a nudge to think about it. You know, even if it's not the money, as you say, it's combination of things, that's motivating them to be creative. Let me ask you about another phrase that was used in the report that talked about co-creating regulation. And in the U.S., we have a pretty strong dividing line. Most of the time between the regulators who write the regulations or the guidance and what have you and the regulated community. Are you envisioning something different here or is it, is the fact finding process different? Or what do you have in mind here?

Cathryn Ross:

The kind of thing I've got in mind in essence is somebody--and I think in reality, mostly it's going to be the regulator--so something like, Hey, you know, we've got this problem that we need to solve, and I don't have the solution to this problem, but I would like to create a conversation, I'd like to create a process that brings together a number of different people from the regulator, maybe from government, maybe from NGOs, maybe from regulated companies, whatever, to bring the best brains, to bear, to try and solve this problem. So together we are going to work this out. So it's a very different conversation. It's not the regulator in the sort of expert box saying, I have a problem. And I also have a solution and it's the right solution because I've thought about it. And I know what I'm talking about. And you know, here you are, you're going to do this solution that I'm telling you to do.

Cathryn Ross:

It's the regulator saying, no, no, we've got a problem here that we need to solve. Right? Let's talk together about what the solution set might look like. And let's iterate that through to a viable set of interventions or behavior changes that we can make. To give you an example of something that I did: this is going back from my time actually at Ofwat, but it's one of the, I felt it was one of the most important things I did when I was there. When I was there, we acquired a new duty. It's kind of a big deal. You know, if you are an economic regulator, you are there to deliver your statutory duties using your statutory function. So if you get a new duty, it kind of makes you pause and think, what is this new thing I'm supposed to be delivering? And we acquired a new duty basically to promote resilience in the companies that we regulated.

Cathryn Ross:

And you kind of look at it and you think, what, what does that mean? I mean, I, you know, resilience, yes. I sort of get it in terms of the ability to, you know, anticipate and withstand and recover from shocks. But, you know, there's, there's a whole load of shocks out there. And, and what does this mean? And we set up a task and finish group that was actually led by a chap who was the chief executive of the water efficiency group. We had water companies on there. We, I think we had some green NGOs on there. It was a very open process. And this group walked around this concept of resilience from all its different angles. And basically together, we came up with the definition of what we thought resilience meant, and they, the group gave us, the regulators, recommendations as to what we could do about it. And it was a much more, multi-dimensional much more sort of, you know, well rounded conversation than I think we would ever have had if we'd sat on our backside, you know, in our own office and thought about what resilience meant and come up with a consultation document and consulted the usual suspects, you know, it was a very rich conversation.

Bob Hahn: (24:31)

So that sounds like a great idea to me. And it sounds very similar in spirit to what your council did on innovation, just for a slightly different issue. Let me ask you about something that I found intriguing in your report that isn't always in smarter regulation lists that I think is important, and I wanted you to opine on it. And it's the importance of ethics and being clear about the fact that regulators are not just putting things into some sort of machine and coming out with an answer and they have to make lots of judgment calls. What do you see as the role of ethics and having an ethical framework either for the regulators themselves or the regulated community and what should we be doing better here?

Cathryn Ross:

Yeah. This was another one of these points that we felt very passionate about actually. And the more we thought about it, the more important it felt to us as a council to put this in. And it came out of a very simple thought, which we sort of set out very explicitly in the report, which is regulation involves a lot of value judgements. And those value judgements are not made by algorithms. They're made by human beings. And those human beings bring their own sets of values to those judgements. And those values need to be transparent. They need to be explicit and they need to be open to challenge actually, because they may not be appropriate. That's a question, but they may certainly be outdated. So for example, you know, you might have a particular framework, you might value something over something else because of some preconception that dates back to something that happened 10 years ago when the whole world's moved on.

Cathryn Ross:

You know, if that value framework is not explicit, it isn't capable of challenge by others, and it can become almost a sort of systemic bias in regulatory decision making. So it did feel very important to us actually, that those ethical frameworks are debated, actually. they are contestable and they should be contested. And in order to be contested, then they need to be visible. And I think the, a slightly more pedestrian point, but also important, you know, we talk a lot in the report about the value actually that regulation can bring to technological innovation. So for example, we did a deep dive report in our first tranche looking at fusion energy, really interesting. One of the reasons it was really interesting was because the investors' infusion energy technology were desperate for clarity about how the thing was going to be regulated, because there was no world in which fusion energy was not going be regulated, that there was a world in which you knew how it was going to be regulated, and you didn't know how it was going to be regulated.

Cathryn Ross:

And they would rather be in the world where you did know how it was regulated. You know, that's the sort of classic example, but it's, it sort of demonstrated to us the importance of regulation in providing a predictable environment, you know, an environment where an investor or, or an, or, you know, somebody who works in industry can know with a reasonable degree of clarity. If I do this kind of thing, I'm okay. If I do that kind of thing, I'm not okay. And again, if regulations are clear about their ethical frameworks, if they're clear about the value frameworks, they're bringing to their judgment, that just enhances the predictability of how they act. And it gives people on the other side, you know, the innovators and the venture capitalists and, and all the rest of it. It gives them something to hang their hat on. It gives them a direction where they know they're more likely to be okay if they do this and less likely to be okay if they do that, that in itself is just really helpful.

Bob Hahn: (27:42)

So I'm not sure how this ethical framework should fit into politics. So when I observe the U.S. process, I observe someone like you in the olden days or me now, and in the olden days, doing a report saying, here are the pros and cons of doing things. If you tote them up like an economist, might here are some of the benefits and costs here might be the impacts on innovation. You Mr. or Ms. Politician need to make a decision that will involve value judgements. And sometimes that's the head of an agency, like the head of the energy department or the head of the environmental protection agency in the United States, or something else if we're dealing with different kinds of innovation or competition issues. But I don't know that it would necessarily be helpful to the regulator or to the regulated community if they were required to specify an ethical framework. In reality, they're making hard judgment calls based on many inputs, depending on the level of the decision. I mean, certainly they have to behave ethically. They, you know, have to abide by conflict of interest rules or whatever. So are there examples out there where you can point to where you think this ethical framework has actually done something constructive? Or is this sort of an idea that you're asking, you know, me and others to develop in future research?

Cathryn Ross:

Yeah, I suppose there, there's a couple of things I say. I mean, the first is there are lots of different types of regulators out there. So for example, one regulator we spoke to as a council, we found really interesting in the U.K. is the human fertility and embryology authority. And they regulate the use of human embryos, including infertility treatments, but also in research. And there are massive ethical considerations involved in the decisions that they make. And it, I mean, I, I'm probably being a little unfair to them and they probably wouldn't describe it like this. But the thing that I found really interesting in talking to the chap who was their chief exec at the time was how thoughtfully they gave life to that sort of ethical consideration. So they were very conscious in how they made their decisions and how they did their work of the need to take into account the views of different people in society.

Cathryn Ross:

For example, one of the things they did is they have public meetings. They have lay members who they're actually on the authority, but they have a lay membership group that they speak to about what they do. So I wouldn't necessarily say that they had set out an immutable ethical framework that they used in making their decision making, but they did recognize that in their decision making, they are making big ethical judgments and the process by which they make them is very important. And ultimately, and again, I don't think they would describe it like this and I’m being a bit crude, but ultimately I, one of the things I took away was that the tests they are applying is, does society believe this is acceptable?

Bob Hahn:

Right? So sort of in your answer to my ill-formed question, what I hear is that process matters for things that are going to potentially have large societal impacts and impact different groups differently. I totally agree with that.

Cathryn Ross:

Definitely. And I'll give you another example, actually, again, I'm going back to my, my offer days, but so one of the things that water companies do in the UK is because we charge by volume of consumption. There are some people who find it difficult to pay their bills, and we have social tariffs to assist those people who are in financial, you know, difficulty in paying their bills. When I was at ofwat, we were under quite a lot of pressure to somewhere on a spectrum from ask to tell water companies to do more on social tariffs, basically to increase the amount of subsidy they provided to those people who were simply not able to pay their water bill. And we, I found that incredibly uncomfortable because that is basically a regulator who is essentially a technocrat, not an elected official, making a judgment call about a social transfer from one group in society to another group in society, which I did not feel comfortable making that did not feel like something that should be done by an unelected technocrat.

Cathryn Ross:

That's the kind of thing that we elect governments to do with taxation and, and spending and things like that. So we said, and we said this explicitly, which is an example of being explicit about your values and making a decision that we would encourage companies to make those kind of transfers to create these kind of social tariffs in two cases. One is where it was actually value for money to do so. So in other words, it's better to have people paying something than paying nothing. So it's actually cost beneficial. So customers are better off. So why wouldn't you? And the second is where your customer base is actually supporting the use of their money to subsidize people less fortunate. And if those two things are happening, yes, you do a social tariff. If those two things are not happening, do not do a social tariff. But we were very explicit about that. We were very explicit about the fact that we didn't feel we were able to make that kind of value judgment that these people in society are worth subsidizing by those people in society. Do you see what I'm saying? It's an example.

Bob Hahn:

I hear what you're saying, but in a U.S. context, I think about politicians, legislators making that, that decision, like frequently, they'll say we want to subsidize a group that's less well-off for, for any number of reasons, but let's just say income, their lower income. And I think the way I would sort of divide the baby on that one is to say, let's get a mandate from the legislature, if that's what they want to do. And then we should have the technocrats as you call them, figure out the least painful or best way to do it.

Cathryn Ross:

Yeah. Yep. I would very much support that, but that isn't always how the world works. There are lots of value judgements that you make as a, as a regulator that you would never be able to bat up to the politicians. I mean it, to give you another example. I mean, when you set a price control, when, when a regulator basically limits the prices that a Monopolist can charge its customers, you know, there are a thousand value judgments that sit under the heading of cost allocation. You know, a lot of these businesses that are regulated, the reason they're monopolies is because they're relatively high fixed cost businesses with, you know, low margin costs. So, you know, they've got economies of scale. Well, how you allocate those costs between different types of charges. I, you know, massive judgment call. And again, I would encourage regulators who have to make those judgment calls to be explicit about where they are actually allocating those costs. And for what reason, and you can allocate them on the basis of trying to load costs onto prices, where you want send signals for efficient usage at the margin, or you can, you know, load those costs into a way that sort of minimizes the welfare loss, or you can try and recover them in the most progressive way rather than regressive, you know, but there are value judgements there. And I think you need to be explicit about those,

Bob Hahn: (34:30)

To the extent that politicians will let us, I totally agree with you. <Laugh> on the other hand, they don't necessarily like to see that these explicit trade-offs. Let me turn to something else you talk about, or the council talks about in your grade paper or report. And that's the idea of responsible innovation. And I want to bring up a crazy example in quotation that I'm sure you've heard before from Mark Zuckerberg, where I think Facebook, or he said something like move fast and break things. And I think of Uber and Travis Kalanick in the old days where they did move fast and break a few things and brought an incredible new technology ride sharing to many people. What do you mean by responsible innovation and in the report the council talks about responsible innovation, not just for the regulator, but for the entrepreneur. My view about this is mixed to say the least, it's not necessarily a bad thing that we have ride sharing in many societies now, but there were some Cowboys and probably cowgirls who moved fast and broke a bunch of things. So that's not a, you know, you could say, what's the question, that's sort of a comment. I'll let you figure out what the question is there. But, how should we think about this responsible innovation concept?

Cathryn Ross:

Yeah, I, I guess there are, there are various things I could say on this. I mean the first thing which I think is important and we felt, I think was important when we wrote the report, although our job is to make recommendations about regulatory reform and therefore our primary audience is regulators and policy makers. That's not all the puzzle here. There are other bits in the puzzle and we didn't really want to let the report go without actually being explicit. That there are things that businesses can do that innovators can do, that investors can do that would in and of themselves improve the regulatory environment and therefore better enable them to thrive in their innovations to deliver the benefits. So it's no use just continually pointing the finger at the regulatory and policy making going, you must, you must, you must, when actually there's some stuff that you might be able to fix in your own house.

Cathryn Ross:

So that felt philosophically quite an important, you know, addition to, to the report. And I think when we talked about responsible innovation, we've referred to a few of the responsible innovation standards. I mean, there's, there's a British Standards Institute, one, that gives them more guidance on practically what this might look like. The basic thrust of it is for innovators to think about how their innovation is created and to sort of create their innovation in a way that sort of maximizes its potential to do good and minimizes its potential to do bad. Now clearly, to the extent that maximizing potential to do good and minimizing potential to do bad means you are not innovating. That is a non-solution because if there is no innovation, there is no potential to do good. So, you know, there's a natural limit on that because you need to absorb the innovation gets off the ground, but it might involve things like thinking about, you know, safeguards that you can put around your innovation to make sure that it's used in a sensible way, rather than a not sensible way.

Cathryn Ross:

It might involve thinking about who would be most likely to benefit from your innovation and perhaps including them in the process of developing it so that it's done with a mate to sort of maximizing those benefits. And then the thought that we had was then if the innovator or indeed the investor in the innovator can then at the moment that they become big enough to be regulated, go to the regulation and say, look, here's a whole host of reasons why you don't really need to worry about us because we've built good stuff into the design of our product or the design of our service, because we've been really thoughtful about this from the get go that should then provide the regulator with a reason to go, okay. Maybe I, maybe I don't need to be worried about this. Actually, maybe this is mostly okay. And we can have a more constructive conversation rather than feeling like they have to come in and regulate in a sort of harsher way. And it would just make for a more sort of constructive conversation. So it's partly about thinking about how your product and services use. And it's partly about thinking about how you developed your product and service in order to maximize its benefits.

Bob Hahn: (38:05)

So one of the things I really liked about your report, which was a little different from, and I've read a lot of these and taught a lot of these was a little different from things that I had read before, was your focus on trust. And your example just a minute ago was sort of related to that, you know, find out how this technology is going to affect people and make sure it does good things for the community that you're trying to sell it to. Can you talk about the need for, or the importance of trust in the regulatory process and how we could do better? And what some examples on the leading edge might be?

Cathryn Ross:

This is a huge topic. It's just a huge topic.

Bob Hahn:

Okay. We've got three minutes.

Cathryn Ross:

Okay. With three minute version, I don't think, I don't think regulation works without trust, right? And even though the reason for regulation is that somebody somewhere let's assume government believes that some bad stuff will happen without a regulator. And that the regulator believes that some bad stuff will happen if it doesn't do things, the whole process of regulation just rests on trust because you know, the regulator doesn't have information about the products and services that they're regulating or about the market that they're regulating. They have to trust others to provide them with that. And to be honest with them and to provide information at a time, you are, when you, as a regulator get feedback from the industry that you regulate for the market that you regulate about what effects your interventions are having, you know, you need to trust that you are getting, you know, a reasonable view of what's actually going on in the market.

Cathryn Ross:

You know, I mean, regulation exists in large part because, you know, there's no such thing as a complete contract. So you know, where there's no such thing as a complete contract, you're basically talking about discretion and human interaction and that whole kind of framework falls apart if you don't have trust. So it's absolutely critical. And the thing I keep going back to with all of this and it, and it's, it is why I keep coming back to this. It's sort of inherent predisposition of the regulatory process to favor the status quo, which is why regulations really need to bust out that and, and do that consciously. It goes back to this thing about trust. I really fervently believe that that trust is a human being to human being thing. It is not an institution to institution thing. And so the interaction between human beings in regulatory processes is absolutely critical.

Cathryn Ross:

You know, it's not about whether I, as the competition regulator, trust you as the party, trying to get your merger through the competition authorities, you know, it's about whether Cathryn trust Bob, and because of the importance of those human interactions. I think this is part of what creates this sort of regulatory bias, if you like, to the bigger companies and the better resourced companies, because it tends to be the better resourced companies, the more established companies, the kind of companies that have government affairs teams, the kind of companies that have regulatory affairs teams who can invest the time and the effort in turning up for face to face sessions and having those human beings, human beings interactions, which is why I think it's absolutely critical that the regulators devote time and effort to going out as human beings, to see other human beings who are not the human beings who want to see them, you know, that the human beings who are doing the innovation, who are doing the disruption, who are doing things differently, I think that's absolutely critical because they have to build trust. They have to create the conversation with that different set of people, rather than with the people who like to talk to them about the stuff that they already know.

Bob Hahn:

You're very gracious and did that in three minutes and we're approaching the wishing hour. And I have many questions that I want to ask you, but I will limit them and just ask a couple. One is my hobby horse. So you, there are a ton of great ideas in this report and I was very stimulated by it. But my question is, as you try these sandboxes and pilots and whatever, do you think it's a good or bad idea if we could get legislators or politicians to allocate a certain fraction of the budget to actually evaluating in some sense, either a qualitative case study sense or a more serious, you know, randomized trial, or what have you get him to say up front, I'm going to give you a percentage of your budget that's devoted to evaluation so we can actually learn what works and what doesn't.

Cathryn Ross:

Yeah. I mean, this is not the view of the regulatory horizons council. We haven't debated it, but my personal use that would be really helpful because it's always the thing that doesn't get done. You know, it's, if you can devote a little bit of time and effort to creating the experiment, be it a sandbox or a scale box or whatever, that's a major victory, but actually devoting a little bit more time and effort to doing the evaluations is even harder. And to be honest, if we could get a way of doing those evaluations more systematically and possibly more consistently and getting the, so what shared more widely, you know, I think we'd all sort of move up that virtuous circle and sort of, you know, testing and trialing and learning, you know, a lot quicker. So I think there is a lot that could be done to sort of systematize that evaluation process.

Cathryn Ross:

I mean, the one thing I would say though, I mean, there's been something in the UK, which I think has been quite successful and it's partly on this point and it's partly also on your point about nudging regulators for the better behavior. And that's been something called the regulators pioneer fund that was set up by the department who sort of look after a lot of the economic regulators so that regulators can basically claim an amount from this fund to do something different. And part of what they do is they then get together as the regulators innovation network to then share the learning from the thing that they've done that has been a bit different. And that's been quite a useful catalyst for some of this. It's not quite the sort of systematic, you know, dedicated funding and sort of coherent program that you are talking about, but it has helped with that.

Bob Hahn:

One final question, then you're off the hook. What would you like to see regulators doing more of in a decade that they're not doing now?

Cathryn Ross:

I think first and foremost, getting out and about more and talking to people who are not the people that want to talk to them. I think that's absolutely critical. And even if they just did that, you know, you just learn so much, you just see the world through a different lens. I think that's really, really important. And then the second I think would be building in learning in a more conscious way to what they do. You know, I don’t know how it works in the U.S., but every year in the U.K., most regulators are, I think all the ones I can think of publish an annual plan. And it says, this is what we are going to do in the next 12 months. I'm not saying never, but very rarely does that include we are going try and learn about some stuff that we don't know, or we're going to set something up with a view to learning something that not, I have no idea what the answer is, but we're going to know something we didn't know to start with, or we are going to work together with another regulator because they've done something and we'd like to learn more about it.

Cathryn Ross:

There isn't enough of that. And again, it's to your point earlier, Bob, about, you know, scarcity of budget, scarcity of resource, you know, it's stuff that is really important but is quite often a casualty. I'm not saying people don't do any of it. They do, but if we could really do more of that again, it would just get us on that virtuous circle a bit quicker.

Bob Hahn:

We have been talking with Cathryn Ross about regulation and innovation. Cathryn, thanks for joining Two Think Minimum.

Cathryn Ross:

Thank you, Bob.