European flag
>
>
Peter Gluckman on the worldwide response to COVID-19

Peter Gluckman on the worldwide response to COVID-19

Science for Policy podcast episode

Play Video

About this episode

Broadcast on

1 September 2020

Guests

Show notes

What has COVID-19 taught us about science advice? How have different countries responded to evolving evidence during the pandemic? Have some science advice models performed better than others in terms of public health outcomes? Can science advice really help much when evidence is partial or controversial, and decisions are needed at high speed?

Sir Peter Gluckman discusses these questions with Toby Wardman of SAPEA. We also discuss where to draw the line between evidence and democratic decision-making; whether scientists should air their disagreements in public or keep them behind closed doors; scientific hubris vs humility; and the emerging phenomenon of the celebrity science advisor.

Transcript

The transcript below was generated automatically and may contain inaccuracies.

Toby: Hello, welcome to the Science for Policy podcast. My name's Toby and I'm very happy to welcome and introduce my guest for today, Sir Peter Gluckman. Sir Peter is the Chair of the International Network for Government Science Advice, or INGSA, and in that capacity he's now leading a worldwide evidence gathering exercise to assess the impact of science advice during the COVID-19 pandemic. He's also President-elect of the International Science Council, served as the New Zealand Government Chief Scientific Advisor until 2018, and is the founding director of the very excitingly named Centre for Informed Futures at the University of Auckland. Peter, hello and welcome and thanks for being here.

Peter Gluckman: Good evening, Toby.

Toby: Through the miracle of technology, I can say good morning to you. We're talking via video link from time zones on opposite sides of the globe. Which we've had some trouble getting set up, but definitely we're there now.

Peter Gluckman: Well, yeah, yeah, it often goes that way. But we're here now.

Toby: So the Centre for Informed Futures, it sounds very grand. What's it all about?

Peter Gluckman: Well, when I was Chief Science Advisor to three governments, three prime ministers in New Zealand, it became clear to me that with a short electoral cycle, we have an election only every three years, that there wasn't a lot of really strategic long term thinking, except in one or two areas of government. But I also think more generally, the world faces a number of challenges in parallel. Obviously, climate change and environmental degradation, loss of biodiversity, but also massive demographic change, massive geographic, geostrategic change, significant economic pressures, and so forth. And so there's a need for thinking which is more than just narrow traditional foresight or horizon scanning, but actually thinking about how all these things interconnect and looking about how decisions are made not only by government, but by society as a whole. So our focus is on the middle to long term. The focus is also on issues of how a modern democracy should work at all levels to make those complex decisions that a society is going to need to succeed going ahead. We've got the technological challenges, we've got the disruptions to social cohesion. As I said, we've got environmental change, demographic change, geostrategic change, these things are all coming at once and at speed. And it needs reflective thinking from people who are outside government, but in ways that can both inform government and inform society.

Toby: And can those things usefully be studied together as a single subject in a single research centre? Because they sound to me like very different challenges.

Peter Gluckman: Absolutely. The whole point of the Centre for Informed Futures is to be a transdisciplinary think tank, following the principles, if you like, of post-normal science, engaging with stakeholders, looking at it, always posing the question from the multiple angles up front. Obviously within the centre itself, we have a range of people from a range of backgrounds, all of whom are experienced in transdisciplinary thinking. But we also have affiliate members across New Zealand and in Europe and indeed around the world.

Toby: Why are we not better at this already? I mean, you mentioned that you framed it as a kind of governance issue, or at least partly that there's this incentive among governments towards short-term thinking. Do you think that's it? Or do you think it's more about human nature or what?

Peter Gluckman: Well, I think human nature does have a short time preference in general. It's easier to look at the things that are going to happen tomorrow than make investments or make or pay a price for something that will happen into the future. I mean, we've seen that even in Covid, where the alerts for pandemics have been across many countries for many years, but the level of pandemic planning and preparation has been quite variable. It can always be put off for tomorrow. And I think that we are facing a set of parallel transformations, as I've suggested, which cannot be put off. We need to be thinking about these issues now before they disrupt social cohesion, before they undermine societal resilience, before they turn to conflict, before we lose the opportunity to protect the planet and so forth. There's lots of things here that need a proactive response. And yes, government is not the only player, but it was when I was as chief science advisor, I could see the need for new ways of thinking. Similarly, I think the academic community has been buried within its silos of biology's here and the humanities are there and sociology's here and anthropology's there. And yes, they have come together in isolated ways in interdisciplinary research. But universities are not well set up for transdisciplinary research. The OECD recently issued a report just about two months ago urging a fundamental change in our thinking about these major problems, which some people call them the wicked problems, but the problems that society faces that need a transdisciplinary approach and universities need to change their way of researching and teaching to achieve that.

Toby: How about the business of science advice? Does that need to change too?

Peter Gluckman: Well, I think that's the core issue for science advice. All science advice outside the very technical issues that governments have always had scientists to support have really been in the space of post-normal science, where you've got complex science interfacing with values, which by definition are in dispute. I mean, I'm not saying science doesn't have values. Of course, it has its own set of values. But here I'm talking about the public values that are the normal dispute in any democracy. And so therefore, it's always been the case that good science advice needs to be based on looking at the problem, not through one lens, let's say of climate science, if in the case of a climate change issue, but through the lens of economics, the lens of sociology, the lens of human behaviour, etc, etc. And so we worked hard when we were in, I was chief science advisor, to build up an approach to the extent we could in the environment we were working in, which was transdisciplinary, but we could do better. And we have done better.

Toby: Do you think that applies also to our recent experiences with the COVID-19 pandemic, which has been a real test for science advice? Have the different disciplines been brought together here in the right way?

Peter Gluckman: In terms of one of the questions we're asking, and the work that INGSA is currently doing is how were the different disciplines of the social sciences, the natural sciences, the data sciences, and the normative humanities brought together in offering advice. Now in some situations, it's not, that's not what's needed. But when the government is looking at its overall response, all those elements are needed. There's an ethical domain, there's clearly an economic domain, there's a moral domain, as well as the modelling that's been used and overused in my judgment, and as well as the science of the virus, the science of clinical epidemiology, all of those needed to be brought together. Now, how well they've been brought together is very, is very variable. And I think there are countries which have not, who on paper have done it. But the responses and the outcomes suggest there's a lot that it hasn't gone as well as it might. And I think one of the questions that we're asking in the global work we're doing is why has it failed, the systems that were presumed to be very good, have at least failed when you look at it in terms of what's happened in the pandemic. Now, is that a failure of science advice? Is that a failure of the policy community and of the political community? Because at the end of the day, scientists are not technocrats. They can only advise on the scientific components of what they're looking at. It is ultimately for the policy community and the political community to decide what to do with that knowledge. Now in emergencies, one hopes that some of the issues that otherwise would be there in the political framing are removed, but it doesn't appear that's been the case in many countries. And you know, it is obvious that some countries have largely ignored the scientific advice and other countries as concerns that the science advice wasn't pluralistic enough to achieve, to re-achieve an appropriate consensus. And all of that will be hard to sort out in the immediacy until we can stand back from it without people trying to protect their own role in these things and be and make a deep study. So what INGSA is doing is we have reporters in about 120 countries, reporters being academics or policy community people, not journalists who have been feeding into a database that we've been, that we're maintaining in Auckland, the INGSA policy tracker, which is focused not so much on the policy choices made, but on what evidence was provided, how was it provided, who provided it, what disciplines were involved, how, what were the mechanisms of provision, were they formal or informal or ad hoc, how was it transmitted into the government processes, and clearly we're relating that also to the decisions that have been made. And it's very interesting to explore why were some decisions made at some time, to what extent was it based on evidence, how was it evidence used, how was it incorporated into the final policies made.

Toby: Yeah, so this is an interesting topic and there's lots of strands to try and tease apart in what you've just said. I'd like to try and separate a few of them. But before we do, let me just check that I have the right basic idea, because I think there's two different things you could be saying here. The first one is, well, we were all dealing with the same challenge fundamentally around the world. And despite that, there have been clear differences in the way different countries have responded to that challenge, different policies, different approaches, different timings. You can allow for all kinds of other factors such as political perspectives, demography, even different geography. So for instance, closing your external borders is much easier to do when you're Australia than Germany, say. But the thesis is that even once you allow for those things, there's still a correlation between the approach that different countries have taken and the way that those countries have used science and science advice. OK, so that's one idea. But I wonder whether you also wanted to go a bit further than that and talk about outcomes as well as approaches. So we have to be careful about comparing outcomes for all kinds of reasons, not least of which we're not out of the woods yet. But still, looking at the state of things right now, there do seem to be some clear winners and some clear losers. Some countries have ridden out the storm relatively well so far, and others maybe not so much. So would you say there's a correlation between those different success levels, if you like, and different approaches to scientific advice?

Peter Gluckman: I don't know the answer to that, Toby. I mean, I think that that's one of the questions to ask. But clearly, science advice is ongoing. And I think that what we're learning is the countries that have done best, and many of them are quite small countries, which don't necessarily have well, you know, sophisticated systems are those where quick decisions were made, and early decisions were made. And I think the strategies for countries that got in there quick, have been different as a consequence to that, to the strategies of countries that got to have large community spread, and then were trying to get into suppression modes, because their choices were very different once you had a large amount of community spread. While you had limited community spread, you could choose to eliminate or to near eliminate the virus. Once you have a lot of community spread, you're really trying to manage it. So you don't overload your healthcare system. And so the mere decision of when you go, create a path dependency of what your options are after that. Now, underneath all of that, there's a whole lot of science advice at different levels, in different contexts, and I'm interested or is interested in trying to understand it. Because if you think about it in simplistic terms, the countries either had a pre-existing science advice system, or they didn't, did they then create an ad hoc one? Or did they just continue to use the officials within government or reach out to the to their friends in academia? What I think I'm coming to the conclusion is, you need a mechanism, because you have to have a mechanism that, if you like defines the access of scientists into the system. But you also need people who are skilled in that mechanism with the right skill set and personality, both to synthesize the evidence, which may not be the same people as to brokerage that in evidence to communicate it from the integration of the knowledge to a way that the policymaker politician can understand and use. And so I think there's a lot to learn here. As you said, this is a unique experiment, because here we have a global impact on 194 countries at roughly the same time, with the same levels of decision-making having to be made in different countries. And we're doing a lot of work, not just in developed countries. But we're very, very interested in what's going on in low income countries, middle income countries as well, some of whom have made some very good calls over managing the virus. And it's interesting to see how they've done that. Vietnam is a classic example, one level of a large country. But look at a number of the small island states, countries like Jamaica, who've done very well, with a relatively limited academic and scientific infrastructure, they still been able to come to choices, which if the measure of success is how the pandemic evolved, they've done very well.

Toby: Okay, so let's say you highlighted two big topics there. One is the infrastructure for delivering science advice. And the other is the skill set that the people inside that infrastructure have in terms of being able to interface with policymaking effectively. So let's take those issues one by one then. I was reading an essay you wrote in May, which is on the INGSA website, and I'll put a link in the show notes for listeners. In that essay, you wrote the following, which surprised me a little, "in some countries, science advisory ecosystems were well developed, and in others, they are essentially non-existent. There is little in the pandemic response to suggest that one model is superior to another". And you alluded to that a moment or two ago as well. So there's really been no benefit to a country of having a mature science advice system in terms of how well it's performed during a pandemic. Really?

Peter Gluckman: No, I think that may be bad wording by me. There are two components to that statement. The first is, I mean, if we take the United States and the United Kingdom, both of which would consider they're very well developed science advisory systems. I mean, the UK one is often put on a pinnacle as one of the most sophisticated, if not the most sophisticated, based on looking what happened to disease patterns in Britain. The system has not done a good job. Now, maybe the science advisors did a good job and they were just rejected by the policy in my community. But that in itself tells me that there's something wrong, either in the skill set or the process. So there's that. The second point was more, we have a variety of different models of science advice around the world. And, you know, we have models based on academies, which is very common in Europe. We have models based on science advisors, which are more common in the British heritage, the British Commonwealth heritage. We have national commissions in some countries. There's a variety. We have committees, pandemic committees. We have lots of different models. What I was trying to allude to is we don't actually as yet, and it may yet turn out that we can find that there's a pattern. I can't say that this particular model of science advising as a structure is more effective than another. Right. So Mark Wilpert, who used to be, who was a former chief science advisor in Great Britain, who was very influential in my thinking, used to say the biggest role of a chief science advisor was in emergencies. Now I think in New Zealand, I can say that's proven to be the case as a yes, but would have to say the jury is, would not come to that conclusion about the United Kingdom at this moment. But why is that the case? And that may not be the current incumbent's fault. It may represent a whole lot of other dynamics about which I have some suspicions, but it's not appropriate before we do more analysis to actually work that through. Well, yeah. So look, I think you do still need to have a system. I think whether, and the system is not unitary. I mean, if you think about a mature science advisory system, it will have people who have access to the executive of government. People have access to the legislators of government. It will have academies. It will have think tanks. It will potentially, it will have universities with research centres and so forth. You'll have government research institutions. All of those are potentially part of an ecosystem of people who have some role in either evidence synthesis or evidence brokerage. I come back to the point, there's two separate things going on here. One is integrating the knowledge. One is transmitting that knowledge to both the government and that can mean also the policy maker and the political side of government, but also to the public. And I think one of the things here is to make sure that transparency exists. And I think overall, transparency has not been high or necessarily as high as it might have been for the information that policy makers are receiving from the science advisory mechanisms in this kind of situation. Now part of it will be it's too, argued it's too technical or it might cause panic, but that again comes down to the skill of the broker to be able to communicate that to the public because the politician in the end is a reflection of the public.

Toby: So I hear what you're saying about the need for the broker. Does this caricature maybe of the perfect scientist or even the perfect science advisor who stands entirely outside the system, is completely independent of political bias, the old cliché speaks truth to power, speaks without concern for political convenience and so on. But when you talk about someone who has this additional skill of brokerage, you want someone who also perhaps understands well the world of politics and can speak the right language and can tailor their advice to what's needed and communicate it clearly and can also communicate it to the public in a way that...

Peter Gluckman: But not corrupt the science.

Toby: Sure, but it's a lot to ask.

Peter Gluckman: It means still speaking truth to power. And the analogy I always use is imagine a Chinese person speaking to a German and neither understand each other's language, culture or body language. If you don't have an interpreter who understands all those dimensions interacting between the two of them, there'll be miscommunication. And I think the point I'm trying to make about the skills of brokerages, it's not about putting on a biased political hat. It's not about being arrogant about the scientist knows everything. It's about reflecting and making sure that the politician or the policymaker understands what the science is saying and the scientist understands what the politician needs to know. That is a skill set in its own right, which Roger Pielke first raised in his book, The Honest Broker, which I've written a lot more about and which I think is forgotten. Not every great scientist, no matter how good they are at evidence synthesis or knowing their field necessarily is a great person when placed in front of a politician or with the public. And they don't have to be. And not every site and you don't expect a science advisor or a person who's acting in that role to know everything about every domain of science. That's not their role. They know how to communicate. They know how to make sure there's alignment between the question and answer. And they know how to say to the politician, this is what the science is telling you. These are the options that therefore emerge for you. These are the consequences of these options that I can see. But I'm not you. I cannot make the decisions on behalf of your citizens. You have to take into account all the normative considerations that you must, the fiscal considerations, the public responses, etc, etc. Now again, I'm not pretending that science is value free. Don't get me wrong. I'm a great believer in the work of Heather Douglas and particularly her work on inductive risk. You know, if you think about science, it's got two large areas where values come into play. One is in the choice of the question to study and the choice of the methodology of how to study it. In the collection of the data and the analysis of the data, we hope is values free in the sense that we're not putting biases to put on it. But the biggest, biggest values judgment and the one that every person in science advice has to think about is the inductive risk. That is the difference between what you know and what you conclude. Because if you go too early and you claim a certainty and you're wrong, there can be consequences. If you go too late, there can be consequences. And it's always in science advice when I was a science advisor, the thing at the top of my mind was the inductive risk question. It's whether you have a sufficiency of evidence of sufficient quality on which to reach a conclusion that you can recommend to the government. The other point to make, which I think got lost in the pandemic quite a lot from the modelers is the issue of uncertainty. All science has levels of uncertainty around it. Now it could be epistemic uncertainty, it could be methodological uncertainty, there's many reasons for uncertainty. But we're not we tended particularly in this pandemic to use models which themselves are models of models of reality, because we don't know everything about the buyers, or all the people that are going to be infected by the virus, or the environment in which the people in the bar are both coexisting. We've tended to work without giving estimates of probability or estimates of certainty around it. And I think that that's very harmful. I think we it's not a matter of saying science cannot help. I think it's a matter of science being honest about what we know what we don't know. And I've always said the most important skill that a science advisor can have is to say we don't know the answer, or we have this level of uncertainty, or even more importantly, science cannot answer that question. I mean, I think the climate change community learned the importance of being honest about uncertainty and probability. And they've done a much better job than what we've done in this pandemic.

Toby: Isn't there an important difference, though, between the conversation about climate change and the uncertainty that surrounds it, and the pandemic response also characterized by a high degree of uncertainty, because of evidence gaps, or what have you. And the obvious difference is of speed. I mean, normally, when we talk about science advice processes, those processes have built in a certain capacity for reflection and dialogue between politicians and scientists and society as well, actually. And that helped that gives both sides the chance to understand and get clear on the uncertainty, even in the cutting, setting aside big, complex systemic issues like climate change, even in the clear cut cases, it still happens slowly. You don't discover on Monday that a particular pesticide threatens insect life, and then expect to rewrite your laws on Tuesday, you have time to think about it to weigh up different factors and really tease out those uncertainties.

Peter Gluckman: Yes, but in any situation, science still has uncertainties, and we need to be far better about communicating. For no other reason than people trust you when you admit to the limits on your knowledge. And trust is so important in this situation, because we know a lot about the fact that trust, undermining trust, that leads to failure to follow, comply with what is best to do in this situation, and so forth. Trust in government is critically important. Trust and knowledge, reliable knowledge is important. And I think that we've learned that when you communicate with dogmatism, and you express hubris, you're less likely to be an effective communicator. When you communicate with with humility and acknowledge the limits of science, the science you communicate is more accepted. And even in emergencies, to use an example, Prime Minister, we think what's going on, there'll be a minimum of at least so many people likely to be affected, and a maximum of so many, we don't know what it will be in the middle. Prime Ministers can work to that, you don't need to go in there and say, Prime Minister, we think that next week, there'll be 357 people admitted to hospital, that just opens you up to a whole lot of other things. And I think that models have been very important. And in this as in climate change, and showing what might happen. But what it should be doing is being used not to put models on a pedestal, but rather to indicate to government the urgency of action. Because it's been the urgency of action that's achieved the outcomes that appear to be better.

Toby: So following on from that, can I ask you about the relatively new phenomenon of the celebrity science advisor? There's a slightly, I'm using the phrase slightly mischievously, but you know what I mean. Many governments have been keen to advertise their science advisor credentials recently for very obvious reasons. We have advisors and advisory committees have really become household names in some parts of the world, which is unheard of a year ago, at least where I'm sitting. Has that got any implications, do you think for how science advice is done? Do you think it has any bearing on this issue of scientific hubris or humility?

Peter Gluckman: It's a really interesting question. And I think it's a very good question, Toby. Let me preempt it by going back to my nine years as science advisor, please. And that I only ever stood alongside a politician twice in nine years, because I don't think that's the role of the science advisor to be seen there, to sort of hold the hand of the politician on matters that are ultimately political decisions. On the other hand, those were neither of those situations were emergencies. They were urgent in a political sense, perhaps not urgent in another sense. This is a different situation where the expertise of what generally been achieved, medical, medical officer or an epidemiologist or a science advisor is of importance to the media and therefore to the public. And I think in this situation where decisions are happening on a almost daily basis in some countries, I think it is important the media has access to the expert, the person giving the advice. I think, however, how you manage that so it doesn't become politicized or turn into, as you said, the celebrity who doesn't know how to back out from being a celebrity and may compromise the way they act is something that is concerning. But I think I'd rather not make any definitive comments just because I think it is a new phenomenon and it's one that we need to see over time as we move through the pandemic and hopefully beyond the pandemic. Has it? What are the consequences of it? At one level, I quite like the idea that scientists can be seen as heroes. I like the idea that the scientists being held up are doing it for the right reasons in the sense that they are communicating to the public. And I like all of that. I just want to be sure that they're still speaking truth to power.

Toby: All right, good. Well, I'll come back and ask you in 18 months. We'll have a conversation again and see what you think.

Peter Gluckman: Absolutely.

Toby: Then the other side of this great publicity boost is the question of public disagreements. So I'm not talking now about fights between scientists and politicians, that's one thing, but you can also have fights between scientists and other scientists. I'm sure you know one very vivid example is in the UK where a group of scientists who were uncomfortable with the government's approach set up what was essentially a rival advisory committee and got a lot of media attention. If there are those disputes, which seems almost inevitable in an area where there's a lot of uncertainty, where political tensions are running high and so on, where should the disputes play out?

Peter Gluckman: Well, I mean, let me, we've got to put one thing behind that before we start. The nature of science is structured skepticism. And without getting too deep into the philosophy of science, we know that inherent in the scientific method is critique and criticism and frank discussion of other people's work, criticism of it, critique. That's how science moves forward. Yeah, great. So that's the point of view that scientists need to have mechanisms for commenting on science. That should be certainly not on an ad hominem basis, but on the basis of discussing the science. At times, we've seen it to generate into ad hominem debate. And that's not what the scientific culture is about. Beyond that, however, I think we've got to sort out what's been really going on. Clearly technical debates are best held by technical committees and reached into a consensus that reaches the government. You can't expect governments to be scientific referees. It's the first rule of science advice in a sense. You can't expect a politician to be able to peer review science and be a referee between different forms of science. The science advisory mechanism, whatever it is, should be trying to bring it together. Now was the situation that you alluded to one of the perceived failure of the scientific advisory mechanism or was it because there wasn't enough transparency as to what debates had taken place in Britain and those committees that led to David King setting up Independent Sage? I have not enough knowledge at this stage to draw an opinion. We have had similar experiences in New Zealand, not in the pandemic, but in the past. And what I've said to the scientists and they tend to be more technical. I said to the scientists, have all your technical debates in private. You must have them. That is part of the scientific method. But when you come to speaking to the public, you should try and speak consensually. What do you agree? And even if that includes and we still have disagreements on the following, at least do so in a consensual way. And so we need to actually look at other issues. And I don't want this to be a picking on any individual because I'm talking generically. When a scientist is getting up there in public at a different time to where there is a mechanism in play, why are they doing it? Are they doing it because they want funding? Are they doing it because they were they were annoyed that they've been excluded from the official mechanism, etc, etc? There are lots of motivations. Scientists are human. I think in this particular situation you allude to quite clearly a significant part of the community was concerned about the general direction the government was taking. Whether that was the only way to do it, whether it was to do with the personalities involved, those are things for others to judge on. I mean, it seems like that's a strong argument for making a mechanism as broad a church as it can be, at least at the bottom. That way, by the time you get to the top, when you're communicating directly with a politician, you're able to present an outcome either where the disagreements have been settled or where they can be clearly characterized and explained. That way, where there is disagreement, it can take place within the mechanism and people don't feel the need to go outside it. I think in general that's true. Obviously, it depends on what the issue is, how technical it is or how non-technical it is, how diverse you need to be. And also how quickly it has to be dealt with, surely, because you can't those issues come into play. And I mean, it is complicated. And diversity may only be a few people. It still can lead to a diverse set of views. I mean, two scientists, five opinions. So you have to… and this comes from the maturity of the people managing the system, the wisdom, I know it's a strange word, wisdom, but I think wisdom comes into play in how you integrate different bodies of knowledge, different views on the reliability of knowledge. And that's again the skill of the evidence synthesis component of what we've been talking about. Because actually at the end, I think, you know, in general, you come back to the point that politicians and policymakers are not scientists. The more consensual you can bring the view together, even with uncertainties or even with exclusions, the more likely it is that government will act on that advice.

Toby: Just to finish on this topic, because we are running out of time, unfortunately. Purely out of interest, have you been able to gather information about informal sources of science advice outside of science mechanisms?

Peter Gluckman: Yes.

Toby: Do you have anything to share about that? Well, I think there are a number of countries in which clearly people outside the formal system have had a lot of impact on the decisions that have been made. Now, how those people had the impact is in one of two ways. Either they're very noisy in the public domain, in the media, or they're the friends of the politician, or the friend of the policymaker. And that group, again, there may have been some very good advice given through that route. But particularly the friend leaping out and giving advice, the person who can ring the prime minister or the prime minister reaches out to avoid all the checks and balances of a system. And I think that's one of the things now I'm not saying it's led to any bad outcomes. I'm not suggesting that at all. What I'm saying is in general, that worries me, just as I think it worries me if it's the noisy player outside the system, and the considerations that that person's brought to bear had not been included in some way within the process, whatever that process is.

Toby: Okay, interesting. Looking back 10 years from now, do you think we will see this as a watershed moment for science advice? Is everything different from here on out?

Peter Gluckman: Well, I hope it will be. But I think we're so early in the story of it yet, that it's hard. And it will need some very deep evaluation of what worked and what's not worked. And that can't really happen until the inevitable post mortems in different countries occur. And people like INGSA do a deep dive deeper dives into the actual case histories, if you like, we've got 100 odd case histories to study. But I think it's done two things. It will certainly say that every country needs to have a process. It needs to have some form of process pre established for dealing with certain kinds of events. And that may encourage countries to think about science advisory mechanisms that can deal with non acute issues as well. That would be a really interesting outcome. And whether it's done through the European models, or the anglophone models, or any other model, the model is not as important as the function. Is there a function of evidence synthesis? Is there a function of evidence brokerage? The second thing though, which we've not discussed, and we could spend another two hours discussing is why were the various forms of preparatory evaluation not taken into good account? In other words, there are countries with risk registers, there are countries that claim to have pandemic response plans. There was a pandemic response index produced by economists, which showed a complete, if you look at the OECD countries, there's almost an inverse relationship between the alleged responsivity of the country and what actually happened, but that's biased by the UK, USA and a couple of other countries. But beyond that, why are we not doing well? I mean, the risks of a zoonotic pandemic were well known in the scientific community, in the public health community, I think in most governments.

Toby: They've been known more widely as well. I mean, as long as I've been working in science communication, there have been people who've popped up on public platforms and said, you know, we're overdue a pandemic, mark my words, it'll be along in a minute now.

Peter Gluckman: Absolutely. Yet when you look at it, very few countries were well prepared. I mean, one of the things that most people are looking at are the countries that were exposed to SARS that tended to be quicker in making their responses, Singapore, Hong Kong, Taiwan, South Korea. Now some of them did very well with the rapid response. They've had some subsequent problems, but that's another story. But at least their initial responses were rapid and they were well prepared. They knew what to get on and do, which says that they had learned the lesson from SARS. Other countries which had detailed pandemic plans, as it turned out, were not as prepared as people thought they were. And then there's another dimension, which is again for another conversation. What about the role of the international community, the role of WHO, the role of why were they so slow to declare a pandemic? Why would they not recommend the closing of borders when it turns out the closure of borders was actually one of the most effective measures that a number of countries could make? There's questions to ask there. You know, given that so many countries of the world are low and middle income countries with less than complete science mechanisms, we've got to think about the role of the international agencies as well in this. They were slow to come to the party in many ways. And I think that there's other questions to ask here. Why did it take so long for masks to be seen to be part of the control mechanism? Things like that that we will have to ask questions about in time.

Toby: Do you think COVID-19 was a dress rehearsal?

Peter Gluckman: I could answer that with a yes or I can answer it with a no. Take your pick. Because it is a virus with a long incubation period, it's more likely to be one of lower virulence than viruses with high incubation, short incubation periods. So in that sense, we were lucky. And the sense that it has a long incubation period, we're very unlikely because it's making control mechanisms much harder. I think on top of that, because the virus is not behaving like other coronaviruses do, it's also conflated and confused matters. So if we take the virus and think about the medical treatment of the virus, it was treated in the early days as if it was largely a respiratory virus. And it only took several months, largely as a result of those countries which sadly had terrible loads of very sick people, to understand that it was a systemic virus with an inflammatory component and which affects the immune system in complex ways, which changes the therapeutic approaches, the therapeutic needs, and is now probably confounding some of our what the antibody tests mean, etc, etc. And so there's a lot that is both good and lucky about the situation that sadly, it's killed a lot of people, but not as many people as perhaps Ebola had got out there and loose. But on the other hand, unlucky, in that it has killed so many people and has infected so many people, and has been so fundamentally disruptive to social lives, the economic lives, and all that follows with it. I mean, the emotional burden for many people in many countries, the mental health burden, not just now, but 1, 2, 3 years from now, from this virus is a hell of a burden on humankind that this virus is induced. So I mean, I don't think we can talk about lucky or unlucky. I think what we have to say is, it's a shitty thing that the virus came along. We are still struggling to handle it in many, many countries. There's still a hell of a lot of unknowns. We're still a significant way from a from a an effective treatment, or an effective vaccine that is available to everybody who needs it. And even for anybody to have it in reality. And so this is an enormous burden on not just advanced countries, but on all country, which is going to be with us for a large part of a decade, in the sense that even if the vaccinology does very well, the cost to society, the cost to people's lives, the cost to our mental health, the cost to many people's dreams, has been upended enormously. But I think that we need to get beyond counting the number of people who have died, acknowledging that that has been an awful cost to pay and recognise that the living also have a lot of costs to bear moving ahead.

Toby: On that sobering note, Peter Gluckman, thank you very much indeed for talking to me. And I look forward to speaking to you again, when your study is complete, and you've got some results to share with us.

Peter Gluckman: Happy to talk, Toby. Now I can have time for a whiskey.

Uncle SAM

Staff login