Impact News
We know there is lots going on out there and it's hard to keep track!
Here we share news, webinars, training, or anything else impact-related we think potentially useful.
Anyone who has attended our Communicating your Impact module will know just how deep this subject actually goes, and that doing this well requires a bit of careful planning. Our science and research training means that we are great at thinking about the audience of our peers, but forging effective pathways to impact means forging connections with a much more diverse number of audiences who have very different needs. iPEN's recent webinar featuring Hannah McKercher and George Slim highlighted some great examples of this if your audience is policy makers. For example the templates that Hannah has developed have been tailored specifically as 'A3s' which are routinely used around government to summarise and share information to people who have very little time to read further. Recently some great new resources have been launched, which make a great addition to the toolkit. Scroll on to find out more! The SCICOMM LAUNDROMAT - helping you 'take a load off' The Scicomm Laundromat is a great new resource that spun out of PhD research supported by Te Pūhana Matatini, one of the Centres of Research Excellence (CoRES). Te Pūhana Matatini applies complexity science and inter- and transdisciplinary approaches to try and address the challenging issues we are currently facing. This website has lots of practical advice, examples, and resources that we encourage you to check out. They have really clear instructions taking you through the 'cycle' and anyone who's been involved with any co-design process will see how this kind of approach has been used. There are really clear step-by-step instructions taking all the guess work out of how to use these resources including instructional videos on how to construct the washing machine! The resources don't refer to programme logics as a tool to help you think through some of the steps. Don't forget to use this too, as it will help 'turbocharge' your scicomm wash cycle thinking. Victoria University has created this complete FREE self-paced online science communications course, funded by the Prime Minister's Science Communications Prize.
" You will hear from experienced communicators such as Siouxsie Wiles, Shaun Hendy and Rebecca Priestley. They will share their personal science communication journeys and give advice on how to enter the field. You will also learn about the current issues facing Aotearoa’s dynamic science communication community, through videos on topics such as communicating with Māori audiences, the ethics of science communication, and the importance of engaging media and policymakers in science issues." They estimate it will take you 6 - 10 hours to work through the course. Click here to find out more and to register.
0 Comments
On Valentines day, iPEN invited a range of representatives working across the Research Science and Innovation sector to talk about supporting impact from research for the first of what will hopefully be some semi-regular 'catch-ups'. COVID, working from home, and our group's work plan had meant that our focus had been fairly inwardly focused, especially as we worked on analysis systemic barriers and enablers of impact. We used our report to kick start the discussion, asking those who had gathered both in person or online whether our findings resonated with them, and if there were opportunities to collaborate where we had shared interest to support the sector's impact. What did we find out?Our hui participants said that our findings resonated with their own experiences of what helps or hinders research impact. We talked a lot about how both discourse and experience reinforce that supporting impact is a 'team sport' (a recurring theme also identified in the recent SfTī report - click here for our post on this last month) and the efficiency and effectiveness of taking a consistent approach across the sector wherever possible. Having an opportunity to come together and discuss what we are all doing was much appreciated, and together we identified a range of tangible opportunities and actions we could progress to work collaboratively and in a more joined up manner. What now?iPEN is already starting to progress some of these actions, and we hope to share more of the fruits from this collective mahi in the months to come. This will include future invitations to discuss with others who missed this time around. Click here if you'd like to read the full summary of our discussion (only 3 pages), as well as the agenda and list of who attended. If you're interested to find out more about what we discussed feel free to get in touch too. Recently the Science for Technological Innovation National Science Challenge published a report summarizing findings from one of their longitudinal pieces of research – exploring how to build the capacity of New Zealand’s innovation system. This informative report does a really good job at summarizing some of the key trends that have occurred overseas, and then outlines four key trends, before detailing their findings. What jumped out at us? The whole report is worth a read if you have time, but what jumped out for us in particular was their reflection that ‘traditional science’ is no longer sufficient to address todays challenges, and the shift to ‘mode 2’ ‘post-normal’ science required a different combination of people, skills, and approaches. “Despite their significant differences, addressing missions and global challenges require individuals, teams and organizations to operate across multiple modes.” (p. 17 emphasis original). They note (trend 3) that New Zealand’s unique bi-cultural context has already set us on this journey, but they also note that this comes with many challenges and constraints and summaries many of the key issues, from the effectiveness of diversity policies, challenges associated with protecting indigenous knowledge and data sovereignty, and the challenges associated with genuinely supporting Måori researchers. Their expanded typology of roles a researcher or scientist we found particularly useful as it clearly and succinctly describes some of the key mindsets and behaviours different ‘types’ of researcher use, and what this means for where and how they might engage with industry (see below, on page 25), and talk about how individuals (and organisation) might need to shift between modes. Impact (including innovation) is a team sport This report coincided with our own systems analysis of the barriers and enablers of impact. We were happy to read that our own findings are consistent and complementary. We also reflected on the different ‘tone’ of the report findings which we would attribute to their application of systems thinking to this work. The importance of considering things like values and norms is a clear example of how considering mindsets is a useful way to unpack why we see what we see. One area that this report complements iPEN’s own systems analysis (and our experience in supporting researcher) is that impact (including innovation) can only be achieved through the efforts of many. The report identified the importance of integrative capacity and critical role ‘intermediaries’ play. These intermediaries can be people, groups, and organizations, they may hold this role either explicitly or implicitly, and play a central role in ‘brokering’ and boundary spanning. We thought it was especially helpful how the barriers to effectively engaging was linked to tensions and differences in things like norms, values and industry. In our own training we encourage researchers to try and ‘empathise’ with their collaborators, which is all about trying to better understand their values, language, and pain points. It was refreshing to see the report highlight that researchers aren’t binary and that instead many have multi-personalities / identities. Recognizing that researchers have complex interests and drivers means better teams can be assembled, where peoples skills and talents can be leveraged in ways that complement others. “Intermediaries at all levels are crucial to a well-functioning science-based open innovation system” Our own analysis highlighted the absolutely fundamental role trust and partnerships play. SfTI’s findings reinforce and complement this, and express how this is achieved through the lens of roles actors play in the system.
We loved reading the report and we’re now looking at ways we can forge our own opportunities to integrated with SfTI and other NSCs to share learnings and insights like these. Only got 5 minutes? We suggest reading pages 8 – 9 as a great starting place where the key questions and summary of actions are outlined. The four global research trends are summarized on pages 17 – 27. The full observations, findings and insights are on pages 28 onwards. The fundamentals of achieving impact from research, is getting the knowledge and information that is contained (usually) in academic research publications into the hands and heads of the next and end user - the people who can start applying that knowledge. This means a different audience, and with that a need to change how you’re communicating, so you’re speaking their language. As researchers, a key audience for us are policy makers. They might be involved in operational policy (setting rules and guidelines for how things get done), or strategic policy which is much more about setting the overall rules of the game (like legislation). In either case, someone who works in policy needs the ‘so what’ bits of your research provided to them in as short but comprehensive a way as possible.
What’s great about this resource is Hannah has developed two short explanatory videos, and included some really helpful completed examples, where you can see how the tips play out when summarising a real piece of research. All the templates and videos can be found here. We suggest watching the second video first. In this video they talk about a number of the kinds of challenges and issues we also talk about in our own training, and is a helpful reminder that these are the kinds of challenges anyone working in research experiences. Although you can find the full guide here, we suggest you download and review the DELTA example first. In this example you can see exactly how they’ve completed the policy brief with two additional pages of explanation. The CellAg example then shows another example of a completed brief. What is really helpful with both these examples is they’ve been drafted as what is the classic ‘one-pager’. Anyone who’s worked in or with government knows that if you can’t summarize it on an A3 page, you’re probably going to loose your audience, so these are a particularly helpful resource for the New Zealand context. Nearly four years ago iPEN changed from being a group of colleagues talking about how to help their organisations deliver research impact, to the more formal programme of work you can see today. However before embarking on developing the shared tools, resources and training that is now iPEN's 'core' focus, we were advised to first understand what the 'current state' was across our organisations. This involved our expert adviser conducting a dizzying number of interviews (over 100!) with staff across all seven of our organisations, and we also circulated a survey which provided us with even more rich information from the 100s of responses we had to that. We discovered that many of our scientists knew what research impact is, and were highly motivated to deliver it - many commented that was why the became scientists in the first place - what they struggled with was understanding HOW to do it. However it also became clear from the feedback that there were some others things they felt were getting in the way of impact, but the roots of these ran deeper than just a lack of skills or resources. Fast forward two years - with our training programme up and running - we decided to return to this information to see if we could learn more about these challenges so we could see if there were things iPEN could do help. We did a systems analysisThis time we took a 'systems view' as we are fortunate to have a couple of team members with expertise in systems thinking who observed that the kinds of challenges that had been described had much more complex 'roots', and so our work began in earnest, speaking with researchers and scientists who were recognised as have had impact, about what had helped and hindered them. Along the way we workshopped what was emerging with wider colleagues across the RSI system to check if what we had identified resonated with them, and also to check if our analysis and opportunities for action made sense. Completely coincidentally Te Ara Pairangi was launched, so we have also been sharing our findings and thinking to MBIE, through including making a submission and also inviting them to join workshops we ran during 2022. so what did we learn?First we learned that doing a systems analysis is indeed a messy and intellectually challenging process. It was tough to realise 'solutions' are a misnomer in systems (indeed this is a symptom of 'simplistic linear thinking') and that we needed to think about where change might be useful, using out deeper understanding of how things interacted to guide this intellectual detective work. Notwithstanding the challenge and the 'mess' there were some themes that came through in our analysis loud and clear as either supporting, or slowing down our delivery of impact. The importance of relationships was identified time and time again. These trusted relationships, built between individuals over time, act very much like mortar between bricks - helping to stick bits of science together giving these 'pipelines' of work strength and resilience. We heard that research impact required a whole range of other activities that sit outside/beyond the margins so what is traditionally considered doing science, but these invisible or unseen activities are in fact critical to the process of delivering impact. We've called this the 'impact creation cycle' and mapped it against the iceberg concept to explain how the current system operates. Perverse or competing incentives in the system's design was also talked about, and these are experienced at a global level. For example given limited time, scientists described the 'forced choice' to focus on publications over other 'impactful' activities as these were part of the contract, and support career progression (this challenge is now routinely being identified globally). In total we identified seven interconnecting themes, which expands to even more when considering what achieving impact might look like from a Te Ao Māori perspective. The full summary of our findings can be found by clicking here, as well as the companion report identifying where opportunities for action lie. While we've now learned that 'solutions' are not what we should be looking for, we found the application of 'leverage points' from the systems literature a powerful way to really understand where change could be more or less effective and WHY. so what now?iPEN's mahi is focused on acting, and supporting the HOW of science. So what actions are we taking now?
We are in the early stages of figuring out what next, but sharing our findings and the opportunities for action is one important first step. We are also now taking the conversation wider, with a hui organised with a wider collection of colleagues across the system to share and discuss this work. As we say in our 'Making Sense of Impact' webinar, impact is a collective endeavour, so we'll keep you posted on what collective efforts might eventuate as we take our conversation wider.
We know that this time of year is pretty busy for our CRI staff, so we’ve tried to pick the eyes out of some of the insights we heard that are relevant to our context. If you’re keen to do your own deep-dive into the content, you can check out the programme for this year here, and find out more about the speakers here. If you want to sign up for your own access, become a Research Impact Academy member here. Here is our summary of each of the four speakers we have included. Scroll down further for full details:
Finding great collaborators who can help visually communicate messages has been important – at the start of the pandemic – when she saw a key message about flattening the curve that she felt could be more effectively communicated - she simply emailed a cartoonist she thought would do a great job and ‘the rest is history’. Her partnership with Toby Morris through the pandemic under creative commons has seen their cartoons picked up worldwide, even making their way into WHO advisory material. Another person built a website that could automatically translate their cartoons into any language, this saw their science communication reach even further. Siouxsie spoke honestly about the fact that sometimes scientists aren’t always the right messenger, and how important trust is when it comes to people really listening to what the evidence is telling us. Improving our science communication directly can certainly help build this trust, but sometimes the right approach is to instead work with others who can relay the messages. “Science is no good when it is separated from people”, and she noted that because scientists are part of society, it's not just important we learn to talk, but it's also important that we learn to listen, so we can focus on the problems people would like us to study.
“Many researchers retire without translating. You can have multiple papers in Nature or Science, and of course basic research is important but that translation is more important. At the end when I am in my deathbed I want to go away leaving a legacy. Chamindie spoke of a pragmatic approach to how she has pursued her research, which has been driven by a deep commitment to democratising access to health care. She noted, with some irony, that her research which looks at biomarkers in saliva and other body fluids has progressed with support from industry – which she has quite deliberately focused on and nurtured – rather than public funding. Her first public health grants have only come since the pandemic struck. She also offered some really interesting reflections on the care that needs to be taken if any research is progressed down the commercialisation route. She described how many institutional models use the old fashioned approach to adoption – where the baby is handed over, with no further involvement from the mother and parents. She thinks this approach has real flaws, which means the universities or institutes can actually lose out on the return on their investment when they take the researchers out of the picture, who are usually the best places to fully understand the potential of their discovery.
He and his colleagues applied an approach we at iPEN love to champion – using the KISS principle to keep it simple – and explore if they could ‘unpack’ how you do impactful research in order to develop simple tools for researchers. Through their research they applied the Kipling method (often called the six Ws –who, what, where, when, why and how), to see if they could tease out what led to impact. They came up with what is effectively a research impact diagnostic tool bydeveloping an initial set of criteria and testing the process against research with recognised impact (initially with Emerald publishing real impact award winners, and then impact case studies from the REF - Research Excellence Framework). You can complete the diagnosis for free here (just scroll down to the bottom of the page, or click on this link). He noted that people are often a bit disappointed with their results because we often lack the training to be properly impactful. But he emphasised that the tool has been developed specifically to then help direct you to behaviours and actions that supports more impactful research – what he called the ‘missing research curriculum’. Soon they will publish a ‘Research Impact Playbook’ which will help fill this missing curriculum, through the provision of tools and other approaches to help researchers strengthen the behaviours and actions they’ve found underpin impact. You can subscribe for updates on their website.
He described how these kinds of assessment processes codify explicit knowledge – the kind of knowledge that can be easily converted into numbers,words and concrete artifacts – that lend themselves to being used as evidence of impact.
He noted that any assessment framework that is structured around demonstrating proof of impact in this way privileges explicit knowledge over tacit knowledge – the kind of knowledge that is all about practice and ‘know how’. He said instead of assessment frameworks focusing on the easy to measure things at the tip of the iceberg, we needed to instead focus on the underlying knowledge and conditions that are required for impact. This is much harder to see and to measure – but critical to understand if we are serious about impact. By Sudesh Sharma and Suzanne Manning - ESR Social Systems Team Crown research institutes (CRIs) are government-owned companies that do scientific research for the benefit of Aotearoa New Zealand. There are seven CRIs in Aotearoa, which were set up to lead research and science that address New Zealand’s pressing issues and achieve economic, social and environmental impact. For example, ESR is the CRI that plays a critical national role in health, food safety, groundwater, radiation and forensic sciences, and is a key contributor to public health, environment and biosecurity outcomes. While each CRI is focused on its core areas of science and research, the end goal for all of them is to create positive impact for society. A recent review called CRIs an indispensable part of the science system but said there were concerns that there is no mechanism to evaluate the research impact of CRIs. How can CRIs, the government and the public know whether CRIs are being successful if there is no rigorous process to assess the societal benefits? The Ministry for Business, Innovation and Employment (MBIE), the steward and major funder of CRIs, has recently shifted its focus to investing in research that has potential for social, economic and environmental impact. MBIE wants government funded science to have a clear and strong “line of sight to impact” and has been encouraging CRIs to use a logic model framework (Figure 1) to show this “line of sight”. Using consistent language in logic models helps to compare and evaluate across CRIs, and to focus on impact. Figure 1. MBIE's definitions of key elements of a logic model framework for research impact Of course, there are challenges to creating “line of sight to impact”. One challenge is limited concrete mechanisms for measuring impact. Most CRI monitoring and evaluation indicators have been output level measures (e.g. knowledge and skills) and sometimes outcome level measures (e.g. policy influence), but it is difficult to measure the impact of the science out in the community. It does not help that community and society impact of research tends to be incremental and over the long term, in direct contrast to funding mechanisms that have short time frames and pre-set deliverables. These funding mechanisms have arisen from neoliberal approaches that promote competition between CRIs, a “contract for services” approach, and a requirement for CRIs to generate commercial benefits from science. A focus on impact would require change at both the MBIE/policy level – determining science priorities, creating longer time frames, flexible resourcing for developing and maintaining collaborative relationships, willingness to allow a research agenda to develop in consultation with communities – and at the CRI/science level – explicit mechanisms for measuring impact, focus on authentic and reciprocal relationships, commitment to allow a research agenda to develop in consultation with communities. Another challenge is the traditional approach to Western science that divides research projects and funding into separate disciplines. Yet increasingly it is being recognised that transdisciplinary and collaborative science research projects have the potential to yield greater impact, by addressing the bigger, more complex problems using a variety of different disciplines. This requires collaborative working not only within a like-minded research team, but between researchers with very different perspectives, and even completely different worldviews as when partnering with kaupapa Māori researchers. Most importantly, social science could be a bridge among multiple knowledge systems in an Aotearoa context. There has also traditionally been a difference in status between science based on ‘things’ versus science based on ‘people’. Yet this is a false distinction, as there are important roles for social science (science with people) especially when considering the societal impact of research and innovation. Social science contributes not only to assessing impact of science on communities, but also ensuring beneficial impact of science for communities through input to the research agenda, collaboration while undertaking research, and feedback and sense-making of results. A question to be asked is: given it is relatively well established that complex problems require holistic, transdisciplinary approaches, why have we not focused our science system to operate in this way? One explanation can be found in the systems dynamics archetype of “Shifting the Burden”, as shown in Figure 2. Figure 2. Shifting the Burden archetype. Solid lines with + signs indicate a causal effect in an increasing direction. Dotted lines with - signs indicate causal effect in a decreasing direction. Interruptions to the lines indicate a delayed effect This causal loop diagram explains what happens when there are two competing options for fixing it: either apply a short-term fix that addresses the symptoms only or identify and apply a longer term fix to the underlying issue itself. In the diagram, the top balancing loop is the short-term fix, where mono-disciplinary ‘hard’ science solutions are sought to remove the symptoms, resulting in less symptoms but only temporarily.
The bottom balancing loop addresses the problem in all its complexity, using transdisciplinary approaches and being informed by mātauranga Māori. However, it is the more difficult option and takes a longer time before the impact is seen. More resources get put into the piecemeal science approach because that is what gets the ‘quick wins’, and consequently undermines efforts to address the roots of the complex problem. As a result, a reinforcing addiction loop is formed because the burden of fundamental fix is too hard and business as usual continues. For example, one of the key performance indicators for CRIs is peer-reviewed publications. They are considered evidence of success and good publication records support scientists to get more research funding. However, publications are simply outputs, not impacts. There is no equivalent recognition given for outputs of relationships and transdisciplinary collaboration that can generate genuine societal impact. The Shifting the Burden archetype could also be called the “frustrating” archetype for those who are working in the science impact space. The way out of this trap is for courageous and bold systems leadership across the system, where a plan for science, research and technology is developed that has an intergenerational view of impact, a transdisciplinary approach, and is underpinned by mātauranga Māori. Why do this? Because arguably the current approaches are not solving our complex problems. And if nothing changes – then nothing will change. A recent article in Nature by Melissa Flag outlines the tensions researchers face when launching their proposals into competitive funding environments where novelty is a key quality criterion rather than usefulness. However, securing funds to embed evidence-based solutions is often a challenge. The potential pathway to outcomes and impacts is disrupted by the inevitable need to propose yet more novel research rather than progressing toward impact, in order to secure much needed funding via competitive processes. Novelty is often bundled with concepts of risk, pushing researchers into realms where current knowledge and accepted principles are applied in `unproved or speculative’ ways, new methods or ideas are proposed, and incremental or ground-breaking advances are made. The issue is, does rewarding considerable novelty and risk necessarily ensure the research will deliver usable and useful findings? How much of this type of research can New Zealand afford, compared with nations that deploy substantial funding pools to support basic, untargeted research? Applied research that has a clear outcome in sight for a broad range of users within a realistic timeframe can deliver real change and benefits for Aotearoa New Zealand’s environment, society and economy. But it may not have a novel and highly risky launching pad. Instead, it might require more resources to support implementation, adoption and adaptation to make a difference. A challenge Melissa notes when she asks, `Who funds implementation?’
Yet there is still a potentially huge number of beneficiaries for this highly valuable research who may not have the scale, level of co-ordination or voice to directly resource or leverage funds to support this research … yet!
Melissa also suggests that if funders `truly cared about broader impact, it would be tracked, measured and used in reward systems’. She calls for science `quality’ metrics to be expanded to `recognize making real-world change’. And she suggests researchers, institutions and funders put more effort into partnering with others who can help take `shiny new ideas’ that otherwise would `never go beyond sparkle’ towards solutions. Given the considerable need to deliver outcomes from science to safeguard our planet at this concerning time in our history, Melissa calls for funders to think again about the pressure they create for researchers to `simply go on to the next proposal – the next big, new idea, constantly chasing novelty, the bleeding edge of science’. According to Melissa, this is a waste. She calls for science policy analysts and funders to make generating practical solutions `enticing to researchers’ and not just something that might be regarded as `volunteer work – not part of a scientist’s job’. It’s exactly this sort of science that communities pay their taxes to support. There’s no suggestion that curiosity-driven science doesn’t have its place, but it’s time to consider whether discoveries without impact are unnecessarily using up precious time and resources. That's right - even beer (or in this case a calendar for the brewing of beer) can be a great candidate to highlight some of the basic principles of data visualisation.
In our Communicating Your Impact workshop we link to a few of our favourite experts, one of whom is Stephanie Evergreen. In this blog she illustrates how concepts like layout, colour choice, and the use of graphics/images can transform something really ugly and hard to read into something that is informative at a glance. Why do we care? One of the many ways that we create pathways to impact is through communication. Hard to read and understand graphs and graphics are like loud noise at a cafe. Distracting and sometimes just make us leave. Click on the link above to learn more. Stephanie is also really busy on LinkedIn and regularly shares little pearls of wisdom, so if you're active on LinkedIn follow her there too. The Healthier Lives NSC has been tasked with researching how to significantly reducing the death and disease burden and achieve equitable health outcomes from four of our biggest 'killer' non-communicable diseases (cancer, cardiovascular disease, diabetes, and obesity). Like many of us, the team at Healthier Lives were feeling frustrated at how seeming little evidence is making its way into policy and practice - the pathways to having an actual impact from research. As a result, they organised a hui consisting of a series of webinars (which you can watch here) and then an online workshop to "explore ways to strengthen" these pathways that could be embedded into our new health system. In the workshop, attendees were asked to consider the following questions:
What came out of the workshops? Workshop participants agreed that "Aotearoa New Zealand needs more transparent and system-embedded processes" if these pathways from evidence to impact are to be achieved. The questions prompted the identification of SIX KEY ELEMENTS and FIVE GUIDING PRINCIPLES that are needed if research evidence is going to be effectively embedded and actioned via policy and practice change. The summary of findings from this workshop have been documented here: https://healthierlives.co.nz/webinar-pathways-between-research-policy-and-practice/, and summarised in the diagram below taken from page 3 of this report. Only through the implementation of these elements and principles will we be able to "build the necessary processes and foster the collaboration required to ensure the best research is utilised to its fullest potential for the health of Aotearoa New Zealand". So what jumped out at us? Reading through the summary of findings had us nodding our heads A LOT. iPEN has been doing its own work on trying to understand what the systemic barriers and enablers to delivering impact from research, and it was reassuring to read many of the same themes emerging. The diagram above notes the critical importance of the 'who and how' (in the guiding principles, and multi-directional access to expertise) which also links nicely to our blog and link to Reed and Rudman's recent article about Rethinking research impact: voice, context, and power at the interface of science, policy, and practice. They note (when summarising why there is a need for strengthening pathways) that many of the historical mechanisms for bringing wider voices in - who often create or facilitate these pathways - have been largely disbanded. Further, they note that although ad hoc groups might be useful for responding to discrete events (like COVID), they are probably insufficient to address the more 'wicked' diseases this NCS is focusing on as their causes are multiple and systemic. (Sidenote - those of us who are systems thinkers may be having a 'duh' moment right now. Boundaries, perspectives, and relationships are our starting points.....) Measuring what matters (what's valued) - even if its hard. Equity is a core focus for our health system, and that ensuring this is actually being realised means having to figure out ways to evidence this so change can actually be monitored and used to inform decision-making at all levels (local, regional, and national). "While many [principles like tikanga] are widely accepted in rhetoric, they must be reiterated as they require commitment and action if we are to transformatively change the system" Transparency means communicating information in ways that are understandable to others, and that this is fundamental in building trust and confidence in our findings.
In our iPEN training we acknowledge that when we spend so much of our time talking to our professional colleagues we can forget that not everyone speaks your science language. This can be a challenge as we have been trained to communicate in this way (i.e., with our academic peers through publications using the research equivalent of legalese). Much like how learning another language is best done via the emersion method, figuring out how to communicate well with your stakeholders means getting to know them, and then using language they understand. How does all this link to our training and the advice we give? Figure out who your stakeholders and partners are/are likely to be, and involve them in your projects from the START. Try to recognise, understand, and manage power dynamics as part of this process, and know that relationships take time to build and maintain (and that this happens before, during, and after the projects themselves). Understand their context and use language and words they understand. If you can hit all these marks, you'll be well on the way to strengthening your own pathways to impact. |
AuthoriPEN is a collaboration across all seven Crown Research Institutes in New Zealand. We're a collection of colleagues all working towards supporting greater impact from our science and research. Archives
July 2023
Categories
All
|