Analysis & Policy Observatory (APO)

Related consultation
Submission received

Name (Individual/Organisation)

Analysis & Policy Observatory (APO)

Responses

Q5. Please provide suggestions on how the ARC, researchers and universities can better preserve and strengthen the social licence for public funding of research?

The social licence for public funding of research could be strengthened by making the research more visible and accessible to, and usable by, the public.

The ARC should encourage applicants to incorporate methods and costs for maximising engagement with their research in their proposals.

The Analysis & Policy Observatory (APO) makes research more visible and accessible to, and usable by, the public. APO sources and catalogues material published by organisations (including research institutes, centres, and universities) and makes it discoverable and accessible in its open access repository (apo.org.au), as well as across the internet (APO is indexed by all major search engines) and university libraries (via Informit).

Material published by organisations (also referred to as grey literature) is the most accessible and useable research output for a non-academic audience thereby maximising engagement and impact. Creating a non-academic research output (whether that be a report, infographic, guidelines, video, or podcast) is the main method used by academic researchers seeking to maximise engagement with, and impact of, their work.

APO also disseminates this material to its 15,000 newsletter subscribers of which approximately half work across all levels of government, and quarter work in not-for-profits. Our audience are policymakers, practitioners, community members, advocates, activists, analysts, researchers, and the general public.
Research outputs on APO are shared by members of the general public via social media. With support, APO could be utilised by more academic and ARC-funded researchers to share their work, and our audience has the potential to significantly grow.

A badge or flag could be created to add to research outputs that indicated ‘Funded by the ARC’. This was a similar approach to Collections (apo.org.au/collections) that were funded directly by LIEF grants - the ARC logo was displayed in the ‘Sponsors’ section of the Collection landing page.

APO is already used by research institutes, education providers, and government to engage an audience in an evidence base. The ARC Centre of Excellence for Automated Decision-Making and Society have funded APO to establish a repository (and archive) of their research outputs and to promote these and their events to our audience (apo.org.au/collection/316968/automated-decision-making-society). Their outputs are already attracting a high level of engagement on APO and social media channels.

Q8. With respect to ERA and EI:

(a) Do you believe there is a need for a highly rigorous, retrospective excellence and impact assessment exercise, particularly in the absence of a link to funding?

(b) What other evaluation measures or approaches (e.g. data driven approaches) could be deployed to inform research standards and future academic capability that are relevant to all disciplines, without increasing the administrative burden?

(c) Should the ARC Act be amended to reference a research quality, engagement and impact assessment function, however conducted?

(d) If so, should that reference include the function of developing new methods in research assessment and keeping up with best practice and global insights?

a. Do you believe there is a need for a highly rigorous, retrospective excellence and impact assessment exercise, particularly in the absence of a link to funding?

Yes, there needs to be a process, as the saying goes, ‘what gets measured, gets managed’ or less commonly, ‘you treasure what you measure’. And similarly, a link to funding in some way – either directly or indirectly – should be the longer term goal.

But it could be done differently. There is an opportunity to streamline quality and impact measurement and have it built into ongoing data collection rather than a big, retrospective exercise every several years. If data collection and assessment becomes part of day-to-day research management it will become more useful and easier to respond to at a university level.

Some of this data is collected, for example, in the end of project report for ARC LIEF grants. Recipients are asked to report on how many people used the facilities, and many other measures at the very end of the project. So it is given little attention by the chief investigators throughout the project, and it is unclear what the ARC does with this information.

Rather than doing a separate exercise, effort could be diverted to evaluating the funding that has been distributed to support better understanding – for both the ARC and universities – of how quality outcomes and impact have been achieved.

b. What other evaluation measures or approaches (e.g. data driven approaches) could be deployed to inform research standards and future academic capability that are relevant to all disciplines, without increasing the administrative burden?

In order to maximise the return of the significant investment in research the ARC needs to actively facilitate not only the dissemination of research, but also the translation and implementation of new knowledge (as identified by the ARC Strategy 2022-2025).

A helpful framework for understanding the application of research is to think of it in stages of: generation, dissemination, translation, and implementation. A useful data driven approach would develop measures at each stage of this process. The consultation paper acknowledges the time lag from research to application. However, it’s often possible to measure the steps taken to get there – in terms of dissemination and engagement – much closer to the time when the research was generated. Indeed, more effective engagement occurs at research inception and not dissemination. This approach of measuring dissemination and engagement would also support a more prospective paradigm.

By actively facilitating all stages of the research application process through the provision of infrastructure, the ARC would also have greater means to measure and evaluate it.

Dissemination could be improved by having an accessible centralised repository of research outputs that have use beyond academia. Academic research outputs require translation into something accessible and meaningful to people who can use them.

As mentioned in Question 5, this infrastructure already exists. The Analysis & Policy Observatory (APO) is a digital information management system and repository of material (including text, audio, and visual) published by organisations. Academic researchers translate their work into accessible research outputs and add them to APO to access our audience of 800,000 users of our website repository and our 15,000 newsletter subscribers.

And we track the level of engagement with this work through page view, download, and share (via social media and email) metrics. These are the most reliable alternative metrics available to academic researchers. While page views indicate how ‘click-worthy’ a report is, downloads are a more accurate indicator of engagement. In the last year, APO’s repository had more than 3 million page views and 500,000 downloads.

To further support research translation, the APO platform could be extended to support engagement between academics and government, the community sector, and industry. Greater platform capability will provide more opportunities for measurement. For example, APO already has profile pages for all authors and publishing organisations – these could evolve to provide opportunities for interactions and collaboration with decision-makers and the public which could also be tracked and measured.

APO’s unique database of research outputs, and policy documents and practitioner guidelines also provides an opportunity for data mining and natural language processing to determine links between these documents and research.

The approach of measuring dissemination, translation, and implementation is relevant to all disciplines, but it is likely it would be measured in different ways depending on the underlying purpose of the research, whether it is to create: academic knowledge, knowledge for public or practical use, innovation, or a product. Engagement and impact look markedly different depending on the aim and intended audience. The metrics could be adjusted for each purpose and considered within context. APO can support the measurement of engagement with knowledge for public or practical use, and to a lesser extent, innovation.

If ARC does go down the path of using data driven approaches to measure engagement and impact of research, it is crucial that universities and academics are provided with the means to do this. Specifically, ARC grant recipients are able to use a portion of their funding to maximise and measure engagement and impact with their work.

The consultation paper states, “It’s yet to be determined whether a metrics based approach can resolve the latter issue of privileging style of case studies”. Research impact can be quite nebulous and manifest differently in different contexts, therefore it would be irresponsible to rely on one method of measurement. A mixed-methods approach that looks at both qualitative and quantitative evidence would be more useful in understanding how it was achieved – with some defined and standardised quantitative measures.

c. Should the ARC Act be amended to reference a research quality, engagement and impact assessment function, however conducted?

Yes – with broad scope so how this is done can evolve and innovate.

d. If so, should that reference include the function of developing new methods in research assessment and keeping up with best practice and global insights?

Yes but best practice and global insights need to be rigorously investigated to determine if they are worth adopting – rather than taking them on at face value and adopting the latest ‘trend’ only for it to become irrelevant or useless.

Q10. Having regard to the Review’s Terms of Reference, the ARC Act itself, the function, structure and operation of the ARC, and the current and potential role of the ARC in fostering excellent Australian research of global significance, do you have any other comments or suggestions?

Feedback in relation to ARC Linked Infrastructure, Equipment and Facilities grants:

APO has received LIEF grants in the past which has supported its longevity (having been established in 2002 at Swinburne University). APO is on the path to self-sustainability but there is no support for this and we may run out of funding before we reach this goal. To support this goal, APO does not qualify for the LIEF program, as our primary purpose is to support the access and use of research outputs and research within government and not-for-profits. Although we have the capability, a smaller proportion of our activity supports the conduct of academic research.

To facilitate ARC’s goal of enabling world-leading research, translation and impact, there is an argument for extending the scope of the LIEF funding beyond the conduct of research to support the dissemination, translation, application, and impact of research.

It is also disappointing to see new LIEF grants being awarded to academics to establish new policy repositories or observatories, when the infrastructure already exists. There needs to be a more holistic approach to the funding of research infrastructure, which would complement the Australian Data Research Common’s approach to establishing a research and data ‘commons’.

Submission received

14 December 2022

Publishing statement

Yes, I would like my submission to be published and my name and/or the name of the organisation to be published alongside the submission. Your submission will need to meet government accessibility requirements.