Anonymous #16

Related consultation
Submission received

Name (Individual/Organisation)

Anonymous #16

Responses

Q1. How could the purpose in the ARC Act be revised to reflect the current and future role of the ARC?

For example, should the ARC Act be amended to specify in legislation:
(a) the scope of research funding supported by the ARC
(b) the balance of Discovery and Linkage research programs
(c) the role of the ARC in actively shaping the research landscape in Australia
(d) any other functions?

If so, what scope, functions and role?

If not, please suggest alternative ways to clarify and define these functions.

The ARC Act should allow agility to respond to new issues and problems, and changes in the broader research ecosystem, so including specifics about the balance of Discovery and Linkage programs or similar would not be appropriate (1a-b). There should be more attention to equity not only with reference to gender but career stage and also fields of research, particularly balance in STEMM versus HASS fields (1d). It would be useful to explicitly note the role of the ARC in actively shaping the research landscape in Australia, but such a discussion should be accompanied by concrete specifics about how this does or should occur (1c). It is important to amend the ARC Act to reference a research quality assessment function, and also an assessment of the impact of research function, given these have become important parts of the role of the ARC that warrant explicit recognition.

Q3. How could the Act be improved to ensure academic and research expertise is obtained and maintained to support the ARC?

How could this be done without the Act becoming overly prescriptive?

I do not think this issue should be addressed via the Act but in improved processes within the ARC.

Q4. Should the ARC Act be amended to consolidate the pre-eminence or importance of peer review?

Please provide any specific suggestions you may have for amendment of the Act, and/or for non-legislative measures.

The ARC Act should be amended to stress the importance and pre-eminence of peer review processes in grant awards. Any ministerial deviations from advice received from the ARC following peer review processes should require a detailed rationale which is made publicly available detailing ways in which the proposed grant individual proposal exhibited significant procedural concerns or similar, or had content which was deemed clearly in conflict with explicit agreed aspects of the national interest (such as posing a threat to national security). Ideally this rationale should be subject to review by and advice from the ARC before the decisions are finalised.

Q5. Please provide suggestions on how the ARC, researchers and universities can better preserve and strengthen the social licence for public funding of research?

A National Interest Test (NIT) style of statement could remain a useful way to summarise potential benefits (construed broadly to include scientific, economic, social, cultural, environmental, and so on) for public consumption. However, it should be framed in terms of the overall goals of the program in question, which in the case of the Discovery Program includes contributions to knowledge. The ARC and the government should take seriously the importance of contributions to knowledge in any outward-facing information provided to the public, and not appear to be dismissive of these types of research projects, including during ministerial review processes and subsequent publicity following funding announcements. There is clear benefit that can result from deepening public understanding of research, and the ARC could further this with revised guidelines for the NIT or similar outward-facing communication strategies.

One solution to the current problems with writing NIT statements and asking researchers to rewrite would be to provide tick boxes which capture all of the potential types of benefits including contribution to knowledge in line with the ANZSCR Type of Activity (ToA) and Socio-Economic Objective (SEO) classifications, which would signal that all of these are considered to be significant benefits, followed by a revised format for the NIT which draws on available scholarship and evidence from public understanding of science and related fields which have developed evidence-based standards and advice about how science (broadly construed and not limited to the natural sciences) can best be communicated to various publics.

These types of efforts should be accompanied by more refined communication from the ARC and the minister about the value of funded projects particularly those that on the face might not appear to non-experts to be relevant or be good value for money. Promoting the need for Australia to have international prominence in many fields of research as part of our national reputation, and the close link of reputation (especially in non-applied fields and many HASS fields) to important economic drivers such as the strength and quality of our tertiary education sector, international student markets, and abilities to collaborate with high-quality international partners. These sorts of values are not difficult to communicate and can be directed linked to governmental/ARC investments even in non-applied or fundamental/basic research projects.

Q6. What elements of ARC processes or practices create administrative burdens and/or duplication of effort for researchers, research offices and research partners?

The Review Panel should consider proposing pilot efforts that rely on much more minimal information provision similar to what was recently implemented in the Industry Laureate round (though perhaps with greater length for project narratives themselves). Many funding bodies now utilise an initial expression of interest process and then shortlist before accepting full applications, which is a strategy worth considering as it would allow a quicker turnaround and reduce investment by academics and universities in lengthy grants that are unlikely to be competitive. This type of initiative clearly would reduce administrative burden (and also relates to Q7 below). However, it would need to be accompanied by robust assessment processes appropriate for these sorts of expressions of interest; for instance, the relatively small number of assessments provided for each bid in the initial CoE rounds are insufficient, and make this initial process potentially unequitable and not appropriate given the amount of time/labour/funds spent preparing these expressions of interest.

It is clear that delays to funding announcements, changes to grant rules and deadlines without notice, and related recent concerns must be prevented in the future to keep faith amongst those in the university sector in the processes, which has been addressed in part by the Minister’s recent commitments to grant rounds being published and delivered on predetermined timeframes, but which should be emphasised in the outcomes from this review process.

Australia’s Science and Research Priorities must be reviewed to make them up-to-date but agile, and to permit them to better capture the range of research priorities that are in fact aligned with current needs, including those associated with fundamental research and priorities beyond those associated with the natural sciences, such as cultural and social priorities.

A more robust complaints handling and appeals mechanism should be established, particularly for requests to set aside assessments which are outliers or appear to come from wholly unqualified reviewers. Based on feedback from several research offices and personal experiences, typically the ARC response to date has been that the College of Experts will not take such assessment seriously but that the assessment in question will not be excluded This type of response makes the underlying processes non-transparent, whereas elimination of assessment from processes associated with banding and consideration for funding where appropriate would be considerably more ethical and respectful of researchers’ efforts, and foster the ARC’s reputation in terms of research integrity.

Q7. What improvements could be made:

(a) to ARC processes to promote excellence, improve agility, and better facilitate globally collaborative research and partnerships while maintaining rigour, excellence and peer review at an international standard?

(b) to the ARC Act to give effect to these process improvements, or do you suggest other means?

Please include examples of success or best practice from other countries or communities if you have direct experience of these.

Shorter-term funding schemes would be desirable in many fields and especially for early career researchers. Targeted research calls also would be appropriate if guidelines are clear and transparent (some of the SRI rounds have not been particularly robust in terms of review and award processes). An expression of interest stage would be desirable for fellowship schemes and perhaps even larger Discovery Projects. Expressions of interest are used by many European/UK funders and appear to work well inasmuch as they save time for academics on applications that are simply not viable. However as noted in response to question 7, care would need to be taken to ensure that assessment of EoIs is equitable and reliable.

To encourage international collaborations, reintroducing the international partner awards within Discovery Projects would be desirable: although these costs can be covered within the standard Discovery budget, having this as a separate item would emphasise that these types of collaborations are a key goal.

Interdisciplinary and cross-institutional collaborations could be encouraged through ‘sandpit’ schemes which have been successfully used in the United States (e.g., NIH and NSF) and elsewhere to fund opportunities for academics to devise novel projects on themes of shared interests and relevant to pressing concerns (e.g., COVID-19, climate change): these schemes tend to promote innovative projects while eliminating the long timelines often associated with larger-scale research. Best practices should be explored using the available scholarly literature on alternative assessment schemes and the grey literature on evaluations of existing schemes in other locales.

All of these types of changes could be made internally within the ARC perhaps by trialing them in pilot schemes without requiring changes to the Act, which would be preferable in order to retain agility and responsiveness as the research ecosystem continues to evolve.

Q8. With respect to ERA and EI:

(a) Do you believe there is a need for a highly rigorous, retrospective excellence and impact assessment exercise, particularly in the absence of a link to funding?

(b) What other evaluation measures or approaches (e.g. data driven approaches) could be deployed to inform research standards and future academic capability that are relevant to all disciplines, without increasing the administrative burden?

(c) Should the ARC Act be amended to reference a research quality, engagement and impact assessment function, however conducted?

(d) If so, should that reference include the function of developing new methods in research assessment and keeping up with best practice and global insights?

a. ERA: There is clear need for a highly rigorous research excellence assessment scheme even with no link to funding, as this type of measure helps to foster public confidence, contributes to national and international research reputation, and allows universities to engage in evidence-driven research planning. However, ERA preparation and submission has now become a significant burden for universities and requires reform to achieve the goal of rigorously assessing excellence in research across all fields of research (FoRs), which will require a fine-grained analysis of the most appropriate processes and metrics for the diverse FoRs in the new system, particularly peer-reviewed disciplines.

EI: There also is a clear need to recognise and value excellence in research impact, even without direct links to funding. As with research excellence, this type of measure helps to foster public but also industry and stakeholder confidence, contributes to national and international research reputation, and allows universities to engage in evidence-driven planning about research impact.

b. ERA: I take 'data-driven' approaches to refer to use of automated and largely quantitative metrics, as in fact ERA has relied on qualitative and quantitative data in its past articulations. Automated, quantitative data-driven approaches might permit appropriate analyses in some of the citation-based FoRs (as shown in some of the recent modelling attempts) but will not capture the most relevant information in most of what are currently the peer-reviewed FoRs and even some citation-based FoRs, and would be particularly problematic for NTROs, FoRs containing large numbers of non-journal article outputs, FoRs with weaker or inconsistent indexing practices, or Indigenous FoRs. The new FoR system contains multiple 4-digit FoRs under which outputs could sit, and hence the ‘human’ component of coding outputs to FoRs will remain essential to any fine-grained assessment processes, particularly for any outputs that are not journal articles (and even for many types of journal articles that are interdisciplinary). The ARC should explore the best practices in this domain but always keeping in view the diversity of the FoRs and their outputs in the current system, and the need to represent all FoRs (rather than for instance not assessing NTROs).

There are a few possibilities that might allow rigorous and appropriate assessment while reducing administrative burden as compared to the current ERA processes: one is to follow a system similar to the UK REF, which although flawed in some ways did minimise administrative burden due to focusing on a discrete subset of outputs through selection of the top 4 outputs by each researcher in the given period, which allows excellence to be highlighted and then evaluated. An alternative would be to have data collection annually (similar to the old HERDC system) and more streamlined review processes (say in peer-reviewed FoRs equal to a smaller percentage of the total unit of assessment, perhaps more highly structured in terms of how the assessment occurs). Either of these options could be combined with the proposal to have assessment occur not at the 4-digit but only at the 2-digit level for smaller FoRs which have unified disciplinary structures and perhaps in bundles underneath some of the more heterogenous and voluminous 2-digit FoRs.

EI: Best practices in research impact assessment rely on the use of case studies or similar, and on longer timelines that those currently utilised in the EI scheme as proposed (the voluminous literature on this topic should be consulted and referenced in the review recommendations). An alternative approach might be to make impact assessment non-obligatory but to give recognition to those case studies submitted that fulfil the desired attributes at the highest level, similar to teaching awards. Although it is highly desirable that research planning consider impact, not all research or researchers during any one assessment period may have appropriate case studies. Changing to a system which recognises the highest quality case studies as models also would permit judgments about excellence in impact not to be made institution-by-institution but across the sector, which more accurately reflects the nature of the cross-institutional (and interdisciplinary) teams that typically participate in impactful projects. It could permit more refined evaluation of the outcomes of ARC-funded research far beyond what is currently available in final reports and published outputs.

c-d. The ARC Act should be amended to reference a research quality, engagement and impact assessment function as these components are key to what the ARC does not only in the formal assessment processes but also indirectly through its award of grant funding. It could reference the importance of developing new methods in research assessment and keeping up with best practices and global insights, although any use of new methods should require that the resulting benefits be weighed up against administrative burdens and lack of parallelism to previous assessment practices.

Q9. With respect to the ARC’s capability to evaluate research excellence and impact:

(a) How can the ARC best use its expertise and capability in evaluating the outcomes and benefits of research to demonstrate the ongoing value and excellence of Australian research in different disciplines and/or in response to perceived problems?

(b) What elements would be important so that such a capability could inform potential collaborators and end-users, share best practice, and identify national gaps and opportunities?

(c) Would a data-driven methodology assist in fulfilling this purpose?

No response to this beyond what is noted in response to question 8 above.

Q10. Having regard to the Review’s Terms of Reference, the ARC Act itself, the function, structure and operation of the ARC, and the current and potential role of the ARC in fostering excellent Australian research of global significance, do you have any other comments or suggestions?

Greater parallelism of the systems for preparing grant applications across the ARC and the NHMRC (and even smaller national funders such as the RDCs) would be highly desirable; a considerable amount of labour is wasted in retooling information for submission to these different schemes.

The ARC should more actively self-evaluate (against clearer metrics for 'success'). It also should more actively participate in and contribute to the international research communities which explore the conduct of research assessment, impact, and funding; this might require employment of specialised staff who have particular expertise in these topics not only from experience in the tertiary sector or similar, but on best practices based on rigorous scholarly evidence derived from research from emerging fields which focus on university-industry collaborations, metascience, science communication and public understanding of science, innovation studies, and so on.

Submission received

05 December 2022

Publishing statement

Yes, I would like my submission to be published but my and/or the organisation's details kept anonymous. Your submission will need to meet government accessibility requirements.