• Keine Ergebnisse gefunden

THE ADDED VALUE OF COLLECTING UNIQUE IDENTIFIERS

Im Dokument for Research and (Seite 71-74)

The use of persistent unique identifiers, which can sort out different scientists/companies with the same names, and create a lifelong trace of their work, will allow to:

• Monitor the number of researchers supported through the pro-gramme and automatically access the publicly available informa-tion on their affiliainforma-tion, mobility, career evoluinforma-tion, scientific pro-duction, IPR applications, etc. by linking the identifier to external databases.

• Monitor the evolution of companies supported through the pro-gramme and automatically access their scientific or innovation outputs, turnover, investment, etc. by linking the identifier to ex-ternal databases.

• Build control groups to allow for counterfactual evaluation de-sign (propensity score-matching, regression discontinuity dede-sign or difference-in-difference methods), e.g. tracing the differences between researchers and companies not benefitting from the programme and those benefitting from the programme

The indicator framework is overall expected to provide a solid basis for accountability in so that evaluations can focus on diving deeper into learning and identifying the necessary policy adjustments for the future.

The indicators collected will be one of the many elements feeding into the interim and the ex-post evaluations of Horizon Europe together with other information sources and qualitative and quantitative indicators.

Because of the time lags and the uncertainty of the R&I investments, the interim evaluation will typically provide first evidence on the relevance and coherence of the programme and the efficiency of the processes in place, to identify potential pitfalls or drivers early in the process. It will also include a longer-term assessment of past Programmes to shed light on longer-term impacts.

5. CONCLUSIONS ON THE EXPECTED RESULTS OF THE REVAMPED INDICATOR SYSTEM

The Key Impact Pathways underpinning Horizon Europe’s monitoring system represent a novel, ambitious yet pragmatic approach for devising indicator frameworks for R&I programmes. It results from the identified need to start facing the complexity of R&I investments in monitoring and evaluation practices in order to deliver relevant and timely messages to policy makers. Based on a set of core principles (PATHS: proximity, attribution, traceability, holism and stability) this framework will ensu-re information is collected on a set of key dimensions on which impact is desired. Overall the Key Impact Pathways are expected to support a better capture and communication of the progress of Horizon Europe towards its objectives, including beyond its lifetime. The simplicity and storytelling nature of the Key Impact Pathways should bring a more im-mediate and continuous visibility of the European added value of R&I investments for science, the economy and society and allow to reach a wider audience beyond the R&I community.

To make best use of the potential of the Key Impact Pathways, data collection needs to match the ambition and pragmatism. The underlying richness and soundness of the analysis this will enable may well set a new trend for monitoring the impacts of R&I investments in the future.

Policy makers will be able to better identify and recognise the multip-le impacts of R&I investments, going beyond the mere identification of participation patterns, or the raw scientific and innovation production.

A stronger focus on microdata collection and data linking will allow for an easier identification of concrete storylines at the level of individual researchers, projects or project portfolios, including on the drivers and barriers to impacts. This will be a key element in improving the quality of programme evaluations, and their usefulness for policy learning and policy design – thereby paving the pathway to impact.

6. REFERENCES

Arnold E. (2012). Understanding long-term impacts of R&D funding: The EU framework programme, Research Evaluation 21 pp. 332–343.

Arnold, E., Clark, J. and Muscio, M. (2005). What the Evaluation Re-cord Tells us about Framework Programme Performance, Science and Public Policy, 32/5: 385–97.

Barré R., Cresson E. (1997). Science in tomorrow’s Europe. Economica International, Paris.

Bornmann, Lutz. (2013). What is social impact of research and how can it be assessed? A literature survey. Journal of the American Society for Information Science and Technology, 64/2.

Chen, H. (1990). Theory-driven Evaluation. Beverley Hills CA: Sage.

Coryn C. L.S., Noakes, L. A., Westine, C. D. and Schröter, D. C. (2011).

A Systematic Review of Theory-Driven Evaluation Practice From 1990 to 2009. American Journal of Evaluation, 32(2), 199–226.

Davignon E. et al (1997). 5-Year Assessment of the European Community RTD Framework Programmes, COM(97)151 final, European Commission.

Donaldson, S. I. (2007). Program theory-driven evaluation science, New York, NY: Lawrence.

Douthwaite, B., T. Kuby, E. van de Fliert and Schulz, S. (2003). Impact Pathway Evaluation: An Approach for Achieving and Attributing Impact in Complex Systems, Agricultural Systems 78: 243–65.

Edquist C. (1997). Systems of innovation: technologies, institutions and organisations, Pinter.

Edquist C. (2005). Systems of Innovation: Perspectives and Challenges in Fagerberg J., David C. Mowery D.C., Nelson R.R (eds.), The Oxford Handbook of Innovation , Oxford University Press, New York, US.

EPEC (2011). Understanding the Long Term Impact of the Framework Programme, Report to the European Commission.

Guthrie, S., Wamae, W., Diepeveen, S., Wooding, S., and Grant, J (2013). Measuring Research: A Guide to Research Evaluation Frame-works and Tools. Santa Monica, CA: RAND Corporation.

Irvine J. Martin B., (1989). Research Foresight: Priority-Setting in Sci-ence, Pinter Publishers, London.

JIIP (2016). The contribution of the Framework Programmes to Major Innovations, Report to the European Commission.

Jonkers, K., Fako, P., Isella, L., Zacharewicz, T., Del Rio, JC, Sand-ström, U., Van den Besselaar, P. (2018). A comparative analysis of the publication behaviour of MSCA fellows, European Commission – Joint Research Centre.

Klette T.J., Moone J., Griliches Z. (2000). Do Subsidies to Commercial R&D Reduce Market Failures? Research Policy, 2000, vol. 29, issue 4-5, 471-495.

Lundvall B.A. (1992). “National systems of innovation: Towards a theory of innovation and interactive learning, Pinter.

Nagarajan, N. and Vanheukelen, M. (1997). Evaluating EU Expenditure Programmes: A Guide. Brussels: Directorate-general for Budgets of the European Union.

Nelson, R. R. (1993). National Innovation Systems. New York:Oxford University Press.

Pawson, R., andTilley, N. (1997). Realistic Evaluation. London: Sage.

Pawson, R. (2003). Nothing as practical as a good theory. Evaluation 9(4): 471 – 90.

Pawson, R. (2006). Simple principles for the evaluation of complex pro-grammes,’ in A Killoran and A Kelly (eds), Evidence based public health . Oxford: Oxford University Press.

PPMI (2017). Assessment of the Union Added Value and the economic impact of the EU Framework Programmes (FP7, Horizon 2020), Report to the European Commission.

Ravet, J., Boitier, B., Grancagnolo, M., Le Moüel, P. Stirbat, L. and Zagamé, P. (2018). The Shape of Things to Come: Ex-Ante Assessment of the Economic Impact of Horizon Europe. Paper presented at the Austrian Presidency of the Council of the European Union Conference on Impact of Research and Innovation Policy at the Crossroads of Policy Design, Implementation and Evaluation.

Rietschel et al. (2009). Evaluation of the Sixth Framework Programme for Research and Technological Development, Report to the European Commission.

Rogers, P. J. (2008). Using Programme Theory to Evaluate Complicated and Complex Aspects of Interventions, Evaluation, 14(1):29-48.

European Commission (2005). Five-year assessment of the European Union Research Framework Programmes 1999-2003.

European Commission (2013). Evalsed Sourcebook – Methods and Techniques.

European Commission (2014). Programming period 2014-2020, Guidance Document on Monitoring and Evaluation - European Cohesion Fund, European Regional Development Fund.

European Commission (2017a). Better Regulation Guidelines, Commis-sion Staff Working Document SWD (2017) 350, 2017.

European Commission (2017b). Interim Evaluation of Horizon 2020, SWD (2017) 220.

European Commission (2018a). Impact Assessment accompanying the Commission proposal for Horizon Europe, the Framework Programme for Research and Innovation, SWD (2018) 307.

European Commission (2018b). Proposal for a Regulation of the Eu-ropean Parliament and the Council establishing Horizon Europe - the Framework Programme for Research and Innovation, laying down its rules for participation and dissemination.

European Commission (2018c). Evaluation of Business R&D Grant Sche-mes: behavioural change, mixed-method approaches and big data, Mu-tual Learning Exercise under the Horizon 2020 Policy Support Facility.

European Court of Auditors (2007). Evaluating the EU RTD FP – Could the Commission’s approach be improved, Special Report No 9/2007, pa-ragraph IV.

Foray D. (2000). L’économie de la connaissance, Repères, La Découverte.

Freeman C. (1987). Technology and economic performance: Lessons from Japan, Pinter.

Freeman, C., Lundvall, B.-Å. (eds.) (1988). Small Countries Facing the Technological Revolution, London: Pinter Publishers.

Fresco L. et al (2015). Commitment and coherence - Ex-post evaluation of 7th EU Framework Programme (2007-2013).

Funnell, S. (1997). Program Logic: An Adaptable Tool for Designing and Evaluating Programs, Evaluation News and Comment 6(1): 5–7.

Graham K. E. R., Langlois-Klassen D., Sagal A. A. M., Chan L. and Chorzempa, H. L. (2018). Assessing Health Research and Innovation Im-pact: Evolution of a Framework and Tools in Alberta, Canada. Frontiers in Research Metrics and Analytics (2)2018.

Guellec D. (1999). Economie de l’innovation », Collection Repères, La Découverte.

AUTHORS

NELLY BRUNO

European Commission - DG Research and Innovation 1049 Brussels, Belgium

E: nelly.bruno@ec.europa.eu MARTINA KADUNC

European Commission - DG Research and Innovation 1049 Brussels, Belgium

E: martina.kadunc@ec.europa.eu

KEYWORDS:

Research policy, innovation policy, impact, evaluation, monitoring, Framework Programme, indicators, Europe, Horizon Europe, pathways Smith, K.H. (2005). ‘Measuring innovation’, in Fagerberg J., David C.

Mowery D.C., Nelson R.R (eds.), The Oxford Handbook of Innovation , Oxford University Press, New York, US.

Spaapen, J., van Drooge, L. [SIAMPI Consortium] (2011). Final Report on Social Impacts of Research. Social Impact Assessment Methods for research and funding instruments through the study of Productive Inter-actions between science and society.

Stampfer M. (2008). European Added Value of Community Research Activities - Expert analysis in support of the ex-post evaluation of FP6, WWTF, Vienna Science and Technology Fund.

Suchman, E. (1967). Evaluative research. New York, NY: Russell Sage Foundation.

Technopolis Group (2009). Framework Programme 6 – Meta-evaluation, Report to the European Commission.

Technopolis Group (2018). How should we evaluate complex program-mes for innovation and socio-technical transitions. Technopolis group, June 2018.

Tilley H., Ball L., Cassidy C. (2018). Research Excellence Framework (REF) impact toolkit. Overseas Development Institute, March 2018.

Van den Besselaar P., Flecha R., Radauer A. (2018). Monitoring the Impact of EU Framework Programmes - Expert Report for the European Commission. Publications Office of the European Union: Luxembourg.

Weiss, C. (1987). Where Politics and Evaluation Research Meet, in D. Pa-lumbo (ed.) The politics of program evaluation. Newbury Park, CA: Sage.

Weiss, C. H. (1998). Evaluation: Methods for Studying Programs and Policies. Englewood Cliffs, NJ: Prentice Hall.

Weiss, C. (1995). Nothing as Practical as a Good Theory: Exploring Theory-Based Evaluation for Comprehensive Community Initiatives for Children and Families, in J. P. Connell, A. C. Kubisch, L. B. Schorr and C.

H. Weiss (eds) New Approaches to Evaluating Community Initiatives, pp.

65–9. Washington, DC: Aspen Institute.

In several countries, technology and innovation advisory services are provided by, among others, publicly-funded innovation intermediaries, whose aim is to support innovation in SMEs by providing them with a variety of services. Precisely because the advisory services offered by intermediaries could improve SMEs’ choice and use of knowledge-in-tensive services, we expect this combination of interventions to be more effective than the individual instruments.

This study presents an exploratory empirical analysis focused on two interconnected regional innovation policy interventions implemented in Tuscany (Italy). One was the provision of innovation vouchers that SMEs could use to buy knowledge-intensive services from accredited provi-ders, while the other intervention was the creation of intermediaries that could help SMEs to access such services. Since firms could benefit either by only one of the two interventions, or by both, we use a dataset de-rived from administrative sources to assess whether the policy mix that includes both interventions was more effective than the voucher alone or even the technology and innovation advisory service alone. We adopt a propensity score matching approach applied to the case of multiple treatments, as proposed by Lechner (2002a, 2002b). In particular, we compare three different treatments: (i) the use of innovation vouchers for the purchase of knowledge-intensive services; (ii) the reliance on an intermediary’s technology and innovation advisory service; (iii) the com-bination of the two treatments, i.e. the use of innovation vouchers for the purchase of knowledge-intensive services with guidance from the intermediary.

While policy mixes have been advocated as a response to complex problems (Flanagan et al., 2011; Cunningham et al., 2016), very little empirical evidence is available about the comparative effectiveness of policy mixes with respect to that of the single policies in the mix (Martin, 2016), and no other studies consider the particular combination of inno-vation vouchers and advisory services. This exploratory study captures an aspect that lies at the core of the policy mix literature, namely that the mix cannot be considered as the simple sum of the single instruments that are included in it (Magro and Wilson, 2013), but it can facilitate the emergence of synergies and complementarities among them.

INTRODUCTION

T

he provision of public funds to private firms for the purchase of services, particularly knowledge-intensive ones, has received so far little attention from the evaluation literature (Bakhshi et al., 2015; Bruhn et al., 2018 are notable exceptions). These interventions often target small and medium-sized enterprises (SMEs), providing them with a small amount of public funds that reduce their cost of purchasing services (Storey, 2003). Public funding can take the form of a direct sub-sidy or a voucher, which firms must use to purchase services from accre-dited service providers, or sometimes from any provider freely chosen by the beneficiary firm (OECD, 2000; Storey, 2003; IEG, 2013).

These interventions aim to help SMEs to access a variety of know-ledge and competencies required for innovation, which are not available within the firm (Vossen, 1998; Storey, 2003). The implicit assumption is that SMEs primarily suffer from constraints on their financial resour-ces, rather than on their capabilities. After receiving the subsidy, SMEs should be able to identify the services they need, as well as the suppliers that can best provide them. However, it is well known that SMEs, may not only lack the financial resources to invest in innovation, but also the capabilities to identify the competences and services they need, or the right suppliers that can provide them (Fontana et al., 2006; Ortega-Argilés et al., 2009). Subsidies for the purchase of knowledge-intensive services address the former problem, but not the latter.

As discussed by Shapira and Youtie (2016), to help SMEs increase their awareness of their needs and how to address them, they could be provided with complementary services, such as technology and innovati-on advisory services. We argue that such services could be usefully com-bined with innovation vouchers to increase the performance of SMEs.

Technology and innovation advisory services are usually delivered by one or more experts, who carry out a thorough assessment of the firm’s cur-rent knowledge and technology and an exploration of potential develop-ments. This allows the people involved to undertake a highly customized process of mutual learning, which increases the firm’s knowledge of its own innovation needs. Following the assessment, experts can direct the firm to other external service providers that will be able to deliver the specialized knowledge-intensive services it needs.

Im Dokument for Research and (Seite 71-74)