definition of evaluation by different authors

In this case, a specific definition may be required, for example, in the Research Excellence Framework (REF), Assessment framework and guidance on submissions (REF2014 2011b), which defines impact as, an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia. SIAMPI is based on the widely held assumption that interactions between researchers and stakeholder are an important pre-requisite to achieving impact (Donovan 2011; Hughes and Martin 2012; Spaapen et al. The Payback Framework has been adopted internationally, largely within the health sector, by organizations such as the Canadian Institute of Health Research, the Dutch Public Health Authority, the Australian National Health and Medical Research Council, and the Welfare Bureau in Hong Kong (Bernstein et al. At least, this is the function which it should perform for society. Oxford University Press is a department of the University of Oxford. For example, the development of a spin out can take place in a very short period, whereas it took around 30 years from the discovery of DNA before technology was developed to enable DNA fingerprinting. An evaluation essay or report is a type of argument that provides evidence to justify a writer's opinions about a subject. Definition of evaluation. 3. 0000348060 00000 n To be considered for inclusion within the REF, impact must be underpinned by research that took place between 1 January 1993 and 31 December 2013, with impact occurring during an assessment window from 1 January 2008 to 31 July 2013. Describe and use several methods for finding previous research on a particular research idea or question. For example, some of the key learnings from the evaluation of products and personnel often apply to the evaluation of programs and policies and vice versa. Any tool for impact evaluation needs to be flexible, such that it enables access to impact data for a variety of purposes (Scoble et al. The university imparts information, but it imparts it imaginatively. 2007; Grant et al. The ability to write a persuasive well-evidenced case study may influence the assessment of impact. The process of evaluation involves figuring out how well the goals have been accomplished. 0000001862 00000 n Two areas of research impact health and biomedical sciences and the social sciences have received particular attention in the literature by comparison with, for example, the arts. 0000007307 00000 n Narratives can be used to describe impact; the use of narratives enables a story to be told and the impact to be placed in context and can make good use of qualitative information. Gathering evidence of the links between research and impact is not only a challenge where that evidence is lacking. This is recognized as being particularly problematic within the social sciences where informing policy is a likely impact of research. One might consider that by funding excellent research, impacts (including those that are unforeseen) will follow, and traditionally, assessment of university research focused on academic quality and productivity. 0000010499 00000 n The Oxford English Dictionary defines impact as a 'Marked effect or influence', this is clearly a very broad definition. 0000328114 00000 n As a result, numerous and widely varying models and frameworks for assessing impact exist. Developing systems and taxonomies for capturing impact, 7. Figure 1, replicated from Hughes and Martin (2012), illustrates how the ease with which impact can be attributed decreases with time, whereas the impact, or effect of complementary assets, increases, highlighting the problem that it may take a considerable amount of time for the full impact of a piece of research to develop but because of this time and the increase in complexity of the networks involved in translating the research and interim impacts, it is more difficult to attribute and link back to a contributing piece of research. Other approaches to impact evaluation such as contribution analysis, process tracing, qualitative comparative analysis, and theory-based evaluation designs (e.g., Stern, Stame, Mayne, Forss, & Befani, 2012) do not necessarily employ explicit counterfactual logic for causal inference and do not introduce observation-based definitions. Capturing knowledge exchange events would greatly assist the linking of research with impact. The origin is from the Latin term 'valere' meaning "be strong, be well; be of value, or be worth". For full access to this pdf, sign in to an existing account, or purchase an annual subscription. 0000002318 00000 n In the Brunel model, depth refers to the degree to which the research has influenced or caused change, whereas spread refers to the extent to which the change has occurred and influenced end users. Where narratives are used in conjunction with metrics, a complete picture of impact can be developed, again from a particular perspective but with the evidence available to corroborate the claims made. 5. << /Length 5 0 R /Filter /FlateDecode >> However, there has been recognition that this time window may be insufficient in some instances, with architecture being granted an additional 5-year period (REF2014 2012); why only architecture has been granted this dispensation is not clear, when similar cases could be made for medicine, physics, or even English literature. Reviewing the research literature means finding, reading, and summarizing the published research relevant to your question. The first attempt globally to comprehensively capture the socio-economic impact of research across all disciplines was undertaken for the Australian Research Quality Framework (RQF), using a case study approach. The Goldsmith report concluded that general categories of evidence would be more useful such that indicators could encompass dissemination and circulation, re-use and influence, collaboration and boundary work, and innovation and invention. Perhaps, SROI indicates the desire to be able to demonstrate the monetary value of investment and impact by some organizations. In the UK, evaluation of academic and broader socio-economic impact takes place separately. 0000008675 00000 n 0000004019 00000 n There is a distinction between academic impact understood as the intellectual contribution to ones field of study within academia and external socio-economic impact beyond academia. (2011) Maximising the Impacts of Your Research: A Handbook for Social Scientists (Pubd online) <, Lets Make Science Metrics More Scientific, Measuring Impact Under CERIF (MICE) Project Blog, Information systems of research funding agencies in the era of the Big Data. To understand the socio-economic value of research and subsequently inform funding decisions. Even where we can evidence changes and benefits linked to our research, understanding the causal relationship may be difficult. Collating the evidence and indicators of impact is a significant task that is being undertaken within universities and institutions globally. The University and College Union (University and College Union 2011) organized a petition calling on the UK funding councils to withdraw the inclusion of impact assessment from the REF proposals once plans for the new assessment of university research were released. HEFCE indicated that impact should merit a 25% weighting within the REF (REF2014 2011b); however, this has been reduced for the 2014 REF to 20%, perhaps as a result of feedback and lobbying, for example, from the Russell Group and Million + group of Universities who called for impact to count for 15% (Russell Group 2009; Jump 2011) and following guidance from the expert panels undertaking the pilot exercise who suggested that during the 2014 REF, impact assessment would be in a developmental phase and that a lower weighting for impact would be appropriate with the expectation that this would be increased in subsequent assessments (REF2014 2010). Reviews and guidance on developing and evidencing impact in particular disciplines include the London School of Economics (LSE) Public Policy Groups impact handbook (LSE n.d.), a review of the social and economic impacts arising from the arts produced by Reeve (Reeves 2002), and a review by Kuruvilla et al. The process of evaluation is dynamic and ongoing. While aspects of impact can be adequately interpreted using metrics, narratives, and other evidence, the mixed-method case study approach is an excellent means of pulling all available information, data, and evidence together, allowing a comprehensive summary of the impact within context. One notable definition is provided by Scriven (1991) and later adopted by the American Evaluation Association (): "Evaluation is the systematic process to determine merit, worth, value, or . 0000007777 00000 n These metrics may be used in the UK to understand the benefits of research within academia and are often incorporated into the broader perspective of impact seen internationally, for example, within the Excellence in Research for Australia and using Star Metrics in the USA, in which quantitative measures are used to assess impact, for example, publications, citation, and research income. Any information on the context of the data will be valuable to understanding the degree to which impact has taken place. Perhaps the most extended definition of evaluation has been supplied by C.E.Beeby (1977). Providing advice and guidance within specific disciplines is undoubtedly helpful. These traditional bibliometric techniques can be regarded as giving only a partial picture of full impact (Bornmann and Marx 2013) with no link to causality. This article aims to explore what is understood by the term research impact and to provide a comprehensive assimilation of available literature and information, drawing on global experiences to understand the potential for methods and frameworks of impact assessment being implemented for UK impact assessment. 2006; Nason et al. The fast-moving developments in the field of altmetrics (or alternative metrics) are providing a richer understanding of how research is being used, viewed, and moved. In endeavouring to assess or evaluate impact, a number of difficulties emerge and these may be specific to certain types of impact. (2006) on the impact arising from health research. It is now possible to use data-mining tools to extract specific data from narratives or unstructured data (Mugabushaka and Papazoglou 2012). The . In this article, we draw on a broad range of examples with a focus on methods of evaluation for research impact within Higher Education Institutions (HEIs). Despite many attempts to replace it, no alternative definition has . Wooding et al. We take a more focused look at the impact component of the UK Research Excellence Framework taking place in 2014 and some of the challenges to evaluating impact and the role that systems might play in the future for capturing the links between research and impact and the requirements we have for these systems. 0000009507 00000 n A Preferred Framework and Indicators to Measure Returns on Investment in Health Research, Measuring Impact Under CERIF at Goldsmiths, Anti-Impact Campaigns Poster Boy Sticks up for the Ivory Tower. Overview of the types of information that systems need to capture and link. What are the methodologies and frameworks that have been employed globally to assess research impact and how do these compare? A collation of several indicators of impact may be enough to convince that an impact has taken place. The traditional form of evaluation of university research in the UK was based on measuring academic impact and quality through a process of peer review (Grant 2006). %PDF-1.4 % Attempts have been made to categorize impact evidence and data, for example, the aim of the MICE Project was to develop a set of impact indicators to enable impact to be fed into a based system. Despite the concerns raised, the broader socio-economic impacts of research will be included and count for 20% of the overall research assessment, as part of the REF in 2014. Capturing data, interactions, and indicators as they emerge increases the chance of capturing all relevant information and tools to enable researchers to capture much of this would be valuable. Throughout history, the activities of a university have been to provide both education and research, but the fundamental purpose of a university was perhaps described in the writings of mathematician and philosopher Alfred North Whitehead (1929). Teacher Education: Pre-Service and In-Service, Introduction to Educational Research Methodology, Teacher Education: Pre-Service & In-Service, Difference and Relationship Between Measurement, Assessment and Evaluation in Education, Concept and Importance of Measurement Assessment and Evaluation in Education, Purpose, Aims and Objective of Assessment and Evaluation in Education, Main Types of Assessment in Education and their Purposes, Main Types of Evaluation in Education with Examples, Critical Review of Current Evaluation Practices B.Ed Notes, Compare and Contrast Formative and Summative Evaluation in Curriculum Development B.ED Notes, Difference Between Prognostic and Diagnostic Evaluation in Education with Examples, Similarities and Difference Between Norm-Referenced Test and Criterion-Referenced Test with Examples, Difference Between Quantitative and Qualitative Evaluation in Education, Difference between Blooms Taxonomy and Revised Blooms Taxonomy by Anderson 2001, Cognitive Affective and Psychomotor Domains of Learning Revised Blooms Taxonomy 2001, Revised Blooms Taxonomy of Educational Objectives, 7 Types and Forms of Questions with its Advantages, VSA, SA, ET, Objective Type and Situation Based Questions, Definition and Characteristics of Achievement Test B.Ed Notes, Steps, Procedure and Uses of Achievement Test B.Ed Notes, Meaning, Types and Characteristics of diagnostic test in Education B.ED Notes, Advantages and Disadvantages of Diagnostic Test in Education B.ED Notes, Types of Tasks: Projects, Assignments, Performances B.ED Notes, Need and Importance of CCE: Continuous and Comprehensive Evaluation B.Ed Notes, Characteristics & Problems Faced by Teachers in Continuous and Comprehensive Evaluation, Meaning and Construction of Process Oriented Tools B.ED Notes, Components, Advantages and Disadvantages of Observation Schedule, Observation Techniques of Checklist and Rating Scale, Advantages and Disadvantages of Checklist and Rating Scale, Anecdotal Records Advantages and Disadvantages B.ED Notes, Types and Importance of Group Processes and Group Dynamics, Types, Uses, Advantages & Disadvantages of Sociometric Techniques, Stages of Group Processes & Development: Forming, Storming, Norming, Performing, Adjourning, Assessment Criteria of Social Skills in Collaborative or Cooperative Learning Situations, Portfolio Assessment: Meaning, Scope and Uses for Students Performance, Different Methods and Steps Involved in Developing Assessment Portfolio, Characteristics & Development of Rubrics as Tools of Assessment, Types of Rubrics as an Assessment Tool B.ED Notes, Advantages and Disadvantages of Rubrics in Assessment, Types & Importance of Descriptive Statistics B.ED Notes, What is the Difference Between Descriptive and Inferential Statistics with Examples, Central Tendency and Variability Measures & Difference, What are the Different Types of Graphical Representation & its importance for Performance Assessment, Properties and Uses of Normal Probability Curve (NPC) in Interpretation of Test Scores, Meaning & Types of Grading System in Education, Grading System in Education Advantages and Disadvantages B.ED Notes, 7 Types of Feedback in Education & Advantages and Disadvantages, Role of Feedback in Teaching Learning Process, How to Identify Learners Strengths and Weaknesses, Difference between Assessment of Learning and Assessment for Learning in Tabular Form, Critical Review of Current Evaluation Practices and their Assumptions about Learning and Development, The Concept of Test, Measurement, Assessment and Evaluation in Education. The most appropriate type of evaluation will vary according to the stakeholder whom we are wishing to inform. The exploitation of research to provide impact occurs through a complex variety of processes, individuals, and organizations, and therefore, attributing the contribution made by a specific individual, piece of research, funding, strategy, or organization to an impact is not straight forward. In undertaking excellent research, we anticipate that great things will come and as such one of the fundamental reasons for undertaking research is that we will generate and transform knowledge that will benefit society as a whole. A very different approach known as Social Impact Assessment Methods for research and funding instruments through the study of Productive Interactions (SIAMPI) was developed from the Dutch project Evaluating Research in Context and has a central theme of capturing productive interactions between researchers and stakeholders by analysing the networks that evolve during research programmes (Spaapen and Drooge, 2011; Spaapen et al. This distinction is not so clear in impact assessments outside of the UK, where academic outputs and socio-economic impacts are often viewed as one, to give an overall assessment of value and change created through research. "Evaluation is a process of judging the value of something by certain appraisal." Characteristics of evaluation in Education Below are some of the characteristics of evaluation in education, Continuous Process Comprehensive Child-Centered Cooperative Process Common Practice Teaching Methods Multiple Aspects Continuous Process Over the past year, there have been a number of new posts created within universities, such as writing impact case studies, and a number of companies are now offering this as a contract service. 0000348082 00000 n Every piece of research results in a unique tapestry of impact and despite the MICE taxonomy having more than 100 indicators, it was found that these did not suffice. Evaluation is a procedure that reviews a program critically. In designing systems and tools for collating data related to impact, it is important to consider who will populate the database and ensure that the time and capability required for capture of information is considered. Frameworks for assessing impact have been designed and are employed at an organizational level addressing the specific requirements of the organization and stakeholders. It is perhaps worth noting that the expert panels, who assessed the pilot exercise for the REF, commented that the evidence provided by research institutes to demonstrate impact were a unique collection. different things to different people, and it is primarily a function of the application, as will be seen in the following. In the UK, more sophisticated assessments of impact incorporating wider socio-economic benefits were first investigated within the fields of Biomedical and Health Sciences (Grant 2006), an area of research that wanted to be able to justify the significant investment it received.