There is no unifying consensus on how to operationalize measurement according to these different components.
However, The Oslo Manual , published by the OECD, attempts to create a universal standard for innovation measurement in the private sector by providing a universal definition of innovation with common metrics. The latest edition even touches upon measurement in the public sector. According to the OECD:. How is measurement in the context of the public sector any different? Within the public sector, innovations must have the end goal of delivering better outputs for the public good, which inevitably influences the measurement framework.
Public sector measurement must be concerned with identifying how innovation helps to solve problems and improve outcomes for society.
Common menu bar links
Because public sector innovation is less linear and operates at various administrative levels of government, measurement can get complicated. Within the emerging field of public sector innovation, there is not a long history of measuring innovation. A few large-scale, quantitative-based surveys of public sector organizations have sought to create measurement frameworks building upon the Oslo Manual, such as the Nesta Innovation Index , Australian Public Sector Innovation Indicators framework, and the European Public Sector Innovation Scorecard However, these existing frameworks are too complex to quickly and easily replicate, as they rely on both quantitative plus qualitative surveys with a large sample size conducted over an extended period of time.
Especially in the public sector, there is a need for further development of measurement systems. What is needed are simple, adoptable frameworks that can measure innovation at any level of government, from team to unit to department or agency. So what specific indicators should be measured? If your team is seeking to quickly assess its innovation performance, start by looking at the key capabilities that enable your unit to be innovation-ready.
Emergent themes of enablers most commonly assessed across different frameworks are:. To kick-start your innovation measurement practice starts by carving out the time and intention of what innovation process or practice or behaviours you would like to assess.
Invite a team of trusted and curious colleagues to join you. Together, identify which key innovation capabilities and subsequent indicators are relevant for you and your team to assess. This ranking scale can highlight more clearly areas of strength, weakness and potential improvement. To further guide you, you might start by asking and discussing these five most basic questions as they relate to commonly found enablers of innovation:.
These questions can and should spark the thinking and conversation on how to measure innovation performance in your unit, team, department or agency. Comparability of the BRDIS and CIS data also depends on surveying similar populations of firms and on the techniques used to derive estimates from the data. It is important to recognize that such concepts as invention, innovation, and technology diffusion are on a continuum, and there is still debate regarding their respective space on that continuum. As NCSES develops surveys, new datasets, and new indicators of innovation activities, it will be important to attempt to establish rigorous standards for defining these terms.
Such standards will have implications for innovation surveys internationally and for the comparability of data produced by those surveys. In addition to the lack of standard definitions, other factors can limit the comparability of U. See Table for a list of websites for innovation surveys from regions around the world.
Complete cognitive testing of the questions used on innovation surveys is ongoing in the United States and Europe.
Nevertheless, the data are useful for preliminary snapshots. The project is examining how businesses define and measure innovation.
Measuring Innovation - A New Perspective - en - OECD
This effort is motivated by the sizable gap between estimates in the United States and Europe on the incidence of innovation among firms, with BRDIS data being used as the baseline for U. The and BRDIS instruments included some systematic variations aimed at determining whether the way innovation questions were being asked substantially influenced the incidence answers provided. During a discussion with panel members, NCSES staff indicated that the conclusion to date is that there is no major difference in the statistics due to the phrasing of the questions. The weak result on innovation changed little when the questions were moved to the front of the BRDIS see Table Department of Agriculture.
- Measuring Innovation in Education: A New Perspective.
- Marx et le problème de lidéologie (Logiques sociales) (French Edition).
- SearchWorks Catalog.
- Drupal 7 Views (To The Point)!
- Nationalsozialistische Kulturpolitik im Spannungsfeld zwischen Propaganda, Hoch- und Massenkultur (German Edition)!
The surveys administered by these agencies contain a wide range of useful global, national, state, and local indicators. However, most rely on policy indicators or indicators of inputs to innovation activity; they contain very few measures. How can innovation and its impact on a business be measured? These are basic questions that the cognitive testing project is considering.
This gap illustrates the importance of BRDIS, and it also suggests that these agencies, to the extent possible, should be working together see Recommendation in Chapter 8. The panel strongly supports the increased use of innovation surveys as a way to measure the output of the innovation system. Later in this chapter, new, nontraditional methodologies for measuring innovation are discussed. In this section, however, the panel underscores the inherent limitations of any survey of innovation inputs and outputs.
First, surveys are time-consuming and expensive for the companies asked to complete them, as well as for those that must compile, validate, and publish the results. The quality of the survey depends not only on the number of responses, but also on the amount of time and level of personal attention paid to them by respondents, which surely vary across companies.
Surveys of individuals, such as those relating to the career paths of scientists and engineers discussed later in this report, are becoming increasingly problematic because of declining response rates. A similar decline in survey response can be observed in all wealthy. A second limitation of innovation surveys is that the nature of the economy is rapidly changing, largely as a result of innovation itself. Yet because of the many lags built into the survey process—from survey design, to survey deployment, to compilation and validation of results, sometimes across multiple agencies—results may be stale or of limited use once they are published.
Timing also is important because rates of dissemination of new knowledge and products are important variables of interest. Furthermore, in the current budgetary environment, it will be difficult if not impossible for NCSES to mount new surveys in the future without cutting back on those it already undertakes. Fourth, the traditional classification of economic transactions into goods and services can miss important innovations that do not fall easily into either of these categories.
A good example is the rise of data as a separate economic category or product. Innovations in constructing and analyzing databases are becoming increasingly important in an ever-expanding range of industries—not only Internet search, but also retailing and manufacturing and now even health care. These innovations are important, but their economic value is difficult to measure and categorize. Fifth, a more general challenge is that some innovations—the precise share is unknown—do not result in increased output, at least as conventionally measured.
For example, the activity of someone in Germany downloading a free application written in the United States will not show up in either the U. Similarly, an increase in Internet search activity will not show up as an increase in consumer expenditures Mandel, Various but not all financial innovations, for example, had perverse effects that contributed to the financial crisis of , which in turn led to the Great Recession see, e.
Likewise, while innovations and communication that have facilitated the growth of trade can make some countries better off, some countries or portions of their population unskilled workers in particular may be made worse off in a globalized economy see, e. However, experience suggests that the subjective nature of survey responses poses some difficulties for interpretation. In particular, the percentage of self-reported innovative firms varies across countries in unexpected ways.
By contrast, the percentage of German firms reporting themselves as innovative in the period was far higher, at Even within the European Union EU , however, unexpected differences are seen, with Germany reporting far higher rates of innovation than the Netherlands Rather, the challenges are described here to give a sense of how changes in the economy lead to difficulties in measuring innovation and its impact. The panel does have some thoughts about general processes NCSES can follow to stay abreast of these changes and meet these challenges, which are outlined later in this and subsequent chapters of the report.
This section examines improvements that could be made to BRDIS in five areas: 1 international comparability; 2 deeper information on innovations; 3 extensions to cover organizational and marketing innovations, unmarketed innovations, and a broader array of inputs to innovation; 4 improvements to the presentation of information; and 5 better linkages between BRDIS data and other datasets.
The report also documents declining response rates for a variety of surveys, including panel and longitudinal surveys pp. Recently, however—and in this report—the term has come to refer to data that are gathered from sources other than surveys, such as administrative records, websites, genomics, and geographic sensors, to name a few. Statistically, the term and its characteristics are still being codified. The statistics for German firms are found in Eurostat One impediment to understanding and assessing innovation in the United States is the lack of comparability between U.
STI indicators and those developed by other countries. Comparability of BRDIS and CIS data requires surveying similar populations of firms in comparable size categories and using similar techniques to derive estimates of innovation from the raw data collected. Also needed for comparability are statistics using the same set of industries typically used in statistics for other countries.
These data could be used to compile a simple indicator of the share of product-process innovative firms, defined as firms that have implemented a product or process innovation. Even with more comparable statistics on innovation, it still will not be clear to users that firms are representing the same or similar things when they report product or process innovations. The BRDIS questions do not give enough information to provide a full understanding of what the resulting data and statistics mean. Users would have more confidence in and understanding of the BRDIS innovation measures if they knew that knowledge input measures were correlated with actual performance.
Without greater detail on specific innovations, moreover, the surveys paint an exceedingly broad picture of innovation. Indicators should provide not only a level but also insight into how that level was achieved.
Microdata from innovation surveys connected with other data on a firm might help achieve this goal, but this approach has as yet not been exploited. Most innovative firms introduce both product and process innovations, as well as organizational or marketing innovations discussed below , and the impacts of the innovations are likely to depend on many other business decisions see OECD, ; Stone et al. In a recent study conducted in Australia Arundel et al.
The study results provide valuable information on how respondents perceive innovations and on what innovation indicators represent.
Services on Demand
They also indicate which types of innovations are deemed most important for business performance; in many cases, examples of new organizational methods and new marketing concepts and strategies were cited. It would be useful to have firms of different sizes, different sectors, and different geographic locations represented in such a study. There are three important ways in which BRDIS could be extended to make it more useful for policy purposes.
First, the innovation questions could be broadened to include organizational and marketing innovations. The communications sector, broadly defined to include communications-related hardware and software as well as telecommunications providers, has clearly been one of the most innovative sectors of the economy in recent years. Within this sector, smartphones are a prominent example of product innovation, while increases in mobile broadband speed exemplify process innovation.
Other recent innovations in the communications sector, however, do not fit so neatly into the product and process categories.