Thuis
Contacten

    Hoofdpagina


Scientification, immune responses, and reflection: The changing relationship between management studies and consulting

Dovnload 5.79 Mb.

Scientification, immune responses, and reflection: The changing relationship between management studies and consulting



Pagina6/15
Datum25.10.2017
Grootte5.79 Mb.

Dovnload 5.79 Mb.
1   2   3   4   5   6   7   8   9   ...   15



  1. The term ‘‘evidence based’’ is used or known:

  1. Systematic reviews are produced and made accessible:

    • plenty of traditional reviews and meta-analyses

    • very few systematic reviews in I–O psychology

  1. Articles reporting primary research and traditional literature reviews are accessible to practitioners:

    • difficulty for many I–O psychology practitioners unless still attached to a university

    • abstracts available, but purchasing articles can be costly

    • not all I–O psychology practitioners, depending somewhat on where they trained, are highly skilled in reading and digesting (sometimes rather indigestible) journal articles

  1. ‘‘Cutting-edge’’ practices, panaceas, and fashionable ‘‘new’’ ideas are treated with healthy skepticism:

    • inclined to be quite skeptical or at least are inclined to ask about evidence and empirical support by our training as psychologists

    • yet we’re drawn to what might be called management fads and fashions (inclined to pick up and run with the Next Big Thing even where evidence does not yet exist or is questionable)

      • clients often demand the latest thing

      • If we don’t bring it to them, someone else will

      • try to rework the fad into something closer to our own practice and to established and evidence-based techniques



  1. There is a demand for evidence-based practice from clients and customers:

    • Major clients are those working mid/senior levels in HRM, a field that has not yet embraced evidence based practice

    • managers do not actively seek to purchase ineffective I–O psychology products or services but they are under pressure by short term goals  depend on a general impression that particular products or techniques ‘‘work’’ rather than whether they will work in their specific context

    • benchmarking or mimicry = adopting the same I–O psychology practices already used by their more successful competitors

    • clients have often already decided what they want and are asking the I–O psychologist as a technical specialist to deliver it  our clients are not demanding an evidence-based approach but want to adopt a practice they believe to be effective

  1. Practice decisions are integrative and draw on the four sources of information and evidence described above

    • most difficult characteristic of evidence-based practice to assess without access to numerous observations of what practicing I–O psychologists actually do

    • we play an advisory role, providing information and interpretations to the decision makers

    • taking each source of information in turn, I–O psychologists do draw to some extent on evaluated external evidence when making decisions (even though few systematic reviews are available and access to primary research can be difficult)

    • the perspectives of those who may be affected by the decision are likely to be taken into account at least to some extent because of the APA’s Ethics Code also because of the broader awareness we should have as psychologists about our responsibilities to organizational and individual clients  we’re likely to look for and use evidence from the local context + attempt some initial assessment of the problem or situation + seek out organizational data that might help with problem diagnosis

    • use of practitioner experience and judgment seems highly likely, particularly if the problem or technique is one we have encountered frequently before.

  1. Initial training and continuing professional development (CPD) in evidence-based approaches:

    • Training in I-O psychology tends to be of the fairly traditional academic variety (passive way to learn and retain information, in the US and Britain no internships are required)



        • I–O psychology is not strongly evidence based in the sense that the term is used in other professions but we are as a profession extremely well positioned to adopt, should we wish to do so, many of these characteristics

        • I–O psychologists in many instances are not the key decision makers but, rather, sources of information and advice

        • there are many barriers to the adoption of evidence-based practice, some within and others outside our control

        • 2 importants means for bridging the research-practice gap: practice-oriented evidence and systematic review

Key Strategies for Promoting Evidence-Based Practice

  • Practitioners and scholars in I–O psychology are largely distinct communities of interest, knowledge, and social ties  design ways of communicating and sharing ideas that serve the interests of both

    • devices for translating back and forth information and knowledge, promoting better quality communication and learning

    • SIOP practitioner survey described above, a frequent practitioner request to SIOP was for clarification of standards for I–O practice and better ways of differentiating I–O psychologists from other practitioners in the marketplace (e. Such clarification and professional differentiation can come from creating the evidenceoriented products and associated processes proposed here)

(Although evidence-based practice involves the better use and integration of evidence and information from all four sources described above, we focus here on improving the use of critically evaluated research evidence)

  1. Practice-Oriented Evidence

  • Most research published are theory-oriented investigations authored by academy-based I–O psychologists answering questions of interests to other academics

  • There has been a decline of research authored by practitioners in all I-O journals  Academics are the ones asking the research questions and interpreting the answer

    • shift in journal focus to more academic topics of rigor

    • greater corporate concern for protecting intellectual property

    • ramped-up global competition and its accompanying time and resource crunch  limited practitioner opportunity for research let alone publication

        • I–O psychology’s academics and practitioners are not mingling with each other in our journals


  • Gap at least partly attributable to lower participation by practitioners in research  omission in current research of the kinds of complex problems in complex settings faced by practitioners in their work  engaged scholarship and scholar–practitioner collaboration: academics and practitioners work together to formulate research questions, investigate them, and draw conclusions

  • gap between research and practice entails problems in knowledge transfer

    • can be because of communication issues on both sides

    • barriers to transfer may also reside in the nature of the knowledge itself

    • meta-analysis and literature reviews in I–O psychology have led to the formulation of general knowledge principles based on scientific evidence

(The Handbook of Principles of Organizational Behavior: Indispensable Knowledge for Evidence-Based Management)

  • Handbook

    • over 30 chapters summarizing several hundred evidence-based principles

    • derived from I–O psychology research and all intended to guide the practice of current and future managers and other practitioners

    • contains many examples and is written in plain English

    • “‘researcher-oriented evidence’’ = evidence from rigorous tests of theory, replicated and found to be relatively generalizable over time and context

      • +: achieve our ideal as scientists (to understand the world well and disseminate this knowledge)

      • -: not always obvious to practitioners how exactly to apply the principles identified in such research

VB: finding that General Mental Ability (GMA) positively related to individual performance if an organization seeks to improve the quality of its workforce and the performance of individual members, it should select on intelligence

  • widely promoted by I–O psychology practitioners and is soundly rejected by even experienced HR managers (Practitioners think about the educated idiot who is book smart, tests fantastically well, but isn’t good at work / Managers fear being labeled elitist and perhaps wonder whether they would have gotten their own jobs if their company used IQ tests)

  • concern over adverse impact



        • research–practice gap is best bridged by producing practice-oriented scientific knowledge via research approaches engaging both academics and researchers collaboratively: combining the knowledge of practitioners and the knowledge of academics at all stages of the research process

MEDICINE: ‘‘Disease-oriented evidence’’ : causes of disease providing evidence of pathology and ways of altering the condition (drugs, surgery)

 I-O: phenomena-oriented evidence, such as the origins of job stress or job satisfaction



MEDICINE: “patient-oriented evidence”: gathered from studies of real patients about issues such as mortality, morbidity, and quality of life

 I-O: studies contrasting two interventions to reduce job stress that assess the types of individuals, work settings, and job strains they best ameliorate



        • There’s a growing trend in the practice of medicine to value patient-oriented data more highly than DOE but because practice-oriented evidence does not yet exist to inform every clinical need, practitioners must use other ways of making decisions too (relying on their knowledge of basic physiological processes)

VB: reports by Robert Pritchard and his team, developing and investigating the use of the Productivity Measurement and Enhancement System (ProMES) system for job analysis and strategic planning

  • Differences identified between studies in how thoroughly the ProMES system was applied several implementation related factors ( extent users adhered to the ProMES process and the quality of the feedback provided) affected the overall productivity gains associated with ProMES

  • Pritchard addressed the circumstances under which there are differences in implementation or compliance with standard practices and the sensitivity of outcomes to these variations

VB people know that GMA is predictive of individual performance, but organizations are reluctant to accept or act on this knowledge

Practice oriented evidence could be developed from investigations into conditions making use of GMA as a selection criterion more readily useful



Practice-oriented research could look into whether performance criteria in use affected the value and usefulness practitioners attach to indicators of GMA

Academically oriented evidence indicates that GMA is likely to predict performance in either case. Practitioners may only find GMA useful where mental ability is an organizationally valued contributor to performance

  1. Systematic Reviews

  • reviews provide one of the four sources of information required when making evidence-based decisions

= literature reviews that adhere closely to a set of scientific methods that explicitly aim to limit systematic error (bias), mainly by attempting to identify, appraise and synthesize all relevant studies (of whatever design) in order to answer a particular question (or set of questions). In carrying out this task they set out their methods in advance, and in detail, as one would for any piece of social research. In this respect ... they are quite unlike most ‘‘traditional’’ narrative reviews.

  • way of analyzing existing research using explicit and replicable methods  able to draw conclusions about what is known and not known in relation to the review question

  • meta-analysis = type of systematic review but one that uses only quantitative data and statistical synthesis and focuses on a question repeatedly addressed in the same way by researchers rather than a practice question or problem

    • similar: study of studies

    • both are conducted because single empirical studies (useful and sometimes informative) should not be emphasized because their biases and limitations cannot be fully accounted for

    • Looking at all relevant studies (systematically gathered)  more reliable evidence

  • Evidence based practice: neither traditional literature reviews nor meta-analyses are very useful

    • Traditional literature reviews: biased

Vb reviewers do not make clear how they have selected the studies they have included, do not critically appraise them in an explicit or systematic way, and do not usually pull them together or synthesize findings across studies

    • Traditional literature reviews: no focus on specific research, practice question, or problem

        • differentiates a systematic review from the quantitative meta-analysis used traditionally in I–O psychology
          (evidence based decision making: focused and tailored reviews of evidence where both a practice question or problem and the conditions to which the evidence might be applied are taken into account)

vb table 1: case of high absenteeism

  • systematic review: attempt to find evidence about the relative effectiveness of different forms of absence management interventions given the current and desired absence rates and taking into account as much as possible aspects of the context such as the type of employees involved, the sector, and the existing work arrangements and absence policies




  • systematic reviews can take forms akin to phenomena-oriented evidence or practice-oriented evidence, depending on the review questions and their intended use as well as the kinds of research available

  • systematic review can be useful for purely academic research purposes too


vb interested in collecting all available evidence about absence-management interventions to provide

a more general overview about what is known, and not known, about the efficacy of such interventions

  • systematic review may differ from meta-analysis: it would also consider qualitative information and descriptions, not just effect sizes alone

  • all systematic reviews: process of clearly specified stages

    • 1: Identify and clearly define the question the review will address

      • Specific question  shows clearly what types of data would be relevant

      • Aspects of context (population, sector, organizational type) specified

      • Aspects of interventions (what would be a relevant intervention) specified

      • Aspects of mechanisms linking intervention to outcome (processes, mediators, moderators) specified

      • Aspects of the outcomes (which data are the outcomes of interest) specified

    • 2: Determine the types of studies and data that will answer the question

      • Criteria to help decide which studies will be selected / excluded are identified

Vb review that addresses a causal question might exclude studies with cross-sectional designs.


    • 3: Search the literature to locate relevant studies

      • Specifying databases to be searched, keywords, …

    • 4: Sift through all the retrieved studies to identify those that meet the inclusion criteria (and need to be examined further) and those that do not and should be excluded

      • 2 review team members examine each study + check against inclusion and exclusion criteria

      • No agreement  3rd reviewer

      • Often only fraction of pool of studies can be included

    • 5: Critically appraise the studies by assessing the study quality determined in relation to the review question

      • Research quality can only be judged in relation to the question

      • Assessing quality review conclusions can clearly state how many studies were of high/low quality

    • 6: Synthesize the findings from the studies

      • Key part of systematic review: pulling together findings from across the studies to represent what is known and what is not

      • Review findings are often described in terms of the overall number of studies found/ the quality profile of this group of studies/ the number of studies that obtained particular results

    • 7: disseminate the feelings

      • Full report can me quite long  shorter journal article length

      • planned at the outset of a systematic review given the aim is often to inform practice

  • systematic review is rare in I-O psychology but psychologists are familiar with its general approach: underlying logic similar to psychological research methods/ meta-analysis

    • meta-analysis exclusive use of quantitative data + statistical synthesis)
      ( systematic review: different types of data and forms of synthesis)

    • meta-analysis can only address questions that have been asked many times before in almost the same way by other researchers

vb structured abstract from systematic review relevant for I-O psychology: table 3





    • clearly states the review objectives / search strategy / criteria for inclusion / method of analysis / number of studies found / findings of each

  • differences between I-O psychology reviews & systematic reviews

    • exact methods used by the reviewers to find, select, and exclude studies are open to examination and scrutiny  readers are able to reach their own judgements about efficacy & appropriateness of the method + how much confidence they place in the research

    • we have little access to systematic reviews  develop an implicit though somewhat inaccurate sense of the quantity and quality of research about a particular technique or intervention

vb we just assume the research ‘is out there somewhere’ of have vague recollections about research and the results

      • systematic review reveals far fewer studies to be relevant to a question than assumed

    • (unlike traditional review) processes through which reviewers reach their conclusions are explicit and transparent

vb statistical/narrative synthesis, basis on which data were gathered

    • (unlike traditional review) systematic reviews allow us to identify the quantity & quality of studies and many aspects of study heterogeneity (method, population, design, findings)

      • Essential information to draw conclusions about what is known/not in relation to the review question, the basis of these claims, the confidence with which they can be made

  • In addition to full-blown systematic reviews of the sort described in Table 2, there are other quicker and more focused ways of doing systematic reviews that share many of their qualities

    • Not as thorough/ informative as systematic reviews

    • Provide important information and practice insights
      = rapid evidence assessment, best evidence topics= quick by restricting parameters/ using fewer search terms, …




        • Systematic reviews will become essential part of I-O psychology

          • allow practitioners to provide more evidence-based advice and share the basis

          • allow researchers in a more structured way to identify important gaps in knowledge

          • systematic reviews can highlight where conducting more research on the same question using a particular method is unlikely to yield any new information

Barriers to evidence-based practice in IOP

  • apparent lack of demand from our clients for evidence based IOP

    • faster adoption of cutting edge practice  decisions aren’t made in a evidence based way

    • individual managers are rewarded for short term goals instead of for doing what works in the long term

        • can be overcome by working with organizations to demonstrate how approaching problems in an evidence based way leads to effective, sustainable solutions

        • emphasize that evidence based practice constitutes a family of approaches to make decisions (intended to improve the process and outcome of decision making, not to solve every problem)

  • predominance of masters-level practitioners who learned to practice IOP in unsupervised ways

    • limited understanding of research

    • limited capacity of access new evidence

    • lack of skills to conduct their own systematic reviews or primary research

        • Continuing Professional Development (CPD) enables evidence based practice

        • help IOP practitioners to access research evidence

  • reluctant to acknowledge the limitations of our knowledge where evidence is mixed or where there are grounds for uncertainty

    • difficult to find a balance between promoting a profession while acknowledging its limitations  clients find this unnerving/ sign of incompetence

  • skepticism is a key ingredient of evidence based practice but has limited popularity in IOP

        • remind ourselves that skepticism is fundamental to clearly differentiate us from other organizational practitioners/consultants

  • politics of evidence in organizations

    • power & politics are fundamental for decision making + surround identification/use of evidence in organizations

vb senior managers feel like they have the right to decide based on their experience

    • evidence based practice may be difficult in organizations with highly political cultures

        • Evidence based practice offers the possibility to make clearer distinctions between politics, values, beliefs, interest and other forms of information (research evidence)

(The more decision makers are held accountable for their decisions, the more likely they are to welcome such distinctions)

Prospects for evidence based IOP



  • IOP cannot claim to be fully evidence based yet

  • I–O psychologists are uniquely qualified to undertake the key activities required by evidence based practice (evaluating external research evidence, collecting/using internal evidence for organizational diagnosis)

  • Evidence based practice: IOP (trainers, consultants, inhouse management, …) are positioned to enable an array of approaches

    • (using evidence-based processes for making decisions, giving feedback, as well as incorporating evidence-based content, that is, research findings, into their decisions and practices)

    • Helping organizations and those who manage them become more evidence based

      • We can provide systematic review services (or quicker/briefer version)

        • Useful when organization is thinking of investing lots of money in an intervention/program

      • We have skills to help organization make sense of existing data or collect new data that may diagnose problems or show why/how something is working

    • Systematic reviews need to be part of the professional training of IOP: they help gather and use information

  • IOP have the background to enable us to work as facilitators/coaches for managers/management teams seeking to engage in evidence based management as well as helping organizations collect the internal/external information they need

    • help collect information about the perspectives of those who may be affected by a decision and help make explicit managers’ own expertise and experience and how it is shaping a decision

        • support organizations to make decisions in a conscientious, explicit, and judicious way—in short, to help organizations to practice EBMgt


1   2   3   4   5   6   7   8   9   ...   15


Dovnload 5.79 Mb.