Caring Research and Practice

Photo by Aleksey Kuprikov from Pexels

Photo by Aleksey Kuprikov from Pexels

Caring, caring for, and caring about involve thought, our ideas about caring, and the ‘doing’ of caring. In short, both the theory and practice of caring.  As such there is always the possibility of a gap between theory and practice and between those who produce the theory and those who do the caring.  In this post we explore a common problem in our social work academy, a problem encapsulated by Pierre Bourdieu: the scholastic fallacy. The latter occurs when social work researchers and academics place less emphasis upon the realities of practice and more on the academic ‘sense of the game’ (e.g., peer review, promotion, socialization into the academy through doctoral education, professional meetings, method and external funding one-upmanship, and the designation of the top schools of social work assigned only to those situated within “Research 1” institutions—considered the highest level of research activity).  When the protection of academic research privilege is more important than the relevance of practice, the scholastic fallacy exists. 

An element of the scholastic fallacy is Bourdieu’s illusio, that is, a “particular form of belief…which is demanded by scholastic fields and which presuppose suspension of the objectives of ordinary existence, in favour of new stakes, posited and produced by the game itself” (Bourdieu, 2000, p. 101).  In social work the stakes are determined within a highly contested field (Abbott, 2005),  where capital circulates mostly in the form of methods (i.e., research technique) and where competition within the field is determined by degrees of mastery of those stakes in relationship to funding streams; in social work, while those who master the stakes gain access to funding, they do not necessarily gain access to the symbolic capital within the academic field.  And because particular research interests, agendas, and projects are mostly determined by those outside the field of social work (i.e., state actors and agencies, funders, public policy), there exists instability and uncertainty within the field, which is addressed repeatedly by defensive appeals to scientificity (i.e., is social work a science, what kind of science is social work? etc.) (Brekke, 2014; Guerrero, 2014; Gitterman, 2014; Morris, 2008; Sheldon, 2001; Thyer & Myers 2011; Trevithick, 2008; Witkin, 1988, 1995; Zimmerman, 1989).  To fill the void of symbolic capital produced by the social work illusio (i.e., the dominance of methods), schools of social work, especially the top-ranked ones, actively recruit faculty with larger volumes of symbolic capital. At these moments, in particular, the investment in the game is both condition for and product of the game:

So as to avoid excluding themselves from the game and the profits that can be derived from it, whether we are talking about the simple pleasure of playing, or of all the material and symbolic advantages associated with the possession of symbolic capital, all those who have the privilege of investing in the game (instead of being reduced to the indifference and apathy of apoliticism) accept the tacit contract,  implied in the fact of participating in the game, of recognizing thereby that it is indeed worth playing. This contract unites them to all the other participants by a sort of initial collusion, one far more powerful than all open or secret agreements (Bourdieu, 1991, p. 180).

Playing the game matters a great deal when it comes to caring. In caring practices, the singularity of suffering is felt, described, and understood. The cared for and the care provider are locked into a caring practice specific to that unique situation: N of 1.  In positivist science and research, however, singularity is altogether erased.  In fact, it could be argued that the ‘sense of the game’ in much of positivist social work research requires the erasure of singularity.  George Steinmetz (2004) in his reflection on the fate of the case study and the study of singularity in positivist sociology and political science writes,  “In the more positivistically dominated social science fields such as sociology and political science, however, the case study has been demoted to subaltern status, with less scientific capital than comparative or large-sample quantitative studies.”   In practice, singularity is taken up by the case study (Easton, 2010; Flyvbjerg, 2001, 2006, 2012), even when practitioners (and researchers) are unaware of its implicit, everyday use and significance in the production and conveyance of expert knowledge and practice (Benner, 2000a, 2000b, 2004; Dreyfus, 2008).  In the practice of law, medicine, and management, the case study remains a major means of producing, teaching, and disseminating knowledge.  In social work, oddly, the case study appears only in practitioner journals (e.g.  Clinical Social Work Journal) and almost never is the case method taught as a valid research method in doctoral programs.

There is no doubt that the distance between research and practice can be partially accounted for in the commitments that researchers make to the prevailing research agendas and orthodoxies (doxa) of neoliberal states and bureaucracies (Garrett, 2010; Hasenfeld & Garrow, 2012; Holmwood, 2013; Houston, 2014; Woolford & Nelund, 2013).  And as the neoliberal state, especially in the United States, has increasingly reached into the daily lives of its citizens and simultaneously withdrawn (Bourdieu, 1999), the state has fundamentally shifted the burden and ethics of caring to the individual (Garrett, 2008, 2014; Held, 2006;  Holmwood, 2014; King , 2014; Klodawsky et al. 2006; McDowell, 2004; Rogowski, 2010, 2012; Sayer, 2011, Webb, 2001), private sector actors, and toward models of practice rooted in rational action, cognitivism and behaviorism.  These models of practice, moreover, call for narrow methodological commitments, especially to methodological individualism (Brodie, 2007; Brown, 2006;  Dixon, 2012; Fine, 2006): if we could just find ways of identifying the factors and behaviors that cause single fathers or mothers to behave in the ways they do, then surely the state can act to change those behaviors.  These moves, in turn, have allowed for a profound shift from the ethics of caring to regimes of managerialism (Froggett, 2002; Meagher & Parton, 2004; Webb, 2001; White, 2009). [1]  Meagher and Parton (2004) write of this move in social work: with “…refined systems of audit and new operational and administrative procedures, social work practice has become more legalised, and aspirations to ‘evidence-based practice’ have become pervasive. These developments, which emphasise practitioner accountability to stakeholders other than clients, seem to dominate the nature of what it is to be a social work practitioner, particularly in the public sector” (p.  10).

Another ‘sense of the game’ has been produced through neoliberal performance measures.   Academe embraces and enforces distinctions between high impact vs. low impact and meritorious vs. non-meritorious.  The logic of the game runs like this.  If you publish in the so-called ‘high impact’ journals,  symbolic capital accumulates and careers advance along with institutional rankings.  Seglen (1997) writes of the difficulty of measuring impact in the natural sciences, “Journal impact factors depend on the research field citation habits and citation dynamics can be so different in different research fields as to make evaluative comparisons on the basis of citation rate or journal impact difficult or impossible. For example, biochemistry and molecular biology articles were cited about five times as often as pharmacy articles” (p. 501).   (Note that there is less impact designation assigned to the applied field, that is, pharmacy).  Holmwood writes of this move in higher education more generally:

 In some contexts, they are an instrument of performance management of individuals and departments, as university governance moves from a collegial to a managerial system. At the same time, however, metric data are used to assess the performance of managers, themselves, and the ‘functions’ under their remit, and also their institutions in national and international rankings. Similarly, editors of journals are concerned about the impact factor of their journal and its standing against comparable journals, which, in turn, re-enters local decision-making in recommendations about individual performance (for example, concerning the standing of their publications) and in terms of journals that should be targeted by a department to maximize its standing in research evaluation. In many places, these judgements affect pay and working conditions as contracts separating teaching from research, or varying the ratio between them, become more frequent (Holmwood, 2013).

The top 20 schools of social work have all developed and supported hierarchies among faculty: clinical track, research track, teaching track, non-tenure and tenure tracks are now all too common and the pay and workload hierarchy is most troubling because it systemically reproduces the gap between caring research and caring practice. Furthermore, research societies and journals split off  (socially and psychologically) from the everyday struggles of  practice, from clients and practitioners [2} and from the publications and organizations involved in the dissemination of practitioner knowledge. In the U.S., for example, the Society for Social Work and Research, an organization emerging from and dominated by academic positivism, produces a journal and holds annual meetings where faculty researchers print and present findings.  And the National Association of Social Work, the practitioner organization, holds annual meetings attended mainly by practitioners or by academics with closer relationships to worlds of practice.  These trends operate to obscure the importance of relevance.  The researcher’s “impact” [3] refers not to clients, patients, or the subjects of empirical research but to the ‘sense of the game’ (Judge et al., 2007), the mastery of research techniques, membership in research societies, and the game of citation.   In short, we’re not measuring the impact of the research on our clients or on human suffering.  This is not a relevance factor. 

Scholastic fallacies are present when social work researchers spend an incredible amount of time criticizing their methods instead of the structure of scientific practice that produces those methods. As they debate the advancement of their statistical methods, limitations of their sampling techniques, and their desire to attain the gold standard, the Random Controlled Trial (RCT), they are systemically hiding behind their academic privilege by single-handedly identifying the research object, methods, securing funding, and ultimately (and desirably) conducting the RCT. RCTs are extremely expensive when done correctly, so who decides how research dollars are spent and on what problems? The answer: the peer review system of the granting agency, which are also premised on the ‘sense of the game.’ Moreover,  as the gold standard for research (Cartwright, 2009) has moved toward the RCT (randomized control trial) the frontline worker’s standard for practice has shifted to the so-called evidence-base, the standardized manual, ideally produced by the RCT (Cartwright, 2009; Dore, 2006;  Nairn, 2012; Gray, 2006; Gray  et al., 2009);  like their clients, frontline social workers are now trained using manuals; effective teaching and expertise is demonstrated through testing and mastery of the manual.  Cartwright (2010) persuasively argues that the RCT has internal validity but lacks external validity.  She writes,

A properly conducted RCT provides evidence that the intervention works somewhere (i.e. In the trial). The decision maker, however, needs to estimate ‘will it work for us?’ In health and social care the underlying social and physical structures in which an intervention is devised cannot automatically be assumed to be comparable to target localities in causally relevant aspects (assuming we knew what these were (our emphasis). Differences in institutional, psychological and physical factors yield different causal and probabilistic relations (p. 265).

The scholastic fallacy means, moreover, that a certain kind of privilege is afforded to academic social workers. If practitioners and clients seek transformative caring practices, practices that end racism and structural violence, for example, they should allocate some portion of the systemic in “systemic racism” as the system that produces the scholastic fallacy, a system segregating an elite class of research academics from teaching faculty, practitioners, and clients. By elite we mean those who primarily protect their research status in academe instead of exposing the serious problems in academic hierarchical systems that segregate knowledge producers from knowledge users. These false binaries have helped to reproduce a growing gap between theory and practice.

As the distance between the places where practitioners practice and the places where practice knowledge is produced has increased, so has the gap between what matters most to people and what matters to those who study people (Longhofer & Floersch, 2014; Sayer, 2011). This gap between research and practice has been described by many in the applied fields not only as a crisis of knowledge production but also as a crisis of values: a gap between what researchers do and what matters most to people (Davis, 2013; Fairclough, 2013;  Flyvbjerg, 2001, 2012; Longhofer & Floersch, 2012, 2013; Putnam, 2002;  Putnam and Walsh, 2012; Sayer, 2009, 2011; Van de Ven & Johnson 2006).   The more the scholastic field is committed to the rules of the game, the illusio, the greater a distance is produced between the things that matter to clients and practitioners (i.e.,  normative claims and values) and the things that count as significant within the illusio (i.e., a “suspension of the objectives of ordinary existence in favor of the new stakes”).   And as we move toward what John Holmwood describes as “death by metrics,” social work researchers have purchased the broader and more encompassing academic field’s reduction of the normative to the positive (i.e., the ought to the is); with this reduction, researchers establish a growing distance between what matters most to those we practice with and what is metricized within the illusio (see Andrew Sayer’s recent book, Why things Matter to People: Social Science, Values and Ethical life).  

A truly engaged and critical social work research would accomplish what Andrew Van de Ven has described as engaged scholarship.  For Van de Ven, engaged scholarship uses what he calls an arbitrage strategy “for surpassing the dual hurdles of relevance and rigor” for understanding complex problems.  Arbitrage exploits the differences in the forms of knowledge scholars and practitioners from diverse areas deploy in particular practice settings.   He argues that “quality as well as the impact of research improves substantially when researchers do four things: (1) confront questions and anomalies existing in reality, (2) organize the research project as a collaborative learning community of scholars and practitioners with diverse perspectives, (3) conduct research that systematically examines not only alternative models and theories but alternative practical formulations of the question of interest, and (4) frame the research and its findings to contribute knowledge to academic disciplines and to one or more domains of practice” (Van de Ven, 2006, p. 215). Van de Ven has a model structure for bringing the caring researchers together with the caring practitioners.

 [1] Gambrill (1999) offers her own dichotomous account and defense of evidence-based practice by arguing that the alternative is “authority-based practice.”

[2] In the United States, the Society for Social Work Research (SSWR) attracts exclusively those with particular and strong commitments to positivism while the NASW (The National Association of Social Workers) attracts mainly practitioners.   And while the SSWR mission statement makes a passing reference to values (i.e., social justice) it is not clear how the mission articulates with the vision of the society.

[3] Lawrence writes of the impact factor in biology: “It is fun to imagine song writers being assessed in the way that scientists are today. Bureaucrats employed by DAFTA (Ditty, Aria, Fugue and Toccata Assessment) would count the number of songs produced and rank them by which radio stations they were played on during the first two weeks after release. The song writers would soon find that producing junky Christmas tunes and cosying up to DJs from top radio stations advanced their careers more than composing proper music. It is not so funny that, in the real world of science, dodgy evaluation criteria such as impact factors and citations are dominating minds, distorting behaviour and determining careers. Modern science, particularly biomedicine, is being damaged by attempts to measure the quantity and quality of research. Scientists are ranked according to these measures, a ranking that impacts on funding of grants, competition for posts and promotion. The measures seemed, at first rather harmless, but, like cuckoos in a nest, they have grown into monsters that threaten science itself. Already, they have produced an “audit society” in which scientists aim, and indeed are forced, to put meeting the measures above trying to understand nature and disease (2007, p. R583).

References

Abbott, A. (2005). Linked Ecologies: States and Universities as Environments for Professions. Sociological theory23(3), 245-274.

Archer, M. S. (2010). Routine, reflexivity, and realism. Sociological Theory, 28(3), 272-303.

Benner, P. (2000a). The wisdom of our practice.  The American Journal of Nursing, 100(10), 99-105.

Benner, P. (2000b). The roles of embodiment, emotion and lifeworld for rationality and agency in nursing practice. Nursing Philosophy, 1(1), 5-19.

Benner, P. (2004). Using the Dreyfus model of skill acquisition to describe and interpret skill acquisition and clinical judgment in nursing practice and education. Bulletin of science, technology & society, 24(3), 188-199.

Berman, E. P. (2011). Creating the market university: How academic science became an economic engine. Princeton: Princeton University Press.

Bourdieu, P. (1991). Language and symbolic power. Cambridge: Polity Press.

Bourdieu, P. (1999).  The abdication of the state, pp. 181-188.  In, The Weight of the World. Social Suffering in Contemporary.  Cambridge, UK: Blackwell.

Bourdieu, P. (2000). Pascalian meditations. Stanford: Stanford University Press.

Brekke, J. S. (2014). A Science of Social Work, and Social Work as an Integrative Scientific Discipline Have We Gone Too Far, or Not Far Enough?.Research on Social Work Practice, 24(5), 517-523.

Brodie, J. M. (2007). Reforming social justice in neoliberal times. Studies in social justice, 1(2), 93-107.

Brown, W. (2006). American Nightmare Neoliberalism, Neoconservatism, and De-Democratization. Political theory, 34(6), 690-714.

Cartwright, N., & Munro, E. (2010). The limitations of randomized controlled trials in predicting effectiveness. Journal of evaluation in clinical practice, 16(2), 260-266.

Cartwright, N. (2007). Are RCTs the gold standard?. Biosocieties, 2(1), 11-20.

Cartwright, N. (2009). Evidence-based policy: what’s to be done about relevance?. Philosophical Studies, 143(1), 127-136.

Chu, W. C., & Tsui, M. S. (2008). The nature of practice wisdom in social work revisited. International Social Work, 51(1), 47-54.Davis, J. E. (2013). Social Science, Objectivity, and Moral Life. Society, 50(6), 554-559.

Dixon, J. (2012). On being Poor‐by‐Choice: A Philosophical Critique of the Neoliberal Poverty Perspective. Poverty & Public Policy, 4(2), 1-19.

Dore, I. (2006). Evidence focused social care: on target or off-side?. Social Work & Society, 4(2), 232-255.

Dreyfus, H. L. (2008). On the internet. London: Routledge.

Easton, G. (2010). Critical realism in case study research. Industrial Marketing Management, 39(1), 118-128.

Fairclough, N. (2013). Critical discourse analysis and critical policy studies.Critical Policy Studies, 7(2), 177-197.

Fine, B. (2006). Debating critical realism in economics. Capital & Class, 30(2), 121-129.

Flyvbjerg, B. (2001). Making social science matter. Social Science and Policy Challenges: Democracy, Values, and Capacities, Cambridge: UK: Cambridge University Press.

Flyvbjerg, B. (2006). Five misunderstandings about case-study research. Qualitative inquiry, 12(2), 219-245.

Flyvbjerg, B. (2012). Five Misunderstandings about Case Study Research, Corrected. Qualitative Research: The Essential Guide to Theory and Practice, London and New York: Routledge, 165-166.

Froggett, L. 2002: Love, Hate and Welfare: Psychosocial approaches to policy and practice. Bristol, The Policy Press.

Gambrill, E. (1999). Evidence-based practice: An alternative to authority-based practice. Families in Society: The Journal of Contemporary Social Services,80(4), 341-350.

Garrett, P. M. (2010). Examining the ‘conservative revolution’: Neoliberalism and social work education. Social Work Education, 29(4), 340-355.

Garrett, P. M. (2008). How to be modern: New Labour’s neoliberal modernity and the Change for Children programme. British Journal of Social Work, 38(2), 270-289.

Garrett, P. M. (2014). Confronting the ‘work society’: New conceptual tools for social work. British Journal of Social Work, 44(7), 1682-1699.

Gitterman, A. (2014). Social Work: A Profession in Search of Its Identity. Journal of Social Work Education, 50(4), 599-607.

Gray, M., & Mcdonald, C. (2006). Pursuing good practice? The limits of evidence-based practice. Journal of Social Work, 6(1), 7-20.

Gray, M., Plath, D., Webb, S. A., & Webb, S. (2009). Evidence-based social work: A critical stance. London: Routledge.

Guerrero, E. G. (2014). What Does It Take for Social Work to Evolve to Science Status? Discussing Definition, Structure, and Contextual Challenges and Opportunities. Research on Social Work Practice, 24(5), 601-606.

Hasenfeld, Y., & Garrow, E. E. (2012). Nonprofit human-service organizations, social rights, and advocacy in a neoliberal welfare state. Social Service Review, 86(2), 295-322.

Held, V. (2006). The ethics of care: Personal, political, and global. Oxford: Oxford University Press.

Holmwood, J. (2014). From social rights to the market: neoliberalism and the knowledge economy. International Journal of Lifelong Education, 33(1), 62-76.

Holmwood, J.  (2013).  Death by metrics.  Global Dialogue: Newsletter for the International Sociological Association.  http://isa-global-dialogue.net/death-by-metrics/

Houston, S. (2002). Reflecting on habitus, field and capital towards a culturally sensitive social work. Journal of Social Work, 2(2), 149-167.

Houston, S. (2005). Philosophy, theory and method in social work challenging empiricism’s claim on evidence-based practice. Journal of Social Work, 5(1), 7-20.

Houston, S. (2014). social work and the sociological imagination. Children and Families, 61.

Humphries, B. (2003). What else counts as evidence in evidence-based social work?. Social Work Education, 22(1), 81-91.

Judge, T. A., Cable, D. M., Colbert, A. E., & Rynes, S. L. (2007). What causes a management article to be cited—article, author, or journal?. Academy of Management Journal, 50(3), 491-506.

Klodawsky, F., Aubry, T., & Farrell, S. (2006). Care and the lives of homeless youth in neoliberal times in Canada. Gender, Place and Culture, 13(4), 419-436.

Lahire, B. (2011). The plural actor.  Cambridge, UK: Polity Press.

Lawrence, P. A. (2007). The mismeasurement of science. Current Biology,17(15), R583-R585. doi:10.1016/j.cub.2007.06.014

Longhofer, J. L., & Floersch, J. (2004). Book Review: The Phenomenological Practice Gap Practice Guidelines, Evaluation, and Clinical Judgment.Qualitative Social Work, 3(4), 483-486.

Longhofer, J., & Floersch, J. (2012). The Coming Crisis in Social Work: Some Thoughts on Social Work and Science. Research on Social Work Practice, 22(5), 499-519.

Longhofer, J., & Floersch, J. (2014). Values in a science of social work: Values-informed research and research-informed values. Research on Social Work Practice, 24(5) 527–534.

Magill, M. (2006). The Future of Evidence in Evidence-based Practice Who Will Answer the Call for Clinical Relevance?. Journal of Social Work, 6(2), 101-115.

Martinez-Brawley, E. E. (2001). Searching again and again. Inclusion, heterogeneity and social work research. British Journal of Social Work, 31(2), 271-285.

Meagher, G. & Parton, N.  (2004). Modernising social work and the ethics of care. Social Work & Society, 2(1), 10-27.

McDowell, L. (2004). Work, workfare, work/life balance and an ethic of care. Progress in Human Geography, 28(2), 145-163.

Morris, P. M. (2008). Reinterpreting Abraham Flexner’s speech, “is social work a profession?”: Its meaning and influence on the field’s early professional development. Social Service Review, 82(1), 29-60.

Nairn, S. (2012). A critical realist approach to knowledge: implications for evidence‐based practice in and beyond nursing. Nursing inquiry, 19(1), 6-17.

Osmond, J. (2006). A quest for form: the tacit dimension of social work practice. European Journal of Social Work, 9(2), 159-181.

Putnam, H. (2002). The collapse of the fact/value dichotomy and other essays. Cambridge, MA: Harvard University Press.

Putnam, H., & Walsh, V. (Eds.). (2012). The end of value-free economics. London: Routledge.

Rogowski, S.  (2010). Social work: The rise and fall of a profession?. London: Policy Press.

Rogowski, S. (2012). Social work with children and families: Challenges and possibilities in the neo-liberal world. British Journal of Social Work, 42(5), 921-940.

Rosen, A., & Zeira, A. (2000). Unraveling “tacit knowledge”: What social workers do and why they do it. Social Service Review, 74(1), 103-123.

Sayer, A. (2009). Who’s afraid of critical social science?. Current Sociology, 57(6), 767-786.

Sayer, A. (2011). Why things matter to people: Social science, values and ethical life.  Cambridge, UK: Cambridge University Press.

Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. British Medical Journal, 314(7079), 498-502.

Sheldon, B. (2001). The validity of evidence-based practice in social work: A reply to Stephen Webb. The British Journal of Social Work, 801-809.

Sheppard, M. (1995). Social work, social science and practice wisdom. British Journal of Social Work, 25(3), 265-293.

Sheppard, M. (2012). Social work and social exclusion: the idea of practice. Ashgate Publishing, Ltd.

Smith, D. (1987). The limits of positivism in social work research. British Journal of Social Work, 17(4), 401-416.

Smith, D. (2002). The limits of positivism revisited. Social work & social sciences review, 10(1), 27-37.

Taylor, C., & White, S. (2006). Knowledge and reasoning in social work: Educating for humane judgement. British Journal of Social Work, 36(6), 937-954.

Taylor, C., & White, S. (2001). Knowledge, Truth and Reflexivity The Problem of Judgement in Social Work. Journal of social work, 1(1), 37-59.

Thyer, B. A., & Myers, L. L. (2011). The quest for evidence-based practice: A view from the United States. Journal of Social Work, 11(1), 8-25.

Tsang, E. W. (2014). Case studies and generalization in information systems research: A critical realist perspective. The Journal of Strategic Information Systems, 23(2), 174-186.

Turnbull, N., & Antalffy, N. (2009). Bourdieu's distinction between philosophical and sociological approaches to Science Studies. The Sociological Review, 57(4), 547-566.

Trevithick, P. (2008). Revisiting the knowledge base of social work: A framework for practice. British Journal of Social Work, 38(6), 1212-1237.

Van de Ven, A. H., & Johnson, P. E. (2006). Knowledge for theory and practice. Academy of management review, 31(4), 802-821.

Webb, S. A. (2001). Some considerations on the validity of evidence-based practice in social work. British Journal of social work, 31(1), 57-79.

Witkin, S. L., & Gottschalk, S. (1988). Alternative criteria for theory evaluation. The Social Service Review, 211-224.

Witkin, S. L. (1995). Whither social work research? An essay review. Social Work, 424-428.

Woolford, A., & Nelund, A. (2013). The Responsibilities of the Poor: Performing Neoliberal Citizenship within the Bureaucratic Field. Social Service Review,87(2), 292-318.

Zimmerman, J. H. (1989). Determinism, science, and social work. The Social Service Review, 52-62.

Previous
Previous

Neurochemical Selves?

Next
Next

How do you understand social work values?