Home Page ContentPress Releases Algorithms and robots are missing relational expertise

Algorithms and robots are missing relational expertise

by Anthony Weaver

There has been much talk in recent years of how rapidly evolving technologies will take over the high-skilled work of professionals – radiologists, teachers, and computer programmers. With new applications coming into play, like Chat GPT, the discussion has focused on white-collared roles typically deemed as ‘safe’ from technology extinction.

However, a new article argues that much professional work depends on relational expertise – expertise that is generated, applied, and recognized in collaboration with other professions, organizational arrangements, clients and patients, as well as tools and technologies – and as such this work cannot be done by algorithms.

Pauli Pakarinen, a Postdoctoral Researcher at Stanford University, and Ruthanne Huising, Professor at emlyon business school, show how detailed studies of how members of professions work make it clear that there are three key challenges for AI technologies relative to workers’ relational expertise.

Firstly, expertise and ideas are usually generated through ties and links among actors, making it unavailable for capture, abstraction, reproduction, or replication. Though data, rules, and professional guidance can be input into new technologies, relational expertise cannot be captured to train these technologies.

Secondly, applying expertise in relation to a client’s problem or a patient’s symptoms relies on interaction with the client or patient. In this interaction, additional information is gathered to improve diagnosis and the develop solutions that are appropriate and feasible for the client or patient. This is important and sophisticated translation work that depends on human interaction.

And thirdly, members of profession are individually and collectively accountable for their decisions, advice and treatment even when based on automated outcomes. Often these decisions and recommendations of AI technologies are not explainable and verifiable, therefore members of professions are required to intervene about the appropriate course of action.  

“AI technologies are the latest in a long line of threats that professions have faced. Although many of the job loss analyses predict that professions are under threat, these analyses treat expertise as substance that can easily be transferred from humans to machines. Expertise is not a substance like this. It is not mental or cognitive capacity that can be replicated abstractly. Members of professions learn abstract knowledge and specific skills and techniques; however, this are heavily supplemented by expertise that is developed about and within the settings in which they work, by expertise in applying their abstract knowledge and skills, and by the normative and legal conventions that recognize this expertise.

The prediction that machine learning algorithms are going to take over and eliminate the work of professions, like the technologies themselves, are based on abstract and naïve notions of what these people do all day and how they do it.” says Ruthanne Huising.

“Relational expertise is not extractable or codifiable – therefore job roles that depend on it, will not be replicating by new technologies”.

They add that it is very likely that there will be new work roles in each profession related to new technologies, thus work in the profession is likely to change and shift as opposed to being replaced.

The researchers advise workers who are affected by AI technologies that they must be aware of and emphasise how they create expertise through interaction with each other, the organization they work in, those they advise and treat, and the other tools and technologies they work with. Further, they need to be actively involved in the adoption and implementation of new technologies related to their work.

Related Articles

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More