New! Sign up for our free email newsletter.
Science News
from research organizations

Researchers take a stand on algorithm design for job centers: Landing a job isn't always the right goal

Date:
October 29, 2020
Source:
University of Copenhagen
Summary:
Algorithms that assess the risk of citizens becoming unemployed are currently being tested in a number of Danish municipalities. But according to a new study, gaining employment is not the only relevant goal for those out of work -- nor should it be for an algorithm.
Share:
FULL STORY

Imagine that you are a job consultant. You are sitting across from your client, an unemployed individual.

After locating them in the system, up pops the following text on the computer screen; 'increased risk of long-term unemployment'.

Such assessments are made by an algorithm that, via data on the citizen's gender, age, residence, education, income, ethnicity, history of illness, etc., spits out an estimate of how long the person -- compared to other people from similar backgrounds -- is expected to remain in the system and receive benefits.

But is it reasonable to characterize individual citizens on the basis of what those with similar backgrounds have managed in their job searches? According to a new study from the University of Copenhagen, no.

"You have to understand that people are human. We get older, become ill and experience tragedies and triumphs. So instead of trying to predict risks for individuals, we ought to look at implementing improved and more transparent courses in the job center arena," says Naja Holten Møller, an assistant professor at the Department of Computer Science, and one of the researchers behind the study.

Together with two colleagues from the same department, Professor Thomas Hildebrandt and Professor Irina Shklovski, Møller has explored possible alternatives to using algorithms that predict job readiness for unemployed individuals as well as the ethical aspects that may arise.

"We studied how to develop algorithms in an ethical and responsible manner, where the goals determined for the algorithm make sense to job consultants as well. Here, it is crucial to find a balance, where the unemployed individual's current situation is assessed by a job consultant, while at the same time, one learns from similar trajectories using an algorithm," says Naja Holten Møller.

Job consultants need to help create the algorithm

The use of job search algorithms is not a well-thought scenario. Nevertheless, the Danish Agency for Labour Market and Recruitment has already rolled out this type of algorithm to predict the risk of long-term unemployment among the citizenry -- despite criticism from several data law experts.

"Algorithms used in the public sphere must not harm citizens, obviously. By challenging the scenario and the very assumption that the goal of an unemployed person at a job centre is always to land a job, we are better equipped to understand ethical challenges. Unemployment can have many causes. Thus, the study shows that a quick clarification of time frames for the most vulnerable citizens may be a better goal. By doing so, we can avoid the deployment of algorithms that do great harm," explains Naja Holten Møller.

The job consultants surveyed in the study expressed concern about how the algorithm's assessment would affect their own judgment, specifically in relation to vulnerable citizens.

"A framework must be established in which job consultants can have a real influence on the underlying targets that guide the algorithm. Accomplishing this is difficult and will take time, but is crucial for the outcome. At the same time, it should be kept in mind that algorithms which help make decisions can greatly alter the work of job consultants. Thus, an ethical approach involves considering their advice," explains Naja Holten Møller.

We must consider the ethical aspects

While algorithms can be useful for providing an idea of, for example, how long an individual citizen might expect to be unemployed, this does not mean that it is ethically justifiable to use such predictions in job centers, points out Naja Holten Møller.

"There is a dream that the algorithm can identify patterns that others are oblivious to. Perhaps it can seem that, for those who have experienced a personal tragedy, a particular path through the system is best. For example, the algorithm could determine that because you've been unemployed due to illness or a divorce, your ability to avoid long-term unemployment depends on such and such," she says, concluding:

"But what will we do with this information, and can it be deployed in a sensible way to make better decisions? Job consultants are often able to assess for themselves whether a person is likely to be unemployed for an extended period of time. These assessments are shaped by in-person meetings, professionalism and experience -- and it is here, within these meetings, that an ethical development of new systems for the public can best be spawned."


Story Source:

Materials provided by University of Copenhagen. Note: Content may be edited for style and length.


Journal Reference:

  1. Naja Holten Møller, Irina Shklovski, Thomas T. Hildebrandt. Shifting Concepts of Value. Association for Computing Machinery Transactions, 2020 DOI: 10.1145/3419249.3420149

Cite This Page:

University of Copenhagen. "Researchers take a stand on algorithm design for job centers: Landing a job isn't always the right goal." ScienceDaily. ScienceDaily, 29 October 2020. <www.sciencedaily.com/releases/2020/10/201029105001.htm>.
University of Copenhagen. (2020, October 29). Researchers take a stand on algorithm design for job centers: Landing a job isn't always the right goal. ScienceDaily. Retrieved November 7, 2024 from www.sciencedaily.com/releases/2020/10/201029105001.htm
University of Copenhagen. "Researchers take a stand on algorithm design for job centers: Landing a job isn't always the right goal." ScienceDaily. www.sciencedaily.com/releases/2020/10/201029105001.htm (accessed November 7, 2024).

Explore More

from ScienceDaily

RELATED STORIES