The American consulting firm Accenture ties the promotion of executives to the use of the company’s own AI tools. This is stated in an internal company email to associate directors and senior managers, which was published by the Financial Times. “Using our core tools will make a visible contribution to talent discussions.” it said there. These include the AI Refinery platform developed jointly by Accenture and Nvidia. The Financial Time report states that Accenture has been evaluating weekly log-ins for selected senior-level employees since this month. According to the company, this is necessary in order to be “a completely customer-focused and AI-supported workplace and employer,” Accenture tells our editorial team.
Even if this regulation does not apply to the 11,700 employees in Germany, as Accenture confirmed to our editorial team, the question arises: What does German labor law say, should this practice expand? For this we have Dr. Alexander Insam, labor lawyer and partner at Görg, interviewed.
In the case of external (and internal) AI, the right to issue instructions applies
First of all: How is the use of AI being ordered in this country? According to the employment lawyer, an AI that is fed with internal company data is nothing other than software subject to co-determination according to Section 87 Paragraph 1 No. 6 BetrVG. “First of all, a common understanding must be achieved in a works agreement or in a framework agreement about their use,” says Alexander Insam. He cites the AI regulation at the European level, which was passed in May 2024. There, for example, according to Article 26 Paragraph 7 of the AI Regulation, there are special transparency and information obligations in the company for so-called high-risk AI systems.
In principle, an employer can oblige their employees to use software and AI through the right to issue instructions. Whether the works council has to agree depends on whether the AI software is external or internal. The Hamburg Labor Court decided on January 16, 2024 that an employer may allow its employees to use tools such as ChatGPT and set rules for this without involving the works council.
If, like Accenture, it’s about internal AI, Insam explains, it has to be part of the company system, similar to a word processing program, for example. This requires an introduction with the works council. After that, the right to give instructions applies as usual.
Info
Then take a look at our dossier on the topic. There we continuously put together current reports, analyses, deep dives and tools for the use of AI in everyday HR for you.
Read it!
Promotion based on software usage
What about linking usage to promotion decisions? If an employer introduces an internal tool, it can set tool usage as a criterion for a promotion or salary increase, says Insam. “Even with a performance-related salary increase, I can clearly write the use of AI into the rules of the game.” Things become more difficult if performance measurement is only linked to the quality of the work result. The question then arises as to whether the path to the work result can be prescribed. “If the agreed result is delivered without the use of AI, the employer is generally not allowed to refuse a salary increase or promotion.”

According to Insam, the challenge lies in defining what exactly the AI represents in the work context. “If the AI is a work tool, the employer can instruct that it be used. The employer can issue work instructions and thus make the AI part of the work process.” And this means that usage can also be included in the performance assessment. A promotion can also be linked to a performance appraisal – it is important to make this criterion transparent.
Data protection-compliant proof
If the criteria of the AI regulation are met and there is a company agreement, the question of data protection still remains. Transparency is key here too, says Insam. “The use of software can be controlled with so-called login data. These days these are often generated automatically.” Here the employer must disclose how and for what it uses this data. As with all personal data, the employer’s interest in using it must prevail.
“If the employer has already mastered the hurdles of EU regulations and works constitution law, then it will not fail here,” says Insam. When assessing performance, it would depend on whether simply logging in and out could provide a statement about actual performance. “The assessment of whether work performance is good can in any case and not only be made on this basis.”
Human versus AI work performance
In Alexander Insam’s experience, the problem with the emergence of artificial intelligence is rather the opposite: “Many employers that we advise do not want to pay for what the AI has actually done, but only for the human transfer work. It will be very exciting in the future to define this precisely.” According to information available to the Financial Times, the decision at Accenture is not well received by all employees. Voices from companies criticized the efficiency of the AI tools used and announced that they would leave Accenture if this regulation was expanded.
Angela Heider-Willms is responsible for reporting on the topics of transformation, change management and leadership. She also deals with the topic of diversity.