Roughly two years ago, a post on workforce question-and-answer site Workplace StackExchange generated a lively debate. The post, which has since been viewed nearly a half-million times, was by a user named Etherable, who designed an algorithm that transformed the substance of a 40-hour workweek into a two-hour project. The poster was using the remaining time to tend to personal issues and spend more time with their son. Etherable asked, “Is it unethical for me to not tell my employer I’ve automated my job?”

The question seems to be a straightforward yes/no question. But the responses were often qualified. Some thought that lack of disclosure was unethical but warranted given the worry that the poster would lose their job. Others thought that the answer hinged on whether the company was paying the poster for hours or results. And at least a few gave pats on the back: One of technology’s great promises was that it would free people from rote tasks and give them back time that could be devoted to more meaningful pursuits. This was just an illustration of that promise.

Future of work issues, now

While the employment contract or company policies should be the deciding factor in the situation described, the post also illustrates some of the more complex ethical issues emerging as technology advances, says John Hooker, professor of business ethics and social responsibility at Carnegie Mellon University and author of Taking Ethics Seriously: Why Ethics Is an Essential Tool for the Modern Workplace.

“Rather than a transfer of tasks from humans to automation, we’re going to see a fusion of intelligent agents, computers, [and] algorithms working together with humans,” he says. That fusion is going to shine a light on some ethical and other problem areas companies and workers already struggle to navigate.

Part of the question relates to metrics, says “computer psychologist” Tim Lynch, president of gaming computer company Psychsoftpc. If the individual is being paid by output rather than hours, then they are fulfilling their agreement.

However, other factors complicate matters. Based on the individual’s employment status and agreement, the company may own the algorithm because it owns the employee’s work product, Lynch says. Also, Etherable built in “bugs” to make the work imperfect, as it would be if a human did it. That indicates an effort to mislead the employer. Plus, if the employer is relying on algorithms of which it is unaware and did not vet, the employee may be leaving the company open to security breaches or other liability.

“If you don’t know that Susie over there in the corner wrote this program that’s smashing out 10 million emails a day and has got some bug in it that’s sending the wrong thing to the wrong people, then it’s going to be pretty hard to track that thing and stop it,” says Ryan Duguid, chief evangelist for process automation platform Nintex.