Scrapping income tax as the primary revenue source for national exchequers and giving computers legal personhood are just two of the revolutionary suggestions made in a new report from the International Bar Association (IBA) to the International Labour Organisation’s (ILO) global debate on the future of work.
The IBA Report on the Future of Work is the culmination of a two-year project between the IBA and the ILO. The report was launched at the just-completed IBA annual conference in Seoul, South Korea.
The report follows a worldwide debate about the future of work instigated by the ILO in 2017.
“The world of work is constantly evolving and as legal professionals, it is our duty to stay informed on these changes,” says Salvador del Rey Guanter, Working Group Chair of the IBA-ILO Project on the Future of Work.
“This report provides a fascinating insight into the future of our sector and invaluable advice on how to adapt our working practices to meet the challenges we will face.”
The International Federation of Robotics (IFR) predicts that more than three million industrial robots will be in use in factories around the world by 2020. Furthermore, it is estimated that at least one in three jobs are vulnerable to artificial intelligence (AI) and robotics.
The IBA says national exchequers currently rely on income tax and value added tax (VAT) as their primary revenue source – this will need to change if there is mass unemployment and a rise in alternative income generation such as gig working in the future. As there is no obvious alternative income source for national exchequers, the report considers that new types of taxation, such as digital or robotics tax, may need to be implemented to protect the global economy and promote growth.
Giving computers some form of legal personhood is another suggestion made in the litigation report in The IBA Report on the Future of Work. Autopilot technology, CV scanning tools and the recent development of driverless cars are proof of how prevalent AI is becoming in our society.
As the use of AI becomes more popular, the likelihood of such programs causing damage to people or property is going to increase. Machines currently have no legal status, and as such they cannot be held accountable for their actions in the same way that people or corporations can.
One solution could be to introduce strict product liability laws for AI, similar to the current regulations governing autopilot technology. However, as the cost of compensation would fall on the manufacturer of the product, this approach could prevent technological progress by making companies reluctant to adopt AI systems.
Giving computers legal personhood, and thus the ability to be held accountable, could be a way of encouraging innovation by reducing the ‘fear factor’ involved in AI liability.