A separate US built a platform intended to accurately describe pictures, having first examined huge quantities of images from social media.
It was shown a picture of a man in the kitchen, yet still labelled as a woman in the kitchen.
Maxine Mackintosh, a leading expert in health data, said the problem is mainly the fault of skewed data being used by robotic platforms.
“These big data are really a social mirror – they reflect the biases and inequalities we have in society,” she told the BBC.
“If you want to take steps towards changing that you can’t just use historical information.”
In May last year report claimed that a computer program used by a US court for risk assessment was biased against black prisoners.
The Correctional Offender Management Profiling for Alternative Sanctions, was much more prone to mistakenly label black defendants as likely to reoffend according to an investigation by ProPublica.
The warning came as in the week the Ministry of Defence said the UK would not support a change of international law to place a ban on pre-emptive “killer robots”, able to identify, target and kill without human control.