A Microsoft AI engineer has alleged that the company’s AI image generator ‘Copilot Designer’ lacks basic security measures and can create violent and sexualized images.
As rapidly as the use of Artificial Intelligence ( AI ) tools has increased, its dangers are also coming to light. Recently, Google’s Gemini AI had said objectionable things about Prime Minister Narendra Modi, after which the company had to apologize. Now another case related to tech giant Microsoft has come to light. An AI engineer of the company has alleged that Microsoft’s AI image generator ‘Copilot Designer’ lacks basic security measures and can create violent and sexualized images.
Engineer Shane Jones has written a letter to the Federal Trade Commission of America. It is said that he repeatedly warned Microsoft management about these shortcomings, but no action was taken.
According to The Guardian report , a Microsoft spokesperson has denied that the company ignored safety issues. The company claims that it has enough resources to deal with problems related to Generator AI.
What does Copilot Designer do?
Copilot Designer is a tool created by Microsoft. It can create images based on text prompts. Microsoft’s tool works on the DALL-E 3 AI intelligence system of renowned company OpenAI. Copilot Designer was introduced last year.
What are the shortcomings of Copilot Designer?
Jones has alleged in his letter that there are systematic problems in Copilot Designer. The engineer says that until the company corrects the output, this tool should not be allowed to be used publicly. It is claimed that this tool can create such pictures which focus sexually on women.
Also read: Google’s upcoming Tensor G4 Chip set to rival Snapdragon 8 Gen 4 and Apple A18 Pro