Latest News
Photo of author

OpenAI: OpenAI wants to secure ChatGPT against disinformation

AI developer OpenAI plans to introduce tools to identify disinformation ahead of this year’s major elections. “We want to ensure that our technology is not used in ways that could undermine the democratic process,” the company said. Developers are currently working on tools that will enable users to reliably assign texts generated by the ChatGPT program and recognize whether an image was created by artificial intelligence. “We are still working to understand how effective our personalized influence tools might be,” the statement said.

2024 is a so-called super election year: important elections are taking place in several major countries, including the USA, India and Great Britain. According to a World Economic Forum (WEF) study published last week, influencing elections through AI-driven disinformation poses one of the world’s greatest short-term risks.

Protective measures also apply to the Dall-E 3 image generator

According to the company, if they have questions about the US election, ChatGPT directs users to official websites. “The lessons we learn from this work will form the basis of our approach to other countries and regions,” the company said. In addition, his image generator Dall-E 3 contains “guardrails” that are intended to prevent users from generating images of real people – such as presidential candidates.

Other US companies such as Google and Facebook parent company Meta also announced last year that they wanted to limit the influence of elections on their platforms through the use of AI.

ChatGPT has been around for over a year. The rapid development of artificial intelligence fascinates experts and the public – but also causes great concern with regard to the labor market, data protection and possible disinformation. The EU wants to regulate the use of AI with the so-called AI Act.

SOURCE

Leave a Comment

EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS EcDS