NFT BUILDS INC

You have not selected any currencies to display

ChatGPT Shows Geographic Biases in Environmental Justice Data

AI A recent study has shed light on potential limitations within the artificial intelligence (AI) chatbot ChatGPT, particularly in providing regionally tailored information on environmental justice issues. Virginia Tech, a prominent U.S. university, has released a comprehensive report unveiling potential biases in ChatGPT, indicating variations in its responses related to environmental justice across different counties.The
A recent study has shed light on potential limitations within the artificial intelligence (AI) chatbot ChatGPT, particularly in providing regionally tailored information on environmental justice issues. Virginia Tech, a prominent U.S. university, has released a comprehensive report unveiling potential biases in ChatGPT, indicating variations in its responses related to environmental justice across different counties.

The report from Virginia Tech researchers suggests that ChatGPT faces challenges in delivering location-specific information on environmental justice issues. Notably, the study highlights a discernible pattern where such information is more readily available for larger, densely populated states. States with substantial urban populations, such as Delaware or California, exhibited information gaps in less than 1 percent of their counties.

Conversely, regions with smaller populations, like rural states such as Idaho and New Hampshire, faced significant disparities in access. The report emphasizes that over 90 percent of the population in these rural states lived in counties lacking local-specific information on environmental justice issues.

Kim, a lecturer from Virginia Tech’s Department of Geography, stressed the need for further research as these geographic biases come to light. “While more study is needed, our findings reveal that geographic biases currently exist in the ChatGPT model,” Kim declared.

A United States map showing areas where residents can view (blue) or cannot view (red) local-specific information on environmental justice issues. Source: Virginia Tech

The research paper includes a visual representation, featuring a map illustrating the extent of the U.S. population without access to location-specific information concerning environmental justice issues.

This revelation follows recent concerns about potential political biases exhibited by ChatGPT. Earlier this year, researchers from the United Kingdom and Brazil published a study asserting that large language models, including ChatGPT, generate text with errors and biases that could potentially mislead readers.

These developments underscore the importance of ongoing research and scrutiny to enhance the fairness and accuracy of AI language models like ChatGPT.

Wasif Shakir

Subscribe to the Markets Outlook newsletter
Weekly newsletter that covers the main factors influencing Bitcoin’s price and the week ahead. Delivered every Monday

Related Post