Many Americans think generative AI programs should credit the sources they rely on

Overall, 54% of Americans say that AI programs that generate text and images, such as ChatGPT and DALL-E, should credit the sources they trust to produce their answers. A much smaller share (14%) say programs do not need to credit sources, according to a new Pew Research Center survey. About a third say they are not sure about this question.

The Pew Research Center published this analysis as part of its ongoing work to understand attitudes about artificial intelligence. This analysis is based on a survey of 10,133 U.S. adults conducted February 7-11, 2024.

All those who participated in the survey are members of the Pew Research Center's American Trends Panel (ATP), an online survey panel that is recruited through a national random sampling of residential addresses. In this way, almost all American adults have a chance of being selected. The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, party affiliation, education and other categories. Read more about him The ATP methodology..

Here are the questions used for analysisalong with the answers and their methodology.

A separate Pew Research Center analysis finds growing public engagement with ChatGPT โ€“ one of the best-known examples of generative AI, especially among young people.

Generative AI programs work by Review large amounts of information., such as the works of an artist or a news organization. That allows them to generate responses when users ask questions.

This process has It spurred lawsuits from authors, artists and news organizations., who argue that this is an unauthorized use of copyrighted material. But some technology companies argue that this It is fair use under copyright law and that the programs provide a clear public benefit.

Our survey reveals that the public consistently says that AI programs should credit sources in seven examples of content they could generate.

A bar chart showing that Americans consistently say generative AI programs should credit their sources.

For example, 75% say AI programs should credit sources they trust if they provide information that matches what a journalist wrote almost word for word. Only 6% say they should not credit their sources in this scenario, while 19% say they are not sure.

Most American adults (67% each) also see a need to credit sources if AI programs generate images that imitate the style of a current artist or text that imitates the style of a current author.

Whether an author is alive or dead has little impact on public attitudes: 65% say credit is needed if AI programs imitate the writing style of a famous author who died many years ago.

Similarly, about six in ten say generative AI programs should give credit to sources they trust if writing a film script in the style of a popular movie. Hollywood screenwriters recently got limits on the use of AI in screenwritingas part of a broader labor agreement.

The view that credit is needed also extends to more general types of information. For example, 60% of Americans say AI programs should give credit to the sources they use if they summarize information about the American population. And 61% say credit is needed if these programs provide information that was reported by many different news organizations.

How often do Americans think they interact with AI?

A bar graph showing that adults with higher levels of education report more frequent interaction with AI.

Over the years, the Center's surveys have explored the public's opinions on multiple aspects of artificial intelligenceincluding general awareness and engagement with these technologies.

Our new survey finds that 22% of Americans say they interact with artificial intelligence almost constantly or multiple times a day. Another 27% say they interact with AI about once a day or several times a week. Half of Americans believe they interact with AI less frequently.

Adults with higher levels of education are more likely than those with less education to say they interact with AI frequently. For example, 63% of postgraduates and 57% of university graduates say they interact with AI at least several times a week. That compares to 50% of those with some college education and 36% of those with a high school diploma or less education.

Younger Americans are also more likely than their older peers to say they frequently interact with AI. Most people ages 18-29 (56%) and 30-49 (54%) say they interact with AI at least several times a week. Smaller percentages of people ages 50 to 64 (46%) and ages 65 and older (37%) say the same.

While AI now powers many widely used features, such as personalized online shopping recommendations, its presence may not always be visible to all Americans. For example, only 30% of American adults correctly identify the presence of AI in six examples in a Recent AI Awareness Survey.

Note: Here are the questions used for analysisalong with the answers and their methodology.

Alex Tyson is associate director of research at the Pew Research Center.

Brian Kennedy is a senior fellow focused on scientific and social research at the Pew Research Center.

Leave a Comment

Comments

No comments yet. Why donโ€™t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *