AI deforests the knowledge’s ecosystem

submited by
Style Pass
2023-03-17 04:30:10

Big-tech’s dash to incorporate ChatGPT-like interfaces into their search engines threatens the ecosystem of human knowledge with extinction. Knowledge development is a social activity. It starts with scientists publishing papers and books that build on earlier ones and with practitioners, journalists, and other writers disseminating these findings and their opinions in more accessible forms. It continues through specialized web sites, blogs, the Wikipedia, as well as discussion and Q&A forums. It further builds upon our interactions with these media through web site visits, upvotes, likes, comments, links, and citations. All these elements combined have yielded a rich global knowledge ecosystem that feeds on our interactions to promote the continuous development of useful and engaging content.

ChatGPT and its siblings work by sucking in huge quantities of existing text and using that to train a neural-like structure that models the text’s language — a so-called large language model. They then use that to provide eloquent answers to arbitrary questions. The answers are often surprisingly useful, saving us from the painstaking work of searching, analyzing, and synthesizing an answer from diverse web sources. The problem is that this process robs the knowledge ecosystem from the interactions we would have had in it. Each AI engine reads the content only once for building its model. From then on, all our interactions happen between us, the AI engine, and the model, rather than with the original content.

Leave a Comment