SCROLL TO START EDITING
This is an interactive reading experience by Joyce Ho in Spring 2023
at RISD that allows readers to stand in the point of view of a human
journalist to weigh in the newsroom's code of ethics against AI’s
algorithms.
This project is advised by Professor Marisa Mazria Katz.
NYT Upshot by Aatish Bhatia
"Watch an A.I. Learn to Write by Reading Nothing but 'Jane Austen' "
Get an understanding of how generative A.I. generates its responses based on algorithmic calculations. In this article, The New York Times trained a mini A.I. to learn human language by only modelling the complete works of Jane Austen and other long texts.
The article explores the process of training a language model called BabyGPT on a small amount of text from Jane Austen's works. Over time, BabyGPT learns to generate increasingly coherent sentences, demonstrating the learning capabilities of language models. The rapid progress and complexity of language models like ChatGPT raise concerns about their unpredictability and potential for incorrect reasoning, underscoring the need for careful development and oversight of these powerful AI systems
"Pause Giant AI Experiments: An Open Letter"
We call on all AI labs to immediately pause for at least 6 months
the training of AI systems more powerful than GPT-4.
2023
The open letter highlights the risks posed by AI systems with human-competitive intelligence and calls for a pause in training more powerful AI systems. It emphasizes the need for shared safety protocols, rigorous oversight by independent experts, and accelerated development of robust AI governance systems to ensure safety, transparency, and societal benefits. The letter urges a careful and well-planned approach to AI development, suggesting a "pause" similar to other technologies that have posed potential risks to society.
01 Propaganda and misinformation: It raises concerns
about the possibility of machines flooding information channels
with propaganda and untruths, which can have detrimental effects
on society.
02 Job automation: The letter questions whether
automation should lead to the replacement of all jobs, including
fulfilling ones, potentially causing significant societal and
economic disruptions.
03 Nonhuman minds outnumbering and outsmarting humans:
There is a concern that the development of nonhuman minds
through advanced AI systems may eventually surpass human
capabilities, potentially leading to obsolescence and loss of
control.
04 Loss of control: The open letter warns about the risks
of losing control over the development and deployment of
powerful AI systems, especially when the systems become
increasingly complex and difficult to understand or
predict
Certainly! The first five people listed include:
Yoshua Bengio, Founder and Scientific Director at Mila,
Turing Prize winner and professor at University of Montreal
Stuart Russell, Berkeley, Professor of Computer Science,
director of the Center for Intelligent Systems, and co-author of
the standard textbook “Artificial Intelligence: a Modern
Approach"
Bart Selman, Cornell, Professor of Computer Science, past
president of AAAI
Elon Musk, CEO of SpaceX, Tesla and Twitter
Steve Wozniak, Co-founder,
Apple
"Restrict AI Illustration from Publishing: An Open Letter"
2023
The open letter discusses the impact of generative-image AI technology on the field of journalism and illustration. It highlights the risk of human illustrators being replaced by AI-generated illustrations, which are faster and cheaper to produce. The open letter argues that this technology not only threatens the livelihood of artists but also results in the loss of originality and human insight in art. It also raises concerns about copyright infringement and the economic implications of favoring AI over human artists. The open letter calls for a pledge to support human-made art and resist the use of generative-AI images in journalism
"The team behind the Manyfesto came together around a shared desire to move beyond Western-centric biases in isolation driving global technology as we move forward into the algorithmic era.""
The open letter is a call for a decolonial approach to AI technologies, aiming to challenge the dominance of Western perspectives and biases in the language, development, and governance of AI. It emphasizes the need to uncover, question, and reinvent assumptions, and to recognize and address power asymmetries and historical injustices. The goal is to create a space for marginalized voices and cultures to shape AI on their own terms and foster a decolonial imagination
International Journalism Festival on Friday Apr 21, 2023
14:00 - 14:50
Sala dei Notari, Palazzo dei Priori
"AI is now part of news gathering, production and delivery around
the world, from small specialist newsrooms to global organisations.
But how is changing the journalism created and what impact will it
have on the industry overall? What are the ethical, economic and
editorial issues at stake as machine learning, text/video
generation, automation and personalisation become drivers of news
creation and consumption?"
Gina Chua
executive editor at Semafor
00:02:40
"Just to start and set the table, right, large language models are
fascinating and they do huge amounts of things. And I think there
is an incredible amount of misunderstanding about what they can
and cannot do. They are language models, not fact models, not
verification models. They don't do any of that stuff well. But
what they do do is they do language incredibly well."
00:04:23
"The key point in that is that I wasn't asking it for facts. I was
saying "take this single corpus of information, single story and
rewrite it in different ways" and it did not introduce errors
because it was working off a single thing."
00:05:45
"“We can think about how to regenerate news and reach people in
new forms that I’ve only dreamed of. Can we create narrative
journalism where you can actually say “what happened today” and
have the chat bot go “there was a fire” “I don’t care about the
fire, what’s the next story” or you can say “oh, there was a fire,
where was it? Did anybody die? What was the cause?”…and you can
start thinking about news in very very different ways. Rebuild the
article. Rebuild the news experience beyond using AI to help us do
what we already do"
Lisa Gibbs
Director News Partnerships Associated Press
00:07:30
"ChatGPT, per unlike other AI and automation technologies,
introduces risks to news intellecutal property in ways that have
not existed before, and so when you ask what we are doing now –
that is different than what we were doing before. Now the lawyers
are involved"
00:09:00
"The other thing I;m concerned about is that the rush to sound
smart and try things with GPT is leading newsrooms and journalists
to rush into do some things that they don't hav ean understanding
of. But more importantly, their newsrooms do not have the
infrastructure to support."
00:10:08
"AP (Associated Press) surveyed 200 mostly smaller and local
newsrooms in the United states a couple of years ago about their
understanding of A.I. and what their technology pain points and
problems were. (...) One of the things we found when we did the
survey of 200 organizations in the US, the majority of them are
not even using tools like audio and video transcription. They are
not even automating basic data into stories or summaries to
produce for their audienes is the lack of technology knowledge
capacity."
Associated Press's AI and Journalism Webinar
mentioned by Lisa Gibbs
Chris Moran
Head of Editorial Innovation of The Guardian
00:25:26
"I found myself asking less about what the technology
specifically can do and more what is journalism for.
What do we do as journalists? Which of those tasks are mundane and
we should try and get rid of to free people up for more
interesting things, and which are the kind of things we don't want
this kind of technology to touch in such a specific way."
Charlie Beckett
Director Polis London School of Economics
Panelist at the talk
"New powers,New Responsibilities"published together with Polis and
the Google News Initiative.
It seems like the best way to implement AI is to have it build
from a given text, or content instead of counting on AI as
generating raw content on its own. AI can be a "layer of
polish."
You can't attend because you had to leave Italy early for family
business
The elephant in the room: could AI give technology giants
more control over the news?
11:00 - 11:50
saturday 22/04/2023
Auditorium San Francesco al Prato
Panelists include Charlie Beckett, Emily Bell, Nicholas
Diakopoulos, Uli Koppen,and Felix Simon
Uli Koppen, Founder and Scientific Director at Mila, Head
of AI + Automation Lab, Co-Lead of BR Data • Bayerischer
Rundfunk • Munich, Germany
She is currently collaborating with diverse teams of
journalists, coders, and designers who specialize in
investigative data stories, interactive storytelling, and the
exploration of innovative research techniques like bots and
machine learning.
information from
Online News Association
I'm happy to help!
00:12:02
Uli Koppen responding to Emily's question: "I thought you said
something very interesting about the dark side to AI as well.
What we don't talk about is how we are using these tools and how
has it affected journalists in the newsroom."
Koppen: It has those two sides, as you said. We should
discuss, as a society, how we wanna use algorithms. How do
people now embed them into our systems. Because I think there's
not much to talk about if there's a good use for. It's not the
tool is bad or good. It's a tool. We can use it and we have to
decide how we are using it.
00:29:40
It's a great moment to lobby infrastructure, you know, this
thing journalists don't think is very sexy. So if you're talking
to your CEOs to lobby for API and metadata that are often
legacy, no one wants to touch that because it's so difficult due
to its legacy systems but we have to do those long term
decisions. We have to make conscious decisions about that. The
hype around AI, as I said, is a very good moment to lobby for
that.
00:35:51
Koppen responded to Nicholas Diakopoulos's comment that "We've
been developing techniques for algorithmic accountability, now
AI accountability reporting. We've been developing these
techniques for ten years, and you know
The Markup
is doing it, and there are others who are doing it. We probably
need ten times as much investigation on technology and
society."
Koppen: It's a horizontal layer like climate reporting.
It's not just one silo team working around AI, it helps if you
have specialists because you need people who have statistic
knowledge for machine learning. So we really appreciate the
specialist team we have. But we're also working on integrating
the economy, soirts, more or less every beat we're
having
New York Times Ezra Klein Show
Friday, April 7th, 2023
Why A.I. Might Not Take Your Job or Supercharge the Economy
Ezra Klein answers listener questions about how A.I. might change our lives — or not.
In summary, the author reflects on their evolving perspective regarding A.I., considering the potential benefits of a slowdown in development versus the challenges of making it a viable political position. They emphasize the need for a positive view and a clear agenda during any pause, rather than simply delaying progress without a purpose. The author expresses concerns about the dominance of a few tech companies in shaping A.I. development and calls for a public vision for A.I. to ensure its societal implications are not solely determined by market forces. They highlight the importance of interpretability in A.I. systems, enabling a deeper understanding of their decision-making processes and the ability to explain their outputs. However, they acknowledge that current efforts to improve interpretability lag behind the advancements in learning systems.
01
"I tend to be much less confident that A.I. is going to replace a
lot of jobs in the near term than other people seem to be in part
because I don’t think the question is whether a job can be
replaced. There’s also a question of whether we let it be
replaced."
02
"If to have your job as a contract lawyer, or a copy editor, or a
marketer, or a journalist automated away is to become useless in the
eyes of society, then, yeah, that’s not going to be a reassessment
of values. That’s going to be a punishment we inflict on people so
the owners of A.I. capital can make more money."
Talk to Malika about this article you've been working about AI and journalism
Generative AI in the newsroom
11:00 - 11:50
Sala delle Colonne, Palazzo Graziani
workshop lecture by Nicholas Diakopoulos director
Computational Journalism Lab Northwestern University. " In this
workshop I’ll demystify these technologies and explore ways in which
they could be productive in various newsroom tasks. Lots of examples
will bring the possibilities to life, showing opportunities to use
such AI models for rewriting text, summarizing or classifying
documents, generating ideas, extracting data, and more."
If A.I. risks to cause misinformation when treated as a fact model, how much can we trust A.I. for its image production? How transparent is this algorithmic calculation for us to prevent A.I. from plagiarizing visual sources. Quoting from CAIR's Open Letter, " Media publishing takes intellectual property rights very seriously. Its business would not exist without upholding the laws and values that protect such rights."
“Is [AI] resource or culture?” Gina answered “It’s both but it’s
imagination.”
the prosaic versus the creative side of things that we need to see
in AI brings a myriad of possibilities into narrative journalism.
It can help us imagine ways to tell stories that break
conventional ways of writing. AI can helps us improve our output
in which it does an amazing job at proofing. As Gina was also
mentioning, AI is a tool that will level us up and create a
different type of competition.
International Journalism Festival on Friday Apr 21, 2023
14:00 - 14:50
“Journalists have been losing jobs at the rate of thousands, and it is
not about AI. This is not the highest challenge."
Charlie Beckett's "New powers,New Responsibilities" published by
together with Polis and the Google News Initiative. 2019
pg.54
Out of the 71 news organizations from 32 different countries,
60% is concerned with AI's impact on their journalism practice.