The authoritative guide to ensuring science and technology make life on Earth better, not worse.
By Sara Goudarzi | December 29, 2023
One would be hard pressed to have gone through the past year without hearing about artificial intelligence, be it in casual conversation, in the news, or at the workplace. From concern to excitement, AI seems to have already penetrated many aspects of our lives even if the details of how it works are murky at times. Critics of these applications are concerned about their disruptive qualities, while developers are promising the world. A better world, that is.
In 2023, AI took up a generous amount of real estate in the digital pages of the Bulletin, where authors and editors worked to break through the noise, explain how these applications work, and examine what the risks and possibilities of the tech really were.
Disinformation also made headlines as it continued to spread on social networks, with X, formerly known as Twitter, taking the crown as the biggest offender. The effects of disinformation, which plagued public health and elections over the last several years, have been especially pronounced with the Israeli-Hamas conflict that began in October. False videos, images, and narratives are influencing people’s opinion and perhaps even shaping the war.
Speaking of wars, drones are shaping and reshaping them. The proliferation of off-the-shelf inexpensive drones is leveling the playing field in conflicts to some degree. But the types of drones, how they are used, and under what conditions they’re deployed are important factors in how they’re changing the battleground.
The war in Ukraine shows the game-changing effects of drones depends on the game
By Dominika Kunertova
The Russia-Ukraine war and the recent Hamas attack that penetrated Israel’s famous iron dome, are both examples of how drones have changed, and continue to alter, warfare. Increasingly, new types of unmanned aerial vehicle are called game changing. But using Russia’s invasion of Ukraine as a case study, emerging technology researcher Dominika Kunertova argues that not all drones are created equal in their usage and abilities and their game changing abilities depend on the specific battle they’re being used for.
Popping the chatbot hype balloon
By Sara Goudarzi
Chatbots seem like a technological marvel. Ask them a question and you have an answer within seconds. Need a cover letter written for a job application? Boom! ChatGPT can draft you one in no time. It’s like magic. But is it? In this piece, I explored how these systems work; if they are really magic (they aren’t) or can think on their own (they can’t); the threats they pose; and the hundreds of thousands of humans behind the scenes who help make them run smoothly.
How creatives can stop AI from stealing their work
By Nick Vincent
The Writers Guild of America (WGA) strike against the Alliance of Motion Picture and Television Producers began in May and lasted 148 days. The 11,000 writers of the guild were asking chatbots not be used to write source material. For more dispersed creatives, such as freelance illustrators and writers, unions might not exist or be as robust as the WGA or the Author’s Guild. But there are ways creatives can fight back when developers use their text and images in training AI models—without permission or compensation. In this piece, computing science researcher Nick Vincent goes through some of the ways creatives can fight the unauthorized use of their work.
Narrative warfare: How disinformation shapes the Israeli-Hamas conflict—and millions of minds
By Yusuf Can
After the October 17 explosion that took place in the parking lot of al-Ahli Arab Hospital in Gaza, social media channels were flooded with unverified theories and images as evidence of which side of the Israeli-Hamas conflict was to blame for the strike. The effect? Many made up their minds before experts had a chance to conduct a thorough analysis. “Disinformation doesn’t simply get people to believe a false thing is true; it also convinces them to think a true thing is false,” Wilson Center program coordinator Yusuf Can writes. The disinformation problem, which goes beyond public belief, can lead to dire humanitarian consequences, argues Can, who uses this conflict in Gaza as a case study of what disinformation can do to shape opinions and wars.
The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.
Keywords: AI, ChatGPT, Disinformation, Hamas-Israel, artificial intelligence, drones
Topics: Disruptive Technologies