Pope on AI: Welcome its benefits to humanity, but mitigate its risks

Light of Truth

Slightly more than a week after Pope Francis addressed the G7 Session in Bari, Italy, on artificial intelligence, the Holy Father is reaffirming that the powerful technological advancement must be used ethically, to serve humanity, and that its inherent risks must be mitigated. The Holy Father’s latest words on AI came during his audience on 22 June in the Vatican with participants in the international convention on ‘Generative Artificial Intelligence and Technocratic Paradigm,’ organized by the Vatican’s Centesimus Annus Pro Pontifice.
In his remarks, the Pope thanked those before him for their commitment to exploring how AI can help promote human dignity and be at the service of the disadvantaged.
“I appreciate,” he expressed, “that the Centesimus Annus has given ample space to this subject, involving scholars and experts from different countries and disciplines, analysing the opportunities and risks related to the development and use of AI.” The Pope likewise warned against the tool acting autonomously, stressing AI “is, and must remain a tool” in human hands. Moreover, the Holy Father warned against artificial intelligence perpetrating a ‘throwaway culture,’ favouring inequality, and making decisions outside of its purview.
As he encouraged them to continue examining the true purpose of AI, he asked: “Does it serve to satisfy the needs of humanity, to improve the well-being and integral development of people?” Or does it, rather, “serve to enrich and increase the already high power of the few technological giants despite the dangers to humanity?”
This, he said, is the basic question. Since the future of humanity will be played out on the front of technological innovation, he stated, “We must not miss the opportunity to think and act in a new way, with mind, heart and hands,” in order to “direct innovation toward a configuration centered on the primacy of human dignity.” This, he underscored, is not up for discussion.

Leave a Comment

*
*