At the intersection of technology and social justice

The RJxTP Program offers courses that promote equity in an increasingly data-driven world.

November 24, 2025

By Sarah C. Baldwin

Artificial intelligence is transforming our world at a dizzying pace, including in education and workforce development. According to the World Economic Forum, people are now more than twice as likely to acquire AI skills than they were in 2018. On LinkedIn, the number of AI literacy skills added by members increased by 177% in 2023.

But technology does not benefit all people equally. Established in 2022, the Racial Justice x Technology Policy (RJxTP) program in Heller’s Institute for Economic and Racial Equity aims to foster an equitable technology ecosystem by equipping scholars and professionals with the knowledge and skills they need to address and mitigate the negative effects of technology, such as algorithmic bias in AI.

Microcredentials for industry and learners

To meet the needs of both industry and today’s learners, RJxTP is developing a slate of eight-week online certificate courses, or microcredentials. Among the first such courses to be offered at Brandeis, one was launched in January 2025 and the other in April.

“Our overarching goal is for learners to understand AI and algorithmic bias through the lens of equity,” says Ezra Tefera, MS GHPM’22, director of RJxTP. “We decided to start with cybersecurity and health care because of their immediate implications for personal privacy, public health and social justice issues.”

Taught by Erich Schumann, adjunct professor at Brandeis’ School of Business and Economics and an expert in fraud and cyber-risk management, Introduction to Cybersecurity provided students (most of them from the financial sector) with a broad understanding of digital threats and the skills to mitigate them. The course also explored how emerging technologies like AI can reinforce systemic disparities, using cases such as mortgage-lending algorithms to illustrate real-world risks.

Artificial Intelligence and Healthcare, taught by health law expert Natasha Williams, PhD’03, focused on the impact of algorithmic bias on health disparities and on ways AI can be used to reduce them. One case study showed that while AI uses electronic medical-records data to drive triaging in hospitals, members of vulnerable communities are often not even represented in the data due to lack of access to health care for systemic reasons, such as poverty.

Ethical and responsible application of AI

The response to the courses was overwhelmingly positive: The retention rate was 100% and most students reported that they would recommend their course to a colleague and expressed interest in pursuing more microcredentials in the same hybrid format.

Tefera notes that those who took the cybersecurity course included precollege and college learners, early- and mid-career professionals and executives, while the majority in the AI and health care class were students in Heller’s master’s program in Global Health Policy and Management. In addition to reading case studies, students in the latter group were required to write policy recommendations.

“These individuals are going to be developing public policies and institutional guidelines, among other strategies,” he explains. “So not only will they benefit on a personal basis as they make decisions in their day-to-day lives, but on a larger scale, they will craft better policy briefs and come up with mitigation strategies.”

Students who enrolled in the courses hailed from the U.S. as well as from Africa, Latin America and the Caribbean, but despite their diverse backgrounds and industries, “the common denominator for all of them is that they’re driven by the motivation to understand and apply AI ethically and responsibly, whether it’s in their studies, jobs or daily lives,” says Tefera.

“We must recognize AI’s immense potential, as well as its dangers, as it shapes the world moving forward,” he adds. “Our goal should always be to ensure that these technologies do not amplify systemic disparities.”