Do Something

Call your representatives in Harrisburg to weigh in on funding SEPTA

Find out who your state representatives are and reach out. Let them know how important SEPTA is for getting to work, getting to appointments, getting into and out of Philly for leisure and family, and more.

Transit Forward Philly is organizing several actions to keep this issue in front of our elected officials.  You can sign up to canvass transit riders to help bring more people into this fight.

You can share this online petition from Transit for All PA, which also has sample phone scripts, Letters to the Editor, and other tools you can use to take action available on their site.

Connect WITH OUR SOCIAL ACTION TEAM



Solutions for Better Citizenship

Sign up for our newsletter!

In Brief

Can AI cameras be trusted?

In May 2025, SEPTA and the PPA launched an artificial intelligence system that utilizes AI-powered cameras with computer vision technology to scan license plates for potential violations, such as vehicles blocking bus lanes. If the system flags a possible infraction, a human reviewer confirms it before a fine is issued.

Murugan Anandarajan, academic director of LeBow College of Business’s Center for Applied AI and Business Analytics at Drexel University, describes a recent survey the Center conducted that found that gap between efficiency and oversight is especially common in public sector organizations.

Anandarajan believes it’s important for SEPTA to manage its AI enforcement system carefully to earn public trust, while minimizing risks. Without trustworthy data, AI-powered ticketing can turn efficiency into costly mistakes, such as wrongly issued citations that must be refunded, lost staff time correcting errors, and even legal challenges.

He recommends several steps to signal that the system is fair and accountable, helping shift it from feeling like a ticketing machine into a public service that people can trust.

Guest Commentary

How SEPTA Can Get a Win With AI

With cuts to SEPTA on the horizon, keeping buses running on time is more important than ever. AI-powered bus lane cameras can help with that — if, a Drexel AI expert notes, they are used carefully

Guest Commentary

How SEPTA Can Get a Win With AI

With cuts to SEPTA on the horizon, keeping buses running on time is more important than ever. AI-powered bus lane cameras can help with that — if, a Drexel AI expert notes, they are used carefully

The Southeastern Pennsylvania Transportation Authority piloted a new enforcement tool in Philadelphia in 2023: AI-powered cameras mounted on seven of its buses. The results were immediate and dramatic: In just 70 days, the cameras flagged over 36,000 cars blocking bus lanes.

The results of the pilot gave SEPTA valuable data into bus route obstruction and insights into the role of technology to combat these problems.

In May 2025, SEPTA and the Philadelphia Parking Authority officially launched the program citywide. More than 150 buses and 38 trolleys across the city are now fitted with similar artificial intelligence systems that use AI-powered cameras with computer vision technology that scan license plates for possible violations, like vehicles blocking bus lanes. If the system flags a possible infraction, a human reviewer confirms it before a fine is issued: $76 in Center City, $51 elsewhere.

This rollout comes as SEPTA faces a $213 million budget shortfall, with imminent service cuts and fare hikes.

I’m a professor of information systems and the academic director of LeBow College of Business’s Center for Applied AI and Business Analytics at Drexel University. The center’s research focuses on how organizations use AI, and what that means for trust, fairness and accountability.

In a recent survey the Center conducted with 454 business leaders from industries including technology, finance, healthcare, manufacturing and government, we found that the use of AI is often rolled out faster than the governance needed to make sure it works fairly and transparently.

That gap between efficiency and oversight is especially common in public sector organizations, according to our survey.

That’s why I believe it’s important for SEPTA to manage its AI enforcement system carefully to earn public trust, while minimizing risks.

Fairness and transparency

When cars block a bus lane, they clog traffic. The resulting delays can mess up a person’s day, causing missed connections or making riders late for work. That can leave riders with the feeling they can’t rely on the transit system.

So, if AI enforcement helps keep those lanes clear, it’s a win. Buses move faster, and commutes are quicker.

But here’s the issue: Good intentions don’t work if the system feels unfair or untrustworthy. Our survey also found that more than 70 percent of the surveyed organizations don’t fully trust their own data. In the context of public enforcement, whether it’s transit agencies or parking authorities, that’s a warning sign.

Without trustworthy data, AI-powered ticketing can turn efficiency into costly mistakes, such as wrongly issued citations that must be refunded, lost staff time correcting errors, and even legal challenges. Public confidence matters here because people are most likely to follow the rules and accept penalties when they see the process as accurate and transparent.

Just calling something AI-driven can make people trust it less.

Furthermore, this finding from our survey really caught my attention: Only 28 percent of organizations report having a well-established AI governance model in place. Governance models are the guardrails that keep AI systems trustworthy and aligned with human values.

That’s troubling enough when private companies are using AI. But when a public agency like SEPTA looks at a driver’s license plate and sends the driver a ticket, the stakes are higher. Public enforcement carries legal authority and demands a higher level of fairness and transparency.

The AI label — or framing — effect

One may ask, “Isn’t this ticketing system just like red-light or speed cameras?”

Technically, yes. The system detects rule-breaking, and a human reviews the evidence before a citation is issued.

But simply labeling the technology as AI can transform how it’s perceived. This is known as the framing effect.

Just calling something AI-driven can make people trust it less. Research has shown, whether a system is grading papers or hiring workers, that the exact same process draws more skepticism when AI is mentioned than when it isn’t. People hear “AI” and assume the machine is making judgment calls, so they start looking for flaws. Even if they think that AI is accurate, the trust gap never closes.

That perception means public agencies need to align AI-based enforcement with transparency, visible safeguards and easy ways to challenge mistakes. These measures increase trust in AI-based enforcement.

We’ve seen what can go wrong, and how quickly trust can erode, when an AI-based enforcement system malfunctions. In late 2024, AI cameras on Metropolitan Transportation Authority buses in New York City wrongly issued thousands of parking tickets, including nearly 900 cases where the drivers had actually followed the rules and parked legally.

Even if such errors are rare, they can damage public confidence in the system.

Build trust into the system

The Organization for Economic Cooperation and Development, the international body setting AI policy standards across dozens of countries, has found that people are most likely to accept AI-driven decisions when they understand how those decisions are made and have a clear, accessible way to challenge mistakes.

In short, AI enforcement tools should work for people, not just on them. For SEPTA, that could mean the following:

    • Publish clear bus lane rules and any exceptions, so people know what’s allowed.
    • Explain safeguards, like the fact that every bus camera violation is reviewed by Philadelphia Parking Authority staff before a ticket is issued.
    • Offer a straightforward appeals process with management review and a right to appeal.
    • Share enforcement data, such as how many violations and appeals are processed.

These steps signal that the system is fair and accountable, helping shift it from feeling like a ticketing machine into a public service that people can trust.


Murugan Anandarajan is Professor of Decision Sciences and Management Information Systems at Drexel University.

MORE ON SEPTA FROM THE CITIZEN

Han Zheng, CC BY-SA 2.0, via Wikimedia Commons

Advertising Terms

We do not accept political ads, issue advocacy ads, ads containing expletives, ads featuring photos of children without documented right of use, ads paid for by PACs, and other content deemed to be partisan or misaligned with our mission. The Philadelphia Citizen is a 501(c)(3) nonprofit, nonpartisan organization and all affiliate content will be nonpartisan in nature. Advertisements are approved fully at The Citizen's discretion. Advertisements and sponsorships have different tax-deductible eligibility.

Photo and video disclaimer for attending Citizen events

By entering an event or program of The Philadelphia Citizen, you are entering an area where photography, audio and video recording may occur. Your entry and presence on the event premises constitutes your consent to be photographed, filmed, and/or otherwise recorded and to the release, publication, exhibition, or reproduction of any and all recorded media of your appearance, voice, and name for any purpose whatsoever in perpetuity in connection with The Philadelphia Citizen and its initiatives, including, by way of example only, use on websites, in social media, news and advertising. By entering the event premises, you waive and release any claims you may have related to the use of recorded media of you at the event, including, without limitation, any right to inspect or approve the photo, video or audio recording of you, any claims for invasion of privacy, violation of the right of publicity, defamation, and copyright infringement or for any fees for use of such record media. You understand that all photography, filming and/or recording will be done in reliance on this consent. If you do not agree to the foregoing, please do not enter the event premises.