904 356-JOBS (5627)

904 356-JOBS (5627)

This Jax startup is betting on transparency as the next battleground in AI (Courtesy of the Jacksonville Business Journal) — In Jacksonville’s fast-growing tech scene, a new question is taking center stage: not whether artificial intelligence works, but whether businesses can trust it.

Three years after ChatGPT’s debut, AI has become woven into daily operations at hospitals, banks and government agencies across the First Coast. But as adoption accelerates, the pressure to prove transparency — what data models see, how decisions are made and who’s responsible when something goes wrong — is rising just as quickly.

That spotlight is only getting brighter. California recently passed the nation’s first law requiring AI developers to maintain documented data histories and clear audit trails, a framework expected to influence compliance standards nationwide when it takes effect in 2026.

Amid that shift, a Jacksonville-area startup sees an opening. Orange Park–based Foundry360, founded by Jason Gelsomino, has launched a platform built to sanitize sensitive information before it reaches external AI models and automatically generate audit logs — the very capabilities regulators and risk-averse industries are now demanding. It’s part of a broader trend on the First Coast, where local innovators are racing to turn AI’s looming regulatory era into a competitive advantage.

Locally, Mayo Clinic recently unveiled its virtual AI assistant, built entirely in-house, to help nurses summarize patient information within the electronic record and search resources in that context.

Hospitals are home to some of the world’s most valuable data, making protection that sensitive information exceedingly important. Matt Berseth, the co-founder of NLP Logix, Jacksonville’s long-running AI anchor, likened it to who’s responsible when a human operates a self-driving car.

“Tesla or whatever technology is doing the driving, but a human is in the loop,” Berseth told the Business Journal. “(A human is) in the car, they’re in the driver’s seat and ultimately they are responsible for that vehicle in the full-driving scenario.”

Foundry360’s tool, called Node2AI, aims to address the vulnerability of data. In a hypothetical, but realistic, workflow, an employee pastes a patient summary or internal financial data into ChatGPT. Instead of that identifying information going straight into the public AI model, the tool will intercept the prompt, strip out any identifiers, tokenizes the request, it reenters back through the firewall and into the customer’s environment, Gelsomino explained.

“Let’s use Social Security, birthday, doctor’s appointment — three identifiers. The human brain isn’t going to be able to put those three together. If you put those three things on that on a piece of paper, the chances of you figuring out that that person is John Smith of slim to none,” he said. “But if you load this up into an AI tool, it has the potential to identify that individual based on other identifiers that it has access to.”

It’s built specifically for companies in health care, finance and government sectors — home to some of the world’s most valuable data — where protecting sensitive information is exceedingly important. Atop the sanitization, Node2AI creates an audit log, helping companies prove AI transactions.

“It’s not an AI platform, it’s an infrastructure layer,” said Gelsomino. “It sits between customers’ infrastructure and the providers.”

While policymakers in Sacramento and Washington debate AI guardrails, Foundry360 is getting in front of the trend, deploying the tools to meet and exceed them.

It’s technology built on a valuable premise, in the words of Berseth, who co-founded NLP Logix in 2011, long before the term AI took center stage.

Most companies are three years into their journey of using AI on a daily basis, Berseth told the Business Journal — in other words, not a particularly long time.

Because the acceleration of adoption is moving faster than many organizations can manage, governance and compliance aren’t always first on the agenda, creating blind spots.

As Berseth explained, companies need to sit down and understand how their employees are using AI and ensure they train on proper usage. Without proper safeguards, it’s not hard for patient records or financial identifiers to slip into external models.

“AI is going to be part of the future of all work and what the markets think about it is going to be short term,” he said. “Long term, it’s here to stay, and I think you need to have a solid strategy around it.”

But this same risk is also an area of opportunity, opening the door for companies like Foundry360 to offer new solutions — a chance to turn regulation into a competitive edge.