How EVOS AI Came to Be

It started as an internal tool to protect our IP. It became a new way for businesses to run AI inside their own infrastructure.

Jack Holmes

Here’s how EVOS AI actually started

Not as a product. Not as a pitch deck. But as a necessity.

At EVOS, we’ve always been an engineering-led company. Building EV charging hardware, energy platforms, and managing complex supply chains meant we were constantly dealing with sensitive data — from product design to pricing models to supplier negotiations.

When AI started becoming useful, we were early adopters. Like most companies, we experimented with the usual tools and APIs. And like most companies, we quickly realised something didn’t sit right.

We were sending valuable information — our IP — outside the business.

That included:

  • product designs

  • How we negotiate with suppliers

  • How we structure deals and pricing

  • How we plan manufacturing and operations

  • HR documents

For us, that wasn’t acceptable.

So we did what engineers do. We built our own. We called it Dingo

Instead of relying on external providers, we started deploying our own AI infrastructure.

We ran open-source models.
We hosted everything ourselves.
We built internal tools around it.

At first, it was just about control and learning. Keeping our data inside our environment.

But then something interesting happened.

The AI got better — not just because of the models, but because it was connected to how our business actually worked.

We didn’t use AI as a tool on the side. We embedded it into the core of EVOS.

We used it to:

  • Plan manufacturing and forecast demand

  • Optimise energy usage across our platform

  • Support sales and marketing decisions

  • Improve internal workflows and reporting

Because it was running inside our environment, it had context. It understood our systems, our data, and our way of operating.

That’s when it stopped feeling like “AI” — and started feeling like infrastructure.

As we shared what we were doing, something unexpected happened.

Friends, partners, and other business owners saw our setup and started asking the same question:

“Can you set this up for us?”

They had similar concerns:

  • Protecting customer data

  • Keeping IP inside the business

  • Avoiding dependency on external providers

  • Getting more control over how AI was used

At first, we helped a few companies informally. Our engineering team would deploy private AI environments, configure models, and connect them to internal systems. We did this for the love not for money. Plus we just enjoy building things. Our CTO loves nothing more than a road trip to Umart.

But the pattern became clear.

This wasn’t just our problem. It was a market shift. Most business we spoke to had the same AI fears.

After enough requests, we made a decision.

Instead of building one-off setups, we would productise what we had built.

Take everything we learned at EVOS:

  • Hardware

  • Model deployment

  • Platform layer

  • Ongoing optimisation

And turn it into something simple:

A turnkey private AI environment, deployed inside a business, delivered as a service. We were already providing EV Charging as a service to business, why not AI.

Something businesses could run inside their own environment. Something they could control. Something that could scale with them as AI became more central to how they operate.

EVOS AI wasn’t created because we wanted to start another product.

It was created because we needed to solve a real problem inside our own business.

Protecting IP.
Controlling costs.
Owning how AI is used.

What started as an internal system is now becoming a new way for businesses to think about AI.

If you’re thinking about AI in your business, start with this question:

Do you want to use AI…
Or do you want to own it?

You’re using AI. You just don’t own it

Let's change that =>

You’re using AI. You just don’t own it

Let's change that =>

You’re using AI. You just don’t own it

Let's change that =>