Modernising a legacy computer vision solution for defence
1 March 2026 • Defence sector client • Defence / Computer Vision
Legacy
Modern
Legacy ML systems accumulate technical debt that slows feature development, widens the security surface, and makes every deployment a risk. When the system runs on-premise at customer sites in a defence context, the cost of getting it wrong is high. Modernising is not just about adopting new tools: it is about establishing the engineering practices that let a team move fast without breaking things.
The challenge
Our client operated a computer vision and ML solution that had been running in production for many years in defence applications. The system delivered real value and was proven in the field, but years of incremental development had left it with deep technical debt. Adding new features was slow and risky, and the team spent more time working around the system than improving it.
The core problems:
- A monolithic Python 3.7 codebase with Conda for dependency management and a legacy database. Python 3.7 had reached end-of-life, meaning no security patches and no access to modern language features or performance improvements
- The tightly coupled architecture meant any change could break unrelated parts of the system. This made new feature development slow and unpredictable
- Development could only happen on a specific shared server. The server was often in a poor state: broken environments, conflicting dependencies, stalled processes. Engineers were regularly blocked before they could write a line of code
- A basic CI/CD pipeline existed but had no automated testing, no code quality gates, and no vulnerability scanning. Issues were caught late, often after deployment
- No established DevOps or MLOps practices: no structured code review, no automated quality enforcement, no reproducible builds
- The system was deployed on-premise at customer sites where security, performance, and uptime are critical
Our approach
We took the solution from legacy technical debt to modern DevOps and MLOps best practices, working incrementally to avoid disrupting ongoing delivery.
Stack modernisation
Python was upgraded from 3.7 to 3.12, restoring active security support and unlocking modern language features and significant performance improvements. Conda was replaced with uv for fast, deterministic dependency resolution and reproducible builds. The legacy database was migrated to PostgreSQL. Across the board, outdated tooling was replaced with modern, actively maintained alternatives chosen for long-term supportability.
Modular architecture
The monolith was broken into modular components with clear interfaces. Each module - covering areas such as data ingestion, model inference, and API layers - can be developed, tested, and deployed independently. This reduced the blast radius of changes: an update to one module no longer risks breaking the rest of the system. Critically, it means new features can be built and shipped without navigating a tangled dependency graph. Components can be swapped out or upgraded individually as requirements and tooling evolve.
Development environment
We simplified the development setup so engineers can work locally for day-to-day development and use shared servers only for larger jobs and resource-intensive processes. No more dependency on a single shared server being in a working state. The setup is reproducible and deterministic: a new developer can be productive within minutes rather than hours of environment debugging. This alone removed one of the biggest day-to-day blockers to feature development.
CI/CD and DevOps practices
End-to-end GitLab CI pipelines were built covering linting, automated testing, SonarQube code analysis, security scanning, and deployment. Every merge request runs the full quality suite before code is reviewed. SonarQube enforces quality gates on code coverage, duplication, and known vulnerabilities, ensuring that security and code quality are checked automatically on every change rather than relying on manual review alone. Structured code review became standard practice, and the pipeline gives the team confidence that what ships to customer sites is tested, secure, and reliable.
Results
The modernisation transformed the team's ability to develop, maintain, and deliver with confidence.
>90%
More pull requests
faster feature development
2x
Code coverage
security and quality gates
3.7 to 3.12
Python upgrade
secure, high-performance stack
Local + server
Development
reproducible setup, no blockers
The solution is now built on the DevOps and MLOps practices that enable sustainable, high-velocity development. Components can be updated independently without risking the broader system. Automated quality and security gates catch issues before they reach production. New features that previously took weeks of careful, risky changes can now be developed, tested, and shipped with confidence to on-premise customer sites where performance and reliability are non-negotiable.
Client feedback
“We used to dread merges. Now we ship features without stepping on each other.”
“New developers are productive on day one instead of spending weeks fighting the environment.”
Working on a similar challenge?
We build AI systems for defence and critical infrastructure clients across Northern Europe. Let's talk about what's possible for your environment.
Let's talk