How is Glasswall using AI to accelerate security innovation?
At Glasswall, we view GenAI and Machine Learning as powerful tools for accelerating security innovation and effectiveness. Over the past three years, we have carefully designed and implemented a strategy that has taken us from early-stage experimentation to operational integration across the business.
This approach is reflected in our product design and development roadmap, including an exciting announcement we’ll soon make about extending Machine Learning within our core file security technology. But more on that later . . .
From strategy to structured execution
As part of our ongoing AI adoption process, we recently organised a company-wide AI hackathon to explore new ideas in a practical, hands-on format. The overall objective was simple: to apply AI to additional business and customer challenges and evaluate its potential value.
Having established clear evaluation criteria in advance, such as impact on development velocity, code quality, maintainability and long-term supportability, the hackathon teams were given free rein to explore how AI tools could deliver additional benefits - in a safe, ringfenced environment. Where appropriate, they were also asked to apply software development guardrails and standards, such as validating outputs and ensuring accountability for the code they produced.
But crucially, this wasn’t just a technical exercise; it was extended to other core operational teams, including Product, Pre-sales and Marketing. “At its core, Glasswall is a world-class software development business where AI is already playing an important role, but we also understand that AI can offer significant efficiency and performance benefits across the entire organisation,” said Paul Farrington, Chief Product and Marketing Officer.
“Our AI strategy extends well beyond product development. It’s not just a tool for technologists — it’s a capability for every team. The hackathon was about applying AI in ways that directly support our broader business objectives. Glasswall operates in highly constrained, often disconnected environments, so AI can’t be a black box or a cloud dependency. It must be traceable, deterministic, and aligned with a zero-trust philosophy. That’s non-negotiable.””
What the hackathon delivered
So, what did the day deliver? In short, the results were impressive, with our thirteen cross-functional teams tackling 50 challenges, covering everything from product development and workflow optimisation to internal tooling.
Practical outputs emerged from both technical and non-technical teams and included enhancements to development workflows, internal analysis, commercial enablement processes and financial monitoring and reporting.
By the end of the day, these combined efforts delivered 42 structured solutions, over 280,000 lines of C# and Python code, and 500 test files. This resulted in the creation of 3.5GB of structured repositories, supporting artefacts and defined project structures.
Code, documentation and coverage testing were developed concurrently, reflecting production expectations rather than isolated experimentation. AI tools were applied directly within existing development environments and toolchains, not as a standalone novelty layer.
Each team presented its solutions, with particular emphasis on the impact on development and long-term maintainability. In addition to recognising the highest-scoring submissions, we also introduced two fun awards: one for the most unexpected hallucination and another for the most determined battle with an AI tool.
“The hackathon was a huge success,” said Marc Robinson, Chief Technology Officer. “It demonstrated that disciplined AI integration can increase throughput while preserving code quality, security and maintainability. It also reinforced our view that process acceleration and governance are not mutually exclusive when AI is embedded within established guardrails.”
“Our focus is shifting from AI-assisted to AI-driven outcomes — where the technology delivers measurable value, not just potential,” said Doviana Tollaku, Operations Manager and AI Ambassador. These hackathons help us identify which ideas are ready to move beyond experimentation and become operational capabilities across the business.”
Farrington concluded: “This direction underpins an announcement we’ll be making shortly about the next phase of machine learning in our file security platform. Rather than treating it as an overlay, we’re embedding machine learning directly into how file-based threats are analysed and understood. It builds on the deep Content Disarm and Reconstruction foundations that define our approach to Zero Trust file protection, supported by rigorously trained proprietary models designed to predict threats within files.”
Stay tuned for more news in the coming weeks.
.png)






