Security
AI & DATA

Engineering an AI-Native Code Intelligence for a Global Cybersecurity Leader

Modernizing the core of infrastructure vulnerability management. Altimi decoded a proprietary scanning language to transform legacy technical debt into a machine-interpretable knowledge base.

Challenge

De-risking a Fragile Ecosystem

The scanner relies on thousands of specialized scripts. Over decades, this created an environment where:

Cascading Regressions: Massive script dependencies made refactoring risky; one change could crash entire scan chains.

Performance Decay: Redundant network calls and inefficient logic caused scan slowdowns and false positives.

Logic Bottlenecks: Manual code reviews missed hidden bugs like variable shadowing and infinite loops.

The AI Barrier: Standard LLMs struggled with the niche language syntax, preventing accurate automated modernization.

Solution

The Static Analysis Engine

We implemented a multi-layered analysis pipeline using Golang, SCIP, and Tree-sitter to bridge the gap between legacy code and AI.

Lexer & Parser

A custom engine to process raw source code into structured tokens, handling historical idioms.

AST & CFG Generation

Mapping logic paths to detect dead code and potential resource leaks automatically.

Graph Dependency Maps

Visualizing all connections between scripts to ensure predictable impact analysis.

Real-time LSP Engine

Immediate developer feedback directly in the IDE, shifting validation to the workstation.

Results

Key Outcomes & Benefits

Measurable Business Value

Systematic Bug Detection: Immediate identification of hundreds of hidden logic errors and performance leaks.

Accelerated Delivery: AI-assisted development significantly cut implementation time compared to traditional methods.

Precision AI Refactoring: LLMs now operate on structured AST data, eliminating hallucinations in updates.

Optimized Performance: Elimination of redundant calls resulted in faster scans for end customers.

Summary

This project proved that even in a niche legacy environment burdened by long-term technical debt, precise tool-based automation brings breakthrough results. Creating a dedicated static analysis system for the proprietary language radically reduced maintenance costs and restored full control over the script ecosystem. The tool has become the foundation for further transformation, enabling safe migration and stable development with minimized risk of regression.

FAQ

From legacy chaos to structured intelligence in NASL systems

What system-level limitations of NASL caused the biggest technical debt?

The main drivers were the language’s non-standard semantics, lack of consistent coding standards, and tightly coupled script dependencies that made safe refactoring extremely difficult.

Why was static analysis more effective than a manual approach?

Because it enabled deterministic analysis of the full dependency graph and control-flow structure, removing uncertainty and human error from manual code reviews.

How does the system detect logic and performance issues in the code?

Through a combination of AST, CFG, and rule-based analysis that identifies anti-patterns, dead code, inefficient execution paths, and incorrect logical conditions.

How does the solution handle thousands of interdependent scripts at scale?

By building a dependency graph in Neo4j and using incremental analysis based on SCIP, enabling efficient processing of large, interconnected codebases.

Why did LLM integration become effective only after introducing AST/CFG structure?

Because language models gain precise semantic context only when working on structured representations of code, rather than raw, unstructured text.

Transform your operations with our AI-driven tools

Learn More