Bridge the gap between tech and business strategy

The board talks strategy. IT talks code.

Yellow circles graphic

of organizations don't have a formal process to translate IT spend into KPIs the board can understand.



(Gartner)
0 %
Yellow circles graphic

is lost annually to digital downtime, and the risk is rising as companies rush to adopt gen AI with unproven tech.



(McKinsey)
$ 0 B
Yellow circles graphic

of board members lack regular interaction with CISOs—held back by the difficulty of translating technical jargon into business terms.


(Harvard Business Review)
0 %
Yellow circles graphic

of CROs state that the use of AI poses a reputational risk to their organization.




(World Economic Forum)
0 %

Real control requires a shared understanding

Without a clear, shared view of your IT landscape: Priorities become reactive, investments lose focus, and risks accumulate.
In a world driven by tech and constant change, you can't afford to fly blind.

It's time to shift up.

Yellow dots

As companies strive to innovate and modernize, it becomes critical for IT and business to speak the same language. Especially in an age where everything is moving faster and managing software landscapes is becoming more complex.

Bring clarity to the chaos

Take control of your security posture

Clear metrics. Better decisions. Stronger defense.

Know how secure your software portfolio really is. Track your security posture against industry benchmarks and across different criticality levels, and assess how effectively your teams respond to findings.

Let's talk
vector arrow graphic

Maximize productivity across your organization

Deeper insight. Smarter focus. Lower costs.

See where your time is going and monitor whether it’s paying off. The dashboard highlights systems that consume excessive capacity, signals when technical debt is slowing you down, and helps prioritize refactoring efforts where they matter most.

Let's talk
vector arrow graphic

Connect IT execution to business strategy

Clear summaries. Less noise. Confident decisions.

Analyze what matters, without needing a software engineering degree. Reporting on security, productivity, strategic progress, and IT spend is easier than ever. The Management Dashboard translates fact-based technical findings into clear business KPIs automatically, so everyone is on the same page and can move forward together.


Let's Talk
vector arrow graphic
  • “With the help of Software Improvement Group, and their platform Sigrid, we can invest more effectively in code quality improvement and development."

    Petra Hendriksen

    Head of Mission Control at Alliander
  • Software Improvement Group helped us keep control of technical debt and identify where we needed to focus.”

    Harald Thoonen

    Solution Architect at Rabobank
  • “Tooling like Sigrid provides transparency, allowing us to manage our software proactively and maintain high standards. This is crucial for securely sharing personal data in our digital processes and staying ahead of potential security risks.”

    Kelly Bonneure

    Program Coordinator – MAGDA, Digitaal Vlaanderen
  • "Software Improvement Group helps us access and interpret the data so that we can improve things better and more quickly."

    Dieneke Schouten

    Operational Director Public Sector Solutions at Centric
  • “It’s more visible what you’re talking about — more direct, not a fussy discussion. We can see what’s going wrong, what’s going good, and what needs more attention.”

    Hans Schreuder,

    Managing Director of the Board, SBIR

Turn technical insights into strategic action

Relevant resources

AI for executives: 4 actions the board should take to become AI ready

Read more to learn about AI for executives and the 4 actions the board should take to ensure that your organization is AI-ready....

Technical debt and its impact on IT budgets

The invisible cost of technical debt kills your IT budget without you even knowing. Discover what technical debt is and how to combat it today....

The hidden software security risks business leaders should be aware of

Poor code and open-source vulnerabilities are often overlooked—until they trigger major business disruption. Learn how to find and fix them early....

Frequently asked questions

What kind of insights can executives expect from the Management Dashboard?

Executives can expect insights on security posture, productivity metrics, IT spend, technical debt, development activities, and progress towards defined objectives. These insights are translated into business KPIs that help in making informed, strategic decisions.

What does 'shift up' mean in the context of software governance?

'Shift up' refers to elevating the focus from code-level details to a broader, enterprise-wide perspective. It emphasizes the importance of linking technical execution to strategic business outcomes, ensuring IT decisions are made in alignment with overall business goals.

How do you use the security tab in the dashboard?

The security tab in the Management Dashboard helps organizations evaluate and manage their security processes effectively. It answers three key questions:

  • Are you in control? The top of the security tab covers your overall security compared to the SIG benchmark. It quantifies the impact of your security measures, indicating whether your security posture makes you more or less likely to encounter data breaches.
  • Are you doing the right things? A mature security process involves actively discovering, triaging, and resolving security findings. The security tab displays charts that provide process information on how quickly security findings are being addressed. Effective security management leads to a process that is both consistent and predictable.
  • Are you moving in the right direction? This chart tracks your organization’s progress towards your security objectives. Defining and tracking these objectives gives clear targets to your teams, specifying what is expected from them and ensuring continuous improvement in security measures.
How does SIG's security assessment work?

SIG uses a Static Application Security Testing (SAST) model that ranks software systems from 1 to 5 stars. We evaluate system properties through thorough analysis of the source code and infrastructure, including reviewing the codebase and other artifacts. The scores for various system characteristics are then mapped to the OWASP Top 10, which identifies the ten most critical web application security risks.

For more information see our documentation.

What does SIG's security star rating mean?

Our 5-star rating reflects compliance with security best practices:

  • 1 star: Severely low degree of security controls
  • 2 stars: Very low degree of security controls
  • 3 stars: Low degree of security controls
  • 4 stars: Moderate degree of security controls
  • 5 stars: High degree of security controls

It's important to note that even a 5-star rating doesn't guarantee complete security but indicates that security considerations have been factored into design and implementation

How can my teams stay on top of new security findings as quickly as possible?


To ensure teams stay on top of new security findings, Sigrid integrates with communication tools like Slack. This allows teams to receive automatic notifications of new findings in real-time, fostering a culture of security awareness and enabling prompt action to address issues as they arise.

For more information see our Gitlab page

What is Sigrid’s method for evaluating high-quality software?
Determining an objective standard for code quality is challenging. High-quality software operates as anticipated, can handle unexpected scenarios, is protected against misuse, and adheres to coding quality standards.
 
All these factors share a common requirement: maintainability. This is a noble responsibility for developers and associated roles. Code should be efficient, effective, functional, and easy to read, especially for those developers who will follow. When discussing technical debt, it’s easy to visualize its accumulating effects, where over time it incurs interest on the initial debt—the technical flaws within the code. This metaphor doesn’t extend indefinitely since not all debt must be addressed (this represents a business choice). However, it’s a well-recognized concept (the term “cruft” has not fully entered mainstream terminology). Technical debt might simply represent unrealized potential, but often we observe that at a certain point, teams become caught in a negative cycle. Consequently, subpar code quality becomes apparent to developers, clients, and the business. Additionally, poor code quality is detrimental to business outcomes. While the opposite isn’t always the case, it's clear that understanding code maintainability is essential for delivering business value in a reliable and efficient manner.
 
For more information, see our documentation
How do you use the quality tab in the dashboard?

The quality tab in the Management Dashboard helps organizations assess and improve software quality and maintainability. It answers three key questions:

  • Are you in control? The top of the quality tab depicts your maintainability and architecture compared to the SIG benchmark. It quantifies the impact of technical debt on your development speed, helping you understand how maintainability issues affect overall productivity.
  • Are you doing the right things? These charts look at development activity per month. It is expected that the majority of capacity is spent on business-critical systems. If teams are spending most of their time on legacy systems month after month, it can indicate that technical debt in those systems has become unmanageable. This chart can act as a trigger for starting discussions with the team to address and manage technical debt effectively.
  • Are you moving in the right direction? This chart tracks your organization’s progress towards your quality objectives. Defining and tracking these objectives gives clear targets to your teams, specifying what is expected from them and ensuring continuous improvement in software quality.
How do you establish a precise benchmark for code quality?

To evaluate code quality, one typically requires contextual awareness. Each software piece may exist within a vastly different context. At SIG, we employ technology-agnostic source code analysis to assess quality and make comparisons to a benchmark. A benchmark is significant as it provides an impartial standard to gauge performance against. This benchmark is grounded in “the current state of the software development market,” allowing you to measure your source code against that of others.

For a comparison of various programming technologies, the metrics represent abstractions that exist universally, such as the quantity of code and the complexity of decision-making pathways. Thus, system size can be standardized to “person-months” or “person-years,” indicating the amount of developer effort completed over a specific time frame. These figures are again derived from benchmarks.

Sigrid assesses the analysis findings for your system against a benchmark that encompasses over 30,000 industry systems. This benchmark set is chosen and adjusted (rebalanced) annually to stay aligned with current software development trends. “Balanced” here refers to a representative distribution of the “system population,” covering everything from legacy technologies to contemporary JavaScript frameworks. In technological terms, this is weighted toward the most commonly used programming languages, as this best reflects the existing landscape. The metrics related to the benchmark approximate a normal distribution. This provides a verification of being a fair representation and enables statistical analysis on “the population” of software systems.

For more information, see our documentation

What is the Sigrid star rating?

The evaluation score for code quality, when compared to this benchmark, is represented as a star rating ranging from 1 to 5 stars. It adheres to a distribution of 5%-30%-30%-30%-5%. Technically, the scoring metrics span from 0.5 to 5.5 stars. This is a convention meant to prevent a “0” rating, as zero lacks significance on a code quality scale. The central 30% lies between 2.5 and 3.5, with every score in this range designated as 3 stars, symbolizing the market average.

Despite the fact that 50% of systems will invariably score below the average (3.0), 35% will fall beneath the 3-star marker (below 2.5), and another 35% will exceed the 3-star mark (above 3.5). To mitigate the implication of extreme accuracy, it’s beneficial to perceive these star ratings as ranges; for example, a score of 3.4 stars would be interpreted as “within the expected range of market average, leaning toward the higher side.” It’s important to note that rounding calculations occur downwards, with a maximum precision of 2 decimal places. Therefore, a score of 1.49 stars will be rounded down to 1 star.

For more information, see our documentation

How does SIG's maintainability rating influence development capacity?

Our findings indicate that low maintainability in code (for instance, 2-star systems) can cause a 40% reduction in capacity for routine maintenance. Conversely, high maintainability (4-star systems) can increase capacity for innovation and improvement by up to 30%.

What are objectives in Sigrid®?

Objectives in Sigrid® are targets that can be set to compare against system status and quality trends. They are non-functional requirements that indicate where you want your systems to be in terms of various quality characteristics. Examples include desired maintainability, new code quality, minimum test code ratio, or maximum number of medium-risk vulnerabilities in libraries.

For more information, see our Portfolio objectives page

How do I set objectives in Sigrid®?

To set objectives in Sigrid®, you can select the Objectives tab from the menu bar. You can define portfolio objectives by clicking the “Add Portfolio Objective” button, which will guide you through configuring the objective using dropdown menus for capability, type, and value. You can then apply the objective to a group of systems based on shared metadata such as technology category, business criticality, lifecycle phase, or deployment type.

For more information, see our Portfolio objectives page

Experience Sigrid live

Request your demo of the Sigrid® | Software Assurance Platform:
  • This field is for validation purposes and should be left unchanged.

Register for access to Summer Sessions

This field is for validation purposes and should be left unchanged.
Name*
Privacy*