Measure knowledge sharing in your Organization with Worklytics

See how

Measuring Knowledge Sharing and Innovation via ONA

TLDR

Organizational Network Analysis (ONA) operationalizes knowledge-sharing analytics by measuring how expertise and information flow through actual collaboration relationships. Treat ONA as a measurement system with a decision purpose and a short KPI set.

  • Define knowledge sharing as discovery, diffusion, and reuse, then map each to a network metric.
  • Use a consistent network model: people as nodes, interaction metadata as edges, stable time windows, and segmentation by the boundaries you manage.
  • Track three executive KPIs: cross-boundary connectivity, broker concentration, and discovery friction.
  • Connect network shifts to innovation outcomes you already govern (cycle time, reuse, cross-team throughput), and read results as trends against a baseline.

Why ONA is the right measurement layer

Surveys and documentation counts do not show whether knowledge reaches the teams that need it, when they need it. ONA fills that gap by modeling collaboration as a network and quantifying cross-team information flow.

Innovation measurement also requires discipline. The OECD Oslo Manual 2018 treats innovation as a measurable activity with consistent definitions and reporting expectations. Use ONA as a leading indicator system explicitly linked to measurable results, not as a visualization exercise.

Define “knowledge sharing analytics” in measurable terms

Define knowledge sharing analytics as indicators that answer three operational questions:

  1. Discovery: Can employees locate relevant expertise and prior work without excessive routing?
  2. Diffusion: Does information cross the boundaries that matter to delivery and customer outcomes?
  3. Reuse: Does shared knowledge reduce duplicated effort and shorten execution cycles?

ONA supports all three because it treats knowledge transfer as a flow through relationships. Knowledge sharing improves when cross-boundary links exist, those links are resilient (not dependent on a small set of people), and routing paths between key groups remain short.

Innovation depends on recombining knowledge across different parts of the organization. Research on the “strength of weak ties” is often referenced because it explains why boundary-spanning relationships accelerate the diffusion of information and opportunities.

Data signals that support passive ONA

A practical ONA program uses collaboration metadata that already exists in enterprise systems. The most common signals are:

  • Message interactions in chat and email (who interacts with whom, frequency, directionality).
  • Calendar interactions (meeting participation patterns, cross-team meeting reach).
  • Collaboration artifacts (shared documents, repository activity, ticket workflows), depending on your operating model.

For knowledge sharing analytics, prioritize completeness of coverage over novelty. If a large portion of cross-team work occurs in a tool that is not included, the network will underestimate boundary connections and misstate broker concentration. Apply basic data hygiene: filter automated or system-generated traffic, distinguish internal from external participants, and keep edge construction rules stable so trends are interpretable across periods.

The minimum viable ONA model

A usable ONA program starts with four stable modeling decisions. These decisions determine whether the analysis produces comparable signals over time or devolves into one-off diagnostics that cannot support governance or executive decisions. Consistency across these choices is more important than precision at the outset.

1. Nodes

Model individuals as nodes, then roll results up to teams, functions, or locations for decisions about operating model and work design. Individual-level modeling is required to capture emergent collaboration patterns that are invisible at aggregate levels. Aggregation should occur only after network measures are computed to preserve structural accuracy.

2. Edges

Define connections using collaboration metadata and state the rule precisely:

  • Which tools are in scope
  • What minimum interaction creates an edge
  • Whether edges are weighted
  • Whether edges are directional

Edge definitions determine what the organization considers meaningful knowledge exchange versus background activity. Inconsistent or loosely defined edge rules introduce noise that distorts comparisons across time periods and segments. Choose one edge rule set per business question and keep it constant for a reporting cycle.

3. Time windows

Use windows that match how work is managed:

  • 30 days for near-term feedback
  • 90 days for quarterly stability
  • 12 months for structural change detection

Time window selection directly affects sensitivity versus stability in network measures. Short windows surface rapid shifts but exaggerate volatility, while longer windows smooth noise and reveal persistent structural patterns. Mixing window lengths within the same analysis undermines interpretability.

4. Segmentation

Segment by the boundaries you manage: function, product line, geography, level, tenure. Segmentation aligns ONA results with real decision rights and accountability structures. Without segmentation, organization-wide averages obscure where knowledge flow is constrained or overly dependent on a small subset of actors.

Metrics that matter for knowledge sharing and innovation

You do not need dozens of network measures. Use metrics with direct management meaning, then convert them into a small KPI set.

1. Cross-boundary connectivity

Measure density across key boundaries (function-to-function, region-to-region, product-to-product). Cross-boundary density is where silos show up.

Decision use: low cross-boundary connectivity points to an operating model problem (ownership, interfaces, integration roles), not a communications problem.

Sample report of Worklytics in consistency of Employee connectivity

2. Broker concentration

Use betweenness-type measures to identify whether cross-boundary flow depends on a small number of brokers. High concentration indicates a single point of failure and a predictable burnout risk.

Decision use: reduce dependency by creating redundancy, formalizing communities of practice, and ensuring critical routes have more than one viable path.

3. Discovery friction

Use path length (or related reachability measures) between priority groups to quantify routing distance to expertise.

Decision use: rising discovery friction indicates degraded discoverability. Interventions include clarifying ownership, simplifying escalation paths, and improving internal findability.

4. Convert metrics into executive KPIs

For leadership reporting, keep to three core KPIs. These indicators translate network science into signals that align with operating model decisions and accountability structures. Limiting the KPI set ensures focus on levers leadership can directly influence rather than descriptive analytics.

  • Boundary connectivity score: cross-boundary density normalized for headcount and structure. This KPI indicates whether knowledge and information are able to traverse the organizational boundaries that matter for delivery and innovation.
  • Broker concentration index: share of bridging paths attributable to the top brokers. This measure quantifies dependency risk within the collaboration network and highlights whether cross-boundary flow is resilient or fragile.
  • Discovery friction score: average routing distance between priority groups. This KPI reflects how easily employees can locate and access relevant expertise across the organization.

Each KPI must have a clearly assigned owner, a defined threshold that triggers intervention, and a standardized response playbook. Without ownership and action thresholds, network indicators remain observational and fail to influence organizational outcomes.

Connecting ONA to innovation outcomes

ONA measures collaboration structure. Innovation governance evaluates outcomes. Connect the two with a tiered design.

Leading indicators from ONA

  • Boundary connectivity increases in the parts of the org responsible for discovery and delivery.
  • Broker concentration decreases without collapsing overall connectivity.
  • Discovery friction declines across functions required to deliver value.

Lagging indicators from the business

Use indicators you already defend in leadership reviews, such as delivery cycle time, reuse rate of internal assets, and cross-team throughput for work that spans functions.

Interpretation rule: treat ONA as useful when network shifts precede durable changes in these outcomes, not when a network map looks intuitive. If network scores move without any operational decision being taken, the measurement is not yet connected to management.

Operationalizing measurement and action

To convert ONA into an operating rhythm, formalize three elements: cadence, thresholds, and decision ownership.

Cadence

  • Monthly: review KPI movements and highlight significant shifts by segment.
  • Quarterly: decide interventions that change structure or ways of working, then set target ranges for the next period.
  • Annually: evaluate the effect of structural changes (reorgs, acquisitions, major tooling changes) using long-window trends.

Thresholds

Set numeric thresholds that trigger action, not discussion. A broker concentration index above the defined ceiling triggers a redundancy plan. A boundary connectivity score below the defined floor triggers an operating model review for that interface. A sustained increase in discovery friction triggers a decision on ownership clarity and knowledge infrastructure.

Decision ownership

Assign owners who can change the system: operating model owners, functional leaders, and transformation leads. Knowledge-sharing analytics cannot be improved by reporting alone. It improves when owners execute interventions and you re-measure to verify sustained change.

Governance and privacy requirements

Knowledge sharing analytics fails when employees experience it as surveillance. Governance must be designed as a foundational control, not a compliance afterthought, to ensure sustained trust and participation. The objective is to measure organizational patterns while explicitly preventing individual-level monitoring or misuse.

Use a privacy risk framework

Align governance to the NIST Privacy Framework, which treats privacy as an enterprise risk management discipline. This approach ensures privacy risks are identified, assessed, and mitigated alongside operational and security risks. Applying a formal framework also enables consistent decision-making as data sources, analytics scope, and organizational needs evolve.

Minimize data and avoid content

Use only the interaction metadata required to model collaboration patterns. Excluding message content reduces exposure, simplifies governance, and reinforces that the intent is structural analysis rather than behavioral inspection. Retention periods should align strictly with analytic time windows to prevent unnecessary data accumulation.

Apply pseudonymisation when identity is not required

Pseudonymisation reduces identifiability while preserving analytic value for segmentation and trend analysis. It supports responsible analysis at scale without introducing unnecessary personal data risk. Because pseudonymised data remains regulated personal data in many regimes, purpose limitation, access controls, and documented use cases must remain enforced.

Set acceptable uses

Document allowed uses (organizational design, change management, collaboration health) and prohibited uses (individual performance scoring based on message volume or response latency). This boundary determines whether employees trust the system.

Worklytics as an ONA solution for knowledge sharing analytics

If the priority is operational measurement rather than building and maintaining custom analytics pipelines, Worklytics provides an enterprise-grade platform purpose-built for Organizational Network Analysis at scale. The platform is designed to operationalize knowledge sharing analytics by combining broad data coverage, rigorous privacy controls, and decision-ready network metrics in a single system. This enables organizations to move beyond exploratory analysis and treat ONA as a governed, repeatable management capability.

Broad coverage through pre-built platform connectors

Worklytics offers pre-built connectors for 25+ work platforms, enabling passive and comprehensive ONA data collection across collaboration systems such as Slack, Google Workspace, Microsoft 365, email, and calendar. This breadth of integration ensures that cross-boundary knowledge flow is captured where work actually happens, reducing analytical blind spots caused by tool fragmentation or partial data coverage.

667f42ed7cdf3b940f7577f1_ONA data-p-800.jpg
ONA process of Worklytics

Privacy-by-design through anonymization and pseudonymisation

Worklytics includes built-in anonymization and pseudonymization controls, including the ability to anonymize data at ingestion before analysis and to apply persistent pseudonyms for longitudinal and segmented analysis. These capabilities allow organizations to analyze structural collaboration patterns while maintaining strict safeguards against individual-level monitoring, reinforcing trust and regulatory alignment.

privacy-approach-diagram-worklytics.png
Privacy design of Worklytics

Historical network reconstruction for longitudinal insight

Worklytics supports historical network reconstruction from enterprise collaboration systems, enabling analysis of collaboration patterns over multi-year periods, including up to three years of historical data. This longitudinal perspective is critical for distinguishing structural change from short-term variation, particularly following reorganizations, integrations, or shifts in operating model.

Enterprise-scale network metrics and live dashboards

Worklytics computes core ONA metrics such as degree, eigenvector centrality, and betweenness centrality at enterprise scale and surfaces them through continuously updating dashboards. These dashboards are designed for executive consumption, allowing leaders to monitor changes in connectivity, broker dependency, and discovery friction as ongoing operational signals rather than static reports.

667f44bef8a088310f8df7a7_ONA dashboard-p-800.jpg
Sample reports of Worklytics in Team Collaboration and Organizational Network health

Benchmarking to support target-setting and prioritization

Worklytics provides benchmarking capabilities across roles, functions, and peer organizations, enabling teams to contextualize network measures and set informed performance targets. Benchmarking elevates ONA from descriptive analytics to a comparative management tool, supporting prioritization of interventions where improved knowledge flow will have the greatest organizational impact.

Used effectively, Worklytics enables organizations to institutionalize ONA as a core component of knowledge-sharing analytics. Rather than producing one-off network studies, teams can establish an ongoing measurement system that informs operating model decisions, mitigates collaboration risks, and supports sustained innovation by improving knowledge flow.

667f477ec0744f2b98c209bd_ONA benchmarks-p-2000.jpg
Sample report of Worklytics in Benchmarking comparing typical networks

FAQs

What is the difference between knowledge sharing analytics and ONA?

Knowledge sharing analytics is the measurement objective (discovery, diffusion, reuse). ONA is the method that models collaboration relationships and quantifies those objectives.

Do you need message content to measure knowledge sharing?

No. Use interaction metadata to model patterns of collaboration. Content inspection is not required for the KPIs used in organizational decisions.

Which KPIs should leadership review first?

Boundary connectivity, broker concentration, and discovery friction. Together, they cover silos, single points of failure, and expertise routing inefficiency.

How often should the ONA be reviewed?

Review leading indicators monthly. Use quarterly reviews to compare segmentation and make operating model decisions. Use annual views for structural change detection.

Can ONA be used for individual performance evaluation?

Do not use ONA for individual performance scoring. Use it to improve the system of work by reducing bottlenecks and strengthening cross-team connectivity.

How do you keep an ONA program compliant and trusted?

Use purpose limitation, data minimization, pseudonymisation, restricted access, and documented acceptable uses, aligned to a privacy risk framework and regulator guidance.

Request a demo

Schedule a demo with our team to learn how Worklytics can help your organization.

Book a Demo