Guide April 18, 2026 · 23 mins · The D23 Team

PE Portfolio Cost Analytics: Identifying Cross-Portco Savings

Learn how PE firms use cross-portfolio cost analytics to identify savings across cloud, software, and procurement. Real strategies for value creation.

PE Portfolio Cost Analytics: Identifying Cross-Portco Savings

PE Portfolio Cost Analytics: Identifying Cross-Portco Savings

Private equity firms manage dozens—sometimes hundreds—of portfolio companies simultaneously. Each operates with its own vendor contracts, cloud infrastructure, software licenses, and procurement processes. The result: massive redundancy, negotiating power left on the table, and millions in avoidable spend scattered across the portfolio.

Yet most PE firms lack the analytics infrastructure to see this waste, let alone quantify it. They rely on quarterly calls with CFOs, spreadsheet audits, and manual cost reviews that surface only a fraction of the opportunity. By the time an insight reaches the investment committee, the quarter is over and the savings window has closed.

This is where PE portfolio cost analytics changes the game. A centralized, real-time view of cross-portfolio spend—cloud infrastructure, software subscriptions, vendor contracts, procurement categories—lets you identify patterns, consolidate suppliers, renegotiate terms, and unlock value that competitors miss. The firms doing this best are building dashboards and analytics systems that surface these insights continuously, not quarterly.

This guide walks you through how to build and operationalize cross-portfolio cost analytics, the metrics that matter most, and the technical architecture that makes it work at scale.

Why Cross-Portfolio Cost Analytics Matters in PE Value Creation

Cost optimization is not a new concept in private equity. Firms have long recognized that operational efficiency and SG&A reduction drive EBITDA expansion. But traditional cost-cutting is reactive and fragmented: a CFO identifies a problem within a single company, negotiates a fix, and moves on. The portfolio-wide view—the opportunity to consolidate software licenses across ten companies, or to renegotiate cloud terms with a single provider across the entire fund—remains invisible.

Cross-portfolio cost analytics inverts this. Rather than waiting for individual companies to report inefficiencies, you proactively scan the entire portfolio for patterns. You ask questions like:

  • Which vendors appear in multiple portfolio companies’ contracts, and what volume discounts are we leaving on the table?
  • How much are we spending on cloud infrastructure per dollar of revenue in each portco, and which outliers warrant investigation?
  • Which software categories (CRM, HR, analytics, security) show the most variance in per-user cost across the portfolio?
  • How much of our procurement spend is concentrated with a single vendor, and what risks or opportunities does that create?
  • Which portcos are paying the highest rates for similar services, and what’s driving the difference?

According to research on 10 Cost Optimization Strategies For Private Equity Portfolio Companies, firms that systematically analyze cost structures across their portfolio identify 10-20% of EBITDA in potential savings. But this only works if you have visibility. Without dashboards and analytics, you’re operating blind.

The financial impact is substantial. A $5B PE fund with a typical portfolio of 15-20 companies averaging $300M in revenue generates roughly $900M in combined annual spend across the portfolio. Even a 2-3% reduction in spend through better visibility and consolidation translates to $18-27M in value creation—enough to meaningfully improve IRR. And unlike operational improvements that take time to implement, procurement consolidation and vendor renegotiation can move quickly, often within 90 days of identifying the opportunity.

The Data Foundation: What to Measure and Where to Source It

Before you build dashboards, you need data. Cross-portfolio cost analytics requires integrating multiple data sources, each with its own schema, grain, and update cadence. The good news: the data already exists in your portfolio companies. The challenge is centralizing it.

Cloud Infrastructure Costs

Cloud spend is often the easiest to standardize because AWS, Azure, and GCP provide detailed cost and usage reports. Most portfolio companies export these reports monthly or can grant read-only access to their cloud billing APIs. The key metrics:

  • Total monthly spend by company, service (compute, storage, database, networking), and cost center
  • Cost per revenue dollar (cloud spend as % of revenue)
  • Reserved instance utilization (are teams actually using the capacity they’ve purchased?)
  • Unused resources (idle instances, orphaned storage, unattached volumes)
  • Data transfer and egress costs (often overlooked, but can be 10-15% of cloud bills)

When you aggregate this across the portfolio, patterns emerge. One company might be running at 2% cloud-to-revenue, another at 8%. The difference could be architecture inefficiency, underutilized reserved instances, or simply different business models. But without the dashboard, you never ask the question.

Software and SaaS Spend

Software is messier. Most companies don’t have a centralized software inventory. Licenses are scattered across departments, purchased through different vendors, and often forgotten after renewal. To build visibility, you need:

  • Software inventory data from IT or finance (list of active licenses, cost per user, renewal date, business owner)
  • Headcount data by company and department (to normalize per-user cost)
  • Duplicate tool detection (how many CRM platforms, analytics tools, or project management systems are in use?)
  • Contract terms (annual vs. monthly, auto-renewal clauses, volume discounts available)

Many portfolio companies don’t have this data formalized. You may need to conduct a 4-6 week software audit across the portfolio, collecting spreadsheets from IT and finance teams, normalizing the data, and loading it into a centralized system. The effort pays for itself in the first month of consolidation.

Procurement and Vendor Spend

Procurement data is critical but often the most fragmented. Each company uses different procurement systems (SAP, NetSuite, Coupa, or simple spreadsheets). To create a unified view:

  • Export transaction-level spend data from each portco’s procurement system or accounting software
  • Standardize vendor names (the same vendor may appear as “Acme Corp,” “Acme”, “Acme Inc.” across different companies)
  • Classify spend by category (IT, office supplies, logistics, professional services, etc.) using a standard taxonomy
  • Identify the business unit or cost center responsible for each transaction
  • Flag contract data (when does each vendor agreement expire, what are the terms?)

Once standardized, this data reveals the portfolio’s true supplier concentration. You might discover that five vendors account for 40% of spend, or that the same logistics provider is used by eight portcos but each negotiated independently. These are your consolidation opportunities.

Financial Data Integration

To normalize spend across companies of different sizes and revenue profiles, you need baseline financial data:

  • Revenue (monthly or quarterly)
  • EBITDA (to measure spend as % of profitability)
  • Headcount (to normalize SaaS and HR spend)
  • Number of locations (for facilities, logistics, and regional services)

This data typically comes from the accounting system or management reporting already in place. The integration point is usually straightforward—most PE firms have a data room or FP&A system where this information is already centralized.

Building the Analytics Architecture

Once you’ve identified and sourced the data, the next step is infrastructure. You have two main approaches: build custom analytics in-house, or use a managed analytics platform optimized for PE operations.

The Case for Managed Analytics on Apache Superset

Many PE firms attempt to build cost analytics in-house using SQL, Python, and tools like Tableau or Looker. This approach works if you have a dedicated data team, but it’s expensive and slow. You’re building infrastructure instead of focusing on insights.

A better approach is to use D23, a managed Apache Superset platform designed for teams that need production-grade analytics without the platform overhead. Here’s why it works for PE cost analytics:

Speed to insight: Instead of spending weeks building data pipelines and dashboards from scratch, you start with pre-built templates for cost analysis. Superset’s SQL-based querying means your finance and operations teams can write and modify queries without waiting for engineers. You go from raw data to interactive dashboard in days, not months.

No vendor lock-in: Apache Superset is open source. Your data stays yours, your queries are portable, and you’re not dependent on a single vendor. This matters in PE, where flexibility and control are non-negotiable.

Embedded analytics for portcos: As you build cost analytics dashboards, many portfolio companies want access to their own cost data. With Superset’s embedding capabilities, you can give each CFO a personalized view of their company’s spend without building separate systems. D23 handles the infrastructure, multi-tenancy, and access controls automatically.

Text-to-SQL and AI assistance: Modern analytics platforms can generate SQL from natural language. Instead of asking your data team “Show me cloud spend by service for companies with >$100M revenue,” you can ask the dashboard directly. This democratizes analytics and speeds up ad-hoc exploration.

Data Pipeline Architecture

Regardless of which platform you choose, the underlying architecture should follow this pattern:

Data ingestion layer: Extract data from source systems (cloud billing APIs, procurement systems, accounting software) on a daily or weekly cadence. Use tools like Fivetran, Stitch, or custom scripts to automate this. The goal is to minimize manual data entry and keep everything fresh.

Data warehouse or lake: Land all ingested data in a centralized repository (Snowflake, BigQuery, Redshift, or Postgres). This is your source of truth. All downstream analytics read from here, not from source systems. This decouples your analytics from operational systems and prevents query load from impacting production.

Data transformation layer: Use dbt (data build tool) or similar to clean, standardize, and combine data from multiple sources. This is where you normalize vendor names, classify spend categories, and create reusable metrics like “cloud spend as % of revenue.” The transformation layer should be version-controlled and documented, so anyone on your team can understand how metrics are calculated.

Analytics and visualization layer: This is where Superset comes in. You query the transformed data and build interactive dashboards. Superset’s strength is that it lets non-technical users explore data without touching code, while power users can write complex SQL queries.

Key Dashboards and Metrics for PE Cost Analytics

Once your architecture is in place, you need to know what to measure. The following dashboards form the core of a PE cost analytics practice:

1. Portfolio-Wide Cost Overview

This is your executive dashboard. It shows:

  • Total portfolio spend (cloud, software, procurement) month-over-month and year-over-year
  • Spend as % of revenue across the portfolio and by company
  • Spend by category (cloud, SaaS, vendor, other) with trend lines
  • Top 20 vendors by total spend across the portfolio
  • Spend concentration (what % of total spend comes from the top 10 vendors?)
  • Cost per headcount across the portfolio (to identify outliers)

This dashboard answers the question: “Where is the portfolio’s money going, and what’s changing?” It’s updated weekly and shared with the investment committee.

2. Cloud Cost Analysis

For firms with significant cloud infrastructure, a dedicated cloud dashboard is essential:

  • Monthly cloud spend by company, with forecast for the year
  • Spend by service (EC2, RDS, S3, Lambda, etc.) to identify which services are driving costs
  • Cost per revenue dollar by company (benchmarking)
  • Reserved instance utilization (are we getting value from our commitments?)
  • Unused resources (instances not running, unattached volumes, idle databases)
  • Data transfer costs (often 10-15% of bills and frequently optimizable)
  • Top cost drivers (which instances, databases, or data transfers are most expensive?)

According to research on Data Strategy for PE Rollups: Maximize Enterprise Value, firms that actively monitor cloud costs typically reduce them by 20-30% through better resource utilization and rightsizing. This dashboard is your tool for finding that savings.

3. Software and SaaS Consolidation

This dashboard focuses on duplicate tools and renegotiation opportunities:

  • Software inventory by company and department (what tools are in use?)
  • Cost per user by tool and company (which companies are overpaying?)
  • Duplicate tools (how many CRM, HR, or analytics platforms are in the portfolio?)
  • Contract expiration calendar (when can we renegotiate?)
  • Underutilized licenses (tools with low active users relative to licenses purchased)
  • Consolidation opportunities (if we moved all ten companies to a single CRM, what’s the savings?)

This dashboard is typically owned by the Chief Operating Officer or a dedicated procurement team. It’s used to identify which tools to consolidate, which contracts to renegotiate, and which vendors to approach with portfolio-wide deals.

4. Vendor Consolidation and Spend Analysis

This dashboard reveals your supplier concentration and consolidation opportunities:

  • Top vendors by spend across the portfolio
  • Vendor concentration (what % of spend goes to top 10, 20, 50 vendors?)
  • Vendor overlap (which vendors appear in multiple portfolio companies?)
  • Spend by category (IT, professional services, logistics, etc.)
  • Vendor diversity (are we over-reliant on a single supplier?)
  • Contract terms (which vendors have annual vs. monthly contracts, and what are renewal dates?)
  • Spend variance (if Vendor X is used by three companies, why is Company A paying 30% more?)

This dashboard is typically shared with the Chief Procurement Officer or a dedicated value creation team. It’s used to identify vendors who can be consolidated, renegotiated, or eliminated entirely.

5. Benchmarking and Outlier Detection

This is where you identify which companies are spending abnormally:

  • Cloud spend per revenue dollar by company (benchmarked against median)
  • SaaS spend per headcount by company
  • Total SG&A as % of revenue by company (to identify outliers)
  • Procurement spend per revenue dollar by company
  • Vendors used by only one company (candidates for consolidation or elimination)

When you spot an outlier—a company spending 8% of revenue on cloud while the median is 3%, for example—you have a specific conversation with that CFO: “What’s driving the difference? Is it architecture, utilization, or pricing?” Often, the answer is something fixable (orphaned instances, underutilized reserved instances, or a vendor contract that wasn’t negotiated competitively).

Operational Workflows: From Insight to Execution

Dashboards are only useful if they drive action. The best PE firms operationalize cost analytics by embedding them into their value creation workflows.

Weekly Operations Review

Every Monday morning, the operations team reviews the portfolio cost dashboard. They identify:

  • Which metrics moved significantly week-over-week
  • Which companies have unusual spend patterns
  • Which vendors or categories warrant deeper investigation

This takes 30 minutes and surfaces issues early, before they become quarter-end surprises.

Monthly Cost Deep-Dive

Once a month, the Chief Operating Officer and CFOs from the top 5-10 companies meet to discuss cost trends. The agenda is driven by the dashboard:

  • “Cloud spend is up 15% month-over-month in Company X. What’s changed?”
  • “We’re paying 40% more per user for our CRM than Company Y. Why?”
  • “Vendor Z appears in seven companies’ contracts. Can we consolidate and renegotiate?”

These conversations are specific and data-driven. They lead to action items: audit cloud resources, renegotiate a contract, consolidate a tool, or investigate why one company’s costs are out of line.

Quarterly Value Creation Planning

As you identify savings opportunities, you need a system to track them through execution. Create a simple tracker that includes:

  • Opportunity description (e.g., “Consolidate CRM platforms across eight companies”)
  • Estimated annual savings (based on current spend and consolidation economics)
  • Implementation timeline (when can we execute?)
  • Owner (who’s responsible?)
  • Status (identified, in progress, completed)
  • Actual savings realized (once implemented)

This tracker becomes part of your value creation scorecard. Over a 24-month period, a well-run PE firm can typically identify $5-15M in annual savings across a $5B fund, with 60-80% of that realized through actual execution.

Advanced Techniques: AI and Predictive Analytics

Once you’ve mastered the basics of cost analytics, you can layer in more advanced techniques.

Text-to-SQL for Faster Exploration

Modern analytics platforms like D23 support natural language queries. Instead of writing SQL, you can ask questions in plain English:

  • “Which companies are spending more than 5% of revenue on cloud?”
  • “Show me all vendors used by more than three companies.”
  • “What’s the month-over-month change in software spend for Company X?”

The system translates these questions into SQL and returns results in seconds. This democratizes analytics—your CFOs and operations teams can explore data without waiting for a data analyst to write queries.

Anomaly Detection

As your dashboard grows, manual monitoring becomes impractical. Use statistical methods to automatically flag unusual patterns:

  • Spend anomalies: When a company’s spend deviates significantly from its historical average or peer companies
  • Vendor anomalies: When a vendor’s pricing or volume changes unexpectedly
  • Utilization anomalies: When reserved instances suddenly drop in utilization, suggesting idle capacity

These alerts can be delivered via email or Slack, so your operations team is notified immediately when something unusual happens.

Predictive Spend Forecasting

Once you have 12+ months of historical data, you can build forecasts for future spend. This is useful for:

  • Budget planning: Forecasting next year’s cloud and software spend by company
  • Consolidation timing: Understanding when contract renewals occur, so you can plan consolidation initiatives
  • Headcount planning: Predicting how SaaS and other per-headcount costs will change as companies grow

According to insights on Private equity: Driving value creation with data-driven costing, firms that use activity-based costing and predictive analytics reduce their cost of operations by 5-15% and improve forecast accuracy by 20-30%.

Integration with Existing PE Systems

Your cost analytics doesn’t exist in isolation. It needs to integrate with the systems and processes already in place at your firm.

Data Room Integration

Most PE firms maintain a data room for each portfolio company, containing financial statements, contracts, and operational data. Your cost analytics should pull from this data room, not duplicate it. If your data room system has an API (Intralinks, Datasite, etc.), use it to automate data extraction. If not, establish a weekly export process.

FP&A and Reporting Integration

Your cost analytics should feed into the quarterly and annual reporting processes. The dashboards you’ve built should be the source of truth for cost metrics in board presentations. This ensures consistency and reduces the time spent on manual reporting.

Portfolio Company Systems

Many portfolio companies want to see their own cost data. Rather than building separate systems, use your analytics platform to create personalized views. With D23, you can embed dashboards directly into each company’s internal portal, so their CFO sees only their company’s data while the PE firm sees the entire portfolio.

Challenges and How to Overcome Them

Building cross-portfolio cost analytics is not without challenges. Here’s how to address the most common ones:

Data Quality and Standardization

Challenge: Each portfolio company uses different systems, naming conventions, and chart of accounts. Vendor names are spelled differently across companies. Cloud infrastructure is organized differently.

Solution: Invest in a data transformation layer (dbt is ideal) that standardizes all incoming data. Create a master vendor list and use fuzzy matching to normalize vendor names. Document your standardization rules so they’re reproducible and maintainable.

Access and Privacy

Challenge: Portfolio companies are sensitive about sharing detailed cost data with the PE firm. You need to respect privacy while still providing visibility.

Solution: Use role-based access controls to ensure each company sees only their own data, while the PE firm sees aggregated views. Make it clear that the goal is value creation, not cost-cutting for its own sake. Many companies become enthusiastic participants once they see the benefits of benchmarking against their peers.

Keeping Data Fresh

Challenge: Cost data changes daily. If your dashboards are a week out of date, you’re making decisions on stale information.

Solution: Automate data ingestion so that cloud billing, procurement, and software data are updated daily or at least weekly. Use cloud APIs (AWS Cost Explorer, Azure Cost Management) to pull data automatically rather than relying on manual exports.

Organizational Buy-In

Challenge: CFOs and operations teams are busy. They may not see the value in spending time on dashboards and analysis.

Solution: Start with a quick win. Identify one low-hanging fruit (e.g., a software consolidation that saves $500K in the first 90 days) and execute it based on dashboard insights. Once the team sees value, they’ll be more engaged. Also, make the dashboards easy to use. If your CFOs have to write SQL or navigate complex systems, adoption will be low.

Real-World Example: A $5B Fund’s Cost Analytics Journey

Consider a mid-market PE firm with a $5B fund and 18 portfolio companies averaging $300M in revenue. Here’s how they built cost analytics:

Month 1-2: Audit and data collection

  • Finance team collected cloud billing exports, software inventories, and procurement data from each company
  • Standardized the data and loaded it into a data warehouse (Snowflake)
  • Identified data quality issues (duplicate vendors, missing contracts, inconsistent categories)

Month 3: Dashboard development

  • Built five core dashboards (portfolio overview, cloud, software, procurement, benchmarking)
  • Trained the operations team on how to use them
  • Shared initial findings with the investment committee

Month 4-6: Insight and action

  • Identified that eight companies were using different CRM platforms, with per-user costs ranging from $50 to $150
  • Discovered $2.3M in annual cloud savings through reserved instance consolidation and rightsizing
  • Found that five vendors accounted for 45% of procurement spend, creating consolidation opportunities

Month 6-12: Execution and value realization

  • Consolidated CRM platforms to a single vendor, saving $400K annually
  • Implemented cloud optimization recommendations, realizing $1.8M in annual savings
  • Renegotiated three major vendor contracts, saving $600K
  • Total annual savings: $2.8M, or 0.31% of portfolio revenue

Year 2 and beyond: Continuous improvement

  • Expanded dashboards to include headcount, facilities, and logistics costs
  • Implemented predictive forecasting for budget planning
  • Enabled each portfolio company to use the analytics platform for their own operational cost management

This firm went from having no visibility into cross-portfolio costs to identifying nearly $3M in annual savings in the first year. The dashboard became the centerpiece of their value creation process, and CFOs began using it proactively to manage their own costs.

Choosing the Right Platform

When selecting a platform for PE cost analytics, you have several options:

Commercial BI tools (Looker, Tableau, Power BI) offer polished interfaces and strong visualization capabilities, but they’re expensive ($50K-500K+ annually depending on scale) and require significant IT infrastructure to manage. They’re also designed for large enterprises, not PE firms with specific needs around portfolio analytics.

Metabase and Mode are lighter-weight alternatives, but they lack some of the advanced features needed for complex cost analytics (role-based access controls, multi-tenancy, embedded analytics).

Apache Superset, the open-source platform underlying D23, strikes a balance. It’s powerful enough for complex analytics, flexible enough to customize, and cost-effective. With a managed service like D23, you get all the benefits of Superset without the operational overhead of managing the infrastructure yourself.

For PE firms specifically, D23 offers advantages:

  • Built for analytics at scale: Multi-tenancy and role-based access controls are built in, so you can easily share dashboards with portfolio companies without exposing sensitive data
  • API-first architecture: Integrate with your existing data room, FP&A systems, and portfolio company systems
  • Text-to-SQL and AI assistance: Democratize analytics so CFOs can ask questions without waiting for analysts
  • Cost-effective: Transparent pricing without enterprise licensing overhead
  • Open-source foundation: Your data and queries aren’t locked into a proprietary system

The choice ultimately depends on your firm’s size, technical capability, and specific needs. But for most PE firms building cross-portfolio cost analytics, a managed Superset platform is the fastest path to value.

Building Your Cost Analytics Roadmap

If you’re starting from scratch, here’s a realistic roadmap:

Phase 1 (Months 1-3): Foundation

  • Audit and standardize cost data across the portfolio
  • Build a centralized data warehouse
  • Create the five core dashboards (portfolio overview, cloud, software, procurement, benchmarking)
  • Train your operations team
  • Identify 3-5 quick wins (consolidation opportunities, obvious waste)

Phase 2 (Months 4-9): Execution and Expansion

  • Execute on identified savings opportunities
  • Expand dashboards to include additional cost categories (facilities, logistics, headcount)
  • Implement automated data ingestion so dashboards stay fresh
  • Begin predictive forecasting
  • Share dashboards with portfolio company CFOs

Phase 3 (Months 10-18): Optimization and Embedding

  • Refine dashboards based on user feedback
  • Implement anomaly detection and alerts
  • Integrate cost analytics into your formal value creation process
  • Use dashboards as the source of truth for board reporting
  • Expand to other value creation areas (revenue, margin, headcount)

Phase 4 (Month 18+): Continuous Improvement

  • Maintain and enhance dashboards based on evolving needs
  • Use analytics to drive strategic decisions about portfolio company operations
  • Explore advanced techniques (predictive analytics, scenario modeling)
  • Share best practices and learnings across the portfolio

Measuring Success

How do you know if your cost analytics initiative is working? Track these metrics:

  • Savings identified: Total dollar amount of cost reduction opportunities surfaced by dashboards
  • Savings realized: Actual cost reductions implemented and confirmed
  • Time to insight: How long does it take to go from question to answer using the dashboards?
  • User adoption: What percentage of CFOs and operations leaders actively use the dashboards?
  • Decision velocity: How much faster are decisions being made compared to the pre-analytics era?
  • Portfolio benchmarking: How many companies are now actively comparing themselves to peers and taking action?

A successful cost analytics initiative should deliver at least $2-3M in annual savings for a $5B fund within 12 months, with 60-80% of identified opportunities realized through execution.

Conclusion: Cost Analytics as Competitive Advantage

Cross-portfolio cost analytics is no longer a nice-to-have for PE firms. It’s a core competency that separates top-quartile performers from the rest. Firms that systematically identify and execute on cost savings opportunities create measurable value, improve portfolio company profitability, and enhance returns.

The technical barriers to building this capability have fallen. With tools like D23 built on Apache Superset, you can stand up production-grade analytics in weeks, not months. The data exists in your portfolio companies—you just need to centralize it and make it visible.

The firms winning at this today are those who’ve moved beyond quarterly cost reviews and spreadsheet audits. They’re building dashboards that surface insights continuously, embedding analytics into their value creation workflows, and empowering their operations teams to make data-driven decisions. They’re also consulting with experienced practitioners who understand both PE operations and the technical details of building scalable analytics systems.

If your firm isn’t yet doing cross-portfolio cost analytics at scale, now is the time to start. The opportunity is substantial, the path is clear, and the competitive advantage is real. Begin with a focused pilot on cloud or software costs, prove the model, and expand from there. Within 12 months, you’ll have identified millions in savings and built a capability that will drive value creation for years to come.

For guidance on implementing a cost analytics program tailored to your fund’s specific structure and portfolio, consider consulting with experts who specialize in PE operations and data strategy. According to research on PortCo Performance: Finding Gold in Ore, the firms that extract the most value from their portfolios combine structured data practices with dedicated operational expertise. Cost analytics is the foundation; the execution and change management are what turn insights into realized value.

To learn more about building analytics infrastructure that scales across your portfolio, visit D23 to explore how managed Apache Superset can accelerate your cost analytics journey. For questions about privacy and data handling, review our Privacy Policy and Terms of Service.