Menu
About This ExerciseFor ParticipantsFor FacilitatorsFAQ
Software & Tech

Big Tech

Cloud / Ads / Devices / Enterprise Software

Big Tech Industry Packet#


Core Packet#

Industry Role#

You are the CEO of a Big Tech platform company with a dominant global market position (Google, Meta, Microsoft, Amazon, Apple-class). You operate across cloud infrastructure, advertising, enterprise software, devices, search, social media, and e-commerce with billions of active users and devices worldwide. Annual revenue is approximately $200B-$400B with operating margins in the 20-30% range. You control critical infrastructure — cloud platforms, data centers, global distribution networks — and your capital allocation decisions cascade across every other sector in this exercise. You represent the non-AI-development operations of a Big Tech platform company: cloud infrastructure, advertising, enterprise software, devices, search, social media, and e-commerce. The AI development/AI lab portion of the business (frontier model training, research lab operations, model licensing) is NOT under your control — those actions enter the exercise only through facilitator-controlled injects and scenario updates.


Strategic Context#

You sit at the center of the AI economy — not as the builder of frontier models, but as the company that must figure out how to embed AI into every product, service, and customer relationship you operate. Your cloud infrastructure hosts the AI workloads of every other sector in this exercise. Your advertising business is being reshaped by AI-driven search disruption. Your enterprise software products must integrate AI features at speed or lose ground to nimbler competitors. Your devices and platforms are the distribution channel through which billions of people interact with AI for the first time.

Your competitive position is defined by scale and reach, but scale alone is not a strategy. Amazon, Google, Microsoft, and Meta are all racing to embed AI into their existing product lines — copilots in productivity suites, AI-powered search, generative ad creative, AI shopping assistants, smart device integration. The question is not whether AI gets embedded but how fast, at what cost, and with what customer experience. The companies that execute product integration well will deepen ecosystem lock-in. Those that stumble will face customer attrition to AI-native alternatives and open-source-powered competitors that can move faster without legacy constraints.

Cloud infrastructure is your highest-leverage business. Enterprise customers across every sector are migrating AI workloads to your platform, but they are also demanding commodity pricing as open-source models reduce the premium on proprietary cloud AI services. Your cloud margin is under pressure from both competition (AWS vs. Azure vs. GCP price wars) and customer consolidation (enterprises reducing multi-cloud sprawl). Winning in cloud means positioning as the infrastructure-of-choice for enterprise AI — not just raw compute, but managed AI services, fine-tuning platforms, and verticalized solutions.

The regulatory environment is tightening. Antitrust scrutiny is focused on your market concentration, data practices, and the degree to which your platform advantages create barriers to competition. Data privacy regulations (GDPR, CCPA, expanding state laws) constrain how you collect, retain, and monetize user data. AI amplifies both concerns: AI features require more data, and AI-powered products deepen the ecosystem lock-in that regulators are targeting. You must navigate this environment proactively — reactive compliance is more expensive and more disruptive than anticipatory strategy.

Talent competition is fierce across all sectors. You are bidding for the same AI engineers, product managers, and data scientists that finance, healthcare, supply chains, and SaaS companies want. Your scale is an advantage (interesting problems, competitive compensation, career breadth), but startups offer equity upside and agility that you cannot match. Retention of mid-career engineers — the backbone of product execution — is the most acute pressure point.


Objectives#

ObjectiveTarget (Banded/Directional)Driver
Cloud Revenue GrowthMaterial growth in cloud infrastructure revenue; secure AI workload migration from enterprise customers across sectorsManaged AI services, fine-tuning platforms, verticalized cloud solutions, competitive pricing
AI Product Launch VelocityRapid integration of AI features into core products (search, email, office, devices, cloud, advertising) with measurable adoptionProduct engineering execution, ecosystem integration, customer communication
Advertising Revenue ProtectionMaintain or grow ad revenue despite search fragmentation, consumer privacy regulations, and AI-driven disruption to ad-supported modelsAI-improved targeting efficiency, generative ad creative, new ad formats, measurement innovation
Ecosystem Lock-in & Platform StickinessIncrease switching costs and ecosystem stickiness through AI-native features integrated across the product portfolioIntegration depth, cross-product data flows, workflow automation, developer ecosystem
Talent Retention & Competitive RecruitmentRecruit and retain top AI engineering and product talent in a competitive market; maintain organizational execution capabilityCompetitive compensation, career development, interesting problems at scale, retention of institutional knowledge

Constraints#

ConstraintImpactImplications
Massive AI Infrastructure CapexAI-related capex (data centers, custom silicon, inference infrastructure) is growing 30%+ annually. By 2026, AI capex could represent 40%+ of total capex. ROI timelines are uncertain; GPU/TPU utilization rates are lower than projected between workloadsHeavy capex burden constrains profitability, dividend capacity, M&A dry powder, and ability to invest in other areas. Overinvestment risk if enterprise AI adoption is slower than expected. Underinvestment risk if competitors capex harder. Capital allocation discipline is critical
Model Commoditization (External Market Force)Open-source AI models (LLaMA, Mistral, Phi) are improving rapidly and reaching feature parity with proprietary models for many enterprise use cases. This is an external market force — you do not control frontier model development decisionsProprietary cloud AI service premiums are eroding. Customers are testing open-source models and finding acceptable performance at lower cost. Your differentiation must come from managed services, integration, reliability, and ecosystem value — not model exclusivity
Antitrust Regulatory PressureAntitrust scrutiny on market concentration, data practices, platform bundling, and AI service tying. Potential enforcement actions including forced data separation, open API access mandates, and platform conduct restrictionsCompliance cost and execution risk are high. Business model and product design must be defensible. Proactive transparency and access measures may reduce enforcement risk but also reduce competitive advantage
Data Privacy & RegulationGDPR, state privacy laws, and sector-specific regulations limit data collection, retention, and cross-product data sharing. Expanding regulatory scope increases compliance cost and constrains data-driven product improvementReduces training data availability for product optimization. Limits data monetization across products. Privacy-preserving techniques (federated learning, differential privacy) are required but add complexity and cost
Talent SaturationLimited supply of AI engineering, product, and data science talent globally; bidding war with other Big Tech, SaaS, finance, healthcare, and startups. Retention risk if employees perceive better opportunities elsewhereCompensation pressure (stock + cash rising faster than historical rates); attrition of mid-career engineers is the most acute risk; organizational churn can derail product roadmaps; cannot solve with hiring alone — retention and internal mobility matter more

Resources & Levers#

Platform Assets:

  • Vast first-party data: Search queries, social graph, cloud usage, consumer behavior, enterprise usage, device telemetry. Petabytes of behavioral and operational data
  • Compute infrastructure: Global data center footprint, custom silicon (TPUs, GPUs), distributed inference capability. Competitive advantage in scale and reliability
  • Capital: Cash reserves ($50B-$100B+). Ability to fund massive capex and M&A. Access to capital markets at favorable terms
  • Ecosystem reach: Billions of active users/devices. Cloud platform with hundreds of thousands of enterprise customers. Distribution advantages for new AI-powered products
  • Brand trust: Market leadership, brand recognition, enterprise relationships, consumer familiarity

Potential Paths Forward:

  • AI Product Launches: Embed AI into core products (search, email, office, cloud, devices, advertising). Leverage ecosystem reach and distribution. High impact but execution complexity and customer communication challenges are significant. This is the primary lever for capturing AI-era value
  • Cloud Infrastructure Positioning: Position cloud platform as infrastructure-of-choice for enterprise AI workloads. Compete on managed services, fine-tuning platforms, and verticalized solutions — not just raw compute. Cloud margin under pressure; differentiation is critical
  • Open-Source Strategy: Contribute to or sponsor open-source models and frameworks to maintain developer goodwill, talent recruitment pipeline, and ecosystem influence. Must calibrate carefully to avoid commoditizing your own premium services
  • M&A for AI Talent & Capabilities: Acquire AI startups, vertical AI companies, or talent teams to fill capability gaps faster than organic development. Expensive but execution risk lower than organic build in a talent-constrained market
  • Data Strategy & Privacy: Defend data assets and optimize data flows across products within regulatory constraints. Invest in privacy-preserving techniques to maintain data advantage while demonstrating compliance. Proactive data governance reduces regulatory risk

AI Adoption Arc — Foundation Phase#

Foundation (2025 - Q1 2026): Your AI product integration effort is in early execution. AI-powered features have launched in core products — search, email, productivity suites, cloud services — with mixed adoption. Enterprise cloud customers are piloting AI workloads on your platform, but conversion from pilot to production is slower than projected. GPU/TPU utilization rates at your data centers are below target as inference demand has not yet scaled to match infrastructure buildout. Advertising AI (generative creative, AI-powered targeting) is in limited deployment with promising early signals but no material revenue impact yet. Internally, engineering teams are stretched across too many AI initiatives; prioritization and focus are emerging as execution risks. Capex is committed and accelerating. Margin impact so far: moderate compression from infrastructure costs, partially offset by cloud revenue growth.


Strategic Considerations#

  1. Strategic AI embedding requires focus, not comprehensiveness. The tension is between high-impact, high-stickiness features (copilots in core productivity tools, AI-powered search, smart device integration) and broad AI deployment across the product portfolio. Consider whether concentrated investment in a few transformative features creates more differentiation than distributing AI capability thinly — focus creates speed, but breadth defends against competitors targeting specific product gaps.

  2. Cloud competitive advantage shifts from model exclusivity to managed services. Model commoditization is an external reality that redefines where cloud differentiation lies. Managed AI services, fine-tuning platforms, verticalized solutions, and enterprise trust may matter more than proprietary model superiority — but the trade-off is ceding the narrative on AI leadership to foundation model companies while competing on integration and reliability.

  3. Regulatory posture is a strategic choice with asymmetric consequences. Antitrust and privacy enforcement are intensifying. Proactive transparency and access measures are cheaper than reactive compliance and litigation — but the degree of proactivity involves trade-offs between competitive flexibility and regulatory defensibility. Consider what posture is sustainable before enforcement arrives rather than adapting after it does.

  4. Talent retention competes with infrastructure investment for strategic priority. Engineering talent — especially mid-career product engineers who ship features — is the binding constraint on AI product velocity. Competitive compensation alone is insufficient; interesting problems, career development, and internal mobility matter as much. The question is whether talent investment receives the same strategic weight as capex in resource allocation decisions.

  5. Capex discipline determines whether infrastructure buildout creates or destroys value. Infrastructure investment is necessary but must be tied to demonstrable demand signals and utilization targets. The risk is that momentum-driven buildout outpaces customer adoption — consider what utilization thresholds and payback periods should trigger acceleration or deceleration of infrastructure spending.