April 2026

Veterans + AI Playbook

Three tracks. Two weeks. Everything you need to move from thesis to traction. LearnAIR is one piece of this. The thesis is bigger.

The Plan: April 16-30

You've got a strong thesis, a real network, and a window right now where the right moves compound. Three tracks running in parallel. All three matter. None of them wait on anyone.

Track 1: Learn the Landscape

Today proved you can build rapport and hold a room. Justin said the coffee was "critical." That's your strength. The next step is pairing that presence with the depth to walk someone through the thesis cold, on your feet, in your own words.

This week, start here:

  • Read the Claude Code docs end to end. Not skim. Read.
  • Subscribe to the Every newsletter. They've been covering agent workflows, AI adoption, and the shift in how work gets done. That's the water you're swimming in.
  • Work through this playbook. The full research archive is below: competitive landscape, employer targeting, policy research, Oregon landscape, target contacts. This is the briefing book behind the decks.

Start with these three sections from this page:

  1. Read The Thesis. The hypothesis in one page. Internalize this first.
  2. Read The Market. 23 competitors mapped. Nobody owns the upper-right quadrant. 12-18 month window.
  3. Read The Business Model. How the consultancy flywheel works. Employer pain tiers. Veteran-to-role matching.

Then go deeper:

  1. Read Oregon Landscape. WorkSource, state funding, local orgs, employer list.
  2. Read Curriculum Model. Why Alpha School's 2.3x model matters. LearnAIR's gaps. What we build on top.
  3. Ref Conversation Scripts. Pull these up before any employer, VSO, or veteran conversation.
  4. Ref Target Contacts. Five named Oregon employers with decision-maker profiles.

By end of week 2, the bar is: someone asks you "why veterans and AI, and what have you found?" and you give a 5-minute answer that hits the competitive gap, the funding landscape, the employer demand, and your network advantage. Your words, your conviction.

What from this list do you already feel strongest on? That's where you spend the least time. Go deep on the gaps.

Track 2: Have Conversations and Find Warm Leads

Your Rolodex is real. You've named Honor Foundation, BreakLine, Deloitte Veterans, BetterUp, Mission 22. Those aren't cold leads. Those are people who know you or are one degree away.

Two purposes here. First, these conversations sharpen the thesis. You learn what they're seeing, what's working, what's not. Second, some of these lead somewhere: a partnership, a role, a paid engagement. Both outcomes are good.

This week: pick two, reach out.

Not pitching. Not "we're doing a veteran program." Just: "I'm working on veterans and AI upskilling. Curious what you're seeing in your world." Let the conversation breathe.

Suggested first two:

  1. Honor Foundation. Derek coached there. You have a warm intro. They work with transitioning senior military leaders.
  2. BreakLine. Tech career transitions for underrepresented groups including veterans. They'd have signal on what employers are actually asking for.

Next week: two more conversations, plus start the job search.

Put feelers out for roles. Your background (Marine officer, Deloitte consulting, therapy/coaching, Forest Service) positions you for:

LinkedIn, your network, direct outreach. The goal isn't to land something in two weeks. The goal is to have motion so you're not waiting by the phone.

These conversations are yours regardless of what happens with LearnAIR. If they say yes, you walk in with fresh network intel and external validation. If they say no or stall, you already have momentum.

Which two from your network feel like the most natural first calls? Go with your gut there.

Track 3: Push the Thesis Forward

LearnAIR is one vehicle for the veterans + AI thesis. It might be the right one. It might not. Either way, the thesis gets sharper by working it.

Trailheads to pursue this week:

  1. WorkSource Oregon. Teresa confirmed state funding is unlocked for veterans transitioning. This is the fastest path to a funded cohort. You don't need anyone's permission to learn more. Start with the WorkSource Oregon website and see if there's a veteran services coordinator for Central Oregon. See the Oregon section below for the full picture.
  2. Mission 22. Justin mentioned Magnus and their prayer leader are interested in training. This is a lead that exists independent of any one company. What does Mission 22 actually need? What's their capacity?
  3. F3 Nation as a distribution channel. You're already embedded. 18-20 guys at Riverbend, 3-5 in Redmond. Some of them are veterans. Some of them know veterans. That's organic distribution for a pilot cohort that costs nothing to test. Just: "hey, I'm putting together a thing for vets getting into AI. Know anyone who'd be interested?"
  4. The "agentic curriculum" angle. Justin disclosed their existing build series (custom GPT, n8n, make.com) is already obsolete. The tools are moving so fast that anyone who built curriculum 18 months ago is behind. You learning Claude Code, agent workflows, MCP: that's not just personal development. That's building the muscle to design what comes next. When you can demo a working agent workflow and explain why it matters for a veteran transitioning into tech, that's the pitch no deck can deliver.

By end of week 2, the bar is: you've talked to at least one person at WorkSource or Mission 22, you've floated the idea to someone in your F3 network, and you've built or walked through at least one agent workflow yourself.

Where LearnAIR Fits

LearnAIR is on standby until next week. Here's the honest picture:

The Week at a Glance

April 16-18: Start here, one per track

  • Learn Read The Thesis and The Market sections below.
  • Do Send one message to your Honor Foundation contact.
  • Do Look up WorkSource Oregon veteran services for Central Oregon.

Then build from there:

ActionTrack
Read Claude Code docsLearn
Subscribe to Every newsletterLearn
Read Business Model + Oregon sectionsLearn
Reach out to BreakLineConversations
Float the idea to 1-2 guys in F3 networkThesis

April 21-25 (next week)

ActionTrack
Two more network conversationsConversations
Start job search: LinkedIn, direct outreachConversations
Build or walk through first agent workflowThesis
Follow up with Mission 22Thesis
Read Curriculum + Scripts sectionsLearn
LearnAIR decision expectedLearnAIR

You have a strong thesis, a real network, and three months to make something happen. The worst thing you can do right now is wait. The best thing you can do is be in motion on all three tracks so that when the next conversation happens, whether it's with Justin, with Honor Foundation, or with a hiring manager, you're not presenting a deck. You're telling them what you've already started doing.

What feels like the highest energy move for you right now? Start there.

Stay moving.

The Thesis

Strategic clarity. Read this first.

The Hypothesis

Employers will pay for veterans trained in AI operations supervision.

Not "veterans can learn AI." Not "mission command maps to AI supervision." Not "there's a gap in the competitive landscape." Those are supporting arguments. The load-bearing question is whether someone writes a check.

The One Proof Point

3 employer discovery conversations. The question isn't "would you hire an AI operations supervisor?" It's:

  1. Where are you at with AI adoption?
  2. What's working, what's not?
  3. Where's the gap between what your team can do and what you need them to do?
  4. If that gap had a name, what would you call it?
  5. What would it be worth to close it?

If three different employers describe a problem that sounds like "my people are using AI but nobody's governing the outputs," we've validated demand. If they describe something different, the thesis adapts.

The Consultancy Framing

We are not employees. We are not volunteers. We are a consultancy that owns its IP.

Key People + Competencies

PersonWhat They BringOpen Question
Mike DealMarine vet, Deloitte, USFS facilitator, BreakLine networkTechnical depth (ramping)
Justin CoatsLearnAIR brand, SDVOSB, OpenAI partnership, 100+ clientsIs the vet program his priority or Teresa's?
Teresa CoatsOperations, sport science, passion for vet programThe execution gate. More aligned than Justin.
Trisha SargentL&D (Consumer Cellular), Easter Seals HVRPAdvisor/guide role, not operator
Bobby NapiltoniaAppExchange creator, Twilio first CRO, enterprise GTMNot engaged directly yet. Massive if activated.

What the Program Needs That's Still Weak

The Market

23 competitors mapped. The gap is real. The window is 12-18 months.

The Differentiation

Every veteran AI program in 2026 does one of two things: it teaches veterans to use AI tools (Google certs, OpenAI sprints, VetsinTech workshops), or it helps veterans get hired (Hiring Our Heroes, BreakLine, FourBlock, ACP). None trains veterans to supervise AI-enabled workflows.

This is the exact skill that enterprise AI adoption demands and that military service uniquely prepares people to deliver. Veterans already operate in mission command environments: decentralized execution under commander's intent, where human judgment governs semi-autonomous systems under uncertainty.

The market has AI literacy programs and veteran career programs. It has no veteran AI leadership program. That's the gap.

Competitor Landscape (Key Players)

Veteran-Specific Programs

Hiring Our Heroes (U.S. Chamber)

Corporate Fellowship at Google, Microsoft, Salesforce, Amazon. Some rotations touch AI/data. Strong brand, weak AI depth. AI exposure is incidental to placements.

BreakLine Education

Free remote immersive sprints. Hiring partners include Anduril, Meta, Google, Palantir. Zero AI technical depth. Career accelerator, not a training program.

VetsinTech / Vets in AI

Dedicated "Vets in AI" program launched 2025. ML, data analytics, AI ethics. Events with Nvidia, Microsoft, Google. Closest veteran-specific competitor on AI depth. But focused on making veterans into AI practitioners, NOT training them to supervise workflows.

OpenAI Academy + Veterans Forge

Pilot: 4-hour hands-on AI sprint (Mar 2026, 100 cap). Free ChatGPT Plus for transitioning veterans. Tool literacy, not workflow supervision. Still in pilot phase, tiny scale.

Accenture Veterans Initiative (2025) HIGHEST THREAT

New national initiative targeting Officers and Senior Enlisted. LearnVantage platform with "agentic AI" courses. White House AI Education Taskforce participant. But: it's a corporate hiring funnel, not an independent training program. Veterans become Accenture employees, not broadly skilled AI supervisors.

General AI Upskilling Programs

Google AI Professional Certificate

Coursera-hosted, ~$200 total. Free for veterans via Google Launchpad. Covers AI fundamentals and prompt engineering. No workflow management or supervision framework.

Code Platoon (GI Bill + VET TEC)

AI + Fullstack Engineering bootcamp. VA-approved. Strongest current competitor combining veteran focus + AI content + government funding. But trains coders who use AI, not managers who supervise AI workflows.

VET TEC 2.0

Reauthorized Dec 2024, funded from Oct 2025. Still not operational. VA has not approved any providers yet. ~4,000 spots/year expected. Applications may open June 2026. When it reopens, approved providers will have a massive distribution advantage.

Positioning Matrix

Two axes: veteran-specificity (horizontal) and AI depth (vertical).

Competitive Risks

RiskSeverityMitigation
Hiring Our Heroes adds "AI Management" trackHIGHMove fast. Establish "mission command" as the recognized vocabulary. Build curriculum IP (case studies, simulations) that can't be replicated by bolting a webinar onto a fellowship.
Accenture scales its veteran AI initiativeHIGHAccenture funnels to Accenture. Emphasize independence: "We don't train you for one employer, we train you to lead AI operations anywhere."
VET TEC 2.0 approval gatekeepingMEDFile VET TEC 2.0 provider application immediately. Simultaneously pursue GI Bill approval as parallel path.
"AI supervisor" role doesn't materialize as distinct jobMEDFrame credential as additive, not replacement. "AI Operations Leadership" is a skill overlay on existing management.
Google/OpenAI deepen free veteran AI trainingMEDDepth beats breadth. Position as post-certification: "You got your Google AI cert. Now learn to lead AI operations."

The window is 12-18 months. Accenture's veteran initiative and VET TEC 2.0 will reshape the landscape by late 2026 or early 2027. The advantage is being first to name the role (AI operations supervisor), first to connect it to a military framework (mission command), and first to build a credentialing program around it. Speed and curriculum depth are the moat.

Target Contacts: First 5 Employers

Real people at real companies. The contact phase starts here.

1. Les Schwab Tire Centers

Brian Buch, VP Information and Digital Services

35-year Les Schwab veteran who owns IT and digital strategy. Recently earned all 12 AWS certifications including AI Practitioner. Actively thinking about AI/data strategy at enterprise scale. HQ Bend, OR. ~8,000 employees, 500+ stores. Warmest lead.

2. St. Charles Health System

Rebecca Berry, MBA, VP & Chief Human Resources Officer

At St. Charles since 2007. Led their workforce turnaround (reducing reliance on traveling nurses). AI workforce readiness lands squarely in her domain. HQ Bend, OR. Largest healthcare provider in Central Oregon, ~3,800 employees.

3. BASX Solutions / AAON

Mark Nordstrom, EVP Operations

15+ years driving operational excellence at a company that builds cooling infrastructure for hyperscale data centers (Meta, Apple). Based in Bend. HQ Redmond, OR.

4. Perkins & Co

Jared Holum, CPA, President

Largest locally-owned accounting firm in Oregon, 165+ employees. Professional services firms face the sharpest gap between individual AI usage and firm-wide coordination. Portland, OR.

5. Dutch Bros

Venki Krishnababu, Chief Technology and Information Officer

Joined Dec 2024 from lululemon (7 years as CTO). Nearly 30 years of enterprise technology leadership. A new CTIO building out tech strategy at a fast-growing company. 900+ locations. Grants Pass, OR (NYSE: BROS).

Outreach Template

"I'm building a program that trains veterans to manage AI operations. I'm looking for employers feeling the gap between individual AI productivity and organizational coordination. Would you be open to a 15-minute call?"

Don't send all 5 at once. Start with the warmest (Brian Buch at Les Schwab) and iterate based on response.

Conversation Scripts

Three scripts for three audiences. 25-35 minutes each. Signal extraction, not selling.

Script A: Employer / Hiring Manager

Target: VP of Operations, Program Manager, or HR Director. Prioritize orgs with existing veteran hiring commitments.

"Thanks for making time. We're exploring a workforce training concept focused on veterans transitioning into AI-adjacent roles. Not building AI models, but supervising AI systems in operational settings: reviewing outputs, catching errors, managing workflows where AI is doing the first pass and a human makes the final call. We're early. We haven't built anything yet. And we're talking to people like you to understand whether this solves a real hiring problem. Nothing to buy, no pitch. I just want to learn from your experience."

Questions:

  1. Where is AI showing up in your operations today, and where do you expect it in the next 12-18 months?
  2. When you think about the people who'll sit between the AI system and the final decision, what does that role look like?
  3. How hard has it been to hire for roles that require judgment under ambiguity?
  4. Have you ever hired a veteran specifically because of their operational background? How did that work out?
  5. If a training program produced candidates who could supervise AI-assisted workflows, and those candidates were veterans with security clearances, what would that pipeline be worth to you per hire?
  6. What would you need to see from a program like this before you'd consider hiring from it?

Commitment test: "If we ran a pilot cohort of 10-15 veterans this summer, trained specifically on AI workflow supervision with a capstone project, would you be willing to interview the top graduates? No obligation to hire, just review their work and give us feedback on readiness."

Green Flags

  • Names a specific AI use case already in production
  • Describes the "human in the loop" problem without you prompting it
  • Puts a dollar figure on hiring cost or pipeline value
  • Volunteers to introduce you to a colleague
  • Asks when the first cohort graduates

Red Flags

  • "AI is still a few years out for us"
  • Only talks about needing engineers or PhDs
  • Enthusiastic but won't commit to reviewing candidates
  • Defers everything to a hiring freeze or budget cycle
Script B: Veteran-Serving Organization Leader

Target: Program Director or Executive at Hiring Our Heroes, BreakLine, a state workforce board, a Vet Center, or a VSO with a career transition program.

"Appreciate you taking this. We're designing a training program that would prepare transitioning service members for AI workflow supervision roles. The jobs where a person reviews, validates, and manages AI system outputs in fields like healthcare operations, government services, and logistics. We're not trying to turn veterans into software engineers. We're focused on the oversight and judgment layer. Before we build anything, we want to learn from organizations like yours."

Questions:

  1. What are the top three job categories your transitioning veterans are landing in today? Where do they struggle most in placement?
  2. When employers come to you, what skills or qualities do they ask for most often? Has that changed in the past two years?
  3. How do you evaluate new training programs before recommending them to your veterans?
  4. Have you seen AI-specific training programs targeting veterans? What's worked and what's fallen short?
  5. If a program could demonstrate 70%+ of graduates landed relevant roles within 90 days, what would a partnership look like from your side?
  6. What's the typical timeline for your org to endorse or partner with a new workforce program?

Commitment test: "If we designed a 4-6 week pilot this summer and needed 10-15 veteran participants, would your organization help us recruit them? Even informally? And would you be open to a follow-up conversation once we have a curriculum outline?"

Script C: Transitioning Veteran

Target: E-5 to O-4, 6-15 years of service, within 12-24 months of transition. Any branch, especially operations, intelligence, logistics, medical, or maintenance backgrounds.

"Thanks for your time. I know transition is a busy period. We're building a training program for veterans that focuses on AI workflow supervision. That means: jobs where AI does a first draft or a first pass, and a human reviews it, catches mistakes, and makes the final call. Think quality assurance for AI systems, not programming them. It maps pretty directly to the kind of process discipline and judgment you've been using in the military."

Questions:

  1. What's your plan right now for your next career step? How's the search going?
  2. When you hear "AI," what comes to mind? Do you see it as relevant to the kind of work you want to do?
  3. In your military role, how often did you review someone else's work, validate information, or catch errors in a process?
  4. If a program could help you land a role paying $65K-$85K in 90 days, what would that need to look like for you to invest 4-6 weeks?
  5. What have you already tried for career transition? TAP, SkillBridge, bootcamps? What worked and what didn't?
  6. If this program cost $2,000 out of pocket, or was free if covered by GI Bill, which path would you pursue?

Commitment test: "We're planning a pilot cohort this summer. If you qualified, would you apply? And, separate question, can you think of two or three people who'd also be interested?"

90-Day Timeline with Kill Criteria

Month 1: Validate or Kill (Weeks 1-4)

15-20 validation conversations across employers, VSOs, and veterans. Pattern analysis. Draft v0.1 curriculum outline. Identify pilot partners. Research funding (WIOA, SkillBridge, state workforce board).

Day 30 Kill Criteria

Stop and reassess if:

  • Fewer than 3 of 8+ employer conversations produced a concrete next step
  • No VSO expressed willingness to help recruit a pilot cohort
  • Veterans consistently said "I don't see how this is different"
  • Zero employers could name an AI-supervision-adjacent role they're hiring for

Month 2: Build and Launch Pilot (Weeks 5-8)

10-15 participants, 4 weeks, remote-first. Screen applicants. Onboard. Teach. Collect feedback after every session. Adjust in real time. Submit first funding application.

Month 3: Complete, Measure, Decide (Weeks 9-12)

Capstone projects presented to employer panel. Compile results. Write the Pilot Results Brief (your primary credibility asset). Go/no-go on Cohort 2.

Day 90 Scale Decision

SignalStrong (Scale)Mixed (Iterate)Weak (Pause)
Completion rate80%+60-79%<60%
Employer "would hire"3+ employers1-20
Participant NPS50+20-49<20
90-day placement50%+25-49%<25%
Funding for next cohortSecuredPendingNo leads

Oregon Landscape

Veterans, AI, employers, and funding in your backyard.

Top Findings

  1. Oregon-NVIDIA $10M AI Partnership. Signed April 2025 by Gov. Kotek + Jensen Huang. AI Ambassador Program on campuses. An AI operations supervisor program aligns directly with the state's stated AI workforce priority.
  2. East Cascades Works is the local workforce board. WIOA-designated entity for Bend/Deschutes County (10 counties). The right partner for federal workforce funding. HQ in Bend.
  3. Health Elements AI is in Bend. AI-powered clinical data abstraction. Real AI company, BVC finalist. A concrete local employer for the "would you hire an AI-skilled veteran?" conversation.
  4. Prineville data centers are 35 min from Bend. Meta (3.2M sq ft, $2B+) + Apple (3.2M sq ft). ~300 permanent jobs. Infrastructure jobs that need supervision.
  5. COCC already has veteran infrastructure. STRIVE Program (free 8-week veteran entrepreneurship, Syracuse IVMF partnership). Justin won STRIVE there. Neither COCC nor OSU-Cascades offers AI operations training. Gap.

Veteran-Serving Organizations (Central Oregon)

Employer Target List (30 companies)

Central Oregon (Priority 1)

CompanyIndustryWhy
St. Charles Health SystemHealthcare, ~4,500 empLargest employer in region. Clinical AI + compliance.
BASX SolutionsMfg (data center cooling), ~500+ empHyperscale clients demand process rigor.
Les SchwabAuto services, 500+ locationsAlready bought ChatGPT Team. Warmest lead.
Health Elements AIAI/Healthcare, 10-30 empActual AI company. Would hire veterans directly.
DutchieCannabis tech, 100-300 empFast-growing SaaS, heavy AI users.
Meta (Prineville)Data center, ~150 permanent3.2M sq ft, $2B+ invested.
Apple (Prineville)Data center, ~150 permanent3.2M sq ft campus.

Portland Metro (Priority 2)

CompanyIndustryWhy
Intel (Hillsboro)Semiconductor/AI, 22,300 empLargest private employer in OR. AI at massive scale.
Nike (Beaverton)Sportswear/tech, ~12,000 HQAI across design, supply chain, marketing.
Daimler Trucks NAManufacturing, 34,000 empHeavy manufacturing + AI in logistics.
Northrop GrummanDefense R&D, 500+ empDefense contractor, hires veterans.
Dutch Bros CoffeeF&B, 16,500 emp900+ locations. Same distributed challenge as Les Schwab.
Perkins & CoAccounting, 200+ empProfessional services = acute AI pain.
Funding Sources

Active / Accessible

SourceAmountDetails
ODVA Veteran Services Grant$972K (2025-27)Competitive grants to nonprofits. Contact: Brenna Bandstra, brenna.bandstra@odva.oregon.gov
JVSG$2.4M100% federal (DOL-VETS). Funds LVERs/DVOPs.
Oregon WIOA Title IPart of ~$2.9B nationalPriority of service for veterans. Administered by OR Employment Dept.
Oregon-NVIDIA AI Partnership$10M stateAI Ambassador Program, campus AI integration.

Closing / Limited

SourceAmountDetails
Future Ready Oregon$200M total (nearly spent)Must spend by end of 2026. Veterans explicitly listed as target.

Key Stats

The Business Model

Employer pain tiers, the consultancy flywheel, and veteran-to-role matching.

The Core Problem

Individual workers bought ChatGPT and got 10x more productive at their tasks. But the organization didn't change its processes, workflows, approval chains, or quality controls. Now you have powerful individuals operating in an organizational vacuum. Someone needs to supervise the AI ecosystem. That's the AI Ecosystem Supervisor.

Key stats: 28% of workers currently use generative AI at work. More than half use AI without employer approval. 70% have never received AI training from their employer. 87% of healthcare workers say their employer lacks clear AI policies. 64% of workers have presented AI-generated work as their own.

Employer Pain Tiers

Tier 1: Acute Pain. Bought AI, don't know how to manage it.

Companies with ChatGPT Team/Enterprise licenses but no governance. SMBs that did a workshop and now ask "now what?" Professional services firms with partners using AI for drafts, junior staff using it unsupervised. Healthcare organizations (87% lack AI policies. HIPAA + AI = liability).

Tier 2: Structural Pain. High process complexity + AI adoption pressure.

Manufacturing (quality control, safety documentation). Healthcare systems (clinical documentation, billing). Construction (permitting, compliance). Logistics (distributed operations).

Tier 3: Growth Pain. Scaling with AI but struggling with coordination.

Tech startups (small teams using AI heavily but no dedicated AI ops role). Data center adjacent companies. E-commerce/DTC brands.

Tier 4: Succession Pain. Baby boomer handoff + AI transition.

Central Oregon trades (HVAC, plumbing, electrical). Professional practices (dental, veterinary, small law). Agriculture/ranching. Hospitality/tourism.

The Consultancy Flywheel

  1. LearnAIR (or similar) trains a company on ChatGPT Team
  2. You do an AI Ecosystem Assessment
  3. Build governance framework + workflows
  4. Company realizes they need a full-time person to maintain it
  5. Place a veteran as AI Ecosystem Supervisor
  6. Provide 90-day transition support
  7. Consulting transitions to quarterly retainer. Repeat at next company.

Every consulting engagement generates demand for a veteran placement. Every veteran placement demonstrates the model works, generating referrals for more consulting.

Engagement Pricing

PhaseWhatPrice
Phase 1: AI Readiness AssessmentAudit AI usage, map workflows, interview staff. 1-2 weeks.$5,000-10,000
Phase 2: Ecosystem Build-OutGovernance policies, workflow templates, monitoring, training. 4-8 weeks.$15,000-30,000
Phase 3: Ongoing RetainerMonthly health check, new use cases, staff training, quarterly exec briefing.$3,000-5,000/mo
Phase 4: Veteran PlacementSource from program, 90-day onboarding support.15-20% first-year salary

Veteran-to-Employer Matching

TierMilitary RankRoleSalary Range
Tier 1E4-E6 (Tactical NCOs)AI Operations Coordinator$55,000-75,000
Tier 2E7-E9 + O1-O3AI Ecosystem Manager$80,000-120,000
Tier 3O4+ (Senior Officers)Chief AI Officer / VP AI Ops$130,000-200,000+

The naming advantage: No one has standardized what to call this role. The program that names it owns the category. The military rank mapping gives employers an instant credibility signal.

The Curriculum Model

Why Alpha School's 2.3x model matters. LearnAIR's gaps. What gets built on top.

Alpha School: The Load-Bearing Insight

Software alone = zero acceleration.

Alpha School's homeschool version uses the same software platform but without guides, incentives, or peers. It produces only 1x learning velocity, no acceleration at all. The load-bearing elements are: guide relationships + incentive system + peer effects + physical environment. The AI handles content delivery. The humans handle motivation, accountability, and judgment.

Alpha School puts students in the top 0.1% on standardized tests (97th-99th percentile MAP, 1535 average SAT) with ~2 hours of AI-personalized academics per day. The rest is project-based learning. Results: 2.3x-2.6x national average growth rate.

For adults, the case is even stronger:

LearnAIR's Foundation Series: What's Good, What's Missing

LearnAIR's Foundation Series is 3 live sessions x 1.5 hours each. Groups of 5-20. It covers ChatGPT basics, interface, prompting (DIRECT framework), personas, Custom GPTs, and agent mode. It produces real results: "John completed 4-6 months of work in one week."

Eight Gaps

  1. No organizational coordination layer. Everything is individual. Zero discussion of what happens when 50 people each build their own setup.
  2. ChatGPT-only, platform-locked. No model-agnostic thinking. No Claude, Gemini, or open-source.
  3. No file system / version control. No git, no version history.
  4. No CLI / terminal exposure. Everything is GUI.
  5. No security / governance depth. Privacy toggles, but no output review workflows or approval chains.
  6. No workforce transition framework. A skills course, not a transition program.
  7. No measurement / outcomes. No framework for measuring ROI.
  8. Troubleshooting is reactive, not systemic. Hallucinations treated as bugs, not architectural realities.

What We Build On Top

LearnAIR TeachesWe Add
Build a digital employeeManage a team of digital employees
Personal productivityOrganizational productivity
ChatGPT-specific skillsModel-agnostic principles
GUI workflowsCLI + voice-first workflows
Individual personasSystem-level governance
"Here's what AI can do""Here's how to be responsible for what AI does"
18 minutes of hands-onWeeks of applied experiential learning
Certificate of completionJob placement tied to outcomes

The Proposed Structure

Compress instruction. Expand practice. Never remove the guide.

Based on Alpha's architecture, applied to a veteran learning AI supervision:

TimeActivity
0:00-0:05Daily Dash: review progress, today's targets, adaptive path
0:05-0:30Concept Module: AI-personalized lesson, mastery-gated
0:30-0:50Hands-On Lab: supervised practice in sandboxed AI environment
0:50-1:00Break
1:00-1:20Scenario Drill: branching decision trees, judgment checks
1:20-1:35Spaced Review: quick-fire on material at risk of being forgotten
1:35-1:50Mentor Check-in (2x/wk) or Peer Discussion (3x/wk)
1:50-2:00Reflection + Tomorrow Preview

After the 2-hour instruction block: hands-on practice with a guide present. Ratio: 1:2 or 1:3. For every hour of AI-delivered instruction, 2-3 hours of supervised practice. This maps to the 70-20-10 model: 70% experiential, 20% social, 10% formal.

Don't build a 40-hour/week classroom program. Build a 10-hour/week AI-personalized instruction program with 20-30 hours/week of supervised hands-on practice. That's the Alpha model for adults.