CubeSTEM Digital Twin — V3.4 Mission Pack

Space Mission Challenge Pack

A browser-based CubeSat + space robotics mission-control challenge pack for classrooms, workshops, STEM labs, and pilot demos. Students operate virtual rover and satellite assets through realistic mission scenarios, diagnose telemetry divergence, and produce evidence-based mission reports.

Two Interactive MissionsTelemetry Divergence Engine100-Point ScorecardCompetition Mode

Recommended flow

Full challenge sequence (classroom or workshop)

Follow these steps in order for a coherent Space Mission Challenge story. Step 4 is the advanced technical layer — optional for younger cohorts, expected for technical reviewers and university-style demos.

  1. 1

    Learn mission context

    Open the challenge pack hero and teacher runbook — roles, constraints, and how scoring works (formative, local-only).

    Jump to teacher runbook

  2. 2

    Run Lunar Rover Rescue

    Interactive rover mission with telemetry divergence, scorecard, and in-mission realism panel (V3.6B).

    Open Lunar Rover Rescue

  3. 3

    Run CubeSat–Rover Relay

    Pass-window relay mission; compare command queue and download behavior with the rover mission.

    Open CubeSat–Rover Relay

  4. 4

    Mission Realism Lab (advanced technical layer)

    Parameter-driven teaching models: contact windows, link budget, packet protocol, ADCS coupling — same narrative as the in-mission panels, deeper knobs. Not certified RF/orbit/ADCS.

    Open Mission Realism Lab

  5. 5

    Diagnose telemetry divergence

    Use expected-vs-observed comparison inside either mission; tie findings to scorecard evidence dimensions.

    Back to challenge pack overview

  6. 6

    Generate a local mission report

    Paste highlights manually from missions or realism lab into the optional realism field — no auto-fill, no backend.

    Open report generator

Two Mission Challenges

Both missions include the Telemetry Divergence Engine v0 for expected-vs-observed telemetry comparison, and a unified 100-point Mission Challenge Scorecard across 7 dimensions.

Lunar Rover Rescue Mission

Command a virtual rover through a CubeSat relay window. Manage battery life, diagnose faults from telemetry, and produce an evidence-based mission report. Includes Telemetry Divergence Engine v0, 100-point Mission Challenge Scorecard, and an in-mission Mission realism panel (V3.6B) for packet health, link quality, and teaching-grade ADCS/link coupling copy.

20–30 minutes2–4 students
Rover commandBattery managementFault diagnosisEvidence reporting

CubeSat–Rover Relay Mission

Manage command uploads and telemetry downloads around satellite pass windows. Balance packet priority, queue management, and time-limited contacts. Includes Telemetry Divergence Engine v0, 100-point Mission Challenge Scorecard, and an in-mission Mission realism panel (V3.6B) using the same contact-window, link-budget, and packet-protocol teaching models as the Mission Realism Lab.

25–35 minutes2–4 students
Pass window planningCommand queuingTelemetry downloadPriority trade-offs

Advanced layer — Mission Realism Lab (after missions)

The challenge pack is now paired with a parameter-driven teaching layer: students adjust orbit, pass, link, and packet knobs in the lab, then see applied realism summaries inside each mission. This is still browser-local and teaching-grade — not certified RF or orbital analysis.

Open Mission Realism Lab →

Suggested Classroom Flows

45-Minute Class Flow

  1. 5 min: Briefing — introduce mission story and team roles
  2. 5 min: Planning — review constraints and assign roles
  3. 25 min: Mission execution — run Lunar Rover Rescue or CubeSat–Rover Relay
  4. 10 min: Debrief — review scorecard, discuss decisions, capture evidence

90-Minute Workshop Flow

  1. 10 min: Space robotics context — why relay windows matter
  2. 15 min: Team formation + role assignment
  3. 20 min: Mission 1 — complete Lunar Rover Rescue with full scorecard
  4. 10 min: Break + interim debrief
  5. 25 min: Mission 2 — complete CubeSat–Rover Relay
  6. 10 min: Final debrief — compare missions, lessons learned

Half-Day Challenge Day Flow

  1. 30 min: Opening — context, rules, team formation (4–6 teams)
  2. 45 min: Mission Sprint 1 — all teams run Lunar Rover Rescue
  3. 15 min: Inter-team sharing + leaderboard snapshot (manual)
  4. 45 min: Mission Sprint 2 — all teams run CubeSat–Rover Relay
  5. 30 min: Evidence report completion
  6. 45 min: Final presentations + teacher assessment

Teacher Runbook

Facilitation guide for running Space Mission Challenge Pack in classrooms, workshops, and challenge days. All scoring is local and formative — share manually via screenshots or exported evidence text.

Team Roles

Assign one role per student (teams of 2–4 can combine roles)

  • Mission Commander

    Final decision authority, timeline management, go/no-go calls

  • Rover Operator

    Execute movement and science commands, monitor power state

  • Telemetry Analyst

    Watch expected vs observed divergence, flag anomalies

  • Communications Officer

    Track pass windows, manage command/telemetry queues

  • Safety Officer

    Monitor battery critical thresholds, abort triggers

  • Evidence Reporter

    Capture key decisions, screenshot evidence, draft report

Materials Needed

  • Devices with modern browsers (Chrome/Edge/Firefox/Safari)
  • Projector or screen for mission timeline visibility
  • Printed role cards (optional — can use on-screen references)
  • Student worksheets for planning and evidence logging
  • Timer for pass windows and mission phases

Facilitation Tips

  • Set expectations upfront — this is a teaching-grade simulation, not a certified flight system
  • Emphasize team communication over individual speed
  • Pause at divergence alerts — use them as teaching moments
  • Let students fail safely — better learning from recoverable mistakes
  • Require evidence capture before showing final scorecard
  • Debrief on decision process, not just final score

Debrief Questions

  • What was your most critical decision? What data supported it?
  • Where did expected vs observed telemetry diverge? How did you spot it?
  • What would you do differently with 20/20 hindsight?
  • How did team roles help or hinder mission execution?
  • What does this simulation capture well? Where are its limits?

Student Evidence & Report Flow

Students capture evidence during missions and produce a final mission report. Evidence is browser-local — copy/export text or screenshot to share.

During the Mission

  • Mission Objective

    State the mission goal in your own words. What defines success?

  • Planning Questions

    What constraints matter most? What trade-offs did you anticipate?

  • Command Log

    List your key commands and the reasoning behind each.

  • Telemetry Clue Log

    What did expected vs observed telemetry tell you?

  • Divergence Diagnosis

    Where did telemetry diverge? What was the root cause?

  • Scorecard Reflection

    Which dimensions did you score well on? Where could you improve?

Final Evidence Report Structure

  1. Mission name and team members
  2. Date and duration
  3. Executive summary (2–3 sentences)
  4. Key decisions made
  5. Divergence events and diagnosis
  6. Scorecard results (self-assessment)
  7. Lessons for a future mission
  8. Appendix: Screenshots or exported evidence text

Evidence Export

Each mission includes a "Copy Evidence" button that exports mission decisions, telemetry log, and scorecard results as formatted text. Paste into a document or learning management system.

Local Mission Report Generator

After completing your mission, generate a clean printable report from your evidence. Choose a template, paste your evidence, and save as PDF locally.

Generate local mission report →

Competition Mode

Space Mission Challenge Day guide for classroom competition. Teams complete missions, receive 100-point formative scoring across 7 dimensions, and present evidence-based mission reports. All scoring is local-only — share manually via screenshots or exported text.

100-Point Mission Challenge Scorecard

Unified rubric across both missions — formative assessment only

DimensionPointsFocus Area
Mission Success30 ptsObjective completion, waypoint achievement
Safety20 ptsBattery protection, abort discipline, fault response
Resource Management15 ptsPower-aware commanding, queue efficiency
Telemetry Reasoning15 ptsCorrect interpretation of telemetry streams
Divergence Diagnosis10 ptsIdentifying expected vs observed mismatches
Evidence Quality5 ptsClear command logs and reasoning
Team Reflection5 ptsPost-mission debrief participation
Total100 pts

Competition Guidelines

  • All scoring is local and formative — not official grades
  • Teams manually report scores via screenshot or exported evidence
  • Teacher uses rubric notes to assess team presentations
  • No online leaderboard — manual tracking only
  • Encourage learning over winning — debrief matters more than score
  • Multiple runs allowed — best score or improvement story counts

Pilot & Commercial Value

Space Mission Challenge Day for pilot validation, grant proposals, CSR programs, and school outreach. Software-first approach enables immediate deployment without hardware procurement.

Who It Serves

  • Schools & Colleges

    STEM enrichment without hardware investment. Runs in computer labs with standard browsers.

  • STEM Labs & Makerspaces

    Space robotics concepts before physical rover builds. Software-first, hardware-ready later.

  • Space Clubs & Competitions

    Mission-control practice for CanSat, CubeSat, and rover challenge teams.

  • NAVTTC-Style Training

    Technical vocational training in satellite operations and telemetry analysis.

  • CSR & Grant Programs

    Demonstrable STEM outreach with clear learning outcomes and evidence reporting.

Demo Timing Options

  • 5-minute demo — show one mission card, highlight scorecard, tease divergence
  • 15-minute demo — run one mission phase, show telemetry comparison, discuss team roles
  • 45-minute session — complete one full mission with debrief and evidence export
  • Half-day challenge — run both missions with multiple teams and competition mode

Pricing Note

Pricing to be defined after pilot validation. Current release is demonstration and evaluation use.

Advanced technical layer — Mission Realism Lab

The Mission Realism Lab upgrades the challenge pack from lesson/demo to parameter-driven mission reasoning. Students explore ground track coverage, contact windows, link budget, rain/fade attenuation, packet protocol errors, and ADCS pointing coupling — all in one coherent simulation story. Teaching-grade, browser-local.

Open Mission Realism Lab →

What This Is (and Is Not)

Boundaries — Not Included

  • Software-only simulation

    Browser-based teaching-grade prototype. Not a certified flight simulator.

  • No remote hardware control

    Does not control real rovers, satellites, or ground stations.

  • No real RF or SDR

    Simulated communications only. No actual radio transmission.

  • Local-only scoring

    No backend, no accounts, no online leaderboard. Share manually.

  • Formative assessment only

    Not official grades or certifications. Teacher discretion.

  • Deterministic teaching models

    Telemetry divergence uses rule-based scenarios, not AI/ML diagnosis.

What It Is

  • Classroom-ready space robotics mission simulation
  • Telemetry divergence detection and diagnosis practice
  • Team-based mission control roleplay
  • Evidence-based reporting and reflection
  • Browser-local formative assessment with 100-point scorecard
Hardware-ready interface specification (V3.5B): The mission protocol defined in this challenge pack is compatible with the hardware-ready interface spec. Future supervised lab sessions may connect rover or CubeSat kits via a local bridge adapter using the same telemetry and command schema. View interface spec →

Ready to explore? Start with a mission or browse the demo pack.