Mobile Computing Lab Course

About This Open Source Initiative

In today's rapidly evolving mobile computing landscape, there exists a critical gap between traditional computer science education and the practical skills needed for next-generation mobile applications. As we stand at the intersection of mobile computing, artificial intelligence, and robotics, the demand for hands-on, practical learning experiences has never been greater.

This open-source lab collection addresses several key educational needs:

  • Bridging Theory and Practice: Moving beyond textbook concepts to real-world implementation
  • AI-Mobile Integration: Preparing students for the convergence of AI and mobile computing
  • Humanoid Robotics Preparation: Foundation skills for the emerging field of mobile AI robotics
  • Accessibility: Making high-quality mobile computing education available to all students
  • Industry Relevance: Labs designed around current and emerging mobile technologies

Connection to AI and Robotics

These labs form a crucial foundation for students entering fields such as:

  • Mobile AI Applications: Real-time on-device machine learning
  • Humanoid Robotics: Mobile platforms with AI capabilities for human-robot interaction
  • Autonomous Systems: Mobile robots requiring sensor fusion and intelligent navigation
  • Edge Computing: Distributed AI systems running on mobile and IoT devices

By open-sourcing these materials, we hope to accelerate innovation in mobile computing education and provide a standardized foundation for institutions worldwide to build upon.


Course Overview

This comprehensive lab series covers the essential aspects of modern mobile computing, from low-level sensor processing to high-level AI integration. Each lab builds practical skills through hands-on Python implementation.

Learning Objectives

  • Master mobile sensor data processing and fusion techniques
  • Understand wireless communication protocols and optimization
  • Implement computer vision and acoustic sensing systems
  • Build secure mobile applications with biometric authentication
  • Develop mobile health monitoring and robotics applications
  • Prepare for careers in mobile AI and robotics

Prerequisites

  • Basic Python programming knowledge
  • Understanding of linear algebra and signal processing
  • Familiarity with basic mobile computing concepts

Week 4: Mobile Sensor Processing & Activity Tracking


Lab Focus: Building a Mobile Activity Tracker
Duration: 30-40 minutes
Key Skills: Sensor fusion, signal processing, activity recognition

Project Overview

Build a Mobile Activity Tracker that processes smartphone sensor data to detect steps, validate movement, and recognize activities. Each task builds upon the previous one.

Final Deliverable: A Python-based tracker that detects steps, filters false positives, and recognizes walking vs. sitting activities.

Setup

pip install numpy matplotlib scipy

For GUI display:

import matplotlib
# matplotlib.use('TkAgg')      # Uncomment if needed
# %matplotlib inline           # For Jupyter notebooks

TASK 1: Smart Pedometer Foundation (10 minutes)

Build step detection using accelerometer data

View Python Code Implementation

TASK 2: Enhanced Detection with Gyroscope (10 minutes)

Add gyroscope validation to reduce false positives

View Python Code Implementation

TASK 3: Activity Recognition System (15 minutes)

Complete the tracker with activity classification

View Python Code Implementation

Discussion & Wrap-up (5 minutes)

What You Built:

  1. Step Detection: Accelerometer-based walking analysis
  2. Movement Validation: Gyroscope filtering of false positives
  3. Activity Recognition: Multi-sensor activity classification
  4. Analytics System: Comprehensive tracking dashboard

Real-World Impact:

  • Fitness Apps: Foundation for step counting and activity tracking
  • Health Monitoring: Basis for sedentary behavior detection
  • Research Applications: Human activity recognition systems

Key Concepts Demonstrated:

  • Sensor data simulation and processing
  • Signal filtering and peak detection
  • Multi-sensor fusion techniques
  • Machine learning classification
  • Real-time data visualization

Week 5: Acoustic Sensing & Gesture Recognition


Lab Focus: Building an Acoustic Gesture Recognition System
Duration: 30-45 minutes
Key Skills: Active acoustic sensing, Doppler analysis, gesture classification

Project Overview

Build an Acoustic Gesture Recognition System that uses smartphone speakers/microphones to detect and classify hand gestures through active acoustic sensing. Each task builds upon the previous one to create a complete sensing pipeline.

Final Deliverable: A Python-based system that transmits acoustic signals, processes echoes, and recognizes different hand gestures using Doppler shift analysis.

Setup

pip install numpy matplotlib scipy

TASK 1: Acoustic Signal Generation & Echo Simulation (12 minutes)

Build the foundation for active acoustic sensing

View Python Code Implementation

TASK 2: Doppler Analysis for Motion Detection (15 minutes)

Analyze frequency shifts to detect hand movement patterns

View Python Code Implementation

TASK 3: Complete Gesture Recognition System (18 minutes)

Build a real-time gesture recognition pipeline with multiple gesture support

View Python Code Implementation

Discussion & Wrap-up (5 minutes)

What You Built:

  1. Acoustic Signal Generation: FMCW chirp transmission and echo simulation
  2. Doppler Analysis: Motion detection through frequency shift analysis
  3. Gesture Recognition: Complete classification system with multiple gesture types

Real-World Applications:

  • Smart Home Control: Gesture-based device interaction
  • Accessibility Interfaces: Hands-free control for disabled users
  • Automotive Systems: Driver gesture recognition for infotainment
  • Security Systems: Contactless authentication methods

Week 6: Wireless Communication Systems


Lab Focus: Wireless Communication Systems Simulator
Duration: 30-45 minutes
Key Skills: Multiple access techniques, 6G technology, network optimization

Project Overview

Build a Wireless Communication Systems Simulator that demonstrates traditional multiple access techniques (FDMA, TDMA, CDMA) and explores next-generation 6G communication patterns. Each task builds upon the previous one to create a comprehensive wireless network analyzer.

Final Deliverable: A Python-based simulator that compares communication efficiency across different access methods and predicts 6G performance characteristics.

Setup

pip install numpy matplotlib scipy

TASK 1: Traditional Multiple Access Simulator (15 minutes)

Implement and compare FDMA, TDMA, and CDMA techniques

View Python Code Implementation

TASK 2: 6G Network Traffic Predictor (15 minutes)

Model explosive mobile traffic growth and 6G requirements

View Python Code Implementation

TASK 3: Advanced 6G Technology Enabler Simulator (15 minutes)

Implement mmWave beamforming and AI-driven network optimization

View Python Code Implementation

Discussion & Wrap-up (5 minutes)

What You Built:

  1. Multiple Access Simulator: Implemented and compared FDMA, TDMA, and CDMA techniques
  2. 6G Traffic Predictor: Modeled explosive growth and next-generation requirements
  3. Advanced 6G Technologies: Simulated mmWave beamforming and AI network optimization

6G Technology Enablers Explored:

  • Spectrum: mmWave and THz frequency bands
  • AI Integration: Intelligent network optimization
  • Massive MIMO: Advanced beamforming techniques
  • Network Architecture: Multi-tier heterogeneous networks

Week 7: Mobile Computer Vision


Lab Focus: Mobile Computer Vision with Simple Tools
Duration: 30-40 minutes
Key Skills: Lightweight image processing, mobile optimization, edge deployment

Project Overview

Build a Mobile Computer Vision System using only basic Python libraries. Focus on understanding core mobile vision concepts through hands-on implementation rather than complex dependencies.

Final Deliverable: A complete mobile vision pipeline that demonstrates image classification, mobile optimizations, and real-world deployment considerations.

Setup (Ultra-Simple!)

pip install numpy matplotlib pillow

That's it! No OpenCV, PyTorch, or complex dependencies needed.

TASK 1: Build a Mobile-Optimized Image Classifier (15 minutes)

Create a lightweight image classifier using only NumPy and basic image processing

View Python Code Implementation

TASK 2: Mobile Vision Optimization Simulator (20 minutes)

Demonstrate mobile-specific optimizations and deployment considerations

View Python Code Implementation

Discussion & Wrap-up (5 minutes)

What You Built:

  1. Ultra-Lightweight Classifier: 9-feature image classifier using only basic Python libraries
  2. Mobile Optimization Simulator: Quantization, pruning, and device deployment analysis
  3. Real-World Deployment Model: Performance analysis across different mobile devices

Key Advantages of This Approach:

  • Zero Complex Dependencies: Only NumPy, Matplotlib, PIL (standard libraries)
  • Instant Setup: No environment issues or installation problems
  • Educational Focus: Students see the core concepts without library complexity
  • Mobile-First Design: Actually deployable on real mobile devices

Week 9: Mobile Health Monitoring


Lab Focus: Building a Mobile Health Monitoring System
Duration: 30-45 minutes
Key Skills: Vital signs processing, anomaly detection, personalized health analytics

Project Overview

Build a Mobile Health Monitoring System that simulates patient vital signs, detects health anomalies, and provides personalized health recommendations. Each task builds upon the previous one to create a comprehensive mHealth application.

Final Deliverable: A Python-based health monitor that tracks multiple vital signs, detects anomalies, and generates personalized health insights.

Setup

pip install numpy matplotlib scipy pandas seaborn

TASK 1: Vital Signs Monitoring Foundation (15 minutes)

Build multi-sensor health data collection and basic anomaly detection

View Python Code Implementation

TASK 2: Personalized Health Insights & Recommendations (15-20 minutes)

Add intelligent health analytics and personalized recommendations

View Python Code Implementation

Discussion & Wrap-up (5 minutes)

What You Built:

  1. Vital Signs Monitoring: Multi-sensor health data simulation and real-time tracking
  2. Anomaly Detection: Clinical threshold-based health event detection
  3. Health Analytics: Personalized health scoring and risk assessment
  4. Smart Recommendations: AI-driven health insights and actionable advice

mHealth Application Categories:

  • Personal Care: Fitness tracking, nutrition monitoring
  • Clinical Monitoring: Vital signs, chronic disease management
  • Health Education: Patient information, medication reminders
  • Social Health: Community support, health information sharing

Week 12: Mobile Security & Biometric Authentication


Lab Focus: Mobile Security & Biometric Authentication System
Duration: 30-45 minutes
Key Skills: CIA triad implementation, biometric authentication, threat detection

Project Overview

Build a Mobile Security & Biometric Authentication System that implements the CIA triad (Confidentiality, Integrity, Availability) through biometric authentication, secure data transmission, and threat detection. Each task builds upon the previous one to create a comprehensive mobile security framework.

Final Deliverable: A Python-based security system that authenticates users via biometrics, encrypts sensitive data, and detects security threats in real-time.

Setup

pip install numpy matplotlib scipy cryptography hashlib

TASK 1: Biometric Authentication Foundation (15 minutes)

Implement fingerprint-based user authentication with security scoring

View Python Code Implementation

TASK 2: Secure Data Transmission & Threat Detection (20 minutes)

Implement CIA triad with encryption, integrity checks, and real-time threat monitoring

View Python Code Implementation

Discussion & Wrap-up (5 minutes)

What You Built:

  1. Biometric Authentication: Fingerprint-based user verification with similarity scoring
  2. Secure Data Transmission: End-to-end encryption implementing confidentiality
  3. Data Integrity: Hash-based verification preventing tampering
  4. Availability Assurance: Redundant storage with failover mechanisms
  5. Threat Detection: Real-time monitoring for brute force and unauthorized access

CIA Triad Implementation:

  • Confidentiality: AES encryption, secure session management
  • Integrity: SHA-256 hashing, tamper detection
  • Availability: Redundant storage, session management, fault tolerance

Week 13: Mobile Robot Navigation


Lab Focus: Building a Mobile Robot Navigation System
Duration: 30-45 minutes
Key Skills: Robot kinematics, path planning, obstacle avoidance, autonomous navigation

Project Overview

Build a Mobile Robot Navigation System that simulates robot movement, processes sensor data for obstacle detection, and implements path planning algorithms. Each task builds upon the previous one to create a complete autonomous navigation system.

Final Deliverable: A Python-based robot simulator that can navigate through obstacles, detect collisions, and find optimal paths to targets.

Setup

pip install numpy matplotlib scipy

TASK 1: Robot Motion Simulation Foundation (15 minutes)

Build basic robot movement and sensor simulation

View Python Code Implementation

TASK 2: Advanced Navigation with Path Planning (15-20 minutes)

Implement obstacle avoidance and autonomous navigation

View Python Code Implementation

Discussion & Wrap-up (5 minutes)

What You Built:

  1. Robot Motion Simulation: Realistic movement physics and sensor modeling
  2. Environmental Sensing: LIDAR simulation and obstacle detection
  3. Path Planning: Geometric planning with obstacle avoidance
  4. Autonomous Navigation: Reactive behaviors and target seeking

Real-World Applications:

  • Warehouse Robotics: Automated inventory and delivery systems
  • Autonomous Vehicles: Self-driving car navigation algorithms
  • Service Robots: Hospital, hotel, and home assistance robots
  • Exploration Robots: Mars rovers and underwater vehicles

Course Outcomes & Future Directions

Upon completing these labs, students will have built practical experience in:

  • Mobile Sensor Processing: Foundation for wearable and IoT applications
  • AI-Mobile Integration: Essential skills for on-device machine learning
  • Robotics Preparation: Core concepts for mobile autonomous systems
  • Security Implementation: Critical skills for enterprise mobile development
  • Health Technology: Foundation for digital health and telemedicine
  • Wireless Optimization: Understanding of next-generation mobile networks

Connection to Emerging Fields

These labs prepare students for careers in:

  • Humanoid Robotics: Mobile platforms requiring human interaction
  • Autonomous Vehicles: Self-driving cars and delivery robots
  • Smart Cities: IoT and mobile sensing infrastructure
  • Digital Health: Telemedicine and personalized healthcare
  • Edge AI: Distributed intelligence in mobile and IoT systems

License & Usage

This open-source educational content is provided for academic and educational use. Feel free to adapt, modify, and distribute these materials while maintaining attribution to CU Denver's Mobile Computing course.

For questions or contributions, please contact the course instructors or submit issues through the appropriate academic channels.