Skip to content

Activity 001: Scanner Technology Comparison Lab

Activity ID: U11M1-ACT-001 Duration: 40 minutes Objective: Learners will compare structured light, laser triangulation, and photogrammetry scanning technologies by analyzing sample scan data and matching scanner capabilities to real-world application scenarios.

Overview

In this activity, students examine provided scan datasets from three different scanner technologies, evaluate data quality differences, and develop recommendations for scanner selection based on specific project requirements. This builds the critical evaluation skills needed before hands-on scanner operation in Module 2.

Materials & Equipment Needed

  • Computer workstation with 3D viewer software (MeshLab or CloudCompare — free, open source)
  • Provided sample datasets:
  • Dataset A: Structured light scan of a mechanical part (PLY format)
  • Dataset B: Laser triangulation scan of the same part (PLY format)
  • Dataset C: Photogrammetry reconstruction of the same part (PLY format)
  • Scanner specification sheets (3 scanners, printed or digital)
  • Technology Comparison Worksheet (provided)
  • Projector/screen for group discussion

Instructions & Procedure

Phase 1: Dataset Loading and Initial Observation (10 minutes)

  1. Open CloudCompare or MeshLab on your workstation
  2. Load all three datasets (A, B, C) into the same workspace
  3. For each dataset, record in your worksheet:
  4. Approximate point count (displayed in software)
  5. Visual point density (dense, moderate, sparse)
  6. Presence of color/texture data (yes/no)
  7. Overall impression of data quality (clean, moderate noise, noisy)
  8. Use the cross-section tool to view a slice through each dataset — note differences in surface smoothness and noise levels

Phase 2: Detailed Quality Analysis (15 minutes)

  1. Select a flat surface region on each dataset:
  2. Fit a plane using the software's fitting tool
  3. Record the standard deviation of points from the fitted plane — this approximates noise level
  4. Select a sharp edge or corner region:
  5. Compare how each technology resolved the edge
  6. Note any edge rounding, noise scatter, or missing data
  7. Examine areas where multiple scans were aligned (registration boundaries):
  8. Look for double-wall artifacts or step discontinuities
  9. Check for occlusion holes:
  10. Identify undercuts, deep features, or hidden surfaces that each scanner missed
  11. Complete the Data Quality Comparison table on your worksheet

Phase 3: Application Scenario Matching (10 minutes)

For each of the following scenarios, recommend the best scanning technology and justify your choice:

Scenario A: Quality inspection of a machined aluminum aerospace bracket requiring ±0.02 mm accuracy Scenario B: Creating a 3D model of a large outdoor sculpture (2m tall) for a museum digital archive Scenario C: Scanning a patient's ear canal impression for custom hearing aid manufacturing Scenario D: Documenting a crime scene for forensic reconstruction Scenario E: Reverse engineering a broken vintage car dashboard knob for 3D printing a replacement

Phase 4: Group Discussion (5 minutes)

  1. Each group shares their most surprising finding from the data comparison
  2. Discuss: "Is there one best scanning technology, or does it always depend on the application?"
  3. Identify the single most important specification for each of the five scenarios

Discussion Points

  1. Why did the photogrammetry dataset have more noise on featureless surfaces?
  2. How did the laser triangulation scan handle the sharp edges compared to structured light?
  3. What would happen if you tried to 3D print directly from a noisy point cloud without mesh processing?
  4. When would you combine two scanning technologies on the same project?

Expected Outcomes

  • Completed Technology Comparison Worksheet with quantitative noise measurements
  • Written recommendations for five application scenarios with justified technology selections
  • Understanding that scanner selection is application-driven, not one-size-fits-all
  • Familiarity with CloudCompare or MeshLab basic navigation and analysis tools

Assessment Rubric

Criterion 4 (Excellent) 3 (Good) 2 (Fair) 1 (Needs Improvement)
Data Quality Analysis Thorough quantitative comparison with noise measurements; identifies all major artifacts Good comparison with most artifacts identified Basic comparison; misses some artifacts Incomplete or inaccurate analysis
Scenario Recommendations Clear, well-justified technology selection for all scenarios; considers multiple factors Good recommendations with reasonable justification Adequate selection but weak justification Poor selection or missing justification
Software Proficiency Confident use of viewing, sectioning, and plane-fitting tools Competent use with minor difficulties Basic navigation only; needed help with analysis tools Unable to use software effectively
Discussion Participation Insightful contributions; connects findings to broader concepts Active participation with relevant comments Minimal participation Does not participate

Safety Considerations

  • This is a computer-based activity with no physical hazards
  • Ensure proper ergonomic workstation setup (monitor height, chair position)
  • Take screen breaks every 20 minutes to reduce eye strain

Last Updated: 2026-03-19