Files

592 lines
20 KiB
Markdown
Raw Permalink Normal View History

2026-02-16 17:49:20 -08:00
# When the Data Surprised Us
## Counterintuitive Findings from the Field
### A 45-Minute Presentation for DTS Internal Staff
---
# PRESENTATION OVERVIEW
**Presenter:** Ben, Application Engineer
**Audience:** DTS engineering and internal staff
**Duration:** 45 minutes
**Goal:** Engage engineers with surprising stories, reinforce value of measurement, create memorable learning moments
**Key Themes:**
1. Reality often contradicts expectations
2. Data reveals what the eye cannot see
3. Trust the data, not assumptions
4. This is why rigorous measurement matters
**Tone:** Engaging, interactive, puzzle-like. Audience guessing before reveals.
---
# SLIDE-BY-SLIDE SCRIPT AND SPEAKER NOTES
---
## SLIDE 1: Title Slide
**Visual:** Question marks and data traces
**Title:** When the Data Surprised Us
**Subtitle:** Counterintuitive Findings from the Field
### Speaker Notes:
> Welcome to my favorite presentation to give. This is the one where I get to be a game show host.
>
> Over the years in the field, I've seen things that made me say, "Wait, that can't be right." Results that defied expectations. Data that contradicted what everyone assumed.
>
> Today I'm going to share some of those stories. And I'm going to make you guess before I reveal the answer.
>
> Fair warning: Some of these will mess with your head. That's the point.
---
## SLIDE 2: The Rules of the Game
**Visual:** Numbered list
**Content:**
1. I'll show you a scenario
2. You guess what happened
3. I reveal the data
4. We discuss why it matters
### Speaker Notes:
> Here are the rules.
>
> I'll show you a scenario. You'll have a few seconds to think about what you expect the data to show.
>
> Then I'll reveal what actually happened. Sometimes you'll be right. Often you won't.
>
> And then we'll talk about why it matters for the work we do.
>
> Ready? Let's go.
---
## SLIDE 3: Puzzle 1 Introduction
**Visual:** Section divider
**Title:** PUZZLE 1: The Beautiful Wreck
### Speaker Notes:
> Puzzle one. I call this one "The Beautiful Wreck."
---
## SLIDE 4: The Setup - Two Cars
**Visual:** Two post-crash photos side by side
**Car A:** Severely crumpled front end, pushed back, visually destroyed
**Car B:** Relatively intact, minimal visible damage
### Speaker Notes:
> Here are two cars after frontal crash tests. Same test. Same speed. Same barrier.
>
> Car A - on the left - is destroyed. The front end is pushed back almost to the firewall. The hood is crumpled like paper. It looks catastrophic.
>
> Car B - on the right - looks pretty good. Some crumpling, sure, but the structure held. The passenger compartment looks intact.
>
> *[Pause]*
>
> Quick poll: Which car do you think had better occupant protection? Which car would you rather be in during a crash?
>
> *[Wait for audience response - most will say Car B]*
---
## SLIDE 5: The Reveal
**Visual:** Data comparison
**Car A (destroyed):** HIC 485, Chest 38mm, Femur 4.2 kN - ALL GOOD
**Car B (intact):** HIC 890, Chest 58mm, Femur 9.1 kN - MARGINAL/POOR
### Speaker Notes:
> *[Dramatic pause]*
>
> The destroyed car - Car A - had excellent dummy numbers. HIC of 485. Chest deflection of 38 millimeters. Femur load of 4.2 kilonewtons.
>
> The intact-looking car - Car B - had terrible dummy numbers. HIC of 890. Chest deflection of 58 millimeters. Femur load just under the injury threshold.
>
> *[Let this sink in]*
>
> You would rather be in the car that looked destroyed.
---
## SLIDE 6: Why This Happens
**Visual:** Energy absorption diagram
**Caption:** Crush zone is designed to absorb energy - that's the point
### Speaker Notes:
> Why does this happen?
>
> Because crush zones are supposed to crush. That's their job. They absorb energy by deforming. The more they crumple, the more energy they absorb before it reaches the occupant.
>
> Car A crumpled beautifully. It extended the deceleration pulse. It gave the occupant more time to slow down. Less force on the body.
>
> Car B didn't crumple enough. It was too stiff. The deceleration pulse was short and sharp. The occupant hit a wall of g-forces.
>
> Looking at the car tells you nothing about occupant safety. Only the data does.
---
## SLIDE 7: The Lesson
**Visual:** Quote
**Text:** "Never judge a crash by looking at the car. Judge it by looking at the data."
### Speaker Notes:
> The lesson: Never judge a crash by looking at the car.
>
> I've seen this mistake made by journalists writing about crash tests. "Look how much damage!" they say. And they conclude the car is unsafe.
>
> They've got it backwards. The damage is the safety. The crush zone did its job.
>
> This is why we need instrumentation. This is why we need data. Because what you see is not what matters.
---
## SLIDE 8: Puzzle 2 Introduction
**Visual:** Section divider
**Title:** PUZZLE 2: The Survivor's Paradox
### Speaker Notes:
> Puzzle two. The Survivor's Paradox.
---
## SLIDE 9: The Setup - Rollover Statistics
**Visual:** Statistics
**Text:** "Rollovers are only 3% of crashes, but account for 33% of fatalities"
### Speaker Notes:
> Here's a statistic that drove safety engineers crazy for years.
>
> Rollovers are only about 3% of crashes. But they account for roughly a third of fatalities.
>
> Obviously, we need stronger roofs, right? Prevent the roof from crushing the occupants during rollover.
>
> In 2009, NHTSA doubled the roof crush requirement - from 1.5 times vehicle weight to 3 times vehicle weight.
>
> Question: How much did this reduce rollover fatalities?
---
## SLIDE 10: The Reveal
**Visual:** Graph showing minimal change
**Text:** "Rollover fatalities: Essentially unchanged after roof crush upgrade"
### Speaker Notes:
> *[Pause]*
>
> Essentially unchanged.
>
> Despite doubling the roof strength requirement, rollover fatalities barely moved.
>
> *[Wait for reaction]*
>
> How is that possible? Wasn't roof crush the problem?
---
## SLIDE 11: The Data Revealed the Real Problem
**Visual:** Diagram showing ejection during rollover
**Text:** "70%+ of rollover fatalities involve ejection"
### Speaker Notes:
> The data revealed the real problem.
>
> Over 70% of rollover fatalities involve ejection. The occupant isn't killed by the roof crushing - they're killed by being thrown from the vehicle during the roll.
>
> Once you're outside the vehicle, everything is hard. The ground. Other cars. Poles. The vehicle itself rolling over you.
>
> Roof crush was a secondary problem. Ejection was the primary problem.
---
## SLIDE 12: The Real Solution
**Visual:** Side curtain airbag deployment
**Text:** "Side curtain airbags with rollover sensors: 41% reduction in ejection"
### Speaker Notes:
> The real solution was side curtain airbags.
>
> Airbags that deploy during rollover and stay inflated for 5-6 seconds. Long enough to keep occupants inside the vehicle through multiple rolls.
>
> Combined with better seatbelt pretensioners and rollover-sensing systems, ejection rates dropped significantly.
>
> The data - ejection statistics, injury mechanisms, survival analysis - pointed to a completely different solution than the intuitive one.
>
> Without data, we'd still be making roofs stronger and wondering why people were dying.
---
## SLIDE 13: The Lesson
**Visual:** Quote
**Text:** "The obvious solution isn't always the right solution. Let the data tell you what's actually happening."
### Speaker Notes:
> The lesson: The obvious solution isn't always the right solution.
>
> Everyone assumed roof crush was the killer. The data said ejection was the killer.
>
> This happens more than you'd think. The assumed problem isn't the actual problem. And the only way to know is to measure.
---
## SLIDE 14: Puzzle 3 Introduction
**Visual:** Section divider
**Title:** PUZZLE 3: The Luxury Failure
### Speaker Notes:
> Puzzle three. I mentioned this briefly in another context, but let's really dig into it.
>
> The Luxury Failure.
---
## SLIDE 15: The Setup
**Visual:** Logos of BMW, Mercedes, Audi, Lexus
**Text:** "The most expensive cars. The most prestigious brands. Decades of safety engineering."
### Speaker Notes:
> BMW. Mercedes-Benz. Audi. Lexus.
>
> These are the most expensive, most prestigious car brands in the world. They employ thousands of engineers. They've been building cars for decades.
>
> They all passed the moderate overlap frontal test with flying colors. Great safety records. Premium pricing partly justified by "superior safety engineering."
>
> In 2012, IIHS introduced the small overlap frontal test. Twenty-five percent overlap instead of forty. Rigid barrier instead of deformable.
>
> What do you think happened?
---
## SLIDE 16: The Reveal
**Visual:** Test results with photos
**Results:**
- BMW 5-Series: MARGINAL
- Mercedes C-Class: MARGINAL
- Audi A4: MARGINAL
- Lexus ES 350: POOR
### Speaker Notes:
> *[Let the slide speak first]*
>
> They failed.
>
> BMW 5-Series: Marginal.
> Mercedes C-Class: Marginal.
> Audi A4: Marginal.
> Lexus ES 350: Poor.
>
> Some of the most expensive vehicles on the road couldn't pass a new crash test.
---
## SLIDE 17: What the Data Showed
**Visual:** Crash photos showing intrusion
**Annotations:** Footwell collapse, steering wheel displacement, head contact with structure
### Speaker Notes:
> The data told a brutal story.
>
> In these vehicles, when the impact was offset to the corner, the crash forces bypassed the main structural rails. The wheel got pushed back into the footwell. The firewall collapsed. The steering column displaced into the occupant.
>
> The instrumentated dummy data showed:
> - Elevated HIC from head striking interior structures
> - Femur loads near the injury threshold
> - Lower leg compression in the danger zone
> - Feet trapped in crushed footwells
>
> These vehicles were engineered for the tests that existed. They weren't engineered for this test.
---
## SLIDE 18: The Deeper Lesson
**Visual:** Text
**Quote:** "Engineering to pass a test is not the same as engineering for safety."
### Speaker Notes:
> Here's the deeper lesson.
>
> Engineering to pass a test is not the same as engineering for safety.
>
> These automakers had optimized for the 40% overlap test. Their structures were brilliant - for that test. But they'd inadvertently created a vulnerability at the corner.
>
> This is why test protocols must evolve. This is why consumer information testing matters. And this is why we need precise data - because without it, you don't know if you've actually solved the problem or just solved one specific scenario.
---
## SLIDE 19: Puzzle 4 Introduction
**Visual:** Section divider
**Title:** PUZZLE 4: The 46.2g Man
### Speaker Notes:
> Puzzle four. This one goes back to the early days of safety testing.
>
> The 46.2g Man.
---
## SLIDE 20: The Setup
**Visual:** 1950s photo or rocket sled
**Text:** "1954: Conventional wisdom said humans could not survive more than 18g"
### Speaker Notes:
> It's 1954. The jet age is dawning. Pilots are flying faster than ever before. And they're dying.
>
> Not from crashes - from ejections. Pilots ejecting from aircraft were experiencing massive g-forces. And medical authorities believed that humans could not survive more than 18g of deceleration.
>
> If that was true, high-speed ejection was a death sentence. There was no point designing ejection seats for speeds beyond a certain threshold.
>
> One researcher decided to test this assumption. Not with dummies. With himself.
---
## SLIDE 21: John Paul Stapp
**Visual:** Photo of John Paul Stapp
**Text:** "Colonel John Paul Stapp - Air Force flight surgeon"
### Speaker Notes:
> Colonel John Paul Stapp. Air Force flight surgeon. And the bravest - or craziest - researcher in the history of safety testing.
>
> Stapp strapped himself into a rocket sled. And he ran test after test, pushing higher and higher g-forces.
---
## SLIDE 22: The Reveal
**Visual:** Data display
**Text:** "December 10, 1954: 46.2g sustained. 632 mph to zero in 1.4 seconds."
### Speaker Notes:
> On December 10, 1954, Stapp rode the rocket sled to 632 miles per hour - faster than a .45 caliber bullet.
>
> Then he hit the brakes.
>
> In 1.4 seconds, he decelerated from 632 mph to zero. He experienced 46.2g of sustained deceleration.
>
> That's more than 2.5 times what "experts" said would kill him.
>
> He survived. He walked away - temporarily blinded, with burst blood vessels in his eyes, but alive and conscious.
---
## SLIDE 23: What the Data Revealed
**Visual:** Comparison chart
**18g "limit" vs. 46.2g survival**
### Speaker Notes:
> Stapp proved that the human body could survive far more than anyone believed.
>
> This data - captured by instrumentation on his sled and his body - revolutionized aviation safety. It meant ejection seats could be designed for higher speeds. It meant survival was possible in scenarios previously considered hopeless.
>
> And it laid the foundation for automotive crash testing. If humans could survive 46g, then the question became: How do we design vehicles so that occupants never experience more than they can tolerate?
>
> Stapp's data made modern crash safety possible.
---
## SLIDE 24: The Lesson
**Visual:** Quote
**Text:** "Assumptions are not data. Test it. Measure it. Know for certain."
### Speaker Notes:
> The lesson: Assumptions are not data.
>
> For years, everyone "knew" that 18g was fatal. Nobody had tested it. They just assumed.
>
> Stapp tested it. He measured it. And he proved them wrong.
>
> This is the foundation of everything we do. Don't assume. Test. Measure. Know for certain.
---
## SLIDE 25: Puzzle 5 Introduction
**Visual:** Section divider
**Title:** PUZZLE 5: The Airbag Arms Race
### Speaker Notes:
> Last puzzle. I call this one the Airbag Arms Race.
---
## SLIDE 26: The Setup
**Visual:** Timeline graphic
**1970:** Airbags proposed
**1984:** First mandates
**1998:** Full mandate
**2000s:** "Advanced" airbags
### Speaker Notes:
> The history of airbag regulation is fascinating.
>
> 1970: Airbags first proposed as an alternative to seatbelts.
> 1984: First automatic restraint requirements (airbag OR automatic belt).
> 1998: Airbags mandatory in all vehicles.
> 2000s: "Advanced" airbags with occupant classification.
>
> Each generation, airbags got more sophisticated. More sensors. More inflation stages. More intelligence.
>
> Here's the question: How much faster did airbag technology advance during the mandate period versus before?
---
## SLIDE 27: The Reveal
**Visual:** Comparison
**Pre-mandate (1970-1984):** Minimal innovation
**Post-mandate:** Explosion of patents, technologies, features
### Speaker Notes:
> The answer is dramatic.
>
> Before mandates, airbag technology advanced slowly. Why invest in something that wasn't required?
>
> After mandates, innovation exploded. Multi-stage inflators. Occupant sensors. Position detection. Seatbelt pretensioner integration. Side airbags. Curtain airbags. Knee airbags.
>
> The regulation created the market. The market drove innovation. And the innovation saved lives.
---
## SLIDE 28: The Data That Drove It
**Visual:** Graph
**Text:** "Lives saved by airbags: From 0 (1985) to ~2,800/year (today)"
### Speaker Notes:
> Today, airbags save approximately 2,800 lives per year in the United States.
>
> That didn't happen because of voluntary adoption. It happened because regulation required it, testing proved it worked, and data validated each improvement.
>
> Without crash testing data, you couldn't prove an airbag design was better. Without that proof, you couldn't justify the R&D investment. Without the investment, you couldn't innovate.
>
> Data is the engine of safety progress.
---
## SLIDE 29: The Meta-Lesson
**Visual:** Text
**Quote:** "Measurement isn't just about understanding what happened. It's about enabling what comes next."
### Speaker Notes:
> Here's the meta-lesson from all five puzzles.
>
> Measurement isn't just about understanding what happened. It's about enabling what comes next.
>
> The beautiful wreck taught us that crush zones work.
> The rollover data taught us that ejection was the real problem.
> The small overlap test taught us that we'd missed a vulnerability.
> Stapp's sled taught us what humans can survive.
> The airbag arms race taught us that mandates drive innovation.
>
> In every case, data didn't just describe - it enabled. It pointed the way forward.
---
## SLIDE 30: Implications for DTS
**Visual:** Connection to DTS products
**Text:** "Our instrumentation is the source of these surprises."
### Speaker Notes:
> What does this mean for DTS?
>
> Our instrumentation is the source of these surprises. When data contradicts expectations, it's because someone measured something that hadn't been measured before. Or measured it more precisely. Or measured it in a new context.
>
> Every sensor we design, every data acquisition system we build, every calibration we perform - it's potentially the source of the next surprise. The next counterintuitive finding that changes how vehicles are built.
>
> That's a remarkable responsibility. And a remarkable opportunity.
---
## SLIDE 31: Your Turn
**Visual:** Challenge
**Text:** "What assumptions are we making today that data will overturn tomorrow?"
### Speaker Notes:
> I want to leave you with a challenge.
>
> What assumptions are we making today that data will overturn tomorrow?
>
> In the 1950s, everyone assumed 18g was fatal. Wrong.
> In the 2000s, everyone assumed roof crush was the rollover killer. Wrong.
> In 2011, everyone assumed luxury cars were safer. Wrong.
>
> What are we assuming now that we haven't tested? What do we "know" that might not be true?
>
> Those are the questions that lead to the next breakthrough.
>
> Thank you.
---
## SLIDE 32: Questions
**Visual:** "Questions?"
**Subtitle:** Ben - Application Engineer
### Speaker Notes:
> I'm happy to take questions. And if you've got your own "data surprised me" story, I'd love to hear it.
>
> *[Open for Q&A]*
---
# APPENDIX: ADDITIONAL PUZZLES (BACKUP)
## Puzzle: The Safer Smaller Car
- Setup: Large SUV vs. small car in head-on collision
- Assumption: SUV occupant always safer
- Reality: Depends on crash compatibility; some small cars protect better than some SUVs
- Lesson: Mass isn't everything; structure and restraints matter
## Puzzle: The Helmet That Failed
- Setup: Two helmets - one heavy, one light
- Assumption: Heavier = more protection
- Reality: Lighter helmet had better SI scores due to better energy-absorbing liner
- Lesson: Material science matters more than mass
## Puzzle: The Zero-Star Five-Star
- Setup: Same vehicle, two different NCAP programs
- Assumption: Rating should be consistent
- Reality: Different test protocols, different dummies, different results
- Lesson: Test protocol details matter enormously
## Puzzle: The Unbreakable Neck
- Setup: Early dummy neck designs
- Assumption: Stiff neck provides realistic injury measurement
- Reality: Too stiff = underestimate injury risk; too flexible = overestimate
- Lesson: Biofidelity is a constant calibration challenge
---
# TIMING GUIDE
| Section | Duration | Cumulative |
|---------|----------|------------|
| Opening (Slides 1-2) | 3 min | 3 min |
| Puzzle 1 - Beautiful Wreck (Slides 3-7) | 7 min | 10 min |
| Puzzle 2 - Survivor's Paradox (Slides 8-13) | 7 min | 17 min |
| Puzzle 3 - Luxury Failure (Slides 14-18) | 8 min | 25 min |
| Puzzle 4 - 46.2g Man (Slides 19-24) | 8 min | 33 min |
| Puzzle 5 - Airbag Arms Race (Slides 25-28) | 6 min | 39 min |
| Closing (Slides 29-32) | 4 min | 43 min |
| Q&A Buffer | 2 min | 45 min |
---
# ENGAGEMENT TECHNIQUES
1. **Polling:** Ask audience to guess before reveals. Show of hands or verbal responses.
2. **Pause:** Give time for guesses. Don't rush to the answer.
3. **Callback:** Reference earlier puzzles when discussing later ones.
4. **Humor:** The surprises are inherently engaging - lean into the "gotcha" moment.
5. **Validation:** When audience guesses right, acknowledge it.
6. **Discussion:** For some puzzles, open brief discussion before revealing.
---
# VISUAL NOTES
- Use high-contrast reveals (gray "hidden" state, then color)
- Include actual data traces where possible
- Photos of actual crashes add credibility
- Keep slides uncluttered - one main point per slide
- Consider animation for reveals (build in PowerPoint)
---
*End of Script and Speaker Notes*