Prophesee: Neuromorphic Event-Based Vision

Pioneering the Metavision® architecture: asynchronous, data-driven sensing that mimics the human retina for ultra-low latency.

1. The Metavision® Event Pipeline

Traditional cameras capture redundant frames. Prophesee's Metavision sensor architecture captures only *changes* in light at the pixel level, creating a sparse, high-speed temporal event stream. Click to explore the asynchronous flow.

👁️

Pixel Trigger

Event Stream

🧠

Algorithm Engine

🎯

Action Output

Asynchronous Pixel Trigger

Each pixel in the Metavision sensor acts independently. It only sends data when it detects a relative change in light intensity (temporal contrast).

Input Data

  • Incident Photons
  • Temporal Contrast Thresholds

Output Data

  • Discrete Event Trigger
  • Pixel Address (X, Y)

2. Metavision vs. Frame-Based Workloads

Prophesee sensors operate with **microsecond precision**, effectively delivering temporal resolution equivalent to >10,000 fps while consuming 10x–100x less power. The data is sparse, meaning compute is only expended where motion occurs.

🔍 Key Performance Metric

Metavision sensors achieve >120dB dynamic range, allowing clear vision in extreme lighting (e.g., staring directly into the sun while seeing details in shadows).

3. Foundational Research

The Metavision paradigm is supported by over a decade of neuromorphic engineering research and collaboration with industrial partners like Sony.

Event-based Vision: A Survey

Gallego et al. (IEEE TPAMI)

The definitive academic survey of event-based sensing, processing, and application paradigms.

Sony/Prophesee Stacked Sensor

ISSCC 2020 Whitepaper

Technical details on the first industrial-grade HD event-based vision sensor (IMX636).

Metavision Intelligence SDK

Prophesee Documentation

Proprietary algorithms for optical flow, tracking, and vibration analysis in sparse event spaces.

Strategic Partnership Opportunities

Prophesee's technology provides a massive competitive advantage for high-speed industrial, mobile, and safety-critical applications.

🏭

Industrial Automation

High-speed counting (>1,000 obj/sec) and predictive maintenance through vibration analysis.

Industrial Tier Metavision →
🚗

Automotive ADAS

Ultra-fast obstacle detection and driver monitoring systems (DMS) with microsecond latency.

Automotive Tier Safety →
📱

Mobile & AR/VR

Low-power eye tracking and hand gesture recognition for next-gen spatial computing devices.

Mobile Tier Deduced: needs validation