Prophesee event-based vision kits and Raspberry Pi

Event-based vision (EBV) is really taking off. We provide an overview of the concepts and applications, as well as Prophesee products for EBV. So here’s a reminder diagram and short video for context, then we’ll dig into using Prophesee EBV kits with Raspberry Pi.

Event-based vision is a new paradigm – Courtesy Prophesee

Frame-based vs. Event-based approach to eye tracking – for example:

So I want to try a Prophesee Metavision Evaluation Kit!

Enough theory already, I want to get hands-on with this! Per the Prophesee Metavision Evaluation Kits, how do I get started?

1st vision is the official partner for Prophesee in the US, so start with a quote and purchase a Raspberry Pi 5 CSI modules with the 320×320 pixel GenX320 sensor. Or the GenX320 Raspberry Pi 5 Module.

The main difference between the two are that the M12 lens mount allows for changing lenses yourself, but if you prefer the M6 lens version, you receive a smaller front of the camera and a wider field of view. You can see the variations here:  https://www.1stvision.com/cameras/GenX320-Starter-Kit-for-Rasberry-Pi-5

You’ll need to purchase the Raspberry Pi elsewhere, as 1st vision does not sell them.

Prophesee recommends the 8gb vision at a minimum, with the 16 recommended for on board vision computation. 

Prophesee also recommends getting the active cooler, 27 W Power supply and NVME adapter. 

  1. Active Cooler – SC1148
  2. Buy a Raspberry Pi Active Cooler – Raspberry Pi
  3. 27 W Power Supply
  4. Buy a Raspberry Pi 27W USB-C Power Supply – Raspberry Pi
  5. SSD Kit
  6. Buy a Raspberry Pi SSD Kit – Raspberry Pi

Software options

A further note, the Metavision 5 SDK will not run on the Raspberry Pi 5, as the CPU power is insufficient for that computational load. You’ll need to use the Metavision 4 OpenEB SDK. So to be clear, the SDK choices are:

SDK for Raspberry PiSDK for PC
Metavision 4 OpenEB
(no cost)
Metavision 5 SDK
(bundled offer or standalone purchase)
Metavision SDK options by processor preference

If you would like to talk it through, just call us at 978-474-0044. Or use the link below to request we get back to you by either e-mail or phone.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

Whitepaper: Event-based sensing paradigm

Except for sometimes compelling line-scan imaging, machine vision has been dominated by frame-based approaches. (Compare Area-scan vs. Line-scan). With an area-scan camera, the entire two-dimensional sensor array of x pixels by y pixels is read out and transmitted over the digital interface to the PC host. Whether USB3, GigE, CoaXPress, CameraLink, or any other interface, that’s a lot of image data to transport.

Download whitepaper
Event-based sensing as alternative to frame-based approach

If your application is about motion, why transmit the static pixels?

The question above is intentionally provocative, of course. One might ask, “do I have a choice?” With conventional sensors, one really doesn’t, as their pixels just convert light to electrons according to the physics of CMOS, and readout circuits move the array of charges on down the interface to the host PC, for algorithmic interpretation. There’s nothing wrong with that! Thousands of effective machine vision applications use precisely that frame-based paradigm. Or the line-scan approach, arguably a close cousin of the area-scan model.

Consider the four-frame sequence to the left, relative to a candidate golf-swing analysis application. Per the legend, with post-processing markup the blue-tinged golfer, club, and ball are undersampled in the sense that there are unshown phases of the swing.

Meanwhile the non-moving tree, grass, and sky are needlessly re-sampled in each frame.

It takes an expensive high-frame-rate sensor and interface to significantly increase the sample rate. Plus storage capacity for each frame. And/or processing capacity – for automated applications – to separate the motion segments from the static segments.

With event-based sensing, introduced below, one can achieve the equivalent of 10k fps – by just transmitting the pixels whose values change.

Images courtesy Prophesee Metavision.

Event-based sensing only transmits the pixels that changed

Unlike photography for social media or commercial advertising, where real-looking images are usually the goal, for machine vision it’s all about effective (automated) applications. In motion-oriented applications, we’re just trying to automatically control the robot arm, drive the car, monitor the secure perimeter, track the intruder(s), monitor the vibration, …

We’re NOT worried about color rendering, pretty images, or the static portions in the field of view (FOV). With event-based sensing, “high temporal imaging” is possible, since one need only pay attention to the pixels whose values change.

Consider the short video below. The left side shows a succession of frame-based images for a machine driven by an electric motor and belt. But the left hand image sequence is not a helpful basis for monitoring vibration with an eye to scheduling (or skipping) maintenance, or anticipating breakdowns.

The right-hand sequence was obtained with an event-based vision sensor (EVS), and absolutely reveals components with both “medium” and “significant” vibration. Here those thresholds have triggered color-mapped pseudo-images, to aid comprehension. But an automated application could map the coordinates to take action, such as gracefully shutting down the machine, scheduling maintenance according to calculated risk, etc.

Courtesy Prophesee Metavision

Another example to help make it real:

Here’s another short video, which brings to mind applications like autonomous vehicles and security. It’s not meant to be pretty – it’s meant to show the sensor detects and transmits just the pixels that correlate to change:

Courtesy Prophesee Metavision

Event-based sensing – it really is a different paradigm

Even (especially?) if you are seasoned at line-scan or area-scan imaging, it’s a paradigm shift to understand event-based sensing. Inspired by human vision, and built on the foundation of neuromorphic engineering, it’s a new technology – and it opens up new kinds of applications. Or alternative ways to address existing ones.

Download whitepaper
Event-based sensing as alternative to frame-based approach

Download the whitepaper and learn more about it! Or fill out our form below – we’ll follow up. Or just call us at 978-474-0044.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

#EVS

#event-based

#neuromorphic

Prophesee event-based vision – a new paradigm

We don’t use terms like paradigm-shift lightly. Event-based vision (EBV) is shaking things up just as much as the arrival of frame-based and line-scan imaging once did . It’s that different. And why we are excited to offer Prophesee sensors and cameras.

Event-based vision applications areas – Courtesy Prophesee

Applications examples… and just enough about concepts

This informational blog skews towards applications examples, with just enough about concepts and EBV technology to lend balance. Event-based vision is so new and so different than previous vision technologies. We believe our readers may appreciate an examples-driven approach to understanding this radically new branch of machine vision.

Unless you’re an old hand at event-based vision (EBV) …

…and as of this writing in Summer 2025 few could be, let’s show a couple of short teaser videos before we explain the concepts or go deeper on applications.

Example 1: High-speed multiple-object counting and sizing

The following shows a use of event-based imaging to both count particles or objects and to estimate their size. All in a high-speed multiple concurrent target environment.

Courtesy Prophesee

Example 2: Eye-tracking – only need to track pupil changes

Now consider eye-tracking: in the video below, we see synchronized side-by-side views of the same scene. The left side was obtained with a frame-based sensor. The right side used a Prophesee event-based sensor. Why “waste” bandwidth and processing resources separating a pupil from an iris, eyelids, and eyebrows, when the goal is eye-tracking?

Courtesy Prophesee

Radically increased speed; massively less data

Prophesee Metavision‘s sensor tracks “just” what changes in the field of view, instead of the frame-based approach of reading out whole sensor arrays, transmitting voluminous data from the camera to the host, and algorithmically looking for edges, blobs, or features.

Contact us

Example 3: Object tracking

Not unlike the eye-tracking example above, but this time with multiple moving targets – vehicular traffic. With frame-based vision, bandwidth and processing has to repeatedly process static pavement, light poles, guard rails, and other “clutter” comprising 80% of the field of view. But with event-based vision the sensor only detects and transmits the moving vehicles.

Whether counting traffic for congestion planning/reporting, or collision avoidance abord a given vehicle, we don’t care what the vehicle looks like – only that it’s there and moving along a certain trajectory and at a detected speed.

Example 4: Surveillance and security

Prophesee named this video “Optical Flow Crowd”, which is best understood in the context of security and surveillance, we suggest. It’s not unlike the vehicular flow example above, except that cars and trucks mostly stay in-lane. Whereas pedestrians move at diverse angles. And their arm, leg, head, and torso movements also convey information.

The (computed) vector overlays indicate speed and directional changes, important to reveal potentially dangerous actions taken or likely to emerge. For example, does a raised arm indicate a handgun or knife being readied, or is it just a prelude to scratching an itchy nose? Is there a pursuer turning towards another pedestrian, to which a nearby policeman should be alerted?

Example 5: Vibration monitoring and preventative maintenance

Motorized equipment is typically service either on a preventative maintenance schedule and/or a break-fix approach, depending on costs, legal liabilities and risks, etc. What if one could inexpensively identify vibration patterns that reveal wearing belts or bearings before a breakdown and before there is preventable collateral damage to other components?

Courtesy Prophesee

Enough with the examples – how can I get event-based sensors?

1stVision is pleased to represent Prophesee with a wide range of sensors, evaluation kits, board-level and housed cameras, and an SDK designed for event-based vision applications.

Built on neuromorphic engineering principles inspired by the brain’s neural networks and human vision, Prophesee products are surprisingly affordable, enabling new fields of event-based vision. Or improving on frame-based vision applications that may be done better, faster, or less-expensively using an event-based approach.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

IDS Imaging XCP-E Event-Based Cameras – and the Sony Prophesee IMX636 Sensor

Recently we introduced the IDS Imaging event-based cameras, the uEye XCP-E models. These cameras are a paradigm shift – they detect changes in pixel brightness and transmit ONLY those changes. This dramatically reduces data load, lowers latency, and improves efficiency.

uEye XCP-E event based cameras – housed and board-level options – Courtesy IDS Imaging

Speed, speed, speed

Temporal resolution is better than 100μsec! Rapid changes can be detected – equivalent to an area scan camera operating at >10,000fps.

Paradigm shift – from “frames” to “events”

This is one of those aha moments. Too often we get jaded in believing that things only evolve incrementally – Moore’s law and electronics getting faster and less expensive, etc. Yawn. But this really is a game changer worth getting one’s head wrapped around.

Paradigm shift – an event-based sensor enables a new applications approach – Courtesy Sony / Prophesee

The event-based vision sensor (EVS) was developed by Sony and Prophesee.

Contact us to talk to an expert!

Industrial use cases

With a radically new technology, even machine vision veterans may appreciate seeing example applications already utilizing event-based imaging:

Some typical applications for event-based imaging – Courtesy Sony / Prophesee

Example applications

The following are meant to be suggestive rather than inclusive. Just to whet the appetite.

Courtesy Sony / Prophesee

Let’s enlarge that “alternatives” comparison from the above graphic:

Which level of complexity, performance outcomes, and costs do you prefer? Courtesy Sony / Prophesee

Another class of applications

Courtesy Sony / Prophesee
Vibration monitoring can alert the need for preventative maintenance – Courtesy Sony / Prophesee

Take the guesswork out of when to do preventative maintenance. Maximize uptime. Reduce the risks of catastrophic failure. These cameras are affordable enough to let them do the vibration monitoring – just set your alert threshold!

Let the sensor monitor vibration frequencies – Courtesy Sony / Prophesee

Software for event-based imaging applications

New paradigm best served by Metavision Technology Software – Courtesy Sony / Prophesee
Metavision SDK modules built for event-based imaging applications – Courtesy Sony / Prophesee

Get it? Got it? Need it? Want it?

This is a new way of coming at machine vision applications. It may give you an edge over your competition by enabling you to improve quality, reduce costs, and/or innovate new products or services. It really is different. See the IDS Imaging uEye XCP-E event-based cameras and datasheets. Powered by the Sony IMX636 developed in conjunction with Prophesee.

Tell us more about your application, using the link below. Or just call us at 978-571-5552. We can help you determine whether event-based imaging is optimal for your application.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.