Whitepaper: Event-based sensing paradigm

Except for sometimes compelling line-scan imaging, machine vision has been dominated by frame-based approaches. (Compare Area-scan vs. Line-scan). With an area-scan camera, the entire two-dimensional sensor array of x pixels by y pixels is read out and transmitted over the digital interface to the PC host. Whether USB3, GigE, CoaXPress, CameraLink, or any other interface, that’s a lot of image data to transport.

Download whitepaper
Event-based sensing as alternative to frame-based approach

If your application is about motion, why transmit the static pixels?

The question above is intentionally provocative, of course. One might ask, “do I have a choice?” With conventional sensors, one really doesn’t, as their pixels just convert light to electrons according to the physics of CMOS, and readout circuits move the array of charges on down the interface to the host PC, for algorithmic interpretation. There’s nothing wrong with that! Thousands of effective machine vision applications use precisely that frame-based paradigm. Or the line-scan approach, arguably a close cousin of the area-scan model.

Consider the four-frame sequence to the left, relative to a candidate golf-swing analysis application. Per the legend, with post-processing markup the blue-tinged golfer, club, and ball are undersampled in the sense that there are unshown phases of the swing.

Meanwhile the non-moving tree, grass, and sky are needlessly re-sampled in each frame.

It takes an expensive high-frame-rate sensor and interface to significantly increase the sample rate. Plus storage capacity for each frame. And/or processing capacity – for automated applications – to separate the motion segments from the static segments.

With event-based sensing, introduced below, one can achieve the equivalent of 10k fps – by just transmitting the pixels whose values change.

Images courtesy Prophesee Metavision.

Event-based sensing only transmits the pixels that changed

Unlike photography for social media or commercial advertising, where real-looking images are usually the goal, for machine vision it’s all about effective (automated) applications. In motion-oriented applications, we’re just trying to automatically control the robot arm, drive the car, monitor the secure perimeter, track the intruder(s), monitor the vibration, …

We’re NOT worried about color rendering, pretty images, or the static portions in the field of view (FOV). With event-based sensing, “high temporal imaging” is possible, since one need only pay attention to the pixels whose values change.

Consider the short video below. The left side shows a succession of frame-based images for a machine driven by an electric motor and belt. But the left hand image sequence is not a helpful basis for monitoring vibration with an eye to scheduling (or skipping) maintenance, or anticipating breakdowns.

The right-hand sequence was obtained with an event-based vision sensor (EVS), and absolutely reveals components with both “medium” and “significant” vibration. Here those thresholds have triggered color-mapped pseudo-images, to aid comprehension. But an automated application could map the coordinates to take action, such as gracefully shutting down the machine, scheduling maintenance according to calculated risk, etc.

Courtesy Prophesee Metavision

Another example to help make it real:

Here’s another short video, which brings to mind applications like autonomous vehicles and security. It’s not meant to be pretty – it’s meant to show the sensor detects and transmits just the pixels that correlate to change:

Courtesy Prophesee Metavision

Event-based sensing – it really is a different paradigm

Even (especially?) if you are seasoned at line-scan or area-scan imaging, it’s a paradigm shift to understand event-based sensing. Inspired by human vision, and built on the foundation of neuromorphic engineering, it’s a new technology – and it opens up new kinds of applications. Or alternative ways to address existing ones.

Download whitepaper
Event-based sensing as alternative to frame-based approach

Download the whitepaper and learn more about it! Or fill out our form below – we’ll follow up. Or just call us at 978-474-0044.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

#EVS

#event-based

#neuromorphic

Prophesee event-based vision – a new paradigm

We don’t use terms like paradigm-shift lightly. Event-based vision (EBV) is shaking things up just as much as the arrival of frame-based and line-scan imaging once did . It’s that different. And why we are excited to offer Prophesee sensors and cameras.

Event-based vision applications areas – Courtesy Prophesee

Applications examples… and just enough about concepts

This informational blog skews towards applications examples, with just enough about concepts and EBV technology to lend balance. Event-based vision is so new and so different than previous vision technologies. We believe our readers may appreciate an examples-driven approach to understanding this radically new branch of machine vision.

Unless you’re an old hand at event-based vision (EBV) …

…and as of this writing in Summer 2025 few could be, let’s show a couple of short teaser videos before we explain the concepts or go deeper on applications.

Example 1: High-speed multiple-object counting and sizing

The following shows a use of event-based imaging to both count particles or objects and to estimate their size. All in a high-speed multiple concurrent target environment.

Courtesy Prophesee

Example 2: Eye-tracking – only need to track pupil changes

Now consider eye-tracking: in the video below, we see synchronized side-by-side views of the same scene. The left side was obtained with a frame-based sensor. The right side used a Prophesee event-based sensor. Why “waste” bandwidth and processing resources separating a pupil from an iris, eyelids, and eyebrows, when the goal is eye-tracking?

Courtesy Prophesee

Radically increased speed; massively less data

Prophesee Metavision‘s sensor tracks “just” what changes in the field of view, instead of the frame-based approach of reading out whole sensor arrays, transmitting voluminous data from the camera to the host, and algorithmically looking for edges, blobs, or features.

Contact us

Example 3: Object tracking

Not unlike the eye-tracking example above, but this time with multiple moving targets – vehicular traffic. With frame-based vision, bandwidth and processing has to repeatedly process static pavement, light poles, guard rails, and other “clutter” comprising 80% of the field of view. But with event-based vision the sensor only detects and transmits the moving vehicles.

Whether counting traffic for congestion planning/reporting, or collision avoidance abord a given vehicle, we don’t care what the vehicle looks like – only that it’s there and moving along a certain trajectory and at a detected speed.

Example 4: Surveillance and security

Prophesee named this video “Optical Flow Crowd”, which is best understood in the context of security and surveillance, we suggest. It’s not unlike the vehicular flow example above, except that cars and trucks mostly stay in-lane. Whereas pedestrians move at diverse angles. And their arm, leg, head, and torso movements also convey information.

The (computed) vector overlays indicate speed and directional changes, important to reveal potentially dangerous actions taken or likely to emerge. For example, does a raised arm indicate a handgun or knife being readied, or is it just a prelude to scratching an itchy nose? Is there a pursuer turning towards another pedestrian, to which a nearby policeman should be alerted?

Example 5: Vibration monitoring and preventative maintenance

Motorized equipment is typically service either on a preventative maintenance schedule and/or a break-fix approach, depending on costs, legal liabilities and risks, etc. What if one could inexpensively identify vibration patterns that reveal wearing belts or bearings before a breakdown and before there is preventable collateral damage to other components?

Courtesy Prophesee

Enough with the examples – how can I get event-based sensors?

1stVision is pleased to represent Prophesee with a wide range of sensors, evaluation kits, board-level and housed cameras, and an SDK designed for event-based vision applications.

Built on neuromorphic engineering principles inspired by the brain’s neural networks and human vision, Prophesee products are surprisingly affordable, enabling new fields of event-based vision. Or improving on frame-based vision applications that may be done better, faster, or less-expensively using an event-based approach.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

IDS Imaging XCP-E Event-Based Cameras – and the Sony Prophesee IMX636 Sensor

Recently we introduced the IDS Imaging event-based cameras, the uEye XCP-E models. These cameras are a paradigm shift – they detect changes in pixel brightness and transmit ONLY those changes. This dramatically reduces data load, lowers latency, and improves efficiency.

uEye XCP-E event based cameras – housed and board-level options – Courtesy IDS Imaging

Speed, speed, speed

Temporal resolution is better than 100μsec! Rapid changes can be detected – equivalent to an area scan camera operating at >10,000fps.

Paradigm shift – from “frames” to “events”

This is one of those aha moments. Too often we get jaded in believing that things only evolve incrementally – Moore’s law and electronics getting faster and less expensive, etc. Yawn. But this really is a game changer worth getting one’s head wrapped around.

Paradigm shift – an event-based sensor enables a new applications approach – Courtesy Sony / Prophesee

The event-based vision sensor (EVS) was developed by Sony and Prophesee.

Contact us to talk to an expert!

Industrial use cases

With a radically new technology, even machine vision veterans may appreciate seeing example applications already utilizing event-based imaging:

Some typical applications for event-based imaging – Courtesy Sony / Prophesee

Example applications

The following are meant to be suggestive rather than inclusive. Just to whet the appetite.

Courtesy Sony / Prophesee

Let’s enlarge that “alternatives” comparison from the above graphic:

Which level of complexity, performance outcomes, and costs do you prefer? Courtesy Sony / Prophesee

Another class of applications

Courtesy Sony / Prophesee
Vibration monitoring can alert the need for preventative maintenance – Courtesy Sony / Prophesee

Take the guesswork out of when to do preventative maintenance. Maximize uptime. Reduce the risks of catastrophic failure. These cameras are affordable enough to let them do the vibration monitoring – just set your alert threshold!

Let the sensor monitor vibration frequencies – Courtesy Sony / Prophesee

Software for event-based imaging applications

New paradigm best served by Metavision Technology Software – Courtesy Sony / Prophesee
Metavision SDK modules built for event-based imaging applications – Courtesy Sony / Prophesee

Get it? Got it? Need it? Want it?

This is a new way of coming at machine vision applications. It may give you an edge over your competition by enabling you to improve quality, reduce costs, and/or innovate new products or services. It really is different. See the IDS Imaging uEye XCP-E event-based cameras and datasheets. Powered by the Sony IMX636 developed in conjunction with Prophesee.

Tell us more about your application, using the link below. Or just call us at 978-571-5552. We can help you determine whether event-based imaging is optimal for your application.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

IDS Imaging μEye XCP-E Event-based cameras

The μEye XCP-E event-based camera utilizes Sony’s Prophesee IMX636 sensor. So, by design, it captures only relevant image changes. Event-based imaging can be a game changer for certain applications. Unlike area scan or line scan imaging – which capture every pixel and render a “full image” – event imaging only senses and delivers changes.

IDS μEye XCP-E housed camera (leftmost) and forthcoming XLS-E board level models – Courtesy IDS Imaging

Event-based imaging captures the changes:

Left: uEye XCP-E image vs. Right: Area scan image – Courtesy IDS Imaging

Less is more

Playing on the “less is more” adage reveals key insights into event-based imaging.

The human eye is adept at delivering and entire scene, of course, and that’s how most of us imagine we see the world around us. But our overall vision perception also builds upon the eye’s ability to sense brightness changes within small segments of the overall scene..

Consider a baseball batter awaiting a pitched ball. The overall scene is relatively static: the outfield fence, bases, and foul lines aren’t moving. And the infielders are almost static – relative to the motion of the ball. But the pitched ball approaching at 80 – 90 miles per hour can be identified by a good batter, to gauge “strike or ball” and “swing or take”.

The batter’s visual processing does NOT have time to capture the full scene at each instant of “ball release”, “just released”, “mid-way”, and “arriving soon”. Rather, the ball’s trajectory is discerned as successive changes against a static background. So too with an event-based camera.

Less data -> More speed: In other circumstances, less data might seem like a handicap. For area scan applications it often would be. Finding defects on a static surface requires ingesting a lot of detail – all the pixels – in order to do edge detection, blob analysis, or other algorithmic processing. But by detecting “just the brightness changes”, transmitting less data is exactly what delivers the increased speed!

Applications example: motion detection and analysis

Airport security application – Courtesy IDS Imaging

What is delivered are pixel motion coordinates and timestamps – NOT pixel brightness values. So you get useable results rather than having to algorithmically compute the results from a traditional area scan image. Track moving objects easily.

How much?

Already intrigued? The housed model, UE-39B0XCP-E, is available now, as this blog releases in early March 2025. Board-level models to be released soon.

Temporal resolution better than 100 μsec

Detect rapid changes – a conventional camera would need > 10,000 fps to capture this – Courtesy IDS Imaging
Courtesy IDS Imaging

Efficient data processing

Courtesy IDS Imaging

IDS Imaging uEye XCP-E event-based cameras can be directly integrated with the sensor manufacturers’ software tools, called Metavision. That’s all thanks to Sony’s partnership with Prophesee. Since event-based imaging is a paradigm shift away from conventional machine vision approaches, the visualization tools, API, and training videos help you get up to speed quickly.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.