Drone detection event-based cameras from Prophesee

Event-based cameras outperform frame-based approaches for many applications. We provided insight to the event-based paradigm in a recent blog. Or download our whitepaper on event-based sensing.

In this piece, we focus on drone detection, a task at which event-based imaging excels. For full-impact, please view the following in full-screen mode using the “four corners” button.

Find the drone – Event-based approach beats frame-based method – Courtesy scientific paper attribution

As discussed in the event-based paradigm introductions links above, frame-based approaches would struggle to track a drone moving in a visually complex environment (above left), having to parse for drone shapes and orientations, occlusions, etc., even when most of the imagery is static.

Meanwhile, as seen in the event-based video (above right), the new paradigm only looks for “what’s changed”, which amounts to showing “what’s moving?”. For drone detection, as well as other perimeter intrusion applications, vibration monitoring, etc., that’s ideal.

1stVision represents Prophesee’s event-based sensors and cameras, built on neuromorphic engineering principles inspired by human vision. Call us at 978-474-0044 to learn more or request a quote.

Contact us for a quote

For some applications, one only needs an event-based sensor – problem solved. For other applications, one might combine different imaging approaches. Consider the juxtaposition of three methods shown below:

Visible, polarization, and event-based approaches – Courtesy Prophesee and EOPTIC

The multimodal approach above is utilized in a proprietary system developed by EOPTIC, in which visible, polarization, and event-based sensors are integrated. For certain applications one may require the best of speed, detail, and situational awareness, for automated “confidence” and accuracy, for example.

Here’s another side-by-side video on drone detection and tracking:

Visible vs. (hybrid) event-based imaging – Courtesy Prophesee and NEUROBUS

The above-left video uses conventional frame-based imaging, where it’s pretty hard to see the drone until it rises above the trees. But the event-based approach used by Prophesee’s customer Neurobus, together with their own neuromorphic technologies, identifies the drone event amidst the trees – a level of early warning that could make all the difference.

By the numbers:

Enough with the videos – looks compelling but can you quantify Prophesee event-based sensors for me please?

Quantifying key attributes – Courtesy Prophesee

Ready to evaluate event-based vision in your application?

1stVision offers Prophesee Metavision® evaluation kits designed to help engineers and developers quickly assess event-based sensing for high-speed motion detection, drone tracking, robotics, and other dynamic vision applications. Each kit provides everything needed to get started with Prophesee’s Metavision technology, including hardware, software tools, and technical support from our experienced machine vision team. Request a quote to discuss kit availability, configuration options, and how we can help accelerate your proof-of-concept or system deployment.” – we can link that page with the kits. 

Technical note: The GenX320 Starter kit for Raspberry Pi 5” utilizes the Sony IMX636 sensor, expressly designed for event-based sensing.

Kit or camera? You choose.

The kits described and linked above are ideal for those pursuing embedded designs. If you prefer a full camera – still very compact at less than 5cm per side – and you want a USB3 interface – see IDS uEye event-based cameras. You’ve got options.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

Prophesee event-based vision kits and Raspberry Pi

Event-based vision (EBV) is really taking off. We provide an overview of the concepts and applications, as well as Prophesee products for EBV. So here’s a reminder diagram and short video for context, then we’ll dig into using Prophesee EBV kits with Raspberry Pi.

Event-based vision is a new paradigm – Courtesy Prophesee

Frame-based vs. Event-based approach to eye tracking – for example:

So I want to try a Prophesee Metavision Evaluation Kit!

Enough theory already, I want to get hands-on with this! Per the Prophesee Metavision Evaluation Kits, how do I get started?

1st vision is the official partner for Prophesee in the US, so start with a quote and purchase a Raspberry Pi 5 CSI modules with the 320×320 pixel GenX320 sensor. Or the GenX320 Raspberry Pi 5 Module.

The main difference between the two are that the M12 lens mount allows for changing lenses yourself, but if you prefer the M6 lens version, you receive a smaller front of the camera and a wider field of view. You can see the variations here:  https://www.1stvision.com/cameras/GenX320-Starter-Kit-for-Rasberry-Pi-5

You’ll need to purchase the Raspberry Pi elsewhere, as 1st vision does not sell them.

Prophesee recommends the 8gb vision at a minimum, with the 16 recommended for on board vision computation. 

Prophesee also recommends getting the active cooler, 27 W Power supply and NVME adapter. 

  1. Active Cooler – SC1148
  2. Buy a Raspberry Pi Active Cooler – Raspberry Pi
  3. 27 W Power Supply
  4. Buy a Raspberry Pi 27W USB-C Power Supply – Raspberry Pi
  5. SSD Kit
  6. Buy a Raspberry Pi SSD Kit – Raspberry Pi

Software options

A further note, the Metavision 5 SDK will not run on the Raspberry Pi 5, as the CPU power is insufficient for that computational load. You’ll need to use the Metavision 4 OpenEB SDK. So to be clear, the SDK choices are:

SDK for Raspberry PiSDK for PC
Metavision 4 OpenEB
(no cost)
Metavision 5 SDK
(bundled offer or standalone purchase)
Metavision SDK options by processor preference

If you would like to talk it through, just call us at 978-474-0044. Or use the link below to request we get back to you by either e-mail or phone.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

Whitepaper: Event-based sensing paradigm

Except for sometimes compelling line-scan imaging, machine vision has been dominated by frame-based approaches. (Compare Area-scan vs. Line-scan). With an area-scan camera, the entire two-dimensional sensor array of x pixels by y pixels is read out and transmitted over the digital interface to the PC host. Whether USB3, GigE, CoaXPress, CameraLink, or any other interface, that’s a lot of image data to transport.

Download whitepaper
Event-based sensing as alternative to frame-based approach

If your application is about motion, why transmit the static pixels?

The question above is intentionally provocative, of course. One might ask, “do I have a choice?” With conventional sensors, one really doesn’t, as their pixels just convert light to electrons according to the physics of CMOS, and readout circuits move the array of charges on down the interface to the host PC, for algorithmic interpretation. There’s nothing wrong with that! Thousands of effective machine vision applications use precisely that frame-based paradigm. Or the line-scan approach, arguably a close cousin of the area-scan model.

Consider the four-frame sequence to the left, relative to a candidate golf-swing analysis application. Per the legend, with post-processing markup the blue-tinged golfer, club, and ball are undersampled in the sense that there are unshown phases of the swing.

Meanwhile the non-moving tree, grass, and sky are needlessly re-sampled in each frame.

It takes an expensive high-frame-rate sensor and interface to significantly increase the sample rate. Plus storage capacity for each frame. And/or processing capacity – for automated applications – to separate the motion segments from the static segments.

With event-based sensing, introduced below, one can achieve the equivalent of 10k fps – by just transmitting the pixels whose values change.

Images courtesy Prophesee Metavision.

Event-based sensing only transmits the pixels that changed

Unlike photography for social media or commercial advertising, where real-looking images are usually the goal, for machine vision it’s all about effective (automated) applications. In motion-oriented applications, we’re just trying to automatically control the robot arm, drive the car, monitor the secure perimeter, track the intruder(s), monitor the vibration, …

We’re NOT worried about color rendering, pretty images, or the static portions in the field of view (FOV). With event-based sensing, “high temporal imaging” is possible, since one need only pay attention to the pixels whose values change.

Consider the short video below. The left side shows a succession of frame-based images for a machine driven by an electric motor and belt. But the left hand image sequence is not a helpful basis for monitoring vibration with an eye to scheduling (or skipping) maintenance, or anticipating breakdowns.

The right-hand sequence was obtained with an event-based vision sensor (EVS), and absolutely reveals components with both “medium” and “significant” vibration. Here those thresholds have triggered color-mapped pseudo-images, to aid comprehension. But an automated application could map the coordinates to take action, such as gracefully shutting down the machine, scheduling maintenance according to calculated risk, etc.

Courtesy Prophesee Metavision

Another example to help make it real:

Here’s another short video, which brings to mind applications like autonomous vehicles and security. It’s not meant to be pretty – it’s meant to show the sensor detects and transmits just the pixels that correlate to change:

Courtesy Prophesee Metavision

Event-based sensing – it really is a different paradigm

Even (especially?) if you are seasoned at line-scan or area-scan imaging, it’s a paradigm shift to understand event-based sensing. Inspired by human vision, and built on the foundation of neuromorphic engineering, it’s a new technology – and it opens up new kinds of applications. Or alternative ways to address existing ones.

Download whitepaper
Event-based sensing as alternative to frame-based approach

Download the whitepaper and learn more about it! Or fill out our form below – we’ll follow up. Or just call us at 978-474-0044.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.

#EVS

#event-based

#neuromorphic

Prophesee event-based vision – a new paradigm

We don’t use terms like paradigm-shift lightly. Event-based vision (EBV) is shaking things up just as much as the arrival of frame-based and line-scan imaging once did . It’s that different. And why we are excited to offer Prophesee sensors and cameras.

Event-based vision applications areas – Courtesy Prophesee

Applications examples… and just enough about concepts

This informational blog skews towards applications examples, with just enough about concepts and EBV technology to lend balance. Event-based vision is so new and so different than previous vision technologies. We believe our readers may appreciate an examples-driven approach to understanding this radically new branch of machine vision.

Unless you’re an old hand at event-based vision (EBV) …

…and as of this writing in Summer 2025 few could be, let’s show a couple of short teaser videos before we explain the concepts or go deeper on applications.

Example 1: High-speed multiple-object counting and sizing

The following shows a use of event-based imaging to both count particles or objects and to estimate their size. All in a high-speed multiple concurrent target environment.

Courtesy Prophesee

Example 2: Eye-tracking – only need to track pupil changes

Now consider eye-tracking: in the video below, we see synchronized side-by-side views of the same scene. The left side was obtained with a frame-based sensor. The right side used a Prophesee event-based sensor. Why “waste” bandwidth and processing resources separating a pupil from an iris, eyelids, and eyebrows, when the goal is eye-tracking?

Courtesy Prophesee

Radically increased speed; massively less data

Prophesee Metavision‘s sensor tracks “just” what changes in the field of view, instead of the frame-based approach of reading out whole sensor arrays, transmitting voluminous data from the camera to the host, and algorithmically looking for edges, blobs, or features.

Contact us

Example 3: Object tracking

Not unlike the eye-tracking example above, but this time with multiple moving targets – vehicular traffic. With frame-based vision, bandwidth and processing has to repeatedly process static pavement, light poles, guard rails, and other “clutter” comprising 80% of the field of view. But with event-based vision the sensor only detects and transmits the moving vehicles.

Whether counting traffic for congestion planning/reporting, or collision avoidance abord a given vehicle, we don’t care what the vehicle looks like – only that it’s there and moving along a certain trajectory and at a detected speed.

Example 4: Surveillance and security

Prophesee named this video “Optical Flow Crowd”, which is best understood in the context of security and surveillance, we suggest. It’s not unlike the vehicular flow example above, except that cars and trucks mostly stay in-lane. Whereas pedestrians move at diverse angles. And their arm, leg, head, and torso movements also convey information.

The (computed) vector overlays indicate speed and directional changes, important to reveal potentially dangerous actions taken or likely to emerge. For example, does a raised arm indicate a handgun or knife being readied, or is it just a prelude to scratching an itchy nose? Is there a pursuer turning towards another pedestrian, to which a nearby policeman should be alerted?

Example 5: Vibration monitoring and preventative maintenance

Motorized equipment is typically service either on a preventative maintenance schedule and/or a break-fix approach, depending on costs, legal liabilities and risks, etc. What if one could inexpensively identify vibration patterns that reveal wearing belts or bearings before a breakdown and before there is preventable collateral damage to other components?

Courtesy Prophesee

Enough with the examples – how can I get event-based sensors?

1stVision is pleased to represent Prophesee with a wide range of sensors, evaluation kits, board-level and housed cameras, and an SDK designed for event-based vision applications.

Built on neuromorphic engineering principles inspired by the brain’s neural networks and human vision, Prophesee products are surprisingly affordable, enabling new fields of event-based vision. Or improving on frame-based vision applications that may be done better, faster, or less-expensively using an event-based approach.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of cameraslensescablesNIC cards and industrial computers, we can provide a full vision solution!

About you: We want to hear from you!  We’ve built our brand on our know-how and like to educate the marketplace on imaging technology topics…  What would you like to hear about?… Drop a line to info@1stvision.com with what topics you’d like to know more about.