{"id":4748,"date":"2025-07-30T12:15:39","date_gmt":"2025-07-30T16:15:39","guid":{"rendered":"https:\/\/www.1stvision.com\/machine-vision-solutions\/?p=4748"},"modified":"2025-09-04T15:42:11","modified_gmt":"2025-09-04T19:42:11","slug":"whitepaper-event-based-sensing-paradigm","status":"publish","type":"post","link":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html","title":{"rendered":"Whitepaper:  Event-based sensing paradigm"},"content":{"rendered":"\n<p>Except for sometimes compelling line-scan imaging, machine vision has been dominated by frame-based approaches. (Compare <a href=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2022\/02\/area-scan-camera-ideal-for-general-purpose-machine-vision-imaging-while-line-scan-best-for-continuous-materials-high-speed-inspection.html?siq_name=$[FNAME]$$[LNAME]$&amp;siq_email=$[EMAIL]$\" target=\"_blank\" rel=\"noreferrer noopener\">Area-scan vs. Line-scan<\/a>). With an area-scan camera, the entire two-dimensional sensor array of x pixels by y pixels is read out and transmitted over the digital <a href=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/guidelines-selecting-machine-vision-camera-interface.html?siq_name=$[FNAME]$$[LNAME]$&amp;siq_email=$[EMAIL]$\" target=\"_blank\" rel=\"noreferrer noopener\">interface<\/a> to the PC host. Whether USB3, GigE, CoaXPress, CameraLink, or any other interface, that&#8217;s a lot of image data to transport.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><a href=\"https:\/\/www.1stvision.com\/knowledgebase2\/1018\/?siq_name=$[FNAME]$$[LNAME]$&amp;siq_email=$[EMAIL]$\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"390\" height=\"90\" src=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/04\/RED-Download-Whitepaper.jpg\" alt=\"Download whitepaper\" class=\"wp-image-1846\" srcset=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/04\/RED-Download-Whitepaper.jpg 390w, https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/04\/RED-Download-Whitepaper-300x69.jpg 300w\" sizes=\"auto, (max-width: 390px) 100vw, 390px\" \/><\/a><figcaption class=\"wp-element-caption\">Event-based sensing as alternative to frame-based approach<\/figcaption><\/figure>\n<\/div>\n\n\n<h2 class=\"wp-block-heading\">If your application is about motion, why transmit the static pixels?<\/h2>\n\n\n\n<p>The question above is intentionally provocative, of course.  One might ask, &#8220;do I have a choice?&#8221;   With conventional sensors, one really doesn&#8217;t, as their pixels just convert light to electrons according to the physics of CMOS, and readout circuits move the array of charges on down the interface to the host PC, for algorithmic interpretation.  There&#8217;s nothing wrong with that!  Thousands of effective machine vision applications use precisely that frame-based paradigm.  Or the line-scan approach, arguably a close cousin of the area-scan model.<\/p>\n\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile\"><figure class=\"wp-block-media-text__media\"><img loading=\"lazy\" decoding=\"async\" width=\"240\" height=\"966\" src=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg\" alt=\"\" class=\"wp-image-4750 size-full\"\/><\/figure><div class=\"wp-block-media-text__content\">\n<p>Consider the four-frame sequence to the left, relative to a candidate golf-swing analysis application.  Per the legend, with post-processing markup the blue-tinged golfer, club, and ball are undersampled in the sense that there are unshown phases of the swing.   <\/p>\n\n\n\n<p>Meanwhile the non-moving tree, grass, and sky are needlessly re-sampled in each frame.<\/p>\n\n\n\n<p>It takes an expensive high-frame-rate sensor and interface to significantly increase the sample rate. Plus storage capacity for each frame. And\/or processing capacity &#8211; for automated applications &#8211; to separate the motion segments from the static segments.<\/p>\n\n\n\n<p>With event-based sensing, introduced below, one can achieve the equivalent of 10k fps &#8211; by just transmitting the pixels whose values change.<\/p>\n\n\n\n<p>Images courtesy Prophesee Metavision.<\/p>\n<\/div><\/div>\n\n\n\n<h2 class=\"wp-block-heading\">Event-based sensing <strong>only<\/strong> transmits the pixels that changed<\/h2>\n\n\n\n<p>Unlike photography for social media or commercial advertising, where real-looking images are usually the goal, for machine vision it&#8217;s all about effective (automated) applications. In motion-oriented applications, we&#8217;re just trying to automatically control the robot arm, drive the car, monitor the secure perimeter, track the intruder(s), monitor the vibration, &#8230;<\/p>\n\n\n\n<p>We&#8217;re NOT worried about color rendering, pretty images, or the static portions in the field of view (FOV).  With event-based sensing, &#8220;high temporal imaging&#8221; is possible, since one need only pay attention to the pixels whose values change.<\/p>\n\n\n\n<p>Consider the short video below. The left side shows a succession of frame-based images for a machine driven by an electric motor and belt. But the left hand image sequence is not a helpful basis for monitoring vibration with an eye to scheduling (or skipping) maintenance, or anticipating breakdowns.<\/p>\n\n\n\n<p>The right-hand sequence was obtained with an event-based vision sensor (EVS), and absolutely reveals components with both &#8220;medium&#8221; and &#8220;significant&#8221; vibration. Here those thresholds have triggered color-mapped pseudo-images, to aid comprehension. But an automated application could map the coordinates to take action, such as gracefully shutting down the machine, scheduling maintenance according to calculated risk, etc.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"PROPHESEE Vibration Monitoring Event-Based Vision\" width=\"525\" height=\"295\" src=\"https:\/\/www.youtube.com\/embed\/mOp-GbrTzOM?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><figcaption class=\"wp-element-caption\">Courtesy Prophesee Metavision<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Another example to help make it real:<\/h2>\n\n\n\n<p>Here&#8217;s another short video, which brings to mind applications like autonomous vehicles and security.  It&#8217;s not meant to be pretty &#8211; it&#8217;s meant to show the sensor detects and transmits just the pixels that correlate to change:<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-4-3 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Metavision\u00ae Intelligence Machine Learning - Inference\" width=\"525\" height=\"394\" src=\"https:\/\/www.youtube.com\/embed\/Aih62lPlD-c?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><figcaption class=\"wp-element-caption\">Courtesy Prophesee Metavision<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Event-based sensing &#8211; it really is a different paradigm<\/h2>\n\n\n\n<p>Even (especially?) if you are seasoned at line-scan or area-scan imaging, it&#8217;s a paradigm shift to understand event-based sensing. Inspired by human vision, and built on the foundation of neuromorphic engineering, it&#8217;s a new technology &#8211; and it opens up new kinds of applications. Or alternative ways to address existing ones.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><a href=\"https:\/\/www.1stvision.com\/knowledgebase2\/1018\/?siq_name=$[FNAME]$$[LNAME]$&amp;siq_email=$[EMAIL]$\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"390\" height=\"90\" src=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/04\/RED-Download-Whitepaper.jpg\" alt=\"Download whitepaper\" class=\"wp-image-1846\" srcset=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/04\/RED-Download-Whitepaper.jpg 390w, https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/04\/RED-Download-Whitepaper-300x69.jpg 300w\" sizes=\"auto, (max-width: 390px) 100vw, 390px\" \/><\/a><figcaption class=\"wp-element-caption\">Event-based sensing as alternative to frame-based approach<\/figcaption><\/figure>\n<\/div>\n\n\n<p>Download the whitepaper and learn more about it!  Or fill out our form below &#8211; we&#8217;ll follow up.  Or just call us at 978-474-0044.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter\"><a href=\"https:\/\/www.1stvision.com\/contactForm?siq_name=$[FNAME]$$[LNAME]$&amp;siq_email=$[EMAIL]$\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"173\" height=\"41\" src=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/05\/Contact-Us-1.jpg\" alt=\"\" class=\"wp-image-1866\"\/><\/a><\/figure>\n<\/div>\n\n\n<p><a href=\"https:\/\/www.1stvision.com\/?siq_name=$%5bFNAME%5d$%20$%5bLNAME%5d$&amp;siq_email=$%5bEMAIL%5d$\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>1st Vision\u2019s<\/strong><\/a>&nbsp;sales engineers have over 100 years of combined experience to assist in your camera and components selection.&nbsp; With a large portfolio of&nbsp;<a href=\"https:\/\/www.1stvision.com\/cameras\/industrialCameras?siq_name=$[FNAME]$$[LNAME]$&amp;siq_email=$[EMAIL]$\" target=\"_blank\" rel=\"noreferrer noopener\">cameras<\/a>,&nbsp;<a href=\"https:\/\/www.1stvision.com\/lens\/machine-vision-lenses?siq_name=$%5bFNAME%5d$%20$%5bLNAME%5d$&amp;siq_email=$%5bEMAIL%5d$\" target=\"_blank\" rel=\"noreferrer noopener\">lenses<\/a>,&nbsp;<a href=\"https:\/\/www.1stvision.com\/cameras\/accessories\/Data-cables?siq_name=$%5bFNAME%5d$%20$%5bLNAME%5d$&amp;siq_email=$%5bEMAIL%5d$\" target=\"_blank\" rel=\"noreferrer noopener\">cables<\/a>,&nbsp;<a href=\"https:\/\/www.1stvision.com\/cameras\/accessories\/NIC-cards?siq_name=$%5bFNAME%5d$%20$%5bLNAME%5d$&amp;siq_email=$%5bEMAIL%5d$\" target=\"_blank\" rel=\"noreferrer noopener\">NIC cards<\/a>&nbsp;and&nbsp;<a href=\"https:\/\/www.1stvision.com\/1stvision-industrial-computer-systems\/embedded-industrial-computer?siq_name=$%5bFNAME%5d$%20$%5bLNAME%5d$&amp;siq_email=$%5bEMAIL%5d$\" target=\"_blank\" rel=\"noreferrer noopener\">industrial computers<\/a>, we can provide a&nbsp;<a href=\"https:\/\/www.1stvision.com\/components-needed-for-machine-vision-and-industrial-imaging-systems?siq_name=$%5bFNAME%5d$%20$%5bLNAME%5d$&amp;siq_email=$%5bEMAIL%5d$\" target=\"_blank\" rel=\"noreferrer noopener\">full vision solution<\/a>!<\/p>\n\n\n\n<p><strong><u>About you:<\/u><\/strong>\u00a0We want to hear from you!\u00a0 We\u2019ve built our brand on our know-how and like to educate the marketplace on imaging technology topics\u2026\u00a0 What would you like to hear about?\u2026 Drop a line to info@1stvision.com with what topics you\u2019d like to know more about.<\/p>\n\n\n\n<p>#EVS<\/p>\n\n\n\n<p>#event-based<\/p>\n\n\n\n<p>#neuromorphic<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Except for sometimes compelling line-scan imaging, machine vision has been dominated by frame-based approaches. (Compare Area-scan vs. Line-scan). With an area-scan camera, the entire two-dimensional sensor array of x pixels by y pixels is read out and transmitted over the digital interface to the PC host. Whether USB3, GigE, CoaXPress, CameraLink, or any other interface, &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Whitepaper:  Event-based sensing paradigm&#8221;<\/span><\/a><\/p>\n","protected":false},"author":10,"featured_media":4750,"comment_status":"closed","ping_status":"closed","sticky":true,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[8,396],"tags":[],"class_list":["post-4748","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-cameras","category-event-based-vision"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v23.1 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Whitepaper: Event-based sensing paradigm<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Whitepaper: Event-based sensing paradigm\" \/>\n<meta property=\"og:description\" content=\"Except for sometimes compelling line-scan imaging, machine vision has been dominated by frame-based approaches. (Compare Area-scan vs. Line-scan). With an area-scan camera, the entire two-dimensional sensor array of x pixels by y pixels is read out and transmitted over the digital interface to the PC host. Whether USB3, GigE, CoaXPress, CameraLink, or any other interface, &hellip; Continue reading &quot;Whitepaper: Event-based sensing paradigm&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html\" \/>\n<meta property=\"og:site_name\" content=\"1stVision Inc. - Machine Vision Articles\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/pages\/1st-Vision\/944658058935262?fref=ts\" \/>\n<meta property=\"article:published_time\" content=\"2025-07-30T16:15:39+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-09-04T19:42:11+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"240\" \/>\n\t<meta property=\"og:image:height\" content=\"966\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"1stVision\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@1stvision_\" \/>\n<meta name=\"twitter:site\" content=\"@1stvision_\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"1stVision\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html\"},\"author\":{\"name\":\"1stVision\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/person\/d4d7cc92f4d51d8337c7c37ca33bcfad\"},\"headline\":\"Whitepaper: Event-based sensing paradigm\",\"datePublished\":\"2025-07-30T16:15:39+00:00\",\"dateModified\":\"2025-09-04T19:42:11+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html\"},\"wordCount\":754,\"publisher\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg\",\"articleSection\":[\"Cameras\",\"Event-based vision\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html\",\"url\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html\",\"name\":\"Whitepaper: Event-based sensing paradigm\",\"isPartOf\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg\",\"datePublished\":\"2025-07-30T16:15:39+00:00\",\"dateModified\":\"2025-09-04T19:42:11+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#primaryimage\",\"url\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg\",\"contentUrl\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg\",\"width\":240,\"height\":966},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Whitepaper: Event-based sensing paradigm\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#website\",\"url\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/\",\"name\":\"1stVision Inc. - Machine Vision Articles\",\"description\":\"Industrial Imaging technical blog\",\"publisher\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#organization\",\"name\":\"1stVision\",\"url\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/09\/1stvsionLogo.png\",\"contentUrl\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/09\/1stvsionLogo.png\",\"width\":205,\"height\":51,\"caption\":\"1stVision\"},\"image\":{\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/pages\/1st-Vision\/944658058935262?fref=ts\",\"https:\/\/x.com\/1stvision_\",\"https:\/\/www.linkedin.com\/company\/1stvision-inc-\",\"https:\/\/www.youtube.com\/user\/1stVisionInc\u00a0\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/person\/d4d7cc92f4d51d8337c7c37ca33bcfad\",\"name\":\"1stVision\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/df386023d69fefe87c592dadad973428a888e3cde98443bc1b4ac816ed942d34?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/df386023d69fefe87c592dadad973428a888e3cde98443bc1b4ac816ed942d34?s=96&d=mm&r=g\",\"caption\":\"1stVision\"},\"url\":\"https:\/\/www.1stvision.com\/machine-vision-solutions\/author\/scott-smith\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Whitepaper: Event-based sensing paradigm","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html","og_locale":"en_US","og_type":"article","og_title":"Whitepaper: Event-based sensing paradigm","og_description":"Except for sometimes compelling line-scan imaging, machine vision has been dominated by frame-based approaches. (Compare Area-scan vs. Line-scan). With an area-scan camera, the entire two-dimensional sensor array of x pixels by y pixels is read out and transmitted over the digital interface to the PC host. Whether USB3, GigE, CoaXPress, CameraLink, or any other interface, &hellip; Continue reading \"Whitepaper: Event-based sensing paradigm\"","og_url":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html","og_site_name":"1stVision Inc. - Machine Vision Articles","article_publisher":"https:\/\/www.facebook.com\/pages\/1st-Vision\/944658058935262?fref=ts","article_published_time":"2025-07-30T16:15:39+00:00","article_modified_time":"2025-09-04T19:42:11+00:00","og_image":[{"width":240,"height":966,"url":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg","type":"image\/jpeg"}],"author":"1stVision","twitter_card":"summary_large_image","twitter_creator":"@1stvision_","twitter_site":"@1stvision_","twitter_misc":{"Written by":"1stVision","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#article","isPartOf":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html"},"author":{"name":"1stVision","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/person\/d4d7cc92f4d51d8337c7c37ca33bcfad"},"headline":"Whitepaper: Event-based sensing paradigm","datePublished":"2025-07-30T16:15:39+00:00","dateModified":"2025-09-04T19:42:11+00:00","mainEntityOfPage":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html"},"wordCount":754,"publisher":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#organization"},"image":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#primaryimage"},"thumbnailUrl":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg","articleSection":["Cameras","Event-based vision"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html","url":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html","name":"Whitepaper: Event-based sensing paradigm","isPartOf":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#primaryimage"},"image":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#primaryimage"},"thumbnailUrl":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg","datePublished":"2025-07-30T16:15:39+00:00","dateModified":"2025-09-04T19:42:11+00:00","breadcrumb":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#primaryimage","url":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg","contentUrl":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2025\/07\/Over-vs-undersampled-area-scan.jpg","width":240,"height":966},{"@type":"BreadcrumbList","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/2025\/07\/whitepaper-event-based-sensing-paradigm.html#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.1stvision.com\/machine-vision-solutions"},{"@type":"ListItem","position":2,"name":"Whitepaper: Event-based sensing paradigm"}]},{"@type":"WebSite","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#website","url":"https:\/\/www.1stvision.com\/machine-vision-solutions\/","name":"1stVision Inc. - Machine Vision Articles","description":"Industrial Imaging technical blog","publisher":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.1stvision.com\/machine-vision-solutions\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#organization","name":"1stVision","url":"https:\/\/www.1stvision.com\/machine-vision-solutions\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/logo\/image\/","url":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/09\/1stvsionLogo.png","contentUrl":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-content\/uploads\/2021\/09\/1stvsionLogo.png","width":205,"height":51,"caption":"1stVision"},"image":{"@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/pages\/1st-Vision\/944658058935262?fref=ts","https:\/\/x.com\/1stvision_","https:\/\/www.linkedin.com\/company\/1stvision-inc-","https:\/\/www.youtube.com\/user\/1stVisionInc\u00a0"]},{"@type":"Person","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/person\/d4d7cc92f4d51d8337c7c37ca33bcfad","name":"1stVision","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.1stvision.com\/machine-vision-solutions\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/df386023d69fefe87c592dadad973428a888e3cde98443bc1b4ac816ed942d34?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/df386023d69fefe87c592dadad973428a888e3cde98443bc1b4ac816ed942d34?s=96&d=mm&r=g","caption":"1stVision"},"url":"https:\/\/www.1stvision.com\/machine-vision-solutions\/author\/scott-smith"}]}},"_links":{"self":[{"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/posts\/4748","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/users\/10"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/comments?post=4748"}],"version-history":[{"count":6,"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/posts\/4748\/revisions"}],"predecessor-version":[{"id":4759,"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/posts\/4748\/revisions\/4759"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/media\/4750"}],"wp:attachment":[{"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/media?parent=4748"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/categories?post=4748"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1stvision.com\/machine-vision-solutions\/wp-json\/wp\/v2\/tags?post=4748"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}