ebook img

Design and Implementation of a Scalable Camera Array for High-Speed Imaging PDF

114 Pages·2016·8.2 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Design and Implementation of a Scalable Camera Array for High-Speed Imaging

GOSLOW: DESIGN AND IMPLEMENTATION OF A SCALABLE CAMERA ARRAY FOR HIGH-SPEED IMAGING cecill e etheredge . Computer Architecture for Embedded Systems (CAES) Faculty of Electrical Engineering, Mathematics and Computer Science University of Twente 2016 July, [August25,2016at17:40–classicthesis] ABSTRACT Thisthesisinvestigatestheviabilityofusingmultiplelow-costcompo- nentstocreatesomethingmuchbigger:anembeddedsystemcapable ofcapturinghigh-speedvideousingascalableandconfigurablearray of sensors. Various hard- and software components are designed, im- plemented and verified as part of a new embedded system platform, that, not only achieves high-speed imaging capabilities by using an array of imaging sensors, but also provides a scalable and reusable design for future camera array systems. ii [August25,2016at17:40–classicthesis] ACKNOWLEDGMENTS First of all, I would like to thank prof. Marco Bekooij, for his belief in hands-on research and real-world practicality has enabled this re- search to go forward and develop into this thesis. Secondly, I would like to thank my family as an endless source of optimism to explore technology, and to venture on research such as this. A thanks to all colleagues and folks in the CAES group. I would like to thank Og˘uz Meteer, whose countless hours of insights and ex- perienceprovidedthefundamentalsforwithoutthisthesiswouldnot havebeenpossible,andformostofhisborrowedelectronicsinvolved still have to be returned at the time of this writing. Thanks goes out to Stephen Ecob from Silicon On Inspiration in Australia, providing the much needed hardware building blocks on more than one occasion and his continued support essential during theproductionoftheprototype.AlsothankstoAliceforsourcingthe much needed BGA components in Hong Kong, and folks at Proto- Service for providing their field expertise in the final BGA assembly. iii [August25,2016at17:40–classicthesis] CONTENTS 1 introduction 1 11 2 . Problem definition . . . . . . . . . . . . . . . . . . . . . 12 3 . Contributions . . . . . . . . . . . . . . . . . . . . . . . . 13 4 . Thesis outline . . . . . . . . . . . . . . . . . . . . . . . . 2 background 6 21 6 . Digital image sensors . . . . . . . . . . . . . . . . . . . . 22 10 . High-speed imaging . . . . . . . . . . . . . . . . . . . . 23 12 . Image sensor arrays . . . . . . . . . . . . . . . . . . . . . 3 high level system design 14 - 4 hardware domain implementation 17 41 17 . Sensor receiver interface . . . . . . . . . . . . . . . . . . 411 17 . . Physical signaling . . . . . . . . . . . . . . . . . 412 21 . . Timing characteristics . . . . . . . . . . . . . . . 413 24 . . FPGA implementation . . . . . . . . . . . . . . . 42 25 . Streaming DRAM controller . . . . . . . . . . . . . . . . 421 26 . . Dynamic Random-Access Memory (DRAM) . . 422 28 . . Access pattern predictability . . . . . . . . . . . 423 29 . . DRAM protocol . . . . . . . . . . . . . . . . . . . 424 32 . . Peak transfer rate . . . . . . . . . . . . . . . . . . 425 33 . . Protocol state machine . . . . . . . . . . . . . . . 426 36 . . FPGA implementation . . . . . . . . . . . . . . . 427 37 . . Command arbitration . . . . . . . . . . . . . . . 43 38 . Stream interleaver . . . . . . . . . . . . . . . . . . . . . . 431 40 . . FPGA implementation . . . . . . . . . . . . . . . 44 41 . Embedded control processor . . . . . . . . . . . . . . . 45 42 . Sensor control interface . . . . . . . . . . . . . . . . . . 451 2 43 . . I C protocol overview . . . . . . . . . . . . . . . 452 43 . . Camera control interface (CCI) . . . . . . . . . . 453 44 . . FPGA implementation . . . . . . . . . . . . . . . 454 45 . . Phased start . . . . . . . . . . . . . . . . . . . . . 46 48 . Readout interface . . . . . . . . . . . . . . . . . . . . . . 461 48 . . FPGA implementation . . . . . . . . . . . . . . . 47 49 . Clock domain crossing . . . . . . . . . . . . . . . . . . . 5 software domain implementation 51 51 51 . Embedded control . . . . . . . . . . . . . . . . . . . . . 52 53 . Stream decoder . . . . . . . . . . . . . . . . . . . . . . . 521 54 . . Deinterleaving . . . . . . . . . . . . . . . . . . . 522 54 . . Bit slip correction . . . . . . . . . . . . . . . . . . 523 55 . . Protocol layer decoder . . . . . . . . . . . . . . . 53 56 . Image rectification . . . . . . . . . . . . . . . . . . . . . 531 57 . . Mathematical models . . . . . . . . . . . . . . . 532 58 . . Camera intrinsics and extrinsics . . . . . . . . . iv [August25,2016at17:40–classicthesis] contents v 533 62 . . Rectification . . . . . . . . . . . . . . . . . . . . . 534 62 . . Camera calibration . . . . . . . . . . . . . . . . . 6 dataflow analysis 65 61 66 . Throughput analysis . . . . . . . . . . . . . . . . . . . . 611 67 . . Scenario-aware dataflow graph . . . . . . . . . . 612 70 . . Throughput analysis . . . . . . . . . . . . . . . . 613 71 . . Effects on array size . . . . . . . . . . . . . . . . 62 71 . Buffer size analysis . . . . . . . . . . . . . . . . . . . . . 621 72 . . Latency-rate SRDF graph . . . . . . . . . . . . . 622 75 . . Real-world case analysis . . . . . . . . . . . . . . 7 realization and results 78 71 78 . Sensor array configuration . . . . . . . . . . . . . . . . . 72 79 . Choice of hardware . . . . . . . . . . . . . . . . . . . . . 73 81 . Synthesis results . . . . . . . . . . . . . . . . . . . . . . . 74 85 . Measurement setup . . . . . . . . . . . . . . . . . . . . . 75 86 . Experimental results . . . . . . . . . . . . . . . . . . . . 8 conclusion 90 81 92 . Future work . . . . . . . . . . . . . . . . . . . . . . . . . i appendix 94 a sadf graph listing 95 b stroboscope program listing 98 bibliography 100 [August25,2016at17:40–classicthesis] LIST OF FIGURES 1 Figure Block diagram showing a simplified overview of the processes involved in a modern color image sensor, from incoming scene light to fi- 6 nal output image. . . . . . . . . . . . . . . . . . 2 Figure ExampleofaBayerpatternencodedimage(left) and the resulting full color image after the de- 7 bayering process (right). . . . . . . . . . . . . . 3 Figure ReadoutarchitecturesforconventionalCCDand CMOS sensors. Photodiodes and capacitive el- ementsarerespectivelycoloreddarkgreenand 8 yellow. . . . . . . . . . . . . . . . . . . . . . . . 4 Figure Rollingshuttermechanisminaction.Leftshows the shutter’s direction of movement in terms of pixel rows. Right contains a plot of row ex- posure over time, clearly showing the sliding 9 window of time. . . . . . . . . . . . . . . . . . . 5 Figure Simplified view of two active pixels with dif- ferent shutters in a CMOS sensor at the sili- con level. The addition of a memory element 10 causes occlusion of an area of the photodiode. 6 Figure Talbot’s high speed photography experiment. Apaperdisciscapturedatstandstill(left),spin- 10 ningandcapturedwithanexposuretimeof 1 ms (middle), and with an exposure time of 11 11 ms (right). Images courtesy of [VM ].. . . . . 7 Figure The Stanford Multi-Camera array setup as de- 05 12 scribed in [Wil+ ]. . . . . . . . . . . . . . . . . 8 Figure Abstract system diagram illustrating the hard- ware and software domain boundaries, their high-level components and the corresponding 14 dataflows in the system. . . . . . . . . . . . . . 9 Figure Hardware domain diagram showing its vari- ous subsystems and components. Orange in- dicates a component external to the hardware domain, grey is a custom subsystem imple- mented by FPGA logic, yellow is a logic ele- ment and green represents software. Arrows indicate dataflows, with red representing the capture stage, and blue representing the read- 15 out stage. . . . . . . . . . . . . . . . . . . . . . . vi [August25,2016at17:40–classicthesis] ListofFigures vii 10 Figure Software domain diagram showing its various subsystemsandcomponents,includingtheem- beddedcontrol.Orangeindicatesacomponent external to the software domain, grey is a cus- tom subsystem implemented in software and yellow is a specific software component. Ar- rowsindicatedataflows,withredrepresenting 16 the primary video dataflow. . . . . . . . . . . . 11 Figure StandardparallelCMOSsensorinterface.Data lines D through D represent the N-bit en- 0 N 18 coded pixels of the output image. . . . . . . . . 12 Figure LVDS output circuitry: a current-mode driver drivesadifferentialpairline,withcurrentflow- ing in opposite directions. Green and red de- notethecurrentflowforrespectivelyhigh("one") 19 and low ("zero") output values. . . . . . . . . . 13 Figure Simplified fragment of a video data transmis- 2 sion using MIPI CSI- (top) and HiSPi (bot- tom). LP indicates logic high and low using single-ended signaling, while all other signals are LVDS. Green respresents image data, all othercolorsrepresentcontrolwordsasdefined 20 by the respective protocols. . . . . . . . . . . . 14 Figure TypicalCMOSsensorlayoutintermsofphysi- cal pixels (left) and corresponding structure of theimageframereadoutdata(right),asfound 9 021 12 21 in the Aptina MT M CMOS sensor [Apt b]. 15 Figure Topology within a rank of DRAM devices. A bankconsistsofmultipleDRAMs,eachofwhich supplies an N-bit data word using row and 27 column addressing. . . . . . . . . . . . . . . . . 16 3 Figure Common layout of a DDR DIMM, populated atbothsideswithmultipleDRAMs.TheDIMM is typically structured in such a way that each 27 side represents a single DRAM rank. . . . . . . 17 Figure Simplified illustration of the pipelined DRAM 8 command timing in case of two BL write re- quests.Greenandblueindicateanycommands, addresses and data bits belonging to the same 32 request. . . . . . . . . . . . . . . . . . . . . . . . 18 Figure Original simplified DRAM protocol state dia- 08 gramasin[JED ].Powerandcalibrationstates 34 have been omitted. . . . . . . . . . . . . . . . . 19 Figure Redesigned DRAM state diagram optimized forsequentialburstwrites.Burst-relatedstates 35 are highlighted in blue. . . . . . . . . . . . . . . [August25,2016at17:40–classicthesis] ListofFigures viii 20 2 7 Figure I Cdatatransmissionusing -bitslaveaddress- 03 ingwithtwodatawordsasdescribedin[Phi ]. Grey blocks are transmissions from master to slave,whitearefromslavetomasterandgreen depends on the type of operation. Bit widths 44 are denoted below. . . . . . . . . . . . . . . . . 21 Figure Example CCI data transmission addressing an 8 16 -bitregisterusinga -bitaddress.Greyblocks are transmissions from master to slave, white arefromslavetomasterandgreendependson the type of operation. Bit widths are denoted 44 below. . . . . . . . . . . . . . . . . . . . . . . . . 22 8 Figure Phasedstartwith sensorsplottedagainsttime. Darkregionsrepresentexposureduration,and lightregionsrepresentreadoutdurationforall 46 sensors. . . . . . . . . . . . . . . . . . . . . . . . 23 Figure The two different types of synchronizers used in our system. Green and blue represent the two different clock domains, and red acts as 50 stabilizing logic in between. . . . . . . . . . . . 24 Figure Simplified fragment of a video data transmis- 2 sion using MIPI CSI- (top) and HiSPi (bot- tom). LP indicates logic high and low using single-ended signaling, while all other signals are LVDS. Green respresents image data, all othercolorsrepresentcontrolwordsasdefined 55 by the respective protocols. . . . . . . . . . . . 25 Figure OpenCVFeaturedetectionusingaplanarchecker- boardpatternasmountedtoawall.Thevisual markerinthecenterisnotactuallyusedinthe 63 algorithm. . . . . . . . . . . . . . . . . . . . . . 26 Figure SADF graph representing the video streaming behaviour in our system and used as a basis 68 for dataflow analysis. . . . . . . . . . . . . . . . Figure 27 Corresponding FSM for the Dref node in our dataflowmodel,representingwriteandrefresh 68 states with write as initial state. . . . . . . . . . Figure 28 ImaginaryFSMfortheDref nodeinourdataflow model,usingcounterstoswitchbetweenstates 70 and scenarios. . . . . . . . . . . . . . . . . . . . 29 Figure Simplified latency-rate model of our system, with the SRDF graph at the top and the cor- responding task graph at the bottom. Dotted lines represent the imaginary flow of data in 72 the system, and are not part of the actual graph. [August25,2016at17:40–classicthesis] ListofFigures ix 30 Figure Stages of development for the camera module PCB, from CAD design to production to pick 80 and placing of SMD and BGA components. . . 31 Figure Finalprototypehardware,showingthefivein- dividual sensor modules (left) and the final setupasevaluatedwiththemodulesandFPGA 81 hardware connected (right). . . . . . . . . . . . 32 10 Figure Stroboscope hardware, showing an array of 10 area LED lights driven by MOSFETs and 85 connected to a real-time embedded system. . . 33 Figure Timingcharacteristicsofthestroboscope,strob- 1000 ingata Hzrate,showingtwoLED-driving MOSFETsbeingsubsequentlytriggeredat1ms with an error of only 0.02%. Measured using a 4GHz logic analyzer. . . . . . . . . . . . . . . . 86 34 Figure Finalframesascapturedbyfoursensorsofthe 240 prototype at Hz after being processed in the software domain, ordered from top-left to bottom-right and showing a seamlessly over- lappingcaptureofthetimedstroboscopelights from sensor to sensor. Each subsequent frame is captured by a different sensor, at different subsequent points in time. Stroboscope lights run from bottom left to top right in time, as 87 indicated by the white arrow in the first image. 35 Figure Sensor timing diagram corresponding to our foursensorprototypeat240Hzandtheframes 34 in Figure . Our stroboscope LED timing is shown on top, with each number and color 10 representing one of the LEDs that is turned on at thattime for1ms. Dark greenrepresents a single frame, or the light integrated by a sin- glesensorinthetimedomainforasingleframe exposure,i.e.thefirstframecaptureslightfrom 4 differentstrobeLEDs.Lightgreenrepresents theunavoidabletimeinwhichasensor’sshut- ter is closed and image data is read out to our system.Whenallframesarecombined,theyin fact form a continuous integration of physical light with an exposure time and capture rate equal to 4.2ms or 240Hz as witnessed in Fig- 34 88 ure . . . . . . . . . . . . . . . . . . . . . . . . 36 2 Figure TimingcharacteristicsoftheI Cexposurestart commands as sent to different sensors in case ofahypothetical1200Hz(833us)capturerate. Measured using a 4GHz logic analyzer. . . . . 89 [August25,2016at17:40–classicthesis] 1 INTRODUCTION Overthepastcentury,theworldhasseenasteadyincreaseintechno- logical advancement in the field of digital photography. Now more than ever, consumers rely on the availability of advanced digital cam- eras in personal devices such as mobile phones, tablets and personal computers to snap high resolution pictures and videos, all the while cinematographersinthefieldfindthemselveswithaneverincreasing varietyofveryhigh-endcameras.Andinbetweenthesetwomarkets, we have witnessed the rise of an entirely new "prosumer" segment, 4 wielding action cameras and K camcorders in the lower high-end spectrum to capture semi-professional video. Some of the most obvious advancements in cameras can be found in higher resolution imaging capabilities of sensors, as well as semi- conductor production techniques that have seen vast improvements over the years. As a result, sensors have become smaller, more capa- ble, and cheaper to produce, and the cost of including such a sensor of acceptable quality in a new embedded consumer product is rela- tively low, especially in the lower segment of products. Furthermore, some parts of these sensors have become standardized in the indus- try,practicallyturningmanyimagingsensorsandrelatedtechnology into commercial off-the-shelf components. Though, one of the areas of videography that has seen increasing demandbutquiteconservativetechnologicalinnovationisthefieldof high-speed imaging. With the advent of television and on-line docu- mentary series revolving around the capture of slow motion footage, consumers and prosumers have been voicing their interest in want- ing to create slow motion video at home. While professional high- speed cameras have long been available, their prices are far outside the reach of these markets, often costing up to three orders of magni- tudemorethanconventionalcameras.Thereasonforthisisquitesim- ple:high-speedimagingputsextremedemandsonanimagingsensor and its surrounding platform in terms of bandwidth, noise and tim- ing as we will see in this thesis, therefore raising the cost price of a single high-speed imaging sensor into the hundreds or thousands of dollars, let alone the required effort and expertise to design the surrounding hardware. Of course, this does not mean that there is no other way to pro- vide technological innovation in this field. Taking into account that, at the lower end, imaging sensors are becoming more standardized and cheaper, the question arises as to whether it is now possible to usemultiplelow-endsensorstoachievethesameasasinglehigh-end 1 [August25,2016at17:40–classicthesis]

Description:
through a color filter array (CFA) that contains a predefined pattern of colors and is .. Verilog code do fall outside the scope of this paper. Next to the
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.