remote sensing Article Capability Assessment and Performance Metrics for the Titan Multispectral Mapping Lidar JuanCarlosFernandez-Diaz1,2,*,WilliamE.Carter1,2,CraigGlennie1,2,RameshL.Shrestha1,2, ZhigangPan1,2,NimaEkhtari1,2,AbhinavSinghania1,2,DarrenHauser1,2 andMichaelSartori1,2 1 NationalCenterforAirborneLaserMapping(NCALM),Houston,TX77204,USA; [email protected](W.E.C.);[email protected](C.G.);[email protected](R.L.S.); [email protected](Z.P.);[email protected](N.E.);[email protected](A.S.); [email protected](D.H.);[email protected](M.S.) 2 DepartmentofCivilandEnvironmentalEngineering,UniversityofHouston,Houston,TX77204,USA * Correspondence:[email protected];Tel.:+1-832-842-8884 AcademicEditors:JieShan,JuhaHyyppä,GuoqingZhou,WolfgangWagnerandPrasadS.Thenkabail Received:17August2016;Accepted:3November2016;Published:10November2016 Abstract: In this paper we present a description of a new multispectral airborne mapping light detection and ranging (lidar) along with performance results obtained from two years of data collection and test campaigns. The Titan multiwave lidar is manufactured by Teledyne Optech Inc. (Toronto, ON, Canada) and emits laser pulses in the 1550, 1064 and 532 nm wavelengths simultaneously through a single oscillating mirror scanner at pulse repetition frequencies (PRF) thatrangefrom50to300kHzperwavelength(maxcombinedPRFof900kHz). TheTitansystem can perform simultaneous mapping in terrestrial and very shallow water environments and its multispectralcapabilityenablesnewapplications,suchastheproductionoffalsecoloractiveimagery derivedfromthelidarreturnintensitiesandtheautomatedclassificationoftargetandlandcovers. Field tests and mapping projects performed over the past two years demonstrate capabilities to classifyfivelandcoversinurbanenvironmentswithanaccuracyof90%,mapbathymetryunder more than 15 m of water, and map thick vegetation canopies at sub-meter vertical resolutions. Inadditiontoitsmultispectralandperformancecharacteristics,theTitansystemisdesignedwith severalredundanciesanddiversityschemesthathaveproventobebeneficialforbothoperationsand theimprovementofdataquality. Keywords: airborne laser scanning; mapping lidar; multispectral lidar; lidar bathymetry; lidar accuracy;lidarrangeresolution;activeimagery 1. Introduction Overthepasttwodecadesairbornemappinglightdetectionandranging(lidar), alsoknown as airborne laser scanning (ALS), has become one of the prime remote sensing technologies for samplingtheEarth’ssurfaceandlandcoverinthreedimensions(3D),especiallyinareascoveredby vegetationcanopies[1]. Inadditiontotherange(spatial)informationderivedfromtime-of-flight(ToF) measurements, pulsed airborne lidar sensors deliver an arbitrarily scaled measure of the strength of the optical backscattered signal that is proportional to the radiance incident on the detector, typically referred to as intensity. Intensity is correlated with a target’s reflectance at the given laser wavelength, making it useful in the interpretation of the lidar spatial information [2] or as a standalone data source for identifying the characteristics of the likely backscattering surface for eachreturn[3]. Theintensityalsodependsonothertargetcharacteristicssuchasroughnessandthe lidarcross-sectionandonsensor-targetgeometryparametersincludingrangeandincidenceangle. RemoteSens.2016,8,936;doi:10.3390/rs8110936 www.mdpi.com/journal/remotesensing RemoteSens.2016,8,936 2of34 Currentlythegeneralpracticeistodigitallyscalethe“intensity”signaltonormalizeittoagivenrange. However,thequalityofintensityinformationcanbeimprovedthroughgeometricandradiometric correctionsforotherfactorssuchasincidenceangle,andatmosphericattenuation[4–6]. Takenastep further,radiometriccalibrationmethodstransformintensitiestophysicalquantitiessuchastarget reflectance[7],allowinginterpretationwithrespecttoknownmaterialspectra. Inthepast,intensityinformationhasbeenusedforvariousapplications,includinglandcover classification[3,5];enhancementoflidargroundreturnclassification[8,9];fusionwithmultispectral and hyperspectral data to enable a better characterization of terrain or land cover [10]; derivation of forest parameters and tree species mapping [11–13]; and production of greyscale stereo-pair imagerytogeneratebreaklinestraditionallyusedinphotogrammetrythroughatechniquenamed lidargrammetry[14]. However,theusefulnessoflidarintensityinformationhasbeenfundamentally limited because it provides a measure of backscatter at a single, narrow laser wavelength band. This is usually a near-infrared (NIR) wavelength (1064 or 1550 nm) for topographic lidar systems and a green-blue wavelength [15] for bathymetric lidar systems. Currently, the second harmonic ofneodymium-dopedyttriumaluminumgarnet(Nd:YAG)lasers,532nm,isacommonchoicefor bathymetriclidars[16]. Thissinglewavelengthspectrallimitationhasbeenpreviouslyrecognizedand severalexperimentshaveattemptedtomitigateitbycombiningdataobtainedfromindividualsensors thatoperateatdifferentlaserwavelengths[17,18]orbyobservationsfromprototypemultispectral lidar systems [19–21]. These multispectral lidar experiments are testament to the potential of this newly developing remote sensing technique, and as with any other technology, the potential is coupledwithchallenges. Someofthesechallengesarerelatedtophysicalprinciples(atmospheric transparency,backgroundsolarradiation,etc.) andhardwarelimitations(availablelaserwavelengths, eye safety) [22], while other limitations are related to software and algorithms (e.g., radiometric calibrationoftherawlidarintensities)[17]. Thispaperpresentsageneraloverviewofthedesignandperformanceofthefirstoperational multispectralairbornelaserscannerthatcollectsrange,intensityandoptionallyfullwaveformreturn dataatthreedifferentlaserwavelengths(1550,1064,532nm)throughasinglescanningmechanism. This airborne multispectral lidar scanner is the Teledyne Optech Titan MW (multi-wavelength) whichwasdevelopedbasedonthespecificationsandtechnicalrequirementsoftheNationalScience Foundation(NSF)NationalCenterforAirborneLaserMapping(NCALM).Thesystemwasdelivered totheUniversityofHouston(UH)inOctoberof2014. Sincethen,thesensorhasundergonesignificant testing,improvementandfinetuninginawiderangeofenvironments[23]. This paper is intended to highlight the flexibility of the Titan to perform 3D and active multispectral mapping for different applications (bathymetry, urban mapping, ground cover classification, forestry, archeology, etc.) without delving too deeply into one specific application. Individual papers that provide more details for specific applications are currently in preparation. ThebasicresearchquestionaddressedinthispaperrelatestotheperformanceoftheTitansystem, which,givenitsoperationalflexibility,hastobeassessedusingavarietyofmetricsthatvaryaccording to the application. The performance metrics discussed within this work are: (a) the accuracy of experimentalgroundcoverclassificationbasedonun-calibratedmultispectrallidarreturnintensity andstructuralmetricsinanurbanenvironment(Houston,TX,USA);(b)maximumwaterpenetration, and(c)accuracyofthemeasuredwaterdepthsunderidealbathymetricmappingconditions(Destin,FL, USA, and San Salvador Island, Bahamas); (d) canopy penetration; (e) range resolution in tropical rain forests (Guatemala, Belize and Mexico); and (f) precision and accuracy of topographically derivedelevations. It is important to clarify that since the delivery of NCALM’s original Titan system, themanufacturerhasproducedotherTitansensors;however,notallunitshavethesameengineering specificationsorsystemdesign. Thedesignofthesensorisflexible,allowingtheexactconfiguration andperformanceoftheunittobeadaptedtothespecificneedsofacustomer.Thediscussionpresented RemoteSens.2016,8,936 3of34 belowisspecifictothedesignandperformanceoftheNCALMunitandmaynotbeapplicableor Remote Sens. 2016, 8, 936 3 of 34 reproducibleforotherTitanunits,eventhoughtheycarrythesamemakeandmodeldesignation. ThTishpisa ppearpiesr sitsr usctrtuurcetudraeds faosll ofowllso:wSes:c tSioecnti2onp r2e spenretsseanhtsi gah -hliegvhe-lledveeslc rdipestciorinptoifonth eofT itthaen Tsiytsatne m dessiygsnteamn ddoespigerna tainodn aolpcheraartaioctnearli scthicasr;aScetecrtiisotnics3; pSreecsteionnts 3t hperreesseunlttss tahned rdesisuclutss sainodn odfispceursfsoiormn aonf ce tesptserrfeolramteadncteo t(eas)tsg rreoluatnedd ctoo v(ear) gclraosusnifidc acotivoenr bclaassesdificoantiomn ubltaissepde cotnra ml iunltteisnpseictytr;al( bin)tbenatshityym; (ebt)r ic capbaabthilyitmiees;tr(icc) ccaapnaobpiylitpieesn; e(ctr)a ctaionnopayn dpecnheatrraacttieorni zaantdio cnh;a(dra)crteedriuznatdioann;c y(da)n rdedduivnedrasnitcyy daensdig dnivsechrseimtye s; design schemes; and (e) vertical positional precision and accuracy; finally, conclusions are presented and(e)verticalpositionalprecisionandaccuracy;finally,conclusionsarepresentedinSection4. in Section 4. 2. TitanInstrumentDescription 2. Titan Instrument Description The first Titan MW lidar sensor (Serial number 14SEN/CON340) was developed to meet The first Titan MW lidar sensor (Serial number 14SEN/CON340) was developed to meet operational specifications and requirements established by NCALM. The specifications called for operational specifications and requirements established by NCALM. The specifications called for a amultipurposeintegratedmultichannel/multispectralairbornemappinglidarunit,withanintegrated multipurpose integrated multichannel/multispectral airborne mapping lidar unit, with an integrated high resolution digital camera that could seamlessly map terrain and shallow water bathymetry high resolution digital camera that could seamlessly map terrain and shallow water bathymetry environments from flying heights between 300 and 2000 m above ground level (AGL). NCALM’s environments from flying heights between 300 and 2000 m above ground level (AGL). NCALM’s operationalexperiencewithairbornelidarunitsoperatingat1064and532nmwavelengths,andthe operational experience with airborne lidar units operating at 1064 and 532 nm wavelengths, and the requirementtoperformsimultaneousterrestrialandbathymetrymapping,determinedtwoofthethree requirement to perform simultaneous terrestrial and bathymetry mapping, determined two of the laserwavelengths. Therewereseverallaserwavelengthsconsideredforthethirdchannel,including three laser wavelengths. There were several laser wavelengths considered for the third channel, 950and1550nm;however,the1550nmoptionwasselectedbecauseoftheeaseincomplyingwith including 950 and 1550 nm; however, the 1550 nm option was selected because of the ease in eye safety regulations, the proven reliability of the laser sources, and because it results in nearly complying with eye safety regulations, the proven reliability of the laser sources, and because it equreaslulyltss pinac needar(5ly0 0eqnumal)lys psepcatcreadl s(a50m0p nlmin)g swpeacvtrealel nsagmthpsliwngh ewnacvoemlenbginthesd wwhiethn tchoem1b0i6n4eda nwdit5h3 t2hne m wa1v0e6le4n agntdh s53(F2 ingmur we1a)v.elengths (Figure 1). FigFuirgeur1e. T1.h TehTei tTainta’sn’osp oepreartaiotinonalalw waavveelelennggtthhss wwiitthh rreeffeerreennccee ttoo rreefflleecctatannccee spspecetcrtar aofo dfidffieffreernetn latnladn d covceorvfeera fteuarteusreasn adntdh tehLe aLnadnsdasta8t 8O OpperearatitoionnaallL Laanndd IImmaaggeerr ((OOLLII)) ppaassssiivvee iimmaagginingg bbanandds.s . NCALM’s Titan has two fiber laser sources. The first source “Laser A” has a primary output at NCALM’sTitanhastwofiberlasersources. Thefirstsource“LaserA”hasaprimaryoutputat 1064 nm. Part of the 1064 nm output is directly used as channel two for the lidar and another part of 1064nm.Partofthe1064nmoutputisdirectlyusedaschanneltwoforthelidarandanotherpartofthe the output is passed through a frequency-doubling crystal to obtain 532 nm wavelength pulses for channel three of the lidar unit. The second laser source, “Laser B”, has its output at the 1550 nm RemoteSens.2016,8,936 4of34 outputispassedthroughafrequency-doublingcrystaltoobtain532nmwavelengthpulsesforchannel threeofthelidarunit.Thesecondlasersource,“LaserB”,hasitsoutputatthe1550nmwavelengthand isusedasthesourceforlidarchannelone. Bothlasersaresynchronizedandcanproducepulserates between50and300kHz,programmableat25kHzintervals. ThelaseroutputoftheTitancorresponds to Class IV as per United States Food and Drug Administration, 21 Code of Federal Regulations 1040.10and1040.11; InternationalElectrotechnicalCommission60825-1. Thecharacteristicsofthe individuallasersourcesaswellasothercharacteristicsofeachlidarchannelrequiredforassessing systembehaviorandperformancearepresentedinTable1. Table1.PerformancespecificationsofeachoftheTitan’schannels. Channel1 Channel2 Channel3 LaserWavelength(nm) 1550 1064 532 Lookangle(degrees) 3.5forward nadir 7.0forward PulseRepetionFrequency(kHz) 50–300 50–300 50–300 BeamDivergence(mRad) ~0.36 ~0.3 ~1.0 PulseEnergy(µJ) 50–20 ~15 ~30 PulseWidth(ns) ~2.7 3–4 ~3.7 Otherequipmentprovidersoffersingle-passmultispectrallidarsystemsbymounting,eitherin tandemorsidebyside,individualsensorheadsoperatingatdifferentwavelengths[22]. However, for the Titan system, we required the operation of the different wavelengths’ lasers through a single scanning mechanism to provide near-identical swaths from the three wavelength channels. Thechannelsarearrangedsuchthatthe1064nmchannelpointsatthenadir,andthe1550and532nm channels are pointed 3.5◦ and 7◦ forward of the nadir, respectively. The primary reason for this configurationwastominimizereturnsfromthewatersurfaceandmaximizetheprobabilityofwater penetrationforthe532nmpulses. Asecondaryreasonwastomaximizecorrelationwithlegacylidar datasetscollectedwitha1064channelpointingtothenadir. TheTitanscannerhasa±30◦ fieldofscan andamaximumscannerproduct(halfscanangle×scanfrequency)of800degrees-Hertz. Thebeam divergencevaluesarecloseto0.3milliradiansforthe1064and1550nmchannelsandonemilliradian forthe532nmchannel(seeTable1). Boresightparametersforeachchannelarecurrentlydetermined independently,andcombinedwiththesensormodelinthemanufacturer’sproprietarysoftwareto obtainageometricallycorrectandconsistentpointcloud. Individualpointcloudfilesaregenerated foreachflightlineandchannel. Thelaserreturnsignalisanalyzedinrealtimethroughananalogueconstantfractiondiscriminator (CFD) [24,25] which detects and records discrete laser returns (range and intensities) at all pulse repetitionfrequencies. Whilethesystemcandetectalargenumberofreturnsperpulse,itonlyrecords up to four returns for each outgoing laser pulse (first, second, third and last). In addition to the analogueCFD,theoutgoingandreturnwaveformsofallthechannelscanoptionallybedigitizedata 12bitamplitudequantizationresolutionandatarateof1gigasample/s.Currently,thisdigitizationcan onlybedoneforeachoutgoingpulseandreturnwaveformsatamaximumPRFof100kHz;forhigher PRFs,fullwaveformdigitizationisonlyperformedforadecimatedsubsetoftheemittedpulses. TheTitaniscapableofrangingbeyondthesinglepulse-in-the-airlimit(rangeambiguity),meaning thatitisabletoobtainaccuraterangesathighPRFswhenthereareseverallaserpulsesfromeach channel in the air simultaneously before a return from the first emitted pulse is obtained [26,27]. TheTitaniscapableofmeasuringinafixedmulti-pulsemode,whichmeansthatthesensorneedsto beawareofhowmanypulsesareplannedtosimultaneouslybeintheairinordertocomputeaccurate ranges. Thisisdoneintheplanningphaseofthedatacollection. If,forsomereason,theactualnumber ofpulses-in-the-air(PIA)isdifferentthantheplannedvalue,thesystemwillproduceerroneousrange values. This fixed multi-pulse capability has certain combinations of ranges and PRFs where the sensorisnotabletoresolvetherangeambiguitiesandwhichthemanufacturercalls“blindzones”. RemoteSens.2016,8,936 5of34 ThePRFsandrangeregionswheretheTitancanworkwithoutsufferingrangeambiguityforagiven PRFaredisplayedinFigure2asthewhiteregionsbetweenthecoloredbandsandlabeledaccording thenumberofpulses-in-the-air(PIA)atagiventime. Remote Sens. 2016, 8, 936 5 of 34 Figure 2. Pulse repetition frequency (PRF) versus range operation regions for NCALM’s Titan Figure2.Pulserepetitionfrequency(PRF)versusrangeoperationregionsforNCALM’sTitansensor. sensor. The graph shows both regions of operation (white) as well as range ambiguity regions Thegraphshowsbothregionsofoperation(white)aswellasrangeambiguityregionsdepictedasthe depicted as the solid colored bands. solidcoloredbands. Technically, the range ambiguity only occurs at a specific range or multiples of this range value. HoTwecehvneric, adlulye, ttoh ethrea wngideea smcabnig aunigtyleo annldy tohcec uforrswaatrad-slpoeockiifincg rcahnagneneolrs monu tlthiep lTeistaonf (tchhisanrnanelg 1e avnadlu e. Hocwhaenvneer,l d3)u, ethtoe sthpeecwifiicd reasncgaen aat nwghleicahn tdheth aemfboirgwuaitryd o-lcocoukrisn tgurcnhsa innntoe las obnantdh eofT ritaannge(c vhaalnunese la1s aitn d chaanpnpeliles3 )t,ot thhee sepneticriefi scernasnogr.e Tahtewseh bilcihndt hzeonaems,b oirg,u mitoyreo caccucursrattuelryn,s aimnbtoigauibtyan zdonoefs,r aanreg deevpaicluteeds aass it appthleie ssotloidt hcoeleonr tbiarendses ninso Fri.gTuhree s2e. Fbilginudrez 2o nalesso, iollru,smtroarteesa tchcautr aast etlhye, aPmRFb iignucrietyaszeos,n tehse, adrieffedreepnitc PteIAd as theresgoiloidnsc ooflo orpbearantdiosni ngeFti gsmuraell2e.r.F Aignuortehe2ra wlsaoyi loluf svtirsautaelsiztihnagt tahsetshee rePgRioFnisn corfe oapseersa,ttihoen disi ftfoe rreenlattPeI A regtihoenms toof hoopwer matuiocnh graentgsem vaalrlieart.ioAnn tohteh seernwsoary caonf vexispuearileiznicneg wthitehsine are sginiognles folifgohpt elirnaet iaosn thise troesruellta te theomf tteorrhaoinw vamriuactihonra onrg eelevvaartiiaotnio onf tmhaensmenasdoer sctaruncetuxpreesr.i Tenhcee hwigihtherin thae sPinRgFl,e thflei glehstsl tinereraaisnt rheelireef scualnt of be tolerated by the sensor without entering into the range ambiguity regions depicted in the figure. terrainvariationorelevationofmanmadestructures. ThehigherthePRF,thelessterrainreliefcanbe The laser shot densities obtainable with the Titan are mainly a function of instrument toleratedbythesensorwithoutenteringintotherangeambiguityregionsdepictedinthefigure. parameters (laser PRF, scan angle, scan frequency), and flying parameters (ground speed and flying ThelasershotdensitiesobtainablewiththeTitanaremainlyafunctionofinstrumentparameters height above terrain). However, eye safety regulations and range ambiguity also limit the maximum (laser PRF, scan angle, scan frequency), and flying parameters (ground speed and flying height measurement density obtainable for a specific flying height. Figure 3 illustrates the laser shot density above terrain). However, eye safety regulations and range ambiguity also limit the maximum operational envelope of the Titan as a function of the flying height for a single channel and single measurementdensityobtainableforaspecificflyingheight. Figure3illustratesthelasershotdensity pass (i.e., no swath overlap). The figure is for reference only and is based on the specific assumptions operationalenvelopeoftheTitanasafunctionoftheflyingheightforasinglechannelandsingle described below; density values outside the envelope can be obtained under specific circumstances pass(i.e.,noswathoverlap). Thefigureisforreferenceonlyandisbasedonthespecificassumptions but are not considered to be normal mapping operations. describedbelow;densityvaluesoutsidetheenvelopecanbeobtainedunderspecificcircumstances butarenotconsideredtobenormalmappingoperations. RemoteSens.2016,8,936 6of34 Remote Sens. 2016, 8, 936 6 of 34 Figure3.LasershotdensityenvelopeforasinglepassandsinglechanneloftheTitansensor. Figure 3. Laser shot density envelope for a single pass and single channel of the Titan sensor. InFigure3,thelowerlimitoftheenvelopeisobtainedbyassumingthelowestPRFof50kHz, In Figure 3, the lower limit of the envelope is obtained by assuming the lowest PRF of 50 kHz, a agroundspeedof150knotsandascanneroperatingat25Hzatthemaximumfieldofview(±30◦). ground speed of 150 knots and a scanner operating at 25 Hz at the maximum field of view (±30°). At Atlowerflightheightsthemaximumdensityislimitedbyeyesafetyconsiderations. EachlaserPRF lower flight heights the maximum density is limited by eye safety considerations. Each laser PRF has has a nominal ocular hazardous distance (NOHD) that increases as the PRF increases. While it is a nominal ocular hazardous distance (NOHD) that increases as the PRF increases. While it is technicallypossibletooperateathigherPRFsatlowaltitudes,thisdoesnotcomplywithapplicableeye technically possible to operate at higher PRFs at low altitudes, this does not comply with applicable safetyregulationsandthusitisnecessarytousethemaximumPRFthatensuresexceedingtheNOHD eye safety regulations and thus it is necessary to use the maximum PRF that ensures exceeding the atgroundlevel(seeFigure3). Theupperlimitoftheenvelopeisobtainedbyassumingthehighest NOHD at ground level (see Figure 3). The upper limit of the envelope is obtained by assuming the possiblePRFforagivenflyingheight(takingintoaccounttherangeambiguityregions),aground highest possible PRF for a given flying height (taking into account the range ambiguity regions), a speedof150knotsandascanneroperatingat70Hzandaverynarrowfieldofview(±10◦). Thepeaks ground speed of 150 knots and a scanner operating at 70 Hz and a very narrow field of view (±10°). andvalleysintheupperenvelopelimitarecausedbytherangeambiguityregions(Figure2)given The peaks and valleys in the upper envelope limit are caused by the range ambiguity regions (Figure thatforspecificflyingheightstheTitan’smaximumoperationPRFof300kHzperchannelfallswithin 2) given that for specific flying heights the Titan’s maximum operation PRF of 300 kHz per channel arangeambiguityregionandthusthePRFhastobereducedtomeasureunambiguousranges. falls within a range ambiguity region and thus the PRF has to be reduced to measure unambiguous ThedigitalcameraintegratedintotheTitansensorisaDIMACUltralight(alsoknownasD-8900). ranges. TheD-8900isbasedonachargedcoupleddevice(CCD)with60megapixels,eachwithadimensionof The digital camera integrated into the Titan sensor is a DIMAC Ultralight (also known as 6µm×6µm. Thepixelsarearrangedinanarrayof8984pixelsorientedperpendiculartotheflight D-8900). The D-8900 is based on a charged coupled device (CCD) with 60 megapixels, each with a directionand6732pixelsalongtheflightdirection,whichtranslatestoaCCDphysicalframesizeof dimension of 6 μm × 6 μm. The pixels are arranged in an array of 8984 pixels oriented perpendicular 5.39cm×4.04cm. Theimageisformedonthefocalplanethroughacompoundlenswithanominal to the flight direction and 6732 pixels along the flight direction, which translates to a CCD physical focallengthof70mm. ThecombinationofthelensandCCDarrayyieldsatotalfield-of-view(FOV)of frame size of 5.39 cm × 4.04 cm. The image is formed on the focal plane through a compound lens 42.1×32.2andagroundsampledistance(GSD)of0.0000825×flyingheight. ThepositionoftheCCD with a nominal focal length of 70 mm. The combination of the lens and CCD array yields a total isadjustedduringflightbyapiezoactuatortocompensateforthemotionoftheaircraftduringan field-of-view (FOV) of 42.1 × 32.2 and a ground sample distance (GSD) of 0.0000825 × flying height. exposure,reducingthepixelsmearatverysmallGSDs. Thedigitalcameracanbetriggeredinavariety The position of the CCD is adjusted during flight by a piezo actuator to compensate for the motion of ofmodes(timeinterval,position)fromtheTitancontrolsoftware,orindependentlythroughitsown the aircraft during an exposure, reducing the pixel smear at very small GSDs. The digital camera can controlsoftware.Inadditiontoitsintegratedcamera,theTitansystemcaninterfacewithotherimaging be triggered in a variety of modes (time interval, position) from the Titan control software, or sensors;forexample,NCALMhasintegratedtheTitanwithanITRESCASI-1500hyperspectralcamera. independently through its own control software. In addition to its integrated camera, the Titan Physically, the Titan consists of a sensor head and sensor control rack. The sensor rack in system can interface with other imaging sensors; for example, NCALM has integrated the Titan with its base configuration consists of an electrical power unit and a control computer. Depending on an ITRES CASI-1500 hyperspectral camera. whatcomponentsarebeingoperatedinconjunctionwiththelidarsensor,thecontrolrackcanalso Physically, the Titan consists of a sensor head and sensor control rack. The sensor rack in its incorporateuptothreeadditionalcontrolcomputersforthewaveformdigitizers(oneperchannel) base configuration consists of an electrical power unit and a control computer. Depending on what andthecontrolcomputerforthedigitalcamera. Thesensorheadisbasicallyacylinderwithabox components are being operated in conjunction with the lidar sensor, the control rack can also incorporate up to three additional control computers for the waveform digitizers (one per channel) and the control computer for the digital camera. The sensor head is basically a cylinder with a box on RemoteSens.2016,8,936 7of34 Remote Sens. 2016, 8, 936 7 of 34 tohne ttohpe; ttohpe; ctyhleincdyelri nhdoeursehso tuhsee sopthtiecaol pctoimcaplocnoemnptso (nsecnantsn(esrc, acnanmeerr,ac,a lmaseerras), alansde rtsh)ea snydstethme isnyesrtteiaml mineearstiuarlemmeeanstu urenmite (nIMtuUn)i,t a(InMdU th),ea bnodxt hoen btoopx oennctloopseesn tchloes eelsetchtreoenliecc tcroomnipcocnoemnptso noef nthtseo sfetnhseosr.e nTshoer . cTyhliencdyelri nhdaesr ahna saapnpraopxpimroaxtiem datiaemdieatmer eotef r4o5f c4m5c amnda nad haehigehigt h5t55.55 .5cmcm. T.hTeh esesnensosro rccoonntrtorol lrraacckk iiss eenncclloosseedd iinn aann EEddaakk ttrraannssiitt ccaassee wwiitthh aa ssttaannddaarrdd 1199--iinn--wwiiddee rraacckk aanndd aa ffoooottpprriinntt ooff 22660000 ccmm22.. FFiigguurree 44 sshhoowwss pphhoottooss ooff tthhee TTiittaann sseennssoorr iinntteeggrraatteedd oonn aa DDHHCC--66 TTwwiinn OOtttteerr aaiirrppllaannee.. FFiigguurree 44.. TThheeT iTtaitnanm umltuslptespctercatlrlaild alirdsaern ssoernisnotre girnatteegdraintetod ainDtHo Ca- 6DTHwCin-6O tTtewriani rcOrtatfetr: (aa)irOcrvaefrt:v i(eaw) Oofvienrsvtiaelwla toiof ninlastyaolluattiforonm latyhoeupt ofrrtosmid tehoef psoenrts osridhee aodf ;s(ebn)sVorie hweafrdo; m(bf)r oVnietwlo ofkroinmg farfot,nste lnosookrincogn atrfot,l sreancskoirs cionntthroel froarcekg riso uinn dth,es efonrseogrrohueandd,i nsetnhseorb hacekagdr ionu tnhde; b(ca)ckVgierowunodf;t h(ce) sVeineswo rohf ethade stehnrosourg hhetahde tmhraopupgihn gthpeo mrtaopfpthinega iprcorrat fot.f Tthhee alairscerraoftu. tTphuet lwasinerd oowutpisutth weirnecdtoawng iusl tahrew riencdtaonwguolnart hweirnidghotw,a onnd tthhee rDigIhMt,A aCndc atmhee DraIMlenAsCis cbaemheinrad ltehnes cisir bcuehlairndw itnhde ocwir.cular window. 3. Field Testing of Capabilities 3. FieldTestingofCapabilities Since the delivery of the Titan sensor to NCALM in October 2014, the sensor has been used to Since the delivery of the Titan sensor to NCALM in October 2014, the sensor has been used collect data for more than 30 projects across the United States and in remote locations such as to collect data for more than 30 projects across the United States and in remote locations such as Antarctica and the Petén jungle in Guatemala. With over 140 hours of laser-on operation, the Titan AntarcticaandthePeténjungleinGuatemala. Withover140hoflaser-onoperation,theTitansensor sensor has been tested in a wide variety of environments, for diverse applications and throughout its hasbeentestedinawidevarietyofenvironments,fordiverseapplicationsandthroughoutitsdesign design operational envelope. Table 2 presents details of the mapping and test projects acquired with operational envelope. Table 2 presents details of the mapping and test projects acquired with the the Titan sensor. In addition to project data collection, NCALM has also performed tests to assess the Titansensor. Inadditiontoprojectdatacollection, NCALMhasalsoperformedteststoassessthe performance of the sensor under different operational conditions. This paper presents results from performanceofthesensorunderdifferentoperationalconditions. Thispaperpresentsresultsfrom some of these performance tests focused on ground cover classification based on multispectral lidar someoftheseperformancetestsfocusedongroundcoverclassificationbasedonmultispectrallidar intensity and structural metrics in urban environments, bathymetry (maximum detectable depth intensityandstructuralmetricsinurbanenvironments,bathymetry(maximumdetectabledepthand and accuracy of bathymetric elevations), the ability to penetrate thick tropical forest canopies, the accuracyofbathymetricelevations),theabilitytopenetratethicktropicalforestcanopies,theability ability to finely resolve the vertical structure of vegetated environments (range resolution), and its tofinelyresolvetheverticalstructureofvegetatedenvironments(rangeresolution),anditsvertical vertical positional accuracy. positionalaccuracy. RemoteSens.2016,8,936 8of34 Table2.SummaryofprojectsandtestcollectionstodatewiththeTitansystem. CollectionYearand LaseronTime Project/TestLocation PrimaryApplication DayofYear (H) 2014:289,300 Baytown,TX,USA 2015:47,222 Systemtest 2016:127–129 Houston,TX,USA 2014:289 Systemtest 0.4 Jordan,MT,USA 2014:291 Geomorphology 0.7 HebgenLake,MT,USA 2014:292 Tectonics 1.1 BigCreekRiver,ID,USA 2014:293 Riverbathymetry 1.0 GreysRiver,WY,USA 2014:295 Riverbathymetry 1.3 Bishop,CA,USA 2014:296 Geomorphology 0.7 WheelerRidge,CA,USA 2014:296 Geomorphology 0.8 Yucaipa,CA,USA 2014:297 Forestry 0.9 Beaver,UT,USA 2014:299 Rivermorphology 0.7 McMurdoDryValleys,Antractica 2014:338to2015:19 Geomorphology 47.5 ElCeibal,Peten,Guatemala 2015:77–82 Archaeology 5.5 Zacapu,Michoacan,Mexico 2015:88 Archaeology 0.7 Angamuco,Michoacan,Mexico 2015:88 Archaeology 0.8 Teotihuacan,Mexico 2015:91–92 Archaeology 1.69 LaserServicing TrinityRiver,TX,USA 2015:219–222 Rivermorphology 4.8 NASAJSCClearLake,TX,USA 2015:223–227 Climatechangeresiliency 5.1 BaratariaBay,LA,USA 2015:228–230 Marshresponsetooilspill 6.8 DestinInlet,FL,USA 2015:231 Bathymetrytest 1.5 Apalachicola,FL 2015:232–233 Aquaticecosystem 1.6 RedfishBay,TX 2015:235 Bathymetrytest 0.8 TexasGulfCoast,USA 2015:235 Coastalmorphology 0.5 ReynoldsCreek,ID,USA 2015:289,294 Ecology 2.6 SantaClaraRiver,CA,USA 2015:291–293 Rivermorphology 3.4 CalhounCreek,SC,USA 2016:057 Ecology 2.2 LaserServicing Campeche,Mexico 2016:138–141 Archaeology 7.6 LakePeakFault,CA,USA 2016:155 Tectonics 1 InyoDomes,CA,USA 2016:158 Geomorphology 0.5 Monterey,CA,USA 2016:157,159 Urbanspectralclassification 0.7 Bastrop,TX,USA 2016:161 Orthophotos 0.3 NorthWesternBelize 2016:184–186 Archaeology,Geomorphology 5.0 SanSalvador,Bahamas 2016:189–191 Islandhydrology 7.0 MayanBiosphereReserve,Peten,Guatemala 2016:197–207 Archeaology,Ecology 23.7 3.1. MultispectralCapabilities The Titan senor was designed as a flexible multi-purpose and multi-application system; therefore, it may not outperform systems that were designed exclusively for bathymetry or for topographic mapping from high altitudes. However, one application for which it can excel is providing high resolution active multispectral data derived from lidar intensity. In theory and in practice, multispectral lidar intensity can be used for many applications, including return/target classification[28,29];individualtreeidentification,parameterizationandclassification[30];water/land interfaceidentification;andarchaeologicalfeaturedetection[31],amongmanyothers. Severalyears willgobybeforescientistsineachoftheabove-mentionedfieldswillhaveanopportunitytoassessthe utilityoftheTitanmultispectrallidardatasetsascomparedtowhathasbeentheorizedorexperimented sofar. Forbrevity,thispaperwillonlybrieflydescribetheuseofTitanmultispectraldatatogenerate activelidarintensityimages,describetheadvantagesandlimitationsoftheseintensityproductsand provideanexampleofgroundcoverclassificationusingTitandata. It does not take much imagination to realize that the simplest way to use multispectral lidar intensity is to enhance the visualization of lidar data, just as single-wavelength intensity values have been used in the past. As such, the most basic way of utilizing the multispectral intensity is for rendering false colored point clouds, where the intensity of each return is used to assign hue saturationandbrightnessvalues. Inthisapplication,thehueandsaturationareassignedbasedonthe laserwavelengthandthelidarreturnintensityvaluedeterminesormodulatesthebrightnessvalue. RemoteSens.2016,8,936 9of34 Theadvantageofthisapproachisthattherenderingscanretainthepreciseandcomplex3Dspatial natureofthelidardatadespitethereturnlocationsbeingirregularlyspaced. Multispectrallidarintensitycanalsobeusedtogeneratetwo-dimensional(2D)activeintensity images by combining the spatial and spectral information of each laser return. There are several approachestoproduceintensityimageryfromirregularlyspacedlidarpointclouddata. Thefirstand simplermethodconsistsofassigningagreyscalevaluecorrespondingtothelidarintensitytoeach returnandthenrenderingthepointcloudintwodimensionstocreateatop-viewintensityimage. Asecondapproach,whichismoreelaborateandprovidesmorecontroloverthespatialintegrityof theimage, consistsofinterpolatingtheintensityvaluesintoaregulartwo-dimensionalhorizontal array(grid)usingmethodssuchastriangulationwithlinearinterpolation,kriging,orinversedistance weighting,etc. Thismethodisparticularlyusefulwhenthehorizontaldistributionofthelidarreturns issparseascomparedtothedesiredgrid(raster)resolution. Whenthereturndensityishighenoughtoprovideseverallaserreturnsperrastercell,itispossible touseathirdapproach. Thisapproachconsistsofderivingasingleintensityvalueforeachrastercell fromtheintensityvaluesofallthereturnswithinthegivenrastercell. Theresultantintensityraster canbeproducedbyaveragingintensityvaluesusingsimpleorweightedmethodsordetermining minimum,maximum,standarddeviationoranyotherstatisticalmetrictocharacterizethereturns’ intensitywithintheraster[32]. Herein, werefertothisapproachas“binning”. Oneadvantageof binning,especiallywhenitusessomesortofaveragingmechanism,isthatitreducesthevariability ofintensityvaluesduetofactorsthataredifficulttoaccountfor,suchasirregularincidenceangles andvaryingsurfaceroughness. However,thegreatestadvantageofthismethodisthatitenablesthe generationoffalsecolorimageryfrommulti-wavelengthlidarintensitiesevenwhenthecentersof thefootprintsatthedifferentwavelengthsmaynotbeexactlycollocated,whichisthecaseforthe Titansensor. Figure5a–cshowintensityimagesgeneratedusingthebinningmethodfordatacollectedwiththe TitansensoroverthecampusoftheUniversityofHouston. Whenintensityinformationisavailable forthreeormoreindependentwavelengths,itispossibletogeneratefalsecolorRGBimagery. Inthe caseoftheTitan,withthreedifferentwavelengths,itispossibletocombinetheintensityimagesin sixdifferentarrangements. Figure5dshowsafalsecolorRGBcombinationusingtheintensityfrom the1550nmwavelengthastheredchannelandthe1064and532nmwavelengths’intensitiesforthe greenandblueimagechannels,respectively. Bycombiningindependentintensityimagesintoafalse colorRGBimage,itispossibletoaccessthemultidimensionalityofthecolorspacewhichcanhighlight featuresthatperhapsarenoteasilyidentifiedinasingle-wavelengthgrayscaleintensityimage. In addition to using binning to generate intensity rasters, the same technique can be applied totheelevationvaluesofthereturnswithinabintogeneratewhatinthispaperarereferredtoas “structural”images(Figure5e–g),astheyprovideinformationontheverticalstructureofthetargets withinarastercell. Similartothewaytheintensityimagesareproduced,thestructuralimagesare producedbyassigningasinglevaluetotherasterbasedonsomestatisticalmeasurederivedfromthe elevationsofthereturnswithinacell. Thesestructuralrasterscanbegeneratedfromeitherabsolute geodeticelevations(ellipsoidalororthometric)orrelativeelevationssuchasheightabovelocalground. For land cover classification purposes, statistical dispersion metrics such as elevation spread and standard deviation can be computed from either absolute or relative elevation values. However, for averaged elevation metrics to be of any use for classification, they need to be computed from relativeelevationabovelocalgroundvalues. Thesestructuralimageshavegreatpotentialforassisting landcoverclassificationtasksbecausedifferentkindsoftargets(buildings,roads,trees,powerlines, etc.) haveverydistinctelevationsignatures. Otherstructuralmetricscanbederivednotonlyfrom elevationvaluesbutfromreturnstatisticssuchasthenumberofreturnsdetectedfromagivenraster cell(returndensity)andtheratioofreturnstolasershots(Figure5g). Thesereturnmetricsimagescan beusedtosegregateimpervious(roads,ground,buildingroofs)anddiffusetargets(vegetationand water). Structuralmetricsfromdifferentlidarwavelengthscanalsoaidinlandcoverclassification. RemoteSens.2016,8,936 10of34 Remote Sens. 2016, 8, 936 10 of 34 and diffuse targets (vegetation and water). Structural metrics from different lidar wavelengths can Forexample,adifferenceinmeanelevationbetweenstructuralimagesbasedonthe532nmandeither also aid in land cover classification. For example, a difference in mean elevation between structural oftheNIRwavelengths,foragivensetofpixels,willindicateapotentialwaterbody,becausethe images based on the 532 nm and either of the NIR wavelengths, for a given set of pixels, will indicate 532 nm wavelength may produce multiple returns from the water surface, the water column and a potential water body, because the 532 nm wavelength may produce multiple returns from the thewbaetenrt hsiucrlfaaycee,r ,thweh wileattehre cNolIuRmwn aavnedle nthget hbsewntihllicd elatyeecrt, rewtuhirlne stohne lyNIfRro mwatvheelewngatthers swuirlfla dceetoecrtn o retruertnusrnast oanlll.y from the water surface or no returns at all. Figure 5. Intensity and structural images generated from the Titan multispectral data. (a) Intensity Figure5. IntensityandstructuralimagesgeneratedfromtheTitanmultispectraldata. (a)Intensity image generated from the 1550 nm channel; (b) intensity image for the 1064 nm channel; (c) intensity imagegeneratedfromthe1550nmchannel;(b)intensityimageforthe1064nmchannel;(c)intensity image for the 532 nm channel; (d) false color multispectral intensity image generated by using the imageforthe532nmchannel;(d)falsecolormultispectralintensityimagegeneratedbyusingthe 1550 nm intensity for the red channel and the 1064 and 532 nm intensities for the green and blue 1550nmintensityfortheredchannelandthe1064and532nmintensitiesforthegreenandblue channels; (e) structural image based on the spread of the returns height; (f) structural image based on channels;(e)structuralimagebasedonthespreadofthereturnsheight;(f)structuralimagebasedon the height above ground; (g) structural image based on the number of returns per pulse; (h) ground theheightaboveground;(g)structuralimagebasedonthenumberofreturnsperpulse;(h)ground cloud classification results map. cloudclassificationresultsmap. Active intensity and structural images have some advantages over traditional passive images; speAcciftiicvaellyin ttheenys:i t(ya)a neldimsitnruatcet utrhael dimepaegnedsehnacyv eosno smoelaar dilvluamntiangaetisoonv (eimrtargaedriyt iocanna lbpe acsoslilveectiemd aagt es; spencigifihct aolrly betlhoewy :th(ea )cloeluimd cineialitnegt)h; e(b)d gerpeeantldye rnecdyucoen thseo elaffreciltlsu omf sinhaatdioowni(nigm aangde roycccluansiobnes ccaoullseecdte d atbnyig bhutilodrinbgesl oanwd tthoepocgloraupdhyc;e (icli)n pgr)o;v(ibd)e ggroeoadtl yknroewdluecdegeth oef tehffee icltlusmofinsahtiaodno swouinrcge a(wndavoeclecnlugsthio, ns cauasmepdlitbuydeb, upilhdaisneg, spaonladriztaotpioong, raeptch.)y w; h(ci)chp, rionv ipdreincgiopoled, skhnoouwldle adlgloewo ffotrh eeaislileurm cianliabtriaotnionso tuor ce (wraevmeloetnelgyt ho,bataminp ltiaturgdeet, pphhyassieca,l pporloapriezratiteios;n ,anedtc .()dw) hpircohv,idien ipmriangceisp lteh,ats haorue ldalmalolostw pfeorrfecetalysi er orthorectified, only limited by the positional accuracy of the lidar returns. However, there are also calibration to remotely obtain target physical properties; and (d) provide images that are almost limitations or disadvantages of active intensity imagery, including: (a) the limited number of perfectlyorthorectified,onlylimitedbythepositionalaccuracyofthelidarreturns. However,thereare available laser wavelengths defined by the energy level transitions of the lasing materials and also limitations or disadvantages of active intensity imagery, including: (a) the limited number processes; (b) atmospheric attenuation of laser energy in the visible and near-infrared wavelengths of available laser wavelengths defined by the energy level transitions of the lasing materials and by low altitude (below sensor flying level) atmospheric phenomena such as clouds, haze and fog; processes;(b)atmosphericattenuationoflaserenergyinthevisibleandnear-infraredwavelengths and (c) the target surface is only partially illuminated (spatial sampling) by the lidar beams rather by low altitude (below sensor flying level) atmospheric phenomena such as clouds, haze and fog; than fully illuminated as in the case of passive imagery where almost the entire area (with the and(c)thetargetsurfaceisonlypartiallyilluminated(spatialsampling)bythelidarbeamsratherthan exception of areas with shadows) is illuminated by the sun. The active lidar intensity and structural fullyilluminatedasinthecaseofpassiveimagerywherealmosttheentirearea(withtheexceptionof images can be analyzed with the same techniques that are used to analyze images derived from areaswithshadows)isilluminatedbythesun. Theactivelidarintensityandstructuralimagescan passive optical sensors, for example to generate land cover classification maps (Figure 5h). beanalyzedwiththesametechniquesthatareusedtoanalyzeimagesderivedfrompassiveoptical sensors,forexampletogeneratelandcoverclassificationmaps(Figure5h).
Description: