sensors Article Optical Sensing to Determine Tomato Plant Spacing for Precise Agrochemical Application: Two Scenarios JorgeMartínez-Guanter1,MiguelGarrido-Izard2,ConstantinoValero2,DavidC.Slaughter3 andManuelPérez-Ruiz1,* 1 AerospaceEngineeringandFluidMechanicsDepartment,UniversityofSeville,41013Seville,Spain; [email protected] 2 LaboratoriodePropiedadesFísicas(LPF_TAGRALIA),UniversidadPolitécnicadeMadrid(UPM), 28040Madrid,Spain;[email protected](M.G.-I.);[email protected](C.V.) 3 DepartmentofBiologicalandAgriculturalEngineering,UniversityofCalifornia,Davis,CA95616,USA; [email protected] * Correspondence:[email protected];Tel.:+34-954-481-389 AcademicEditor:DimitriosMoshou Received:31March2017;Accepted:8May2017;Published:11May2017 Abstract:Thefeasibilityofautomatedindividualcropplantcareinvegetablecropfieldshasincreased, resultinginimprovedefficiencyandeconomicbenefits. Asystems-basedapproachisakeyfeaturein theengineeringdesignofmechanizationthatincorporatesprecisionsensingtechniques.Theobjective ofthisstudywastodesignnewsensingcapabilitiestomeasurecropplantspacingunderdifferent testconditions(California,USAandAndalucía,Spain). Forthisstudy,threedifferenttypesofoptical sensorswereused: anopticallight-beamsensor(880nm),aLightDetectionandRanging(LiDAR) sensor(905nm),andanRGBcamera. Fieldtrialswereconductedonnewlytransplantedtomato plants,usinganencoderasalocalreferencesystem. Testresultsachieveda98%accuracyindetection using light-beam sensors while a 96% accuracy on plant detections was achieved in the best of replicationsusingLiDAR.Theseresultscancontributetothedecision-makingregardingtheuseof thesesensorsbymachinerymanufacturers. Thiscouldleadtoanadvanceinthephysicalorchemical weedcontrolonrowcrops,allowingsignificantreductionsoreveneliminationofhand-weedingtasks. Keywords: LiDAR;light-beam;plantlocalization;Kinect 1. Introduction Precisionagriculturerequiresaccurateplantorseeddistributionacrossafield. Thisdistribution istobeoptimizedaccordingtothesizeandshapeoftheareainwhichnutrientsandlightareprovided toplanttoobtainthemaximumpossibleyield. Thesefactorsarecontrolledbythespacingbetween croprowsandthespacingofplants/seedsinarow[1]. Formanycrops,rowspacingisdetermined as much by the physical characteristics of agricultural machinery used to work in the field as by the specific biological spacing requirements of the crop [2]. According to the crop and machinery used,theaccuracyofplantingbytheprecisiontransplanter/seedertothedesiredsquaregridpattern must be adequate for the operation of agricultural machinery in both longitudinal and transverse cropdirections. Thecurrentdesignsofvegetablecroptransplantersandseedersutilizeseveraluncoordinated plantingmodulesmountedtoacommontransportframe. Thesesystemsusesub-optimalopen-loop methodsthatneglectthedynamicandkinematiceffectsofthemobiletransportframeandofplant motion relative to the frame and the soil. The current designs also neglect to employ complete mechanicalcontrolofthetransplantduringtheentireplantingprocess,producinganerrorinthefinal Sensors2017,17,1096;doi:10.3390/s17051096 www.mdpi.com/journal/sensors Sensors2017,17,1096 2of19 plantingposition,duetotheincreaseduncertaintyofplantlocationasaresultofnaturalvariationsin plantsize,plantmass,soiltractionandsoilcompaction[3]. Accuratelylocatingthecropplant,inadditiontoallowingautomaticcontrolofweeds,allows individualizedtreatmentofeachplant(e.g.,spraying,nutrients). Seekingtoensureminimumphysical interactionwithplants(i.e.,non-contact),differentremotesensingtechniqueshavebeenusedforthe preciselocalizationofplantsinfields. Fortheselocalizationmethods,someauthorshavedecided toaddressautomaticweedcontrolbylocalizingcropplantswithcentimetreaccuracyduringseed drilling[4]ortransplanting[5,6]usingaglobalpositioningsysteminrealtime(RTK-GNSS).These studies,conductedatUCDavis,haveshowndifferencesbetweenRTK-GNSS-basedexpectedseed locationversusactualplantposition. Thepositionuncertainlyrangedfrom3.0to3.8cmforseeds, andtomatotransplants,themeansystemRMSwas2.67cminthealong-trackdirection. Nakarmiand Tangusedanimageacquisitionplatformafterplantingtoestimatetheinter-plantdistancealongthe croprows[7]. Thissystemcouldmeasureinter-plantdistancewithaminimumerrorof±30cmanda maximumerrorof±60cm. Today, one of the biggest challenges to agricultural row crop production in industrialized countries is non-chemical control of intra-row (within the crop row) weed plants. Systems such asthosedevelopedbyPérez-Ruizetal.[8]orthecommercialplatformsbasedoncomputer-controlled hoes developed by Dedousis et al. [9] are relevant examples of innovative mechanical weeding systems. However, the current effectiveness of mechanical weed removal is constrained by plant spacing,theproximityoftheweedstotheplant,theplantheightandtheoperationtiming. Other methodsfornon-chemicalweedcontrol,suchastheroboticplatformdevelopedbyBlascoetal.[10] (capable of killing weeds using a 15-kV electrical discharge), the laser weeding system developed by Shahetal.[11] or the cross-flaming weed control machine designed for the RHEA project by Frasconietal.[12], demonstrate that research to create a robust and efficient system is ongoing. A common feature of all these technological developments is the need for accurate measurement ofthedistancebetweenplants. Spatialdistributionandplantspacingareconsideredkeyparametersforcharacterizingacrop. The current trend is towards the use of optical sensors or image-based devices for measurements, despite the possible limitations of such systems under uncontrolled conditions such as those in agricultural fields. These image-based tools aim to determine and accurately correlate several quantitativeaspectsofcropstoenableplantphenotypestobeestimated[13,14]. Dworak et al. [15] categorized research studying inter-plant location measurements into two types: airborneandground-based. Researchonplantlocationandweeddetectionusingairborne sensorshasincreasedduetotheincreasingpotentialofunmannedaerialsystemsinagriculture,which havebeenusedinmultipleapplicationsinrecentyears[16]. Forground-basedresearch,oneofthe mostwidelyacceptedtechniquesforplantlocationandclassificationistheuseofLightDetectionand Ranging(LiDAR)sensors[17]. Thesesensorsprovidedistancemeasurementsalongalinescanata veryfastscanningrateandhavebeenwidelyusedforvariousapplicationsinagriculture,including 3Dtreerepresentationforprecisechemicalapplications[18,19]orin-fieldplantlocation[20]. This researchcontinuestheapproachdevelopedin[21],inwhichacombinationofLiDAR+IRsensors mountedonamobileplatformwasusedforthedetectionandclassificationoftreestemsinnurseries. Based on the premise that accurate localization of the plant is key for precision chemical or physicalremovalofweeds,weproposeinthispaperanewmethodologytopreciselyestimatetomato plant spacing. In this work, non-invasive methods using optical sensors such as LiDAR, infrared (IR)light-beamsensorsandRGB-Dcamerashavebeenemployed. Forthispurpose,aplatformwas developedonwhichdifferentsensorconfigurationshavebeentestedintwoscenarios: NorthAmerica (UCDavis,CA,USA)andEurope(UniversityofSeville,Andalucia,Spain). Thespecificobjectives, giventhisapproach,werethefollowing: Sensors2017,17,1096 3of19 - Todesignandevaluatetheperformanceofmulti-sensorplatformsattachedtoatractor(aUC DavisplatformmountedontherearofthetractorandaUniversityofSevilleplatformmounted onthefrontofthetractor). - Torefinethedata-processingalgorithmtoselectthemostreliablesensorforthedetectionand localizationofeachtomatoplant. 2. MaterialsandMethods Todevelopanewsensorplatformtomeasurethespacebetweenplantsinthesamecroprow accurately,laboratoryandfieldtestswereconductedinAndalucia(Spain)andinCalifornia(USA). Thisallowedresearcherstoobtainmoredataunderdifferentfieldconditionsandtoimplementthe systemimprovementsrequired, consideringtheplantspacingobjective. Thesetestsaredescribed below,characterizingthesensorsusedandtheparametersmeasured. 2.1. PlantLocationSensors 2.1.1. Light-BeamSensorSpecifications IRlight-beamsensors(BannerSM31EL/RL,BannerEngineeringCo.,Minneapolis,MN,USA) were used in two configurations: first as a light curtain (with three pairs of sensors set vertically, Figure1centralandFigure4)andlaterasimplersetup,usingonlyonepairofsensors(Figure2),which simplifiesthesystemwhilestillallowingtheobjective(plantspacingmeasurement)tobeattained. Inthelightcurtain,light-beamsensorswereplacedtransverselyinthemiddleoftheplatformtodetect anddiscriminatetheplantsteminacrossconfigurationtopreventcrossingsignalsbetweenadjacent sensors. Due to the short range and focus required in laboratory tests, it was necessary to reduce thefieldofviewandthestrengthofthelightsignalbymaskingtheemitterandreceiverlenswitha 3D-printedconicalelement. Inlaboratorytests,theheightofthefirstemitterandreceiverpairabove the platform was 4 cm, and the height of 3D plants (artificial plants were used in laboratory tests; seeSection2.2)was13cm. Inthefieldtests,thesensorwasplaced12cmfromthesoil(theaverage heightmeasuredmanuallyforrealplantsinoutdoortestswas19.5cm)toavoidobstaclesinthefield (e.g.,dirtclods,slightsurfaceundulations). Inbothcases,thereceiverwassettoobtainaTTLoutput pulseeachtimetheIRlight-beamwasblockedbyanypartoftheplant. Thesignalsgeneratedby thesensorswerecollectedandtime-stampedbyamicrocontrollerinrealtimeandstoredforoff-line analysis. TechnicalfeaturesoftheIRlight-beamsensorsarepresentedinTable1. Table1.IRlight-beamsensorfeatures. OperationalVoltage(V) 10–30V Detectionrange(m) 30m Responsetime(milliseconds) 1ms Sinkingandsourcingoutputs(mA) 150mA 2.1.2. LaserScanner ALMS111LiDARlaserscanner(SICKAG,Waldkirch,Germany),wasusedinthelaboratoryand fieldtestingplatformstogenerateahigh-densitypointcloudonwhichtoperformthelocalization measurements. ItsmaincharacteristicsaresummarizedinTable2. Thebasicoperatingprincipleof theLiDARsensoristheprojectionofanopticalsignalontothesurfaceofanobjectatacertainangle andrange. Processingthecorrespondingreflectedsignalallowsthesensortodeterminethedistance totheplant. TheLiDARsensorwasinterfacedwithacomputerthroughanRJ45Ethernetportfor datarecording. Dataresolutionwasgreatlyaffectedbythespeedoftheplatform’smovement;thus, maintenance of a constant speed was of key importance for accurate measurements. During data acquisition,twodigitalfilterswereactivatedforoptimizingthemeasureddistancevalues: afogfilter Sensors2017,17,1096 4of19 (becoSmenisnorgs 2l0e17s,s 17s, e1n09s6i tive in the near range (up to approximately 4 m)); and an N-pulse-t4o o-f1 1-9p ulse filter, which filters out the first reflected pulse in case that two pulses are reflected by two objects approximately 4 m)); and an N-pulse-to-1-pulse filter, which filters out the first reflected pulse in duringameasurement[22]. DifferentLiDARscanorientationswereevaluated: scanningvertically case that two pulses are reflected by two objects during a measurement [22]. Different LiDAR scan withothrieensteantisoonrs lowoekrein egvadlouwatendw: ascrdansn(iFnigg uvreerti1c)a,lslyc awninthin tghew sitehnsaor4 5lo◦oiknicnlgin daotiwonnw(parudssh (-Fbirgouorme 1))a, nda lateraslc-asncanninngi nwgitohr aie 4n5t°a itnioclnin(astiidone -(vpiueswh)-b.room) and a lateral-scanning orientation (side-view). Figure 1. Details of the sensors on the laboratory platform (vertical LiDAR and Light-beam sensors) Figure1.Detailsofthesensorsonthelaboratoryplatform(verticalLiDARandLight-beamsensors) for the detection and structure of the modular 3D plant. forthedetectionandstructureofthemodular3Dplant. Table 2. LMS 111 technical data. Table2.LMS111technicaldata. Operational Range From 0.5 to 20 m Scanning field of view 270° OperationalRange From0.5to20m Scanning Frequency 50 Hz AnguSlacra rnensoinlugtifione ldofview 270◦ 0.5° LighStc asonunricneg Frequency 50Hz 905 nm EnclosAurneg rualtainrgr esolution 0.5◦ IP 67 Lightsource 905nm 2.1.3. RGB-D Camera Enclosurerating IP67 A Kinect V2 commercial sensor (Microsoft, Redmond, WA, USA), originally designed for 2.1.3. RGB-DCamera indoor video games, was mounted sideways on the research platform during field trials. This AsenKsionre ccatpVtu2rceodm RmGBe,r cNiaIRls aennds odre(pMthi cirmoasgoefts, (Rbeadsemd oonnd t,imWeA-o,fU-flSigAh)t,) oorfi gtoinmaaltloy pdleasnitgsn, ealdthfoourgihn door for further analysis, only RGB images were used for the validation of stick/tomato locations videogames,wasmountedsidewaysontheresearchplatformduringfieldtrials. Thissensorcaptured obtained from the LiDAR scans, as detailed in Section 2.4.3. Kinect RGB-captured images have a RGB,NIRanddepthimages(basedontime-of-flight)oftomatoplants,althoughforfurtheranalysis, resolution of 1920 × 1080 pixels and a field of view (FOV) of 84.1 × 53.8°, resulting in an average of onlyRGBimageswereusedforthevalidationofstick/tomatolocationsobtainedfromtheLiDAR approximately 22 × 20 pixels per degree. NIR images and depth camera have a resolution of 512 × scans, as detailed in Section 2.4.3. Kinect RGB-captured images have a resolution of 1920 × 1080 424 pixels, with an FOV of 70 × 60° and a depth-sensing maximum distance of 4.5–5 m. Although pixelsandafieldofview(FOV)of84.1 × 53.8◦, resultinginanaverageofapproximately22 × 20 systems such as the Kinect sensor were primarily designed for use under controlled light pixelscopnedritidoengs,r ethee. sNecIoRndim vaegrseisona nofd thdiesp stehnscoarm hearsa hhigahveer RaGrBes roelsuotliuotniono f(654102 ×× 48402 i4n pvi1x)e alsn,dw itist h an FOVionffr7a0re×d s6e0n◦sianngd caapdaebpilitthie-ss ewnesrien galmsoa ixmimpruovmedd, iesntaanblciengo fa 4m.5o–r5e mlig.hAtinltgh-oinudgehpesnydsetnetm vsieswu cahnda sthe Kinecstuspepnosrotirngw eitrse upsrei mouatrdiloyordse suingdneerd hfiogrh-uilsleumunindaetironco cnotnrodlilteiodnsli. gDhetscpointed tihtiios nism,pthroevseemcoenntd, wveer sion ofthiosbsseenrvseodr hthaast hthigeh qeuraRliGtyB orfe tshoel uRtGioBn i(m64a0ge×s w48e0rei nsovm1)ewanhdat iatsffienctferdar ebdy sluemnsininogsitcya paanbdi lditiireesctw ere incident light, and therefore, the image must be post-processed to obtain usable results. The images alsoimproved,enablingamorelighting-independentviewandsupportingitsuseoutdoorsunder taken by the Kinect sensor were simultaneously acquired and synchronized with the LiDAR scans high-illuminationconditions. Despitethisimprovement, weobservedthatthequalityoftheRGB and the encoder pulses. Because the LabVIEW software (National Instruments, Austin, TX, USA) imagesweresomewhataffectedbyluminosityanddirectincidentlight,andtherefore,theimagemust used for obtaining the scan data was developed to collect three items (the scans themselves, the bepost-processedtoobtainusableresults. TheimagestakenbytheKinectsensorweresimultaneously encoder pulses and the timestamp), a specific Kinect recording software had to be developed to acquired and synchronized with the LiDAR scans and the encoder pulses. Because the LabVIEW embed the timestamp value in the image data. With the same timestamp for the LiDAR and the softwiamraeg(eN, tahteio dnaatal Icnosutlrdu mbee mntast,chAeuds atinnd, tThXe ,imUaSgAe)s uusseedd tfoo rproobvtiadien iinnfgortmheatsiocnan abdoautta twhea sfodrwevaredlo ped tocolmleocvtetmhreenet iotfe tmhes p(tlhatefosrcman. sthemselves,theencoderpulsesandthetimestamp),aspecificKinect recordingsoftwarehadtobedevelopedtoembedthetimestampvalueintheimagedata. Withthe 2.2. Lab Platform Design and Tests sametimestampfortheLiDARandtheimage,thedatacouldbematchedandtheimagesusedto provideinTfoo rmmaaxtiimonizaeb othuet tahcecuforarcwy arodf mthoev edmisteanntceo fmtheeaspulraetmfoernmts. obtained by the sensors, an experimental platform was designed to avoid the seasonal limitations of testing outdoors. Instead 2.2. Loafb wPolartkfionrgm inD ae sliagbnoraantdorTye wstisth real plants, the team designed and created model plants (see Figure 1) using a 3D printer (Prusa I3, BQ, Madrid, Spain). These plants were mounted on a conveyor Tomaximizetheaccuracyofthedistancemeasurementsobtainedbythesensors,anexperimental platform was designed to avoid the seasonal limitations of testing outdoors. Instead of working in a laboratory with real plants, the team designed and created model plants (see Figure 1) using Sensors2017,17,1096 5of19 a 3D printer (Prusa I3, BQ, Madrid, Spain). These plants were mounted on a conveyor chain at a Sensors 2017, 17, 1096 5 of 19 predetermined distance. This conveyor chain system, similar to that of a bicycle, was driven by a smallchealeinct raitc am porteodretaebrlmeitnoedm doivsetanthcee. bTehltisa ctoanvcoeynosrt acnhtasinp eseydsteomf ,1 .s3im5 iklamr ·tho− t1h.aFt oorf tah ebiocydcolem, ewtrays systemdr,ivtehne bshy aaf tsomfaalln eilnecctrreimc menottaolr oapbtliec atol emnocvoed ethre( 6b3eRlt2 a5t6 ,aG croanyshtainllt Isnpce.e,dC hoifc 1a.g3o5 ,kImL,·hU−1S. AFo)rw thase mounotdeodmsoetrthy astyisttwemas, tahteta schhaefdt dofi raenct liynctroemtheengtaela ropshtiacfatl aenndcoudseerd (6to3Rm2e5a6,s uGrreatyhheildl iIsntca.n, cCehtircaavgeol,l eILd,, thussUeSrvAi)n gwaass amlooucanltiezda tisoon trheafte riet nwceass yasttteamch.eEda dchirecchtalyn ntoel tihnet hgiesaern schoadfet ragnedn eursaetde st2o5 m6peausluseres ptheer revoludtiisotann,cper otvraidveinllgeda, 3t-hmums sreersvoilnugti oans ian ltohceadliizraetcitoino nreoffetrreanvceel .sTyhsteedma.t aEagcehn ecrhaatendnebly itnh ethliisg hetn-bcoedamer sensogresnaenradtetsh 2e5c6u pmuulsleast ipveer oredvoomluetitoenr,p purlosveidcionugn at 3w-merme croeslloelcutteiodnu isni nthge adilroewct-icoons otfo tpraevne-lh. aTrhdew daartea ArdugineoneLreaotenda rbdyo thme ilcirgohcto-bnetaromll esern(sAorrds uainndo tPhreo cjeucmt,uIlvarteivae, Iotdaloym)petreorg pruamlsem ceodunint waesrime cpolleleicntteedg ruastiendg develao plmowen-ctoesnt viorpoennm-heanrtd(wIDaEre). TAhridsudienvoi ceLeeonnaabrldedo remciocrrdoicnogntorfodllaetra t(hAartdwuienroe sPtorroejedctin, aIvtreexat, filIetafloyr) furtheprrocgormampumteerd anina lyas iss.imSepvlee rainlrteegpreattietido ndseovfetlhoepmteesntst weenrveirmonamdeenotn t(hIDeEp)l.a tTfohrims dtoevoipceti meinzaebtlhede recording of data that were stored in a text file for further computer analysis. Several repetitions of functionsofbothlight-beamandLiDARsensors. FromthethreepossibleLiDARorientations,lateral the tests were made on the platform to optimize the functions of both light-beam and LiDAR scanningwasselectedforthefieldtrialsbecauseitprovidedthebestinformationonthestructureof sensors. From the three possible LiDAR orientations, lateral scanning was selected for the field theplant,asconcludedin[17]. Inlabtests,twoarrangementsoflight-beamsensorswereassessed: trials because it provided the best information on the structure of the plant, as concluded in [17]. In oneinalightcurtainassemblywiththreesensorpairsatdifferentheightsandanotherusingonlyone lab tests, two arrangements of light-beam sensors were assessed: one in a light curtain assembly emitter-receiverpair. with three sensor pairs at different heights and another using only one emitter-receiver pair. 2.3. FieldTests 2.3. Field Tests Theinitialtests,performedinDavis,CA(USA),wereusedtoassessthesetupofthelight-beam The initial tests, performed in Davis, CA (USA), were used to assess the setup of the light-beam sensorsystemanddetectedonlythestemoftheplantsratherthanlocatingitwithinalocalreference sensor system and detected only the stem of the plants rather than locating it within a local system. Oncethetomatoplantswereplacedinthefield,testswereconductedattheWesternCenter reference system. Once the tomato plants were placed in the field, tests were conducted at the forAgricultureEquipment(WCAE)attheUniversityofCalifornia,Daviscampusfarmtoevaluate Western Center for Agriculture Equipment (WCAE) at the University of California, Davis campus theperformanceofthesensorplatformformeasuringrowcropspacing. Forthistest,animplement farm to evaluate the performance of the sensor platform for measuring row crop spacing. For this was designed to house the sensors as follows. The same IR light-beam sensor and encoder, both test, an implement was designed to house the sensors as follows. The same IR light-beam sensor descraibnedd einncoSdeectri, obnot2h.1 d,ewscerriebeuds eidn (SFeicgtiuorne 22.)1., Twheereo uutspeudt (sFiiggnuarels 2o).f Tthhee soeuntspourts swigenraelsc oonf ntheec tseednstooras bidirewcetiroen caolndniegcitteadl mtoo ad ubliedi(rNecIt9io4n0a3l, dNigaittiaoln maloIdnustleru (mNIe n9t4s03C, oN.,aAtiuonstailn ,InTsXtr,uUmSeAnt)s, wCho.i,l eAtuhsetisni,g TnXal, encodUeSrAw),a swchoinlen etchtee dstigonaald iegnitcaoldienrp uwtams ocdonunleec(tNedI 9t4o1 1a, Ndaigtiiotanl ailnIpnusttr ummoednutlse C(No.I) .9B4o1t1h, mNoadtiuonleasl wereIinnsttergurmateendtsi nCtoo.)a. nBoNthI cmRoIOdu-9le0s0 w4e(NreI i9n4te1g1r,aNteadt iionntoa laInn NstIr ucRmIeOn-t9s00C4o (.N),Ia n94d11a,l lNdaattiaonwale rInesrterucomrdenetds usingCLoa.)b, VaInEdW al(lN daattiao nwaelrIen srtercuomrdeendt suCsion.g). LInabtVheIEseWe a(Nrlyatfiioenladl tIrniasltsru,tmheenttesa mCow.).o Irnk etdheosne teharrelye lfiineelds ofastmriaallsl,p tlhoet toefamla nwdo2rk0emd oinn ltehnreget hli,nwesh oefr ea tshmeamll eptlhoot dofo llaongdy 2fo0 rmd einte lcetningtght, hwehpelraen tthsew mitehthinodaoclorogpy linewfoars dteestetecdtin.g the plants within a crop line was tested. (a) (b) Figure 2. (a) Light-beam sensors mounted on the experimental platform designed for field trials at Figure2.(a)Light-beamsensorsmountedontheexperimentalplatformdesignedforfieldtrialsatUC UC Davis, California; (b) Progressive monitoring flowchart using light-beam sensors. Davis,California;(b)Progressivemonitoringflowchartusinglight-beamsensors. To continue the study of plant localization in a different scenario, additional experiments were Tdoesciognnteidn uaet tthhee stUundiyveorfsiptyla noft lSoecvaillilzea, tiino nwinhiachd iaff erreefnintesmceennat rioof, athded iLtiioDnAaRl esxepnesroirms eanntds wdeartea desigpnreodceaststihneg Uwneirvee prseirtfyoromfeSde.v Tilhlee,sien tweshtsi cwhearer ecfionnedmucetnedt oofnt hseevLeirDalA liRnesse nosf otrosmaantdo pdlaatnatps rmocaensusainllgy weretpraenrfsoprlmanetded. T fhroemse tterastyss,w weritehc othned upclatendts opnlasceevde rwalitlhin easn oafptpormoxaitmoaptlea,n tthsomugahn uianltleynttrioannaslplyla nntoend- fromutrnaiyfosr,mw,i tshpathcienpg loafn 3ts0 pclmac. eTdwwo iothf tahnesaep lpinreosx iwmeartee a,nthaolyusgehd ifnutrethnetiro, noanlely wniothn -5u5n tiofmoramto, spplaanctins ganodf the other with 51, and a line of 19 wooden sticks was also placed to provide an initial calibration of Sensors2017,17,1096 6of19 30cm. Twooftheselineswereanalysedfurther,onewith55tomatoplantsandtheotherwith51,anda linSeenosfor1s9 20w17o, o17d, e10n96s tickswasalsoplacedtoprovideaninitialcalibrationoftheinstruments. Du6e otfo 19t he initialtestconditions,wheretomatoplantswererecentlytransplantedandhadaheightoflessthan the instruments. Due to the initial test conditions, where tomato plants were recently transplanted 20cm,theteambuiltaninvertedU-shapedplatformattachedtothefrontofasmalltractor(Boomer35, and had a height of less than 20 cm, the team built an inverted U-shaped platform attached to the NewHolland,NewHolland,PA,USA,Figure3). Thechoiceofthesmalltractorwasmotivatedbythe front of a small tractor (Boomer 35, New Holland, New Holland, PA, USA, Figure 3). The choice of widthofthetrack,asthewheelsofthetractorneededtofitonthesidesofthetomatobed,leavingthe the small tractor was motivated by the width of the track, as the wheels of the tractor needed to fit rowoftomatoesclearforscanningandsensors. on the sides of the tomato bed, leaving the row of tomatoes clear for scanning and sensors. (a) (b) Figure 3. (a) Structure housing the sensors mounted on the tractor (left) and detail of the LiDAR and Figure3.(a)Structurehousingthesensorsmountedonthetractor(left)anddetailoftheLiDARand Kinect setup (right) for field trials at the University of Seville; (b) Progressive monitoring flowchart Kinectsetup(right)forfieldtrialsattheUniversityofSeville;(b)Progressivemonitoringflowchart using LiDAR and Kinect sensors. usingLiDARandKinectsensors. As was done in the laboratory platform, the encoder described in Section 2.1 was used as an odAomsewtraics dsyosnteemin, tthhies ltaimboer aitnotreyrfapcleadt fowrmith, tahne uennpcoodweerredde sgcrroibuendd inwhSeeeclt, ioton 2d.e1tewrmasinue sethde as aninosdtaonmtaentreiocussy lostceamtio,nt hoifs thteim deatain atleornfagc tehde rwowith. an unpowered ground wheel, to determine the instantaDnueroinugs lothcea titoesntso, ftthhee dplaattafoarlmon gprtehseenrotewd. several key points that were addressed: (i) The enDcoudreirn gprtohveetde sttos ,bteh esepnlsaittfivoerm top vriebsreantitoends saenvde rsauldkdeeynp moionvtsemtheanttws, esroe iat dwdarse sinsetedg:r(ait)eTdh ienteon cthoed er proavxeisd otfo robteatsioenn soift iavne atdodvitiibornaatli ownhseaenl,d wseulddedde tno mthoev setrmuectnutrse, asondi tdwamaspeinnteedg raas tmeducinh taos tphoessaixbilse of rotfartoimon tohfe avnibardatdioitnios ngaelnwerhateeedl, bwye tlhdee dtratoctothr.e Isnt raudcdtuitiroena, nthded taemamp ehnaedd toa sremduucche tahsep solispspibalgeef orof mthet he vibernactoiodnesr gwehneeerla otned thbey gtrhoeuntrda ctoto ar.voInida ldodsiintigo np,utlhsees;t e(iaim) Choarrdectto orreidenutcaetiothne osfl itphpe asgenesoofrst hweaesn aclosod er whkeeeyl obnectahuesge rtohuen mdotuonatvinogid anlogsliensg opf uthlsee Ls;iD(iiA)RC oserrnescotros rwieonutaldti ocnonodfitthioens tehnes osursbsweaqsuaenlsto akneaylybseisc aouf se the data obtained in the scans and the determination of which data contributed more valuable the mounting angles of the LiDAR sensors would condition the subsequent analysis of the data information; (iii) The speed of the tractor should be as low as possible remain uniform (during the obtainedinthescansandthedeterminationofwhichdatacontributedmorevaluableinformation; test the average speed was 0.36 m/s) and maintain a steady course without steering wheel (iii)Thespeedofthetractorshouldbeaslowaspossibleremainuniform(duringthetesttheaverage movements to follow a straight path. Sensors2017,17,1096 7of19 speedwas0.36m/s)andmaintainasteadycoursewithoutsteeringwheelmovementstofollowa straightpath. 2.4. Data-ProcessingMethodology Todetectpreciselyanddetermineproperlythedistancesbetweenplantsinbothlaboratoryand fieldtests,thedataprovidedbythesensorsweremergedandanalysed. 2.4.1. PlantCharacterizationUsingLight-BeamSensors The methodology followed to analyse data obtained from the light-beam curtain (which was formed by three light-beam sensors in line) was similar to that described in [21]. The algorithm outputsthemomentthatthebeamwasinterruptedandassociatesthebeamwithanencoderpulse. Becausethe3Dplantshadawidershapeatthetop(leaves)thanthebottom(stem),andtherefore more interruptions were received, the algorithm had to be adapted to each sensor pair and each height for plant detection. To discriminate correctly between plants for the light curtain case, the developedalgorithmimplementedadistancerange,measuredinpulsesfromtheencoder,thatallowed the verification of the presence or absence of a plant after the detection of the stem, inferring that interruptionsreceivedfromthesensorsplacedatthemiddleandtopheightsbeforeandafterthestem correspondedtotheleavesandtherestoftheplantstructure,respectively. Fortheanalysisofdata obtainedfromthesinglepairofIRlight-beamsensors,aMatlabroutine(MATLABR2015b,MathWorks, Inc.,Natick,MA,USA)wasdeveloped. Systemcalibrationwasperformedusing11artificialplantsin thelaboratorytestand122realtomatoplantsintheUCDavisfieldtest. Themethodologyusedforthe detectionoftomatoplantswasbasedonthefollowingsteps: 1. SelectionofValuesfortheVariablesUsedbytheProgrammeforDetection: a) Pulse_distance_relation: Thisvariableallowedustoconvertthepulsesgeneratedbythe encoderintothedistancestravelledbytheplatforms. Inlaboratorytrials,theencoderwas coupledtotheshaftthatprovidedmotiontothe3Dplants,andinthefield,itwascoupled toawheelinstalledinsidethestructureoftheplatform. Theconversionfactorsusedfor thetestswere1.18and0.98mmperpulseforthelaboratoryandthefield,respectively. b) Detection_filter: Toeliminatepossibleerroneousdetections,especiallyduringfieldtrials duetotheinteractionofleaves,branchesandevenweeds,thedetectionswerefirstfiltered. Wefilteredeverydetectionthatcorrespondedtoanalong-trackdistanceoflessthan4mm whilethesensorwasactive(continuousdetection). c) Theoretical_plant_distance: Thevalueforthetheoreticaldistancebetweenplantsinacrop line. Thevaluesetduringtestingwas100mmand380mmforthelaboratoryandthe field,respectively. d) Expected_plant_distance: Expecteddistancebetweenplantsinacroplinewasdefinedas thetheoreticalplantdistanceplusanerrorof20%. 2. Importing of raw data recorded by the sensors (encoder and existence “1” or absence “0” of detectionbytheIRsensors). Theconversionfactor(pulse_distance_relation)providedthedistance inmmforeachencodervalue. 3. Data were filtered by removing all detections whose length or distance travelled, while the sensorswereactive,waslessthanthesetvalue(detection_filter). Thus,potentialcandidateswere selectedbyregisteringthefollowing: a) Thedistanceatthestartofthedetection; b) Thedistanceattheendofthedetection; c) Thedistancetravelledduringdetectionand(iv)themeandistanceduringthedetection, whichwasconsideredthelocationofthestemoftheplant. Sensors2017,17,1096 8of19 4. Distanceevaluationbetweenthecurrentcandidateandthepreviouspotentialplant: a) If the evaluated distance was greater than the value set (expected_plant_distance), weconsideredthiscandidateasapotentialnewplant,registeringinanewmatrix: the numberoftheplant,thedetectionsthatdefinedit,themidpointlocationandthedistance fromthepreviouspotentialplant. b) If the evaluated distance was less than the set value (expected_plant_distance), plant candidatedatawasaddedtothepreviouspotentialplant,recalculatingallcomponents forthispotentialplant. Thenewmidpointwasconsideredthedetectionclosesttothe theoreticalmidpoint. 2.4.2. PlantCharacterizationUsingaSide-ViewLiDAR For the analysis of the data obtained from the LiDAR, it is important to mention the high complexity of its data, in both volume and format, compared with those data obtained by the light-beam. This is reflected in the following section, which explains the proposed methodology forobtainingboththeaerialpointcloudsofthetomatorowsreferencedtotheencodersensorandthe tomatoplantidentification. Thisisaprerequisitefortomatoplantlocalization. Forthispurpose,itwas necessarytopre-processthedata,followedbyatransformationandtranslationfromtheLiDARsensor tothescannedpoint. Pre-ProcessingofData (i) Datapre-processingwasperformedattheLiDARsensor. An off-line Matlab process was used with the actual field data collected during the field experiments. Datawerefilteredtoeliminatefalsepositivesorthosethatdidnotcontributerelevant information,consideringonlythosedetectionswithadistancegreaterthan0.05m. Later,theresulting detections were transformed from polar to Cartesian coordinates using a horizontal orientation coordinatesystemasareference. (ii) FromtheLiDARsensortothescannedpoint: transformationsanddatadelimitation. TotransformthehorizontalLiDARcoordinatestotheactualLiDARorientation(lateralinourcase), thefollowingstepswerefollowed: a) ThestartingpointsweretheCartesiancoordinatesobtainedusingthehorizontalorientationasa reference(x(cid:48) ,y(cid:48) ,z(cid:48) ). point point point b) TointegratethescanfromLiDARintotheplatformcoordinatesystem,adifferenttransformation (x ,y ,z ) was applied (see Equation (1)), considering the actual orientation of the LiDAR ϕ θ ψ (seeTable3). EachLiDARscannedpointintheplatformcoordinatesystem(x ,y ,z ) point point point wasobtained: xpoint xp(cid:48)oint 1 0 0 0 cos(yθ) 0 −sin(yθ) 0 ypoint = yp(cid:48)oint × 0 cos(xϕ) sin(xϕ) 0 × 0 1 0 0 × zpoint zp(cid:48)oint 0 −sin(xϕ) cos(xϕ) 0 sin(yθ) 0 cos(yθ) 0 1 1 0 0 0 1 0 0 0 1 (1) cos(zψ) sin(zψ) 0 0 1 0 0 0 −sin(zψ) cos(zψ) 0 0 × 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 1 Enct 0 0 1 Sensors2017,17,1096 9of19 Table3.TransformationandtranslationvaluesappliedtoLiDARdatawithalateralorientation. Roll“φ”(◦) Pitch“θ”(◦) Yaw“ψ”(◦) xTranslation(m) 0 −180 0 Enct Oncetransformed,thextranslationwasappliedtocoordinatesobtainedfortheactualLiDAR orientation. Theencodervaluesrecordedateachscantimewereusedtoupdatethepointcloudx coordinaterelatedtothetractoradvance.Additionally,theheightvalues(zcoordinate)werereadjusted bysubtractingtheminimumobtained. PlantLocalization The3Dpointcloudprocessingwasperformedateachstickortomatorow. Thus,usingmanual distancedelimitation,pointcloudswerelimitedtothedataabovethethreeseedbedsusedduring thetests. (i) AerialPointCloudExtraction Theaerialpointdatacloudwasextractedusingasuccessionofpre-filters. First,allpointsthat didnotprovidenewinformationwereremovedusingagriddingfilter,reducingthesizeofthepoint cloud. Afitplanefunctionwasthenappliedtodistinguishtheaerialpointsfromthegroundpoints. Indetail,theappliedpre-filterswere: a) Griddingfilter: Returnsadownsampledpointcloudusingaboxgridfilter. GridStepspecifies the size of a 3D box. Points within the same box are merged to a single point in the output (seeTable4). b) pcfitplane[23]: ThisMatlabfunctionfitsaplanetoapointcloudusingtheM-estimatorSAmple Consensus (MSAC) algorithm. The MSAC algorithm is a variant of the RANdom SAmple Consensus(RANSAC)algorithm.Thefunctioninputswere:thedistancethresholdvaluebetween adatapointandadefinedplanetodeterminewhetherapointisaninlier,thereferenceorientation constraint and the maximum absolute angular distance. To perform plane detection or soil detection and removal, the evaluations were conducted at every defined evaluation interval (seeTable4). Table4.Aerialpointcloudextractionparametersselected. MSAC Test GridStep(m3) TheoreticalDistance Evaluation Reference MaximumAbsolute Threshold(m) BetweenPlants(m) Intervals(m) Vector AngularDistance Sticks 0.240 0.08 Tomatoes1 (3×3×3)×10−9 0.04 [0,0,1] 5 0.290 0.096 Tomatoes2 Table4showstheparametervalueschosenduringtheaerialpointcloudextraction. Thechosen valueswereselectedbytrialanderror,selectingthosethatyieldedbetterresultswithoutlosingmuch usefulinformation. (ii) PlantIdentificationandLocalization • PlantClustering Ak-meansclustering[24]wasperformedontheresultingaerialpointstopartitionthepointcloud dataintoindividualplantpointclouddata. Theparametersusedtoperformthek-meansclustering wereasfollows: Sensors2017,17,1096 10of19 Aninitialnumberofclusters:Floor((distance_travelled_mm/distance_between_plants_theoretical) (cid:35) +1)×2 ThesquaredEuclideandistanceforthecentroidcluster. Eachcentroidisthemeanofpointsin (cid:35) thecluster. ThesquaredEuclideandistancemeasureandthek-means++algorithmwereusedforcluster (cid:35) centreinitialization. The clustering was repeated five times using the initial cluster centroid positions from the (cid:35) previousiteration. Method for choosing initial cluster centroid positions: Select k seeds by implementing the (cid:35) k-means++algorithmforclustercentreinitialization. Areductioninthenumberofclusterswasdetermineddirectlybyevaluatingtheclustercentroids. If pair of centroids were closer than the min_distance_between_plants, the process was repeated by reducingthenumberofclustersbyone(Table5). Table5.Plantidentificationandstemidentificationparameters. MinimumDistancebetweenPlants MinimumClusterSize HistogramJumps(mm) Distance_between_plants_theoretical×0.2 5 4 Aclusteringsizeevaluationwasperformed,excludingclusterssmallerthanmin_cluster_size. (iii) PlantLocation Threedifferentplantlocationswereconsidered: • Centreofthecluster • Locationofthelowestpointateachplantcluster • Intersectionoftheestimatedstemlineandgroundline By dividing the aerial plant data in slices defined by “histogram jumps”, the x (cid:35) limits on the maximal number of counts were obtained. For the z limits, data belongingtothebottomhalfoftheaerialpointcloudwereconsidered. Theremainingdatainsidetheselimitswereusedtoobtainalineofbestfit,which (cid:35) wasconsideredthestemline. Plant location was defined as the intersection between the stem line and the (cid:35) correspondinggroundlineobtainedpreviouslyfromtheMSACfunction. 2.4.3. ValidationofPlantLocationUsingRGBKinectImages ToobtainthedistancebetweenthestemsoftwoconsecutiveplantsusingtheKinectcamera,itis necessarytocharacterizetheplantscorrectlyandthenlocatethestemswithRGBimagesfromthe Kinectcamera. Thischaracterizationandlocationofthestemwasconductedasfollows: asequence of images of the entire path was obtained (~250 images in each repetition), where the camera’s shootingfrequencywasestablishedsteadilyinat1-sintervals.Obtainingthestringwiththetimestamp of each image was a key aspect of the routine developed in Matlab, as this string would later be usedforintegrationwiththeLiDARmeasurement. Therelationshipbetweenthetimestampandits correspondingencodervaluewasusedtospatiallylocateeachimage(x-axis,correspondingtothe tractoradvance). ImageprocessingtechniquesappliedinthecharacterizationoftomatoplantsfromKinectimages generallyfollowedthesesteps: (i) Accordingto[25],thefirststepinmostworksregardingimageanalysisisthepre-processingof theimage. Inthiswork,anautomaticcroppingoftheoriginalimageswasperformed(Figure4a),
Description: