ebook img

NASA Technical Reports Server (NTRS) 20150002731: Raven: An On-Orbit Relative Navigation Demonstration Using International Space Station Visiting Vehicles PDF

8.7 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview NASA Technical Reports Server (NTRS) 20150002731: Raven: An On-Orbit Relative Navigation Demonstration Using International Space Station Visiting Vehicles

AAS 15-111 RAVEN: AN ON-ORBIT RELATIVE NAVIGATION DEMONSTRATION USING INTERNATIONAL SPACE STATION VISITING VEHICLES Matthew Strube∗, Ross Henry†, Eugene Skelton‡, John Van Eepoel§, Nat Gill¶, Reed McKenna(cid:107) Since the last Hubble Servicing Mission five years ago, the Satellite Servicing Capabili- tiesOffice(SSCO)attheNASAGoddardSpaceFlightCenter(GSFC)hasbeenfocusingon maturingthetechnologiesnecessarytoroboticallyserviceorbitinglegacyassets–spacecraft notnecessarilydesignedforin-flightservice. Raven,SSCO’snextorbitalexperimenttothe InternationalSpaceStation(ISS),isareal-timeautonomousrelativenavigationsystemthat willmaturetheestimationalgorithmsrequiredforrendezvousandproximityoperationsfor asatellite-servicingmission. RavenwillflyasahostedpayloadaspartoftheSpaceTest Program’sSTP-H5mission,whichwillbemountedonanexternalExPRESSLogisticsCar- rier(ELC)andwillimagethemanyvisitingvehiclesarrivinganddepartingfromtheISSas targetsforobservation. Ravenwillhostmultiplesensors: avisiblecamerawithavariable fieldof viewlens, along-wave infraredcamera, anda short-waveflash lidar. This sensor suitecanbepointedviaatwo-axisgimbaltoprovideawidefieldofregardtotrackthevis- itingvehiclesastheymaketheirapproach. Variousreal-timevisionprocessingalgorithms willproducerange,bearing,andsixdegreeoffreedomposemeasurementsthatwillbepro- cessed in a relative navigation filter to produce an optimal relative state estimate. In this overviewpaper, wewillcovertop-levelrequirements, experimentalconceptofoperations, systemdesign,andthestatusofRavenintegrationandtestactivities. INTRODUCTION The next generation of missions undertaken by National Aeronautics and Space Administration (NASA) will require the ability to autonomously rendezvous and mate space vehicles in a variety of Earth orbits and interplanetary locations. These future missions will be focused on exploring beyond Earth orbit with humans, buildinglarge-scalescienceobservatories, returningsamplesfromancientasteroidorcomets, and rescuingagingandailingsatellites. Assuch,NASAcontinuestoinvestinbothcommercial-andgovernment- ownedAutonomousRendezvousandCapture(AR&C)technologiesthatwillenablerendezvousandcapture across a wide spectrum of applications: from capturing man-made objects to landing on primitive bodies; from navigating to cooperative targets to tracking natural features; from operating under ground control supervisiontoreactingautonomouslytothedynamicallychangingenvironment. RavenisNASA’snewestflightexperimenttodevelopandvalidatethesenewAR&Ctechnologies. Specif- ically, Raven will demonstrate the necessary components—next-generation sensors; vision processing al- gorithms; and high-speed, space-rated avionics—of an autonomous navigation system. Over its two-year missionontheISS,Ravenwillestimateinrealtimetherelativenavigationstateofthevariousvisitingvehi- cles(VVs)thatcomeandgofromthestationeachyear:theProgressandSuyozvehiclesfromRussia,theH-II TransferVehicle(HTV)fromJapan,andtheCygnusandDragonspacecraftfromtheUnitedStates. Overthe courseofRaven’snominalon-orbitlifespan,weexpecttomonitorapproximately50individualrendezvous ordeparturetrajectories. ∗AerospaceEngineer,NASAGSFC,Code596,Greenbelt,MD20771,USA. †OpticalEngineer,NASAGSFC,Code551,Greenbelt,MD20771,USA. ‡AerospaceEngineer,LockheedMartinSpaceSystemsCompany,Greenbelt,MD20771,USA. §AerospaceEngineer,NASAGSFC,Code591,Greenbelt,MD20771,USA. ¶AerospaceEngineer,NASAGSFC,Code596,Greenbelt,MD20771,USA. (cid:107)SoftwareDesigner,SoloEffectsLLC,Logan,UT,84321,USA. 1 RavenisasubsystemoftheISSSpaceCubeExperiment–Mini(ISEM)experimentwhichisbeingdeveloped bytheGSFC. ISEMandRavenwillflyaboardtheDepartmentofDefense(DoD)SpaceTestProgram-Houston 5(STP-H5),anISShostedpayload,alongwithmanyotherexperimentsfromvariousagencies,universities,and NASAcenters. TheSTP-H5payloadwillbedeliveredtoISSfromKennedySpaceCenter(KSC)inaDragon capsuleduringthelaunchoftheSpaceXCommercialResupplyService10(SpX-10). STP-H5willbeplaced ontheNadir-Inboard-WakenodeofELC-1foritstwoyearmission. Figure1showstheRavenexperiment andthefullSTP-H5payload. Figure2showsthelocationofELC-1onISS. Figure 1. Computer-Aided Design (CAD) models of Raven (left) and Space Test Program-Houston 5 (STP-H5) (right). The Raven assembly is comprised of two main parts: the Raven core and the SpaceCube 2.0 electronics, which are mounted sepa- ratelytoSTP-H5payload. STP$H4'On$Orbit'Loca2on' STP-H5/Raven ELC-1 '' 1' Figure 2. ELC-1 location on ISS (left) and Space Test Program-Houston 5 (STP-H5) andRavenlocationandorientationonELC-1(right) RavenExtensibility NASA’s recent focus and investment in the AR&C arena has been in developing a common sensor specifi- cation that can support the variety of planned AR&C-related missions. In 2014, NASA completed an assess- mentofthevariousconceptofoperationsformultipleupcomingmissionsincludingthecrewedandrobotic segments of the Asteroid Redirect Mission (ARM), as well as deep-space sample return; satellite servicing; and entry, descent, and landing missions. The results of this assessment were that a suite of sensors con- sisting of varying numbers of visible cameras, infrared cameras, and 3D imaging lidars—each built to a commonspecification—couldmeetthesemissions’schallengingrequirements,withouttheexpenseofeach 2 missiondevelopinguniquehardware. Suchana`lacartecapabilitywillallowtheAgencytorealizereduced non-recurring engineering costs on a per-sensor basis, while allowing for a multitude of different mission scenarios through a multitude of different sensor combinations. NASA acted on these findings through the 2014AsteroidRedirectMissionBroadAgencyAnnouncement(BAA),1whichisfundingindustrytobuildthe aforementionedsensorcomponentstothecommonspecificationgoalslaidoutinAppendixBoftheBAA. In addition to using it to mature critical AR&C technologies for a satellite servicing mission concept, NASA is also planing to utilize the flexibility of Raven to verify the AR&C Common Sensor Specification paradigm. Raven uses the same sensor methodology as specified in the Common Specification as well as high-performingspace-ratedreprogrammableprocessorsthatcanhostthemyriadalgorithmsthatareneeded forthesedemandingfuturemissions. ByusingRaven’sreprogrammableplatform,visitingresearchersfrom the United States Government, universities, research development centers, and commercial companies can hosttheirownalgorithmsonRavenanddemonstratetheflexibilityoftheCommonSensorSpecification. TheremainderofthepaperwillcovertheRavenexperimentinmoredetailincludingdiscussionsonkey missionobjectives,theconceptofoperations,sensingandavionicshardware,andthebaselineflightsoftware tosupportsatelliteservicing. MISSIONOBJECTIVES Raven’stechnologydemonstrationobjectivesarethree-fold: 1. Provideanorbitaltestbedforsatellite-servicingrelatedrelativenavigationalgorithmsandsoftware. TheroleoftheSSCOistodevelopandmaturethroughon-orbitandgrounddemonstrationsthetechnolo- giesrequiredtoroboticallyservicelegacysatellitesinorbit. Byflyingthebaselineservicingalgorithm suite, Ravenwillprovideon-orbitexperiencetotheSSCOGuidance, Navigation, andControl(GN&C) team. Raven continues the technology maturation effort that was started with the Dextre Pointing Package(DPP)effort2andtheArgongrounddemonstrations.3 2. Demonstratemultiplerendezvousparadigmscanbeaccomplishedwithasimilarhardwaresuite.NASA is interested in using similar hardware to perform cooperative docking tasks as well as landing on a non-uniform asteroid. By utilizing sensors within the NASA common sensor suite and flying multi- plealgorithms,Ravenwilldemonstraterelativenavigationtoadefinedcooperativetargetandnatural featuretrackingofanon-cooperativevehicle. 3. DemonstrateanindependentVVmonitoringcapability. ForeachrendezvousbyaVVtoISS, thecrew andgroundoperatorsgetdifferentamountsofinformationconcerningtoaccuracyoftheVVinfollow- ingitsnominalflightpath. Allofthisinformationisprovidedtothecrewandgroundoperatorsfrom the VV: there is no relative navigation solution that is independently sensed and computed by NASA. Additionally, if the necessary sensors for VV rendezvous are located on the station, the Agency can reducecostbyreusinghardwareovermultiplemissionsinsteadofhavingduplicatehardwareoneach vehiclewhichmaybedisposedofaftertheresupplymission. Fromthesehigh-levelobjectives,NASAdevelopedthefollowingminimumsuccesscriteriaforRaven: 1. Collectsensordata(rawvisible,infrared,andlidarimages)atdistanceswithin100metersatratesup to1Hz. Theseimages,alongwithmultipleISSGN&Ctelemetrypoints,willallowengineerstocompute aVVbest-estimatedrelativetrajectoryontheground. Thistrajectorywillbethetruthagainstwhichthe realtimeRavennavigationsolutionwillbecomparedagainst. 2. Collect rendezvous imagery for four different vehicles. By collecting imagery of multiple vehicles, groundoperatorscanunderstandthesensitivityoftheflightsoftwaretovariationsinthesizeandshape of servicing clients. One specific algorithm of interest builds a simplified math models of the client vehicletobeserviced;suchacapabilitywillallowaservicertoupdateanapriorimodeloftheclient vehiclewithon-orbitdatashowingitscurrentconfiguration. 3. Collectrendezvousimageryforatleastfivedifferentapproachespervehicle. EngineerscanusetheVV trajectorydispersions, andthecorrespondingvaryingRavennavigationsolution, todevelopstatistics 3 onthehowwellRavenwillperformagainstsimilarlysizedandshapedservicingclients. Additionally, eventhoughtRavenwilloperateduringbothapproachesanddepartures,approachesarefarmorerele- vanttosatelliteservicing. ThesmallcorrectiveburnsperformedbytheVVmoreaccuratelyrepresent ourservicingclients,asopposedtoadepartingVVfreelydriftingawayfromstation. 4. Demonstrate realtime vision processing solutions within 85 meters for three different vehicles. The vision processing algorithms are the work horses of the relative navigation system being tested on Raven;theyalsoarethemostcriticalpartofanon-cooperativenavigationsystemthatusesanatural- featureapproachtodeterminerelativepositionandattitude. Byaccomplishingthiscriteria,SSCOcan validatethatthehardwareandsoftwarecomponentsinRavenareflexibleenoughtonavigatetomultiple vehicleswithoutnewhardwareorsoftware,somethingthathasnotbeendemonstratedbefore. 5. Demonstrate gimbal tracking of two different vehicles using navigation filter feedback. Unlike the others,demonstratinggimbaltrackingwillrequireafullyoperationalsystemincludingsensors,vision processingsoftware,navigationfilter,andclosed-loopcontrollogicforthegimbal. Demonstratingan end-to-endclosed-loop,multi-sensorAR&Csystemisasignificantimprovementinthestate-of-the-art. Currentsystemscandosomeoftheseitems,butnotall. CONCEPTOFOPERATIONS Launch,Installation,andCheckout Afterdeliverytoandcheck-outatKennedySpaceCenter(KSC),RavenwillbedeliveredtotheISSonSpX- 10,alongwiththerestofSTP-H5. SpaceXwillinstallSTP-H5intotheunpressurizedtrunkoftheitsDragon spacecraft,whichwillbesenttoorbitviatheSpaceXFalcon9launchvehicle. AftertheDragonspacecraft autonomouslyrendezvouseswiththeISSandtheSpaceStationRemoteManipulatorSystem(SSRMS)berths the vehicle to a port, the two-armed Special Purpose Dexterous Manipulator (SPDM) will remove STP-H5 fromtheDragontrunk. Figure3showsbothSTP-H5intheDragontrunkandtheroboticinfrastructurepresent onISS. STP-H5willbeattachedtoELC-1attheFlightReleasableAttachmentMechanism(FRAM)-8site,after whichtheELCwillbeprovidingitsfullcomplementofservices4totheSTP-H5. Figure 3. Raven launch and installation activities: Space Test Program-Houston 5 (STP-H5) shown installed in the Dragon trunk (left) and Robotic infrastructure of theInternationalSpaceStation(ISS). Approximately1weekafterSTP-H5installationontheELC,duringwhichtheSTP-H5operationsteamwill checkoutthepayload’scommandanddatahandlingsystem,Ravenoperatorswillperformacompletecheck- outoftheflighthardware. Theavionicswillbecheckedouttoverifythatallcommand,telemetry,anddata paths are working. Next, ground operators will send the command to release the launch locks, which will allowthegimbaltomovethesensorstowheretheycanseetheEarth,ISSstructure,alreadydockedorberthed visiting vehicles, and a cooperative target placed on STP-H5. Operators will uses these scenes to perform a 4 functionalcheckoutofthesensors,measuretheinter-sensoralignment,andcalibratefeedbackpotentiometers inthegimbal,allinpreparationforRavenscienceoperations. ScienceOperations HavinganAR&CdemonstrationontheISSoffersauniqueopportunitytoviewthemanyVVcomingtothe stationeveryyear. Givenit’slocationunderneaththestation(onthenadirside),Ravencanobserveallofthe ISS VVs starting at 1 km away all the way to docking or berthing at a separation distance of approximately 20meters. ThelistofthedifferentVVflyingtoISSduringRaven’stwo-yearmissionisshowninTable1. The ISSdocking/undockingattitudesareparameterizedbywhichISSbodyaxisisalignedwiththevelocityvector: for example, +XVV reads as ”+X velocity Vector” denotes a configuration where the positive X axis and velocityvectorsarealigned. ThesedifferentdockingattitudesareimportanttoRavenbecausetheeffective field of regard in the Local Vertical, Local Horizontal (LVLH) coordinate system will change as shown in Figure 4. Table 1 also shows whether the approach trajectory is from the ISS radial (R-bar) or velocity (V- Opera'ons!Context! bar)directions,aswellastheISSportlocationsaVViscompatiblewith. FromtheELC-1vantagepoint,the distancebetweenRavVeinsi'anngd!Vtheheicdliefsf!eOrveenrtvipeowr!tlocationslistedinTable1isbetween20and30meters. Vehicle' ISS'Docking' ISS'Undocking' Trajectory' Port'Loca<ons' Expected'#'During'2yr' A0tude' A0tude' Raven'Life<me' Dragon! +XVV!+Z;Nadir! +XVV!+Z;Nadir! R;bar! N2!Nadir! 2! Cygnus! +XVV!+Z;Nadir! +XVV!+Z;Nadir! R;bar! N2!Nadir! 2! Soyuz! ;XVV!+Z;Nadir! +/;ZVV!–X;Nadir! V;bar! MLM/RS!Node/SM! 7! (+XVV!+Z;Nadir! for!zenith!&! Nadir!or!MRM1/FGB! some'mes)! nadir!ports! Nadir! Progress! ;XVV!+Z;Nadir! +/;ZVV!–X;Nadir! V;bar! MRM2/SM!Zenith!or! 8! (+XVV!+Z;Nadir! for!zenith!&! SM!AU! some'mes)! nadir!ports! HTV! +XVV!+Z;Nadir! +XVV!+Z;Nadir! R;bar! N2!Nadir! 1! USCV! TBD! TBD! TBD! N2!Fwd/PMA2! 1! USTV! TBD! TBD! TBD! N2!Nadir!or!N1!Nadir! 8! TV! TBD! TBD! TBD! N2!Nadir! 1! Table1. Vis•i tiBnagsevde!hoinc!eleartlay!bAlperidl!vuerrisniognR!ofa!IvSeSn!Loonpge!Treartmio!Sncsh.eSdhuloew! sISSdockingattitude,approachvector, dockingportloc•a ti3o0n!vse,haicnldese!vxispi'encgte!ISdSn!duumrinbge!Rraovefnv!2is!yitesa.r!life'me! Raven)CONOPS) •  Average!'me!between!observa'on!opportuni'es!~14!days,!σ!~8!days! Raven)Field)of)Regard)for)different)approaches) This document may not be disclosed without prior permission from NASA. ! 4! ISS)+XVV)Approaches) 6DOF)Pose) Region) Velocity) Vbar)) Vbar)) approach) approach) Velocity) ~250)m) Rbar)) ~250)m) >500)m) Rbar)) approach) >500)m) departure) )Bearing/Range) Radial) Region) Radial) ISS)PZVV)Approaches) Figure 4. Different ISS attitude and visiting vehicle (VV) approach vectors overlaid withRaven’soperationRaalvfieenl)dscioefncree)gpaerridod.s)cover)all)nadir)and)aV)dockings) Raven science operatioInnfosrmtay(poni)ccoantlaliyneds)itna)thrits)daocudmaeynt)ips)erxieomrpt)tforomt)hexeporat)ccotnutraoll)unadrerr)iITvAaRl)12o5.f4(bt)h(1e3))VV to ISS. T5h) e operations team will power up the Raven processor and perform a standard checkout of the system approximately T- 5 27 hours before a VV berthing takes place. Immediately following power up, operators will upload new softwareorfirmwaretothesystem.Theseupdatesprovidetwomainfunctions:incorporatinglessonslearned from previous approaches and changing VV model and trajectory files. At T-3 hours, the two-axis gimbal is powered up and verified. Shortly afterward,the three rendezvous sensors are powered up and calibrated. PartofthisrendezvoussensorcalibrationusesISSstructureanddedicatedtargetsplacedonSTP-H5toverify inter-sensor alignments. At T-1 hour, the Inertial Measurement Unit (IMU) is calibrated through a scripted sequenceofgimbalcommandsandthesensorsareopenlooppointedinthegeneraldirectionfromwhichthe VVwillapproachtheISS. Atthispoint,Ravenisreadyforscienceoperations. Acquisition Once the sensors are oriented toward the VV, the vision processing algorithms can acquire the vehicle. Nominally, VV acquisition will occur once the vehicle is within 1 km of the ISS. Image telemetry will be evaluatedinrealtimetodetermineifthetargetisintheopticalsensorsFieldofView(FOV),ifnot,thegimbal pointingcanbeadjustedfromthegroundifnecessary. Fortargetlocationsthatareuncertain,arasterscanof awidefieldofregardcanbeusedtoacquirethetarget. Ifnecessary,groundoperatorscanintervenetokeep thetargetintheFOVuntilitisacquiredbythevisionprocessingalgorithmsusingdifferentrangeandbearing methods. Tracking Once the vehicle has been acquired, Raven will nominally track a vehicle—keeping the sensors pointed at the vehicle—until the vehicle berths or docks with the ISS. There are several approaches that Raven can usetotrackavehicle,eachmoreautonomousthanthenext. Overnumerousrendezvous,Ravenwillprogress through each, starting with a hands-on approach and finishing with an autonomous one. The most basic usestheTeleopmodewhereRavenreceivesdesiredjointanglecommandsfromtheRavengroundterminal, where the operator is determining desired pointing from realtime image telemetry. Next is Path Tracking mode,whichusesscriptedpointingcommandsuploadedbeforethescienceintervaltochangedesiredoptical sensorpointing. AnonboardIMUcanbeeitherenabledordisabledtomakethismodeclosedoropenloop, respectively. Ravenalsohasanautonomousmode,VisualTrackingmode,whichisusedinconcertwiththe relativenavigationsolutiontoautomaticallycontroltheopticalsensorspointingrelativetothetarget. Transition Atthefartherranges,theonboardvisionprocessingalgorithmsprovideonlyrelativepositionmeasurement to the VV. For the cameras, two degree of freedom bearing measurements are provided; for the lidar, both a range and a bearing measurement are provided. At closer ranges, the algorithms calculate a full relative position and attitude vector (pose) to the VV. The transition between range/bearing-only mode to full pose modeusuallyoccurswhenthesubtendedangleofthevehicleis≥30%thesensor’sFOV. Raven’sonboard relativenavigationfilterwillautomaticallyincorporatetheseposesolutionsintotheestimateastheybecome available. Characterization DuringintervalswhereVVsarenotconductingproximityoperations,Ravenwillstillbeconductinglimited operations. Ground operators can point the sensors to predetermined locations on the ISS and, if present, a berthed VV, to collect imagery and continue testing the vision processing algorithms. Additionally, the ISS andotherexternallyattachedpayloadsareinterestedinIMUmeasurementsoftheELCenvironment. Thedis- turbanceenvironmentisalargecontributortocomplexityofeternalpayloads,whichoftenrequiresecondary pointingmechanismstoprovidestablescienceplatforms. Currently,ISShasexternallymountedaccelerom- etersalongthetrusstomeasurepositiondisturbances,butnottheattitudedisturbances. TheRavenIMUwill beabletoprovidethesecorrelatedattitudeandpositiondisturbancemeasurementstothecommunity. There willbededicatedintervalsduringcharacterizationandthroughouttheRavenmissiontocollectIMUdataand correlateagainstISScrewactivities. HARDWAREDESCRIPTION At its core, Raven is comprised of two main hardware groups: the Sensor Enclosure and the Gimbal BaseplateAssembly.GiventhetightvolumeconstraintsofaccommodatingthemanypayloadsonSTP-H5,the 6 RavenSpaceCubeelectronicsarenotco-locatedwithRaven’scoreandareinsteadmountedtothebunkerlid nexttorestofRaven. Figure5showstheRavenCADmodelandthefightassembly. Figure5. RavenComputer-AidedDesign(CAD)model(left)withhardwarecallouts andflighthardware(right)withgroundsupportequipment(GSE)supportstandun- dertheSensorEnclosure. The Sensor Enclosure is a light-weight mechanical structure that contains the Raven sensors. The basic structureissixpanelsattachedattheedges. InsidetheSensorEnclosure, theRavenOpticalBench, shown inFigure6, sitsontwosideledgesbuiltintothetwosidepanels. ThisopticalbenchholdstheVisibleand infraredcameras, alongwithanIMU, andprovidesarigidsurfacetoconstraininter-sensoralignmentshifts duringlaunchandon-orbitoperations. Theopticalbenchalsoprovidesthermalaccommodations—heaters, thermostats,andthermistors—andvolumetohouseconnectors,connectorbrackets,andharnessingtosupport sensoroperations. TheRavenflashlidarisnotmounteddirectlytotheopticalbenchbutinsteadtothefront panel of the Sensor Enclosure. Alignment between the lidar and the optical bench is controlled through multiplealignmentpinsbetweenthefrontpanel,thetwosidepanels,andtheOpticalBench. Figure 6. (Left) Picture of the flight Raven optical bench supported by mechanical groundsupportequipment. ThreeofthefourRavensensorscanbeseen: theVisCam (left w. black stray light baffle and red remove before flight hardware), the IRCam (centerwithblacklensandbrushedaluminumbody),andtheInertialMeasurement Unit (IMU) (right in brushed aluminum with alignment cube installed on the top). (Right)Ravenflashlidarshowingbothtransmitandreceiveoptics. Themechanical interfaceisthroughthe16fastenersshownonthefrontface. TheRavenVisCamisaVisiblewavelengthcamerathatwasoriginallymanufacturedfortheHubbleSpace Telescope(HST)ServicingMission4(SM4)onSTS-109. ThecamerasawoperationduringSM4ontheSpace Shuttle Atlantis as part of GSFC’s Relative Navigation Sensor (RNS) Demonstration Test Objective (DTO) in May, 2009.5,6 The 28 Volt camera contains a Cypress Semiconductor IBIS5-B-1300 focal plane array that outputs a 1000 x 1000 pixel monochrome image over a dual data-strobed Low Voltage Differential Signaling(LVDS)physicalinterface. ForRaven,thiscamerahasbeenpairedwithacommerciallyavailable, 7 radiationtolerant8-24mmzoomlensthathasbeenruggedizedforspaceflightbyGSFC.ThisMotorizedZoom Lens(MZL)provideszoomandfocuscapabilitiesviatwo,one-halfinchsteppermotors. Theadjustableiris on the commercial version of the lens has been replaced with a fixed f/4.0 aperture. The Viscam provides a 45◦ x 45◦ FOV when at the 8 mm lens setting and a 16◦ x 16◦ FOV while at the 24 mm lens setting. The combination of the fixed aperture and variable focal length and focus adjustments in the lens yield a depth of field of approximately four inches from the lens out to infinity. The VisCam assembly includes a stray light baffle coated with N-Science’s ultra-black Deep Space Black coating. The stray light baffle protects the VisCam from unwanted optical artifacts that arise from the dynamic lighting conditions in Low Earth Orbit(LEO). TheRavenIRCamisaLongWaveInfrared(LWIR)camerasensitiveinthe8-14µmwavelengthrangeand isabuild-to-printversionoftheLWIRcamerausedintheTriDARDTOs.7,8 ItcontainsaDRSTechnologies U6010VanadiumOxide(VOx)640x480pixelmicro-bolometerarray.TheIRCamincludesaninternalshutter for on-oribit camera calibration and flat field correction. The camera operates via a Universal Serial Bus (USB)2.0interfacethathandlesallpower,video,command,andtelemetryfunctions. Thecameraincludesan athermalized,50mmf/1.0lensthatyieldsa18◦x14◦FOV. TheRavenInertialMeasurementUnit(IMU)isacommerciallyavailableSTIM300thathasbeenscreened for spaceflight by GSFC. It contains three Micro-Electric Mechanical System (MEMS) gyroscopes, three MEMSaccelerometersandthree(unusedforRaven)inclinometers. TheIMUalsocontainsinternaltemper- aturesensors(oneforeachgyro,oneforeachaccelerometer,andoneforeachinclinometer)whichareused withcalibrationdatafromthermalchambertestingtocompensatefortemperatureeffectswhileinoperation. TheRavenflashlidar,shownunintegratedinFigure6,collectsdatabyfirstilluminatingtherelativescene withawide-beam1572nmlaserpulseandthencollectingthereflectedlightthroughasetofreceiveoptics, which funnel the light on to a 256 x 256 focal plane array. By clocking the time between transmission and reception, the focal plane array can accurately measure the distance to the reflected surface, as well as the return intensity, at a rate of up to 30 Hz. Additionally, the lidar utilizes a Liquid Crystal Variable Retarder(LCVR)whichactsasanopticalswitchtoprovideeithera20◦ or12◦ transmitFOV: thisabilityto togglebetweentwoFOVsallowsoperatorstomaximizethereceivesignaltonoiseratioacrosslargeseparation distances. Thelidarusesavarietyofinterfacessuchas28Voltpowerinput,RS-422commandandtelemetry links,andasetofLVDSdatalinkpairs. TheflashLIDARisfunctionallysimilartoaunitthatflewaspartof theSensorTestforOrionRel-NavRiskMitigation(STORRM)DTOonSTS-134inMay,2011.9 The second main subassembly in Raven is the Gimbal Baseplate Assembly, which is comprised of the baseplate,gimbaltowerassembly,active/passivelaunchlocks,andsecondarysupportelectronics. Thebase- plate serves as Raven’s interface to STP-H5 and provides mounting accommodations for most of the Raven flighthardware. Thegimbaltowerassemblylocatedonthetopofthebaseplateisactuallytwopieces: the EnhancedDeploymentandPositioningMechanism(EDAPM)—aCommercialOff-The-Shelf(COTS)azimuth- over-elevationgimbalwhichpointstheSensorEnclosureoverawidefieldofregard—andthegimbaltower— a mechanical interface adapter between the gimbal and the baseplate. Also on the top of the baseplate are launchlocktiedowns—usedtosecurethegimbalandSensorEnclosureduringlaunch—andthermalaccom- modationstoprotectthesecondaryhardwaremountedtothebottomofthebaseplate. Mountedunderneath the baseplate are the Enhanced Motor Power Drive (EMPD) and the Auxiliary Electronics Box (AEB). The EMPDwillcontrolboththeazimuthandelevationmotorsontheEDAPMaswellasthezoomandfocusmotors ontheVisCamMZL. TheAEBisacatch-allelectronicsboxthattiestogetherthevariouselectricalsystemsof Raven: itcontainsvoltageconvertersforthevariouspiecesofhardware,currentinrushlimiters,andlaunch lockpowerandcontrolservices. RavenusesaSpaceCube2.010 EngineeringModelasitsprimarycommandanddatahandlingprocessor. The same processor design flew aboard Space Test Program-Houston 4 (STP-H4) as part of the ISS Space- Cube2.0experiment. Figure7showsthetopandbottomsidesoftheboardwiththethreeXilinxVirtex-5 Field Programmable Gate Arrays (FPGAs) used in the Engineering Model architecture. One FPGA is the main board controller, providing interfaces to radiation hardened static random-access memory (SRAM), double data rate (DDR) Synchronous Dynamic Random Access Memory (SDRAM), a custom 16-channel analog to digital convertor (ADC), two flash memory modules, and 10/100 Ethernet. The other two FPGAs hostapplication-specificcodeandarewheretheRavenflightsoftwareandVHSIChardwaredescriptionlan- 8 majority of the circuitry was constrained. Figure 3 shows the bottom side of the board in a test fixture. Figure 4 - Flight Processor Card, Top Side guage(VHDL)willrun. EachoftheseFPGAshastwoDDRmemorymodules,oneflashmodules,andavariety ofexternalinterfaces. TheSpaceCube2.0architectureincludestheabilitytoreprogramthetwoapplication- spec ific FPGAs;11 Raven developers plan to use this capability to implement a variety of new software and VHDLmaojvoreitryt hofe tlhief eciorcfutihtrey ewxaps ecroinmsteranint.ed. Figure 3 shows the bottom side of the board in a test fixture. Figure 2 - Processor Engineering Model, Top Side Figure 5 - Flight Processor Card, Bottom Side The architecture selected consists of a radiation hardened Figure 4 - Flight Processor Card, Top Side Aeroflex FPGA, two Xilinx Virtex-5 FPGAs, various memory modules, multiple gigabit and LVDS/RS422 front- panel and backplane interconnects, 10/100 Ethernet, 16 analog channels, and secondary power circuitry that is cutting edge for space flight. A high-level diagram is shown in Figure 6. The Xilinx footprint handles either the 1752 Figure 2 - Processor Engineering Model, Top Side Figure 3 - Processor Engineering Model, Bottom Side CGA space grade device or the 1738 BGA commercial Figure7. Ravenflightsciencedataprocessor: basedontheSpaceCube2.0architecture. grade device. The commercial grade XC5VFX130T B. Flight Card Overview contains two embedded PowerPC440 processors. The SpaceCube v2.0 flight processor card design leverages design experience from all prior SpaceCube processor BASELINEPROCESSINGDESCRIPTION versions. The overall processing architecture strikes a perfect balance between computational potential, memory Figure 5 - Flight Processor Card, Bottom Side TheRavenhybridcomputingsystemfeaturesamixaonfdh ianrtderwcoanrnee-catc creesloeurracteesd, finercmeswsaarrye scuoprpeosrta nfedateumrebs efdord ed softwarelibrariesthatareeachdesignedtorunoptimarlelyliaoTbnhileiS tyap,r cabhcoietaeCrcdtuu brseiez ’esse, lmeacnutdeld t pi-coFowPnesGrisA. ts aMorfca hnai ytr ealcdeitsaustorionens. Thleaharedrnebnedaed s ic buildingblocksareVHDLcoresthatperformalloftheifnrotemrAf aetrhcoeifn leEgxM toF PtwhGeerAefl, ifgotwhldote hdXa riinlditnwox a trhVeei rctfoelixmg-h5p t oFdnPeeGsnigAtnss—., vgMairmioosubt sa l, sensors,auxiliaryelectronics,etc.Flightsoftwareiseximecpuomtreetamdnotolryny, mmthoued luftlliiepgshl,e tm dMuelsitiicgprnloe BcgoilgmaazpbeliitTe aMsn wsdo iLtfhVt cDthoSer/ eRIPSpC4r2-o62c0 ef1rs2osnBot -rs, Claspsa 3n/eAl satnadn dbaardck, palsa nwei lli nbtee rdciosncnuesscetsd, in10 d/e1t0a0i l. Ethernet, 16 eachwithrunningacustomrealtimeLinuxoperatingsysteanmal.ogD ecvhiacnenedlsr,i vaenrds bsercidongdeartyh epgowaper becitrwcueiterny tthheat hiasr d realtimeoperationsoftheVHDLcoresandthesoftreaTlthiem cuefltitpginhrgot cepdergsoesc ifenosrgs ospirn accseao rfdfli tgwihsta . ra eAn. heAixgthes-noledfvetewdl advireaergsrmiaomne sisso afs hg3oewUbn u s providesinter-processorcommunication,whichallo wmsesaoisnfu trFwinigagur r1ee0 p06r . xo cT1e9hs0es eXmsimlitn.o x Wsfyohnoetcpnhr idrnoet snhigiaznneidnloges p aee birtaohtaeirro dtn hwsei at1hc7 r5ao2 ss Figure 3 - Processor Engineering Model, Bottom Side smalCl GcAon sstpraacine edgr aadree a,d eivt icise voerr yt hiem 1p7o3rt8a nBt GtoA fcoocmusm teirmciea l multiplehardwareprocessors. througgrahdoeu t dtehvei cdee. si gnT hpeh asceo monm epracritasl pglarcaedme enXtC. 5 RVeFaXli1s3ti0cT RaBv. eFnli’gshtG CNa&rdC Opvreorvcieewss ing utilizes a mix of firmwapreartascn opdnltaascioenmsf ttewwnoat rseetmucdbeoiedmsd eapdroe P niomewnpetersPraCttoi4v4em0 p aprrxiooicrme tsoisz oferisn.oi svheinragl lthsey s- temTthhreo SupgahcepCuutb.eT vh2.e0 hfliigghht- spproeceedssovri sciaordn dpersoigcne lsesvienrgageasl sgcohreimthamtics. a rTeheim bpolaerdm edenstiegdneirn nfieermds wtoa rbee tocolgenviezarnatg eoft he Figure 6 - SpaceCube v2.0 Flight Processor Diagram massdievseiglyn peaxrpaelrlieenlcpe rofrcoemss ianllg cpraipora bSlepawceiCthubFe PGprAosc.esMsoro lreeavminagt hroeomma tfiocra dlliyscrienttee npasretsa, lvgioasr,i tmhmechs,ansiuccahl satisfftehneerns,a v- versions. The overall processing architecture strikes a and potential of board rework. The results of proper design For interconnect, this design uses very small 85-pin Airborn igatipoenrffiecltt ebra,larnucne absetswoefetnw caormeppurtoatcioensasle spootennttihale, smoefmtocroyr p elapnrnoincge siss osrese.n Finig tuhree p8ictsuhreosw osf tthhee felingdh-tt opr-oecnedssdora tcaarpda th Nano connectors in a back-to-back fashion. This increases forRaanvd einn’tesrcGoNnn&eCct arrecsohuirtececst,u nreec.essary support features for in Figure 4 and Figure 5. the I/O count to three times that of the SpaceCube v1.0 Earcelhiabviilsitiyo, nboparrodc seiszsei, nagnda lpgoowreirt.h m Mwaniyll lgesesnoensr alteearenietdh erabearingmeasurementorafullsix-stateposemea- within half the PWB real estate. These connectors support from the EM were folded into the flight design. Most Ethernet, JTAG, RS422, LVDS, and 88 direct Xilinx signals surement depending on the range to the vehicle. The lidar algorithm will additionally provide a direct importantly, the flight design complies with the IPC-6012B that can be configured as serial or differential channels. rangCelamsse 3a/sAu rsetamndeanrdt, dasu wriinllg beb deiascruinssgedm ino ddeetaiol.p erations. Raven’s GN&C-related sensors—the IMU and gimbal potentiometers—areprocessedusingtheSensorProce ssing(SP)application. Usingrawpotentiometerdata5, The flight processor card is an extended version of 3U theSPapplicationproducespositionandrateestimatesforboththeazimuthandelevationjoints. Addition- measuring 100 x 190 mm. When designing a board with a ally,stmheallS cPonasptrpaliinceadt iaorena,g eitn ies raveterys itmhepofirtealndt otof fvoiceuws taimnde focusestimatesfortheopticalzoomlensandforwards IMUtdharotaugthooutht ethoe tdheesrigGn Np&haCsea opnp lpiacratst ipolnacse.ment. Realistic parts placement studies are imperative prior to finishing the schematic. The board designer needs to be cognizant of Figure 6 - SpaceCube v2.0 Flight Processor Diagram GodldeaavridngN roaotmu rfoarl dFisecarettuer pearItms, avigase, Rmeecchoagninciatli ostniffe(nGeNrsF, IR) For interconnect, this design uses very small 85-pin Airborn and potential of board rework. The results of proper design GNFIR is a natural feature image recognition applicatNioanno dceonvneelcotporesd ini nai tbiaaclkl-ytof-boarckth feasRhiNonS. e Txhpies riinmcreeansteso n planning is seen in the pictures of the flight processor card HSTinS FMig4ur.5e, 46 aIntdr Feiqguurier e5s. nofiducialsonthetargetvehiclethea nI/dOu ctoiluinzte stoa tnhreede gtiemmeso tdheatl ooff tphreo SmpianceeCnutbfee avt1u.0re s to track. The algorithm first reduces the imagery to edgweisthitnh rhoaulfg thhe sPtaWnBd arredal eedstgatee. dTehteescet icoonnn,etchtoerns spurpopjoerct ts Ethernet, JTAG, RS422, LVDS, and 88 direct Xilinx signals the model into the image plane using the previous pose measurement, searches for nearby edges and then that can be configured as serial or differential channels. performsaleastsquaresminimizationofthesearchresultstoarriveataposemeasurement.12 Figure9shows a sim ulated image of a Dragon rendezvous overlaid w5ith the GNFIR algorithm’s tracking detail. Figure 10 shows GNFIR working with infrared camera imagery. The figure also highlights recent updates to GNFIR to trackobjectsinimageswithclutteredbackgrounds. 9 Figure8. Raven’sGuidance,Navigation,andControl(GN&C)softwarearchitecture FlashPose(FPose) FPoseutilizestheflashlidarrangeandintensitypointcloudstodevelopaposemeasurementatcloserange andarangeandbearingmeasurementatfartherdistances. ThealgorithmleveragesstandardIterativeClosest Point(ICP)methodstomatchamodeltothepointclouddata13andthenconvertstheleastsquaresminimiza- tionresulttoaposemeasurementusingtheHornalgorithm.14 Figure11showssimulatedlidarimageryofa DragonrendezvousandtheFposealgorithmtrackingthevehicle. NavigationFilter ThenextstepintheprocessingpipelineistofusethebearingandposemeasurementsfromGNFIR-Vis, GNFIR-IR,andFPose,alongwithmeasurementsoftheRavenjointanglesandrates,intoastateestimateof thevehicle. TheRavennavigationfilterarchitectureisbuiltoffthestandardextendedKalmanfilterarchitec- ture.15 Thefilterperformsthestateandmeasurementupdatesat1Hz,whileanoutputprocessingloopgener- atesinputforgimbalatahigher25Hzwhilepropagatinginbetweenmeasurementupdates.Twodifferentdy- namicformulationsareimplementedinthefilter—asimplekinematicforcemodelandaClohessy-Wiltshire dynamicmodel—andgroundoperatorscanchoosebasedonthecurrentinformationavailable.Initially,oper- atorswillusethesimplerkinematicforcemodelbecauseitdoesnotrelyonexternal,impreciseinformation, suchasitslocationrelativetotheISScenter-of-mass. Thefiltersolutionisexclusivelyusedonboardtopro- videthedesiredpointingvectorfortheVisualTrackingmodeandistransmittedtothegroundoperatorsfor evaluation. Toincreaseprocessingspeed, theRavenfilteralsoimplementsaUDUformulation—wherethe covariance matrix is factored into unit upper triangular and diagonal matrices—given the large number of pose bias states included in the filter state vector.16 The filter also incorporates measurement editing using underweightingtechniquesandfeaturesconsiderstatesusingtheSchmidt-Kalmanconsidercovariance.17 PROGRAMSTATUS As of January 2015, most of the Raven flight hardware has been integrated. The Sensor Enclosure and Gimbal Baseplate Assembly have been mechanically integrated and successfully passed both azimuth and elevation range of motions tests. The Sensor Enclosure is completely electrically integrated with all safe- to-mates between the sensors and the processing and power electronics complete. The flight SpaceCube electronics box is undergoing environmental testing in parallel to Raven core vibration testing. Figure 12 showstheremainingRaventestingactivitiesandtheshipdatetoSTP-H5. 10

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.