ebook img

Regression Analysis in Medical Research for Starters and 2nd Levelers PDF

422 Pages·2018·28.933 MB·english
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Regression Analysis in Medical Research for Starters and 2nd Levelers

(cid:129) Ton J. Cleophas Aeilko H. Zwinderman Regression Analysis in Medical Research for Starters and 2nd Levelers TonJ.Cleophas AeilkoH.Zwinderman DepartmentMedicine DepartmentofEpidemiologyandBiostatistics AlbertSchweitzerHospital AcademicMedicalCenter Sliedrecht,TheNetherlands Amsterdam,Noord-Holland,TheNetherlands Additionalmaterialtothisbookcanbedownloadedfromhttp://extras.springer.com. ISBN978-3-319-71936-8 ISBN978-3-319-71937-5 (eBook) https://doi.org/10.1007/978-3-319-71937-5 LibraryofCongressControlNumber:2017963262 ©SpringerInternationalPublishingAG2018 Preface The authors, as professors in statistics and machine learning at European universi- ties, are worried that their students find regression analyses harder than any other methodology in statistics. This is serious, because almost all of the novel method- ologies in current data mining and data analysis include elements of regression analysis.Itisthemainincentiveforwritinga26-chapteredition,consistingof: – Over26majorfieldsofregressionanalysis – Theircondensedmaths – Theirapplicationsinmedicalandhealthresearchaspublishedsofar – Step-by-stepanalysesforself-assessment – Conclusionandreferencesections Theeditionisaprettycompletetextbookandtutorialformedicalandhealth-care students,aswellasarecollection/updatebenchandhelpdeskforprofessionals. Novel approaches to regression analyses already applied in published clinical research will be addressed: matrix analyses, alpha spending, gatekeeping, kriging, interval censored regressions, causality regressions, canonical regressions, quasi- likelihood regressions and novel non-parametric regressions. Each chapter can be studiedasastand-aloneandcoversoneofthemanyfieldsinthefast-growingworld ofregressionanalyses.Step-by-stepanalysesofover40datafiles,bothhypothesized and real data, stored at extras.springer.com are included for self-assessment purposes. Traditionalregressionanalysisisadequateforepidemiologybutlacksthepreci- sionrequiredforclinicalinvestigations.However,inthepasttwodecades,modern regression methods have proven to be much more precise. And so it is time that a book described regression analysesfor clinicians.The currentedition is thefirst to doso. Sliedrecht,TheNetherlands TonJ.Cleophas Amsterdam,NH,TheNetherlands AeilkoH.Zwinderman Contents 1 ContinuousOutcomeRegressions. . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 DataExample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3 DataPlot. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 DefiningtheIntercept“a”andtheRegressionCoefficient“b” fromtheRegressionEquationy¼a+bx. . . . . . . . . . . . . . . . . 4 1.5 CorrelationCoefficient(R)VariesBetween(cid:2)1and+1. . . . . . . 5 1.6 ComputingR,Intercept“a”andRegressionCoefficient “b”:OrdinaryLeastSquaresandMatrixAlgebra. . . . . . . . . . . . 5 1.7 SPSSStatisticalSoftwareforWindowsforRegression Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.8 ASignificantlyPositiveCorrelation,XSignificantDeterminant ofY. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.9 SimpleLinearRegressionUsestheEquationy¼a+bx. . . . . . 13 1.10 MultipleRegressionwithTheeVariablesUsesAnother Equation. . . . . .. . . . .. . . . . .. . . . . .. . . . .. . . . . .. . . . . .. 13 1.11 RealDataExample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 1.12 SPSSStatisticalSoftwareforWindowsforRegression Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 1.13 SummaryofMultipleRegressionAnalysisof3Variables. . . . . 17 1.14 PurposesofMultipleLinearRegression. . . . . . . . . . . . . . . . . . 18 1.15 MultipleRegressionwithanExploratoryPurpose, FirstPurpose. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 1.16 MultipleRegressionforthePurposeofIncreasingPrecision, SecondPurpose. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 1.17 MultipleRegressionforAdjustingConfounding, ThirdPurpose. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 1.18 MultipleRegressionforAdjustingInteraction,FourthPurpose. . . 34 1.19 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 2 DichotomousOutcomeRegressions. . . . . . . . . . . . . . . . . . . . . . . . . 41 2.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 2.2 LogisticRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2.3 CoxRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 2.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3 ConfirmativeRegressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 3.1 ConfirmativeRegressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 3.2 HighPerformanceRegressionAnalysis. . . . . . . . . . . . . . . . . . 62 3.3 ExampleofaMultipleLinearRegressionAnalysisasPrimary AnalysisfromaControlledTrial. . . . . . . . . . . . . . . . . . . . . . . 63 3.4 ExampleofaMultipleLogisticRegressionAnalysisasPrimary AnalysisfromaControlledTrial. . . . . . . . . . . . . . . . . . . . . . . 65 3.5 ExampleofaMultipleCoxRegressionAnalysisasPrimary AnalysisfromaControlledTrial. . . . . . . . . . . . . . . . . . . . . . . 71 3.6 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 4 DichotomousRegressionsOtherThanLogisticandCox. . . . . . . . . 75 4.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 4.2 BinaryPoissonRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . 76 4.3 NegativeBinomialRegression. . . . . . . . . . . . . . . . . . . . . . . . . 80 4.4 ProbitRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 4.5 TetrachoricRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 4.5.1 TheYuleApproximation. . . . . . . . . . . . . . . . . . . . . . 90 4.5.2 TheUlrichApproximation. . . . . . . . . . . . . . . . . . . . . 90 4.6 Quasi-LikelihoodRegressions. . . . . . . . . . . . . . . . . . . . . . . . . 94 4.7 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 5 PolytomousOutcomeRegressions. . . . . . . . . . . . . . . . . . . . . . . . . . 105 5.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 5.2 MultinomialRegression. . . . . .. . . . . .. . . . . .. . . . .. . . . . .. 106 5.3 OrdinalRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 5.4 NegativeBinomialandPoissonRegressions. . . . . . . . . . . . . . . 111 5.5 RandomInterceptsRegression. . . . . . . . . . . . . . . . . . . . . . . . . 116 5.6 LogitLoglinearRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . 120 5.7 HierarchicalLoglinearRegression. . . . . . . . . . . . . . . . . . . . . . 124 5.8 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128 6 TimetoEventRegressionsOtherThanTraditionalCox. . . . . . . . . 131 6.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 6.2 CoxwithTimeDependentPredictors. . . . . . . . . . . . . . . . . . . . 132 6.3 SegmentedCox. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 6.4 IntervalCensoredRegressions. . . . . . . . . . . . . . . . . . . . . . . . . 137 6.5 Autocorrelations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 6.6 PolynomialRegressions. . . .. . . .. . . .. . . .. . . . .. . . .. . . .. 141 6.7 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146 7 AnalysisofVariance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147 7.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147 7.2 LittleDifferenceBetweenAnovaandRegressionAnalysis. . . . 147 7.3 PairedandUnpairedAnovas. . . . . . . . . . . . . . . . . . . . . . . . . . 150 7.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153 8 RepeatedOutcomesRegressionMethods. . .. . . .. . . .. . . . .. . . .. 155 8.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155 8.2 RepeatedMeasuresAnalysisofVariance(Anova). . . . . . . . . . . 156 8.3 RepeatedMeasuresAnovaVersusAncova. . . . . . . . . . . . . . . . 159 8.4 RepeatedMeasuresAnovawithPredictors. . . . . . . . . . . . . . . . 164 8.5 MixedLinearModelAnalysisWithoutRandomInteraction. . . . 166 8.6 MixedLinearModelwithRandomInteraction. . . . . . . . . . . . . 170 8.7 DoublyRepeatedMeasuresMultivariateAnova. . . . . . . . . . . . 172 8.8 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176 9 MethodologiesforBetterFitofCategoricalPredictors. . . . . . . . . . 179 9.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 9.2 RestructuringCategoriesintoMultipleBinaryVariables. . . . . . 180 9.3 VarianceComponentsRegressions. . . . . . . . . . . . . . . . . . . . . . 183 9.4 ContrastCoefficientsRegressions. .. . . . .. . . . .. . . . .. . . . .. 186 9.5 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192 10 LaplaceRegressions,Multi-insteadofMono-exponential Regressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 10.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 10.2 RegressionAnalysiswithLaplaceTransformationswithDue RespecttoThoseClinicalPharmacologistsWhoRoutinely UseIt. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195 10.3 LaplaceTransformations:HowDoesItWork. . . . . . . . . . . . . . 196 10.4 LaplaceTransformationsandPharmacokinetics. . . . . . . . . . . . . 197 10.5 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200 11 RegressionsforMakingExtrapolations. . . . . . . . . . . . . . . . . . . . . . 201 11.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 11.2 KrigingRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202 11.2.1 SemiVariography. . . . . . . . . . . . . . . . . . . . . . . . . . . 202 11.2.2 CorrelationLevelsbetweenObservedPlacesand UnobservedOnes. . . . . . . . . . . . . . . . . . . . . . . . . . . . 203 11.2.3 TheCorrelationbetweentheKnownPlaces andthePlace“?”. . . . . . . . . . . . . . . . . . . . . . . . . . . . 204 11.3 MarkovRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206 11.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 12 StandardizedRegressionCoefficients. . . . . . . . . . . . . . . . . . . . . . . 213 12.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 12.2 PathAnalysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 12.3 StructuralEquationModeling. . . . . . . . . . . . . . . . . . . . . . . . . 216 12.4 BayesianNetworks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222 12.5 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225 13 MultivariateAnalysisofVarianceandCanonicalRegression. . . . . 227 13.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227 13.2 MultivariateAnalysisofVariance(Manova). . . . . . . . . . . . . . . 229 13.3 CanonicalRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231 13.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235 14 MoreonPoissonRegressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237 14.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237 14.2 PoissonRegressionwithEventOutcomesperPersonper PeriodofTime. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238 14.3 PoissonRegressionwithYes/NoEventOutcomes perPopulationperPeriodofTime. . . . . . . . . . . . . . . . . . . . . . 241 14.4 PoissonRegressionsRoutinelyAdjustingAgeandSex Dependence,Intercept-OnlyModels. . . . . . . . . . . . . . . . . . . . . 243 14.5 LoglinearModelsforAssessingIncidentRateswithVarying IncidentRisks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245 14.6 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248 15 RegressionTrendTesting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249 15.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249 15.2 LinearTrendTestingofContinuousData. . . . . . . . . . . . . . . . . 250 15.3 LinearTrendTestingofDiscreteData. . . . . . . . . . . . . . . . . . . 252 15.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253 16 OptimalScalingandAutomaticLinearRegression. . . . . . . . . . . . . 255 16.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255 16.2 OptimalScalingwithDiscretizationandRegularizationVersus TraditionalLinearRegression. . . . . . . . . . . . . . . . . . . . . . . . . 257 16.3 AutomaticRegressionforMaximizingRelationships. . . . . . . . . 261 16.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265 17 SplineRegressionModeling. . .. . . .. . . . .. . . .. . . . .. . . .. . . . .. 267 17.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267 17.2 LinearandtheSimplestNonlinearModelsofthe PolynomialType. . . .. . . .. . . .. . . . .. . . .. . . .. . . .. . . . .. 268 17.3 SplineModeling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272 17.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276 18 MoreonNonlinearRegressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . 279 18.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280 18.2 TestingforLinearity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281 18.3 LogitandProbitTransformations. . . . . . . . . . . . . . . . . . . . . . . 284 18.4 “TrialandError”Method,BoxCoxTransformation, ACE/AVASPackages. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286 18.5 SinusoidalDatawithPolynomialRegressions. . . . . . . . . . . . . . 288 18.6 ExponentialModeling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289 18.7 SplineModeling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 290 18.8 LoessModeling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294 18.9 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297 Appendix. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298 19 SpecialFormsofContinuousOutcomesRegressions. . . . . . . . . . . . 299 19.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 299 19.2 KernelRegressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300 19.3 GammaandTweedieRegressions. . . . . . . . . . . . . . . . . . . . . . 307 19.4 RobustRegressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315 19.5 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318 20 RegressionsforQuantitativeDiagnosticTesting. . . . . . . . . . . . . . . 319 20.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319 20.2 Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320 20.3 DemingRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322 20.4 Passing-BablokRegression. . . . . . . . . . . . . . . . . . . . . . . . . . . 324 20.5 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326 21 Regressions,aPanaceeoratLeastaWidespreadHelpforData Analyses. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327 21.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 328 21.2 HowRegressionsHelpYouMakeSenseoftheEffects ofSmallChangesinExperimentalSettings. . . . . . . . . . . . . . . . 329 21.3 HowRegressionsCanAssesstheSensitivityofYour Predictors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330 21.4 HowRegressionsCanBeUsedforDatawithMultiple CategoricalOutcomeandPredictorVariables. . . . . . . . . . . . . . 334 21.5 HowRegressionsAreUsedforAssessingtheGoodness ofNovelQualitativeDiagnosticTests. . . . .. . . . . . .. . . . . . .. 340 21.6 HowRegressionsCanHelpYouFindOutAboutDataSubsets withUnusuallyLargeSpread. . . . . . . . . . . . . . . . . . . . . . . . . . 343 21.7 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357 22 RegressionTrees. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359 22.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359 22.2 DataExample,PrinciplesofRegressionTrees. . . . . . . . . . . . . . 360 22.3 AutomatedEntireTreeRegressionfromtheLDLCholesterol Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 361 22.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364 23 RegressionswithLatentVariables. .. . . . .. . . .. . . . .. . . .. . . . .. 365 23.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 365 23.2 FactorAnalysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366 23.3 PartialLeastSquares. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 23.4 DiscriminantAnalysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 378 23.5 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385 References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385 24 PartialCorrelations. . . . . . .. . . . . . . . . .. . . . . . . . .. . . . . . . . . .. 387 24.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387 24.2 DataExample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388 24.3 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 392 References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 392 25 FunctionalDataAnalysisI. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393 25.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393 25.2 PrinciplesofPrincipalComponentsAnalysisandOptimal Scaling,aBriefReview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395 25.3 FunctionalDataAnalysis,DataExample. . . . . . . . . . . . . . . . . 397 25.3.1 PrincipalComponentsAnalysis. . . . . . . . . . . . . . . . . . 398 25.3.2 OptimalScalingwithSplineSmoothing. . . . . . . . . . . . 401 25.3.3 OptimalScalingwithSplineSmoothingIncluding RegularizedRegressionUsingeitherRidge,Lasso, orElasticNetShrinkages. . . . . . . . . . . . . . . . . . . . . . 402 25.4 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 406 26 FunctionalDataAnalysisII. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407 26.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407 26.2 StatisticalModel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 408 26.3 AnExample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410 26.4 ApplicationsinMedicalandHealthResearch. . . . . . . . . . . . . . 413 26.5 Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414 References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 415 References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417 Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 423 Chapter 1 Continuous Outcome Regressions General Principles Regression Analysis I Abstract The current chapter reviews the general principles of the most popular regression models in a nonmathematical fashion. First, simple and multiple linear regressions are explained as methods for making predictions about outcome variables,otherwisecalleddependentvariables,fromexposurevariables,otherwise calledindependentvariables.Second,additionalpurposesofregressionanalysesare addressed,including 1. anexploratorypurpose, 2. increasingprecision, 3. adjustingconfounding, 4. adjustinginteraction. Particularattentionhasbeengiventocommonsenserationingandmoreintuitive explanations of the pretty complex statistical methodologies, rather than bloodless algebraicproofsofthemethods. Keywords Linearregression·Multiplelinearregression·Exploratorypurpose ·Precision·Confounding·Interaction 1.1 Introduction TheauthorsasteachersinstatisticsatuniversitiesinFranceandtheNetherlands,have witnessed, that students find regression analysis harder than any other methodology in statistics. Particularly, medical and health care students rapidly get lost, because of dependent data and covariances, that must be accounted all the time. The problem is thathighschoolalgebraisfamiliarwithequationslikey¼a+bx,butitneveraddresses equationslikey¼a+b x +b x ,letaloney¼a+b x +b x +b x .b x . 1 1 2 2 1 1 2 2 1 1 2 2 Inthepast30yearsthetheoreticalbasisofregressionanalysishaschangedlittle, but an important step, made by the European Medicine Agency (EMA) last year, was, that the EMA has decided to include directives regarding baseline characteristics in the statistical analysis of controlled clinical trials. And regression

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.