ebook img

Automated Usage Tracing and Analysis PDF

119 Pages·2013·4.49 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Automated Usage Tracing and Analysis

Automated Usage Tracing and Analysis: a comparison with web survey Master of Science Thesis in Software Engineering MIKAEL BOLLE EMIL BACKLUND Chalmers University of Technology University of Gothenburg Department of Computer Science and Engineering Gothenburg, Sweden, September 2013 The Author grants to Chalmers University of Technology and University of Gothenburg the non-exclusive right to publish the Work electronically and in a non-commercial purpose make it accessible on the Internet. The Author warrants that he/she is the author to the Work, and warrants that the Work does not contain text, pictures or other material that violates copyright law. The Author shall, when transferring the rights of the Work to a third party (for example a publisher or a company), acknowledge the third party about this agreement. If the Author has signed a copyright agreement with a third party regarding the Work, the Author warrants hereby that he/she has obtained any necessary permission from this third party to let Chalmers University of Technology and University of Gothenburg store the Work electronically and make it accessible on the Internet. Automated Usage Tracing and Analysis: a comparison with web survey MIKAEL BOLLE EMIL BACKLUND © MIKAEL BOLLE, September 2013. © EMIL BACKLUND, September 2013. Examiner: MIROSLAW STARON Chalmers University of Technology University of Gothenburg Department of Computer Science and Engineering SE-412 96 Gothenburg Sweden Telephone + 46 (0)31-772 1000 Department of Computer Science and Engineering Gothenburg, Sweden September 2013 Abstract Achallengeintakingdecisionsonhowtoimproveasoftwareproductistogainknowledge on how end-users interact with it. One way of getting this knowledge is by asking them through a web survey. Another approach is based on tracing what the users do and then run an analysis on that gathered data. This paper presents an approach called Automatic Usage Tracing and Analysis (AUTA) to automatically gather usage data for asoftwaresystemwithAspect-OrientedProgramming(AOP)andanalyzesthegathered datathroughdatamining. Abrainstormingworkshopwiththedevelopersofthesoftware system was used to define a set of questions that the data mining should answer. The questions were implemented in AUTA and a web survey was conducted to compare the two methods. The comparison showed that there is a resemblance between AUTA and a conducted web survey. However, the resemblance is not strong enough to conclude with certainty that AUTA can replace the use of web surveys. It was also discovered that some questions identified in the workshop were not possible to be answered with a web survey but could be answered with AUTA. The recommendation is therefore that AUTA and web surveys are used as complement to each other. Keywords: AOP,Aspect-OrientedProgramming, Datamining, Websurvey, Correlation, Usage tracing Acknowledgements First of all we would like to thank Matthias Tichy our academic supervisors and Carl Stein our industrial supervisor, they have both been a great support and guidance through out the process. We would also like to thank ATEA Global Services for providing the possibility to conduct this study and to the employees’ whom gave valuable input during the brain- storming workshop and took their time to answer our questions when needed. A big thanks is also sent out to all participants which were test subjects in the task workshop Finally, we would like to thank our friends and family for their support, this study would not have been possible without them. Mikael Bolle & Emil Backlund, G¨oteborg 12/08/13 Glossary Advice Advices are the actions that will be executed when a Join-point is reached. 11–14, 39, 41, 43–45 AOP Aspect-Oriented Programming. 1–4, 8, 11–13, 37, 39, 79, 80, 88 Aspect A stand-alone module or class for implementing cross-cutting concerns. iv, 11, 12, 14, 39–45 Aspect language The programming language for implementing aspects. 11 Aspect weaver Aspectweaverisusedtocombinetheaspect-andcomponent-language. 12 AUTA Automated Usage Tracing and Analysis. 2, 3, 8, 9, 22, 37, 46, 79, 80, 88, 89 Bounce rate A bounce is considered as when a user navigates to a service and then navigates to another service in just a matter of second. The bounce rate is a value representing how often this kind of navigation happens for a service, it represent the action of bouncing in and out of a pages. 27, 29, 54, 64, 67, 69, 73, 77, 84 CA Cluster Analysis. 15 Component language Theprogramminglanguageforimplementingthemainconcern ofthesystem. Canbebothaprocedural-orobject-oriented-language,inthisstudy it is an object-oriented language. 11, 12 Join-point Is a point in the execution where aspects are coordinated with the Compo- nent language. 11, 12 Pointcut Pointcuts define which Join-Points an aspects should be applied to. 11, 14, 37, 41, 42, 44, 45, 61, 79, 80 PostSharp Is an Aspect-Oriented Framework for .NET. 8, 14, 39 5 Proxy-pattern A design pattern where the interaction with a class is passed through a proxy which implements the same interface. 13, 39, 79 SUMMON Chalmers library search system. 6 UI User interface. 1, 26, 27, 38, 40 UserControl Class responsible for the presentation of a web page. 37, 39 UX User experience design. 22, 26 Contents 1 Introduction 1 1.1 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.3 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 Case Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.5 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.6 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2 Fundamentals 11 2.1 Aspect-Oriented Programming . . . . . . . . . . . . . . . . . . . . . . . . 11 2.2 Data Mining . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.2.1 Cluster Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.2.2 Outlier Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.3 Surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.3.1 Web Surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3 Workshop 22 3.1 Set Up . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 3.2 Which parts of Accelerator need improved user experience? . . . . . . . . 23 3.3 Whatwouldyouliketoknowabouttheusers’interactionwithAccelerator? 25 3.4 What is important in a mobile version of Accelerator, from the users’ standpoint? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.5 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4 Question Analysis 28 4.1 Breakdown Question 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.2 Breakdown Question 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.3 Breakdown Question 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 4.4 Breakdown Question 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 4.5 Breakdown Question 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 i CONTENTS 4.6 Breakdown Question 6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.7 Breakdown Question 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 4.8 Breakdown Question 8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 5 Solution 37 5.1 Usage Tracing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 5.1.1 Selection of Data Points . . . . . . . . . . . . . . . . . . . . . . . . 37 5.1.2 Selection of Technology . . . . . . . . . . . . . . . . . . . . . . . . 39 5.1.3 Implementation with AOP . . . . . . . . . . . . . . . . . . . . . . 39 5.1.4 Example: User Action to Database Storage . . . . . . . . . . . . . 43 5.2 Data Mining . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 5.2.1 Breakdown Question 1 . . . . . . . . . . . . . . . . . . . . . . . . . 46 5.2.2 Breakdown Question 2 . . . . . . . . . . . . . . . . . . . . . . . . . 50 5.2.3 Breakdown Question 3 . . . . . . . . . . . . . . . . . . . . . . . . . 54 5.2.4 Breakdown Question 5 . . . . . . . . . . . . . . . . . . . . . . . . . 54 5.2.5 Breakdown Question 6 . . . . . . . . . . . . . . . . . . . . . . . . . 57 5.2.6 Breakdown Question 7 . . . . . . . . . . . . . . . . . . . . . . . . . 59 5.2.7 Breakdown Question 8 . . . . . . . . . . . . . . . . . . . . . . . . . 60 6 Evaluation 64 6.1 User Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 6.1.1 Subject selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 6.1.2 Task Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 6.1.3 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 6.1.4 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 6.2 Web Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 6.2.1 Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 6.2.2 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 6.3 Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 6.3.1 Breakdown Question 1 . . . . . . . . . . . . . . . . . . . . . . . . . 77 6.3.2 Breakdown Question 3 . . . . . . . . . . . . . . . . . . . . . . . . . 77 6.3.3 Breakdown Question 5 . . . . . . . . . . . . . . . . . . . . . . . . . 78 6.3.4 Breakdown Question 7 . . . . . . . . . . . . . . . . . . . . . . . . . 78 7 Discussion 79 7.1 Aspect-Oriented Programming for Usage Tracing . . . . . . . . . . . . . . 79 7.2 Conducting a Workshop For Guidance of Implementation of Data Analysis 80 7.3 Possible Replacement Of Web Survey . . . . . . . . . . . . . . . . . . . . 81 7.3.1 Data Mining . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 7.3.2 User testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 7.3.3 Web Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 7.3.4 Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 7.4 Threats to Validity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 ii CONTENTS 8 Conclusion 88 8.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Bibliography 93 A Correlation 94 A.1 Breakdown Question 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 A.2 Breakdown Question 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 A.3 Breakdown Question 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 A.4 Breakdown Question 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 B Survey Questions 102 C Workshop Tasks 105 iii

Description:
Keywords: AOP, Aspect-Oriented Programming, Data mining, Web survey, Can be both a procedural- or object-oriented-language, in this study .. management, fraud detection and for stock market forecasting [7]. This study is conducted on a system developed with C# and conclusions drawn are.
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.