USOO8732587B2 (12) United States Patent (10) Patent No.: US 8,732,587 B2 Narayanan (45) Date of Patent: May 20, 2014 (54) SYSTEMS AND METHODS FOR DISPLAYING 2005/0149726 A1* 7/2005 1031114131. .................. .. 713/164 TRUSTWORTHINESS CLASSIFICATIONS 2005/0283831 A1 12/2005 Ryu et al. 2006/0253584 A1 11/2006 Dixon et al. FOR FILES AS VISUALLY OVERLAID ICONS 2007/0168266 A1* 7/2007 Questembert ................. .. 705/35 (75) Inventor: Aravinthan Narayanan, West KK 2007/0179883 A1 * 8/2007 Questembert . . . . . . . . . .. 705/39 2009/0049557 A1* 2/2009 Friedman et al. .. 726/27 Nagar (IN) 2009/0150812 A1* 6/2009 Baker et a1. 715/764 2010/0030894 A1* 2/2010 Cancel et al. .... .. 709/224 (73) Assignee: Symantec Corporation, Mountain View, 2011/0067101 A1* 3/2011 Seshadri et al. 726/22 CA (US) 2012/0246725 A1 * 9/2012 Osipkov ........................ .. 726/23 OTHER PUBLICATIONS ( * ) Notice: Subject to any disclaimer, the term of this patent is extended or adjusted under 35 Sourabh Satish; Systems and Methods for Determining and Quanti U.S.C. 154(b) by 230 days. fying the Impact of an Application on the Health of a System; US. Appl. No. 12/049,751, ?led Mar. 17, 2008. (21) App1.No.: 13/103,031 Sourabh Satish; Systems and Methods for Determining the Health Impact of an Application Based on Information Obtained From Like (22) Filed: May 6, 2011 Pro?led Computing Systems; US. Appl. No. 12/056,379, ?led Mar. 27, 2008. (65) Prior Publication Data Sourabh Satish; Social Trust Based Security Model; U.S. Appl. No. US 2012/0246598 A1 Sep. 27, 2012 11/394,846, ?led Mar. 31, 2006. Non-Final Of?ce Action Received in Related U.S. Appl. No. (30) Foreign Application Priority Data 11/394,846; Mar. 6, 2009. (Continued) Mar. 21, 2011 (IN) ........................... .. 370/KOL/2011 Primary Examiner * William Titcomb (51) Int. Cl. (74) Attorney, Agent, or Firm * ALG Intellectual Property, G06F 3/00 (2006.01) LLC (52) US. Cl. USPC .......................... .. 715/745; 715/846; 715/837 (57) ABSTRACT (58) Field of Classi?cation Search A computer-implemented method for displaying trustworthi USPC ........................................ .. 715/846, 837, 745 ness classi?cations for ?les as visually overlaid icons may See application ?le for complete search history. include (1) identifying a ?le, (2) identifying a ?le icon that graphically represents the ?le within a ?le manager interface (56) References Cited on a computing device, (3) obtaining a trustworthiness clas U.S. PATENT DOCUMENTS si?cation assigned to the ?le that identi?es the trustworthi ness of the ?le, and then (4) visually overlaying the ?le icon 7,269,851 B2 9/2007 Ackroyd with a trustworthiness icon that graphically represents the 7,392,477 B2 * 6/2008 Plastina et al. .............. .. 715/210 trustworthiness classi?cation assigned to the ?le. Various 7,640,590 B1 * 12/2009 McCorkendale et al. .... .. 726/25 other systems, methods, and computer-readable media are 7,694,328 B2 * 4/2010 Joshi et al. ...................... .. 726/2 also disclosed. 7,979,544 B2 * 7/2011 Cancel et al. .. 709/224 8,256,000 B1 * 8/2012 Krishnappa .... .. 726/24 2004/0119757 A1* 6/2004 Corley et al. ............... .. 345/837 20 Claims, 6 Drawing Sheets Tmstworthiness loon Overlsid File Icon 311.8 4 Overlay Process 3031 Wincalcexe Wincalcsxe Trustworthiness Classi?cation ' : Wincalcexe m: OXAD930931 WI 99% US 8,732,587 B2 Page 2 (56) References Cited Carey Nachenberg; Systems and Methods for Using Reputation Data to Detect Shared-Object-Based Security Threats; U.S. Appl. No. 12/415,834, ?led Mar. 31, 2009. OTHER PUBLICATIONS Daniel Asheghian; Methods and Systems for Evaluating the Health of Computing Systems Based on When Operating-System Changes Final Of?ce Action Received in Related U.S. Appl. No. 11/394,846; Occur; U.S. Appl. No. 12/476,782, ?led Jun. 2, 2009. William E. Sobel; Systems and Methods for Digitally Signing Sep. 1, 2009. Executables With Reputation Information; U.S. Appl. No. Non-Final Of?ce Action Received in Related U.S. Appl. No. 12/858,085, ?led Aug. 17,2010. 11/394,846; Feb. 23, 2010. Himanshu Dubey; Systems and Methods for Generating Reputation Final Of?ce Action Received in Related U.S. Appl. No. 11/394,846; Based Ratings for Uniform Resource Locators; U.S. Appl. No. Jul. 22, 2010. 13/101,472, ?led May 5,2011. Non-Final Of?ce Action Received in Related U.S. Appl. No. 11/394,846; Dec. 9, 2010. * cited by examiner US. Patent May 20, 2014 Sheet 1 0f6 US 8,732,587 B2 System m Modules Classification Database E @ Identification Module Trustworthiness m Classification(s) Q Classi?cation Module E Overlay Module M FIG. 1 US. Patent May 20, 2014 Sheet 2 0f 6 US 8,732,587 B2 200 Identify a file Q l Identify a file icon that graphically represents the file within a ?le manager interface on a computing device E l Obtain a trustworthiness classification assigned to the file that identifies the trustworthiness of the file & l Visually overlay the file icon with a trustworthiness icon that graphically represents the trustworthiness classification assigned to the file, wherein the trustworthiness icon enables a user of the computing device to visually identify the file’s trustworthiness without having to request a separate evaluation of the file’s trustworthiness @ End FIG. 2 US. Patent May 20, 2014 Sheet 3 0f6 US 8,732,587 B2 Trustworthiness Icon ? Overlaid File Icon File Icon Q E H Overlay Process ? Wincalc.exe Wincalc.exe Trustworthiness Classification ? M: Wincalc.exe File Hash: OXAD930931 Trustworthiness Score: 99% FIG. 3 US. Patent May 20, 2014 Sheet 4 0f 6 US 8,732,587 B2 mm2uemmtEcEEm ? f oov US. Patent May 20, 2014 Sheet 6 0f 6 US 8,732,587 B2 59:25 quumu @250 Cvoow @250 @250 3% @250 @250 Alexa @250 A .wQE % 4 265% g E@H@>w E25 3 % E25 g 9232092255023 854 / US 8,732,587 B2 1 2 SYSTEMS AND METHODS FOR DISPLAYING worthiness or reputations of ?les within a ?le manager inter TRUSTWORTHINESS CLASSIFICATIONS face without having to request separate trustworthiness evalu FOR FILES AS VISUALLY OVERLAID ICONS ations of the same. Features from any of the above-mentioned embodiments BACKGROUND may be used in combination with one another in accordance with the general principles described herein. These and other Security software may attempt to determine the trustwor embodiments, features, and advantages will be more fully thiness of a ?le using various heuristics and/ or based on understood upon reading the following detailed description in various community-supplied information about the ?le. For conjunction with the accompanying drawings and claims. example, security software may attempt to determine whether BRIEF DESCRIPTION OF THE DRAWINGS a ?le is malicious by determining whether the ?le matches a unique digital signature or ?ngerprint associated with a The accompanying drawings illustrate a number of exem known-malicious ?le. Additionally or alternatively, the secu plary embodiments and are a part of the speci?cation. rity software may attempt to assess the trustworthiness of the Together with the following description, these drawings dem ?le by obtaining a reputation score for the ?le from a reputa onstrate and explain various principles of the instant disclo tion service. In this example, the reputation service may sure. assign the reputation score to the ?le by collecting, aggregat FIG. 1 is a block diagram of an exemplary system for ing, and analyZing data from potentially millions of user displaying trustworthiness classi?cations for ?les as visually devices within a community (such as the user base of a secu 20 overlaid icons. rity-software vendor) that identify, among other details, the FIG. 2 is a ?ow diagram of an exemplary method for ?le’s origin, age, and prevalence within the community (such displaying trustworthiness classi?cations for ?les as visually as whether the ?le is predominantly found on at-risk or overlaid icons. “unhealthy” machines within the community). FIG. 3 is an illustration of an exemplary overlay process. Unfortunately, while a user may (in some cases) see the 25 FIG. 4 is an illustration of an exemplary ?le manager results of such a trustworthiness evaluation immediately upon interface including trustworthiness classi?cations for ?les its completion, the user may fail to recall the results of the displayed as visually overlaid icons. trustworthiness evaluation at a future point in time (e. g., when FIG. 5 is a block diagram of an exemplary computing viewing ?les within a ?le manager interface). Thus, the user system capable of implementing one or more of the embodi may be unable to identify the trustworthiness of a ?le dis 30 ments described and/ or illustrated herein. played within a ?le manager interface without requesting a FIG. 6 is a block diagram of an exemplary computing separate evaluation of the ?le’s trustworthiness. Conse network capable of implementing one or more of the embodi quently, if the user fails to request a separate trustworthiness ments described and/ or illustrated herein. evaluation prior to opening or executing a ?le, the user may Throughout the drawings, identical reference characters unknowingly open or execute a ?le that is less trustworthy 35 and descriptions indicate similar, but not necessarily identi than another ?le that performs a substantially similar func cal, elements. While the exemplary embodiments described tion. For example, the user may unknowingly execute a cal herein are susceptible to various modi?cations and alternative culator application that is less trustworthy than another cal forms, speci?c embodiments have been shown by way of culator application stored in the same directory. example in the drawings and will be described in detail As such, the instant disclosure identi?es a need for systems 40 herein. However, the exemplary embodiments described and methods for enabling users to quickly and easily identify herein are not intended to be limited to the particular forms (and/or compare) the trustworthiness of ?les without having disclosed. Rather, the instant disclosure covers all modi?ca to request a separate evaluation of the ?les’ trustworthiness. tions, equivalents, and alternatives falling within the scope of the appended claims. SUMMARY 45 DETAILED DESCRIPTION OF EXEMPLARY As will be described in greater detail below, the instant EMBODIMENTS disclosure generally relates to systems and methods for dis playing trustworthiness classi?cations for ?les as visually As will be described in greater detail below, the instant overlaid icons. In one example, a shell extension (such as a 50 disclosure generally relates to systems and methods for dis security plug-in) for a ?le manager interface may accomplish playing trustworthiness classi?cations for ?les as visually such a goal by (1) identifying a ?le icon that graphically overlaid icons. The following will provide, with reference to represents a ?le within a ?le manager interface (such as FIG. 1, detailed descriptions of exemplary systems for dis MICROSOFT WINDOWS EXPLORER) on a computing playing trustworthiness classi?cations for ?les as visually device, (2) obtaining a trustworthiness classi?cation assigned 55 overlaid icons. Detailed descriptions of corresponding com to the ?le that identi?es the trustworthiness or reputation of puter-implemented methods will also be provided in connec the ?le, and then (3) visually overlaying the ?le icon with a tion with FIGS. 2-4. In addition, detailed descriptions of an trustworthiness icon that graphically represents the trustwor exemplary computing system and network architecture thiness classi?cation assigned to the ?le. In this example, the capable of implementing one or more of the embodiments trustworthiness icon may enable a user of the computing 60 described herein will be provided in connection with FIGS. 5 device to quickly and easily visually identify the ?le’s trust and 6, respectively. worthiness or reputation without having to request a separate FIG. 1 is a block diagram of an exemplary system 100 for evaluation of the ?le. displaying trustworthiness classi?cations for ?les as visually As will be explained in greater detail below, by visually overlaid icons. As illustrated in this ?gure, exemplary system overlaying ?le icons with corresponding trustworthiness 65 100 may include one or more modules 102 forperforming one icons, the various systems and methods described herein may or more tasks. For example, and as will be explained in greater enable users to visually identify (and/or compare) the trust detail below, exemplary system 100 may include an identi? US 8,732,587 B2 3 4 cation module 104 programmed to identify a ?le icon that of the components of system 100 in FIG. 1, computing system graphically represents a ?le within a ?le manager interface. 510 in FIG. 5, and/or exemplary network architecture 600 in Exemplary system 100 may also include a classi?cation mod FIG. 6. ule 106 programmed to obtain a trustworthiness classi?cation As illustrated in FIG. 2, at step 202 the various systems assigned to the ?le that identi?es the trustworthiness or repu described herein may identify a ?le. For example, identi?ca tation of the ?le. tion module 104 may, as part of computing system 510 in FIG. 5, identify a ?le encountered by computing system 510. In addition, and as will be described in greater detail below, The systems described herein may perform step 202 in a exemplary system 100 may include an overlay module 108 variety of ways. In one example, identi?cation module 104 programmed to visually overlay the ?le icon with a trustwor may identify a ?le upon observing or detecting creation of the thiness icon that graphically represents the trustworthiness ?le. For example, identi?cation module 104 may identify an classi?cation assigned to the ?le. Although illustrated as executable ?le that is created and stored within a directory on separate elements, one or more of modules 102 in FIG. 1 may computing system 510 as part of an installation process. In represent portions of a single module or application (such as another example, identi?cation module 104 may identify a a shell extension, a security plug-in, or a ?le manager inter ?le within a directory as an application (or operating system) face). analyzes the contents of the directory prior to causing the In certain embodiments, one or more of modules 102 in contents of the directory to be displayed within a ?le manager FIG. 1 may represent one or more software applications or interface (such as ?le manager interface 400 in FIG. 4). Iden programs that, when executed by a computing device, may ti?cation module 104 may also identify ?les upon encounter cause the computing device to perform one or more tasks. For 20 ing the same on local and/or remote storage devices (e.g., ?les example, as will be described in greater detail below, one or stored on removable storage devices and/or remote servers). more of modules 102 may represent software modules stored The ?le identi?ed in step 202 may be any type of ?le, includ and con?gured to run on one or more computing devices, such ing an executable ?le or a dynamic-link library ?le. as computing system 510 in FIG. 5 and/ or portions of exem Returning to FIG. 2, at step 204 the various systems plary network architecture 600 in FIG. 6. One or more of 25 described herein may identify a ?le icon used to graphically modules 102 in FIG. 1 may also represent all or portions of represent the ?le within a ?le manager interface on a com puting device. For example, identi?cation module 104 may, one or more special-purpose computers con?gured to per as part of computing system 510 in FIG. 5, identify a ?le icon form one or more tasks. 302 in FIGS. 3 and 4 that is used to graphically represent the As illustrated in FIG. 1, exemplary system 100 may also 30 ?le “Wincalc.exe” within ?le manager interface 400 in FIG. include one or more databases, such as classi?cation database 4. 120. Classi?cation database 120 may represent portions of a The term “?le icon,” as used herein, generally refers to any single database or computing device or a plurality of data type or form of pictogram used to visually or graphically bases or computing devices. In one embodiment, and as will represent an object (such as a ?le, folder, application, device, be explained in greater detail below, classi?cation database 35 etc.) within a graphical user interface of a computing system. 120 may store trustworthiness classi?cations 122 assigned to As will be explained in greater detail below, ?le icons may ?les. graphically represent objects using any of a variety of shapes, Classi?cation database 120 in FIG. 1 may represent a por text, sizes, colors, and/or animations. tion of one or more computing devices. For example, classi In addition, the term “?le manager interface,” as used ?cation database 120 may represent a portion of computing 40 herein, generally refers to any type or form of user interface system 510 in FIG. 5 and/or portions of exemplary network for enabling users to view and/or manipulate ?les within a ?le architecture 600 in FIG. 6. Alternatively, classi?cation data system. File manager interfaces may display various images base 120 in FIG. 1 may represent one or more physically that graphically represent one or more ?les, relative or abso separate devices capable of being accessed by a computing lute paths, computing resources, and/or any other type of device, such as computing system 510 in FIG. 5 and/or por 45 suitable information. In some examples, ?le manager inter tions of exemplary network architecture 600 in FIG. 6. faces may be presented to a user of a computing system in Exemplary system 100 in FIG. 1 may be deployed in a response to one or more user actions (e.g., upon selecting an variety of ways. In one example, all or a portion of exemplary application icon that represents the ?le manager interface). system 100 may represent portions of computing system 510 Examples of ?le manager interfaces include, without limita in FIG. 5. For example, and as will be described in greater 50 tion, orthodox ?le managers (such as MICROSOFT SE-EX detail below, in some examples modules 102 may program PLORER or WINSCP), navigational ?le managers (such as computing system 510 to display trustworthiness classi?ca MICROSOFT WINDOWS EXPLORER or MAC OS X tions assigned to ?les as visually overlaid icons by (l) iden FINDER), spatial ?le managers, 3D ?le managers, web tifying a ?le icon that graphically represents a ?le within a ?le based ?le managers, or any other suitable type of ?le manager manager interface (such as ?le manager interface 400 in FIG. 55 or user interface. 4) on computing system 510, (2) obtaining a trustworthiness The systems described herein may perform step 204 in a classi?cation assigned to the ?le (from, e.g., classi?cation variety of ways. In one example, identi?cation module 104 database 120 in FIG. 1) that identi?es the trustworthiness or may identify a ?le icon associated with a ?le upon creation of reputation of the ?le, and then (3) visually overlaying the ?le the same by an application or operating system (e.g., during icon with a trustworthiness icon that graphically represents 60 installation of the ?le or an application associated with the the trustworthiness classi?cation assigned to the ?le. ?le). In other examples, identi?cation module 104 may iden FIG. 2 is a ?ow diagram of an exemplary computer-imple tify a ?le icon associated with a ?le upon encountering the ?le mented method 200 for displaying trustworthiness classi?ca (e.g., upon encountering a ?le on a removable storage device tions for ?les as visually overlaid icons. The steps shown in and/or remote server). FIG. 2 may be performed by any suitable computer-execut 65 Returning to FIG. 2, at step 206 the various systems able code and/or computing system. In some embodiments, described herein may obtain a trustworthiness classi?cation the steps shown in FIG. 3 may be performed by one or more assigned to the ?le that identi?es the trustworthiness or repu
Description: