ebook img

Automating the Subject Matter Eligibility Test of Alice v. CLS Bank Ben Dugan** In Alice v. CLS ... PDF

46 Pages·2017·1.68 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Automating the Subject Matter Eligibility Test of Alice v. CLS Bank Ben Dugan** In Alice v. CLS ...

Mechanizing Alice: Automating the Subject Matter Eligibility Test of Alice v. CLS Bank Ben Dugan** 1. INTRODUCTION AND OVERVIEW In Alice v. CLS Bank, the Supreme Court established a new test for determining whether a patent claim is directed to patent-eligible subject matter.1 The impact of the Court’s action is profound: the modified standard means that many formerly valid patents are now invalid, and that many pending patent applications that would have been granted under the old standard will now not be granted. This article describes a project to mechanize the subject matter eligibility test of Alice v. CLS Bank. The Alice test asks a human to determine whether or not a patent claim is directed to patent-eligible subject matter. The core research question addressed by this article is whether it is possible to automate the Alice test. Is it possible to build a machine that takes a patent claim as input and outputs an indication that the claim passes or fails the Alice test? We show that it is possible to implement just such a machine, by casting the Alice test as a classification problem that is amenable to machine learning. This article describes the design, development, and applications of a machine classifier that approximates the Alice test. Our machine classifier is a computer program that takes the text of a patent claim as input, and indicates whether or not the claim passes the Alice test. We employ supervised machine learning to construct the classifier.2 Supervised machine learning is a technique for training a computer program to recognize patterns.3 Training comprises presenting the program with positive and negative examples, and automatically adjusting associations between particular features in those examples and the desired output.4 The examples we use to train our machine classifier are obtained from the United States Patent Office. Within a few months of the Alice decision, examiners at the Patent Office began reviewing claims in patent applications for subject matter compliance under the  An early discussion draft of this article appeared as Estimating the Impact of Alice v. CLS Bank Based on a Statistical Analysis of Patent Office Subject Matter Rejections (February 23, 2016). Available at SSRN: https://ssrn.com/abstract=2730803. This article significantly refines the statistical analysis of subject matter rejections at the Patent Office. This article also clarifies the performance results of our machine classifier, and better accounts for classifier performance when estimating the number of patents invalidated under Alice v. CLS Bank. ** Member, Lowe Graham Jones, PLLC. Affiliate Instructor of Law, University of Washington School of Law. Opinions expressed herein are those of the author only. Copyright 2017 Ben Dugan. I would like to thank Bob Dugan and Jane Winn for their feedback, advice, and support, and Sarah Dugan for her love and encouragement. 1 Alice Corp. v. CLS Bank, Int’l, 134 S. Ct. 2347 (2014). 2 STUART RUSSELL & PETER NORVIG, ARTIFICIAL INTELLIGENCE: A MODERN APPROACH 693-95 (3d ed. 2010). 3 Id. 4 Id. 1 Revised: August 25, 2017 new framework.5 Each decision of an examiner is publicly reported in the form of a written office action.6 We programmatically obtained and reviewed many thousands of these office actions to build a data set that associates patent claims with corresponding eligibility decisions. We then used this dataset to train, test, and validate our machine classifier. A. Table of Contents 1. Introduction and Overview .................................................................................................. 1 A. Table of Contents ............................................................................................................ 2 B. Organization of the Article ............................................................................................... 3 2. Brief Review of the Alice Framework ................................................................................... 5 3. Rendering Legal Services in the Shadow of Alice ............................................................... 7 A. Intuition-Based Legal Services ........................................................................................ 8 B. Data-Driven Patent Legal Services .................................................................................. 9 C. Predicting Subject Matter Rejections Yields Economic Efficiencies ................................11 4. Data Collection Methodology .............................................................................................13 5. Data Analysis Results ........................................................................................................18 6. Predicting Alice Rejections with Machine Classification .....................................................24 A. Word Clouds ..................................................................................................................25 B. Classifier Training ...........................................................................................................28 C. Performance of a Baseline Classifier ..............................................................................29 D. Performance of an Improved Classifier ...........................................................................31 E. Extensions, Improvements, and Future Work .................................................................33 7. A Patent Claim Evaluation System .....................................................................................34 A. System Description ........................................................................................................35 B. Claim Evaluation System Use Cases .............................................................................37 C. Questions Arising From the Application of Machine Intelligence to the Law ....................38 8. Estimating the Impact of Alice on Issued Patents ...............................................................40 A. The Classifier .................................................................................................................40 B. Classifier Validation ........................................................................................................41 5 See, e.g., USPTO, Preliminary Examination Instructions in View of the Supreme Court Decision in Alice v. CLS Bank (June 25, 2014), http://www.uspto.gov/sites/default/files/patents/announce/alice_pec_25jun2014.pdf. See generally, USPTO, Subject Matter Eligibility, https://www.uspto.gov/patent/laws-and-regulations/examination- policy/subject-matter-eligibility [hereinafter Preliminary Examination Instructions]. 6 35 U.S.C. § 132; 37 C.F.R. 1.104; MPEP 706. 2 Revised: August 25, 2017 C. Evaluation of Issued Patent Claims ................................................................................43 9. Conclusion .........................................................................................................................46 B. Organization of the Article This article is organized in the following manner. In Section 2, we provide an overview of the Alice framework for determining the subject matter eligibility of a patent claim. The Alice test first asks whether a given patent claim is directed to a non-patentable law of nature, natural phenomenon, or abstract idea.7 If so, the claim is not patent eligible unless the claim recites additional elements that amount to significantly more than the recited non-patentable concept.8 In Section 3, we motivate a computer-assisted approach for rendering legal advice in the context of Alice. Alice creates a new patentability question that must be answered before and during the preparation, prosecution, and enforcement of a patent. Section 3 provides inspiration for a data-driven, computer-assisted, predictive approach for efficiently answering the Alice patentability question. Such a predictive approach can be usefully performed at various stages of the lifecycle of a patent, including during initial invention analysis, application preparation and claim development, and litigation risk analysis. Computer-assisted prediction of Alice rejections stands in contrast to traditional, intuition- driven methods of legal work, and can yield considerable economic efficiencies, by eliminating the legal fees associated with the preparation and prosecution of applications for unpatentable inventions, or by eliminating baseless litigation of invalid patent claims. In addition, a predictive approach can be used to assist a patent practitioner in crafting patent claims that are less likely to be subjected to Alice rejections, thereby reducing the number of applicant-examiner interactions and corresponding legal fees during examination. In Section 4, we describe our data collection methodology. Section 4 lays out our process for generating a dataset for training our machine classifier. In brief, we automatically download thousands of patent application file histories, each of which is a record of the interaction between a patent examiner and an applicant. From these file histories, we extract office actions, each of which is a written record of an examiner’s analysis and decision of a particular application. We then process the extracted office actions, to determine whether the examiner has accepted or rejected the claims of the application under Alice. Finally, we construct our dataset with the obtained information. Our dataset is a table that associates, in each row, a patent claim with an indication of whether the claim passes or fails the Alice test, as decided by a patent examiner. In Section 5, we present results from an analysis of our dataset. Our analysis identifies trends and subject matter areas that are disproportionately subject to rejections under Alice. Our dataset shows that the subject matter areas that contain many applications 7 Alice, 134 S. Ct. at 2354. 8 Id. at 2354-56. 3 Revised: August 25, 2017 with Alice rejections include data processing, business methods, games, educational methods, and speech processing. This result is consistent with the focus of the Alice test on detecting claims that are directed to abstract ideas, including concepts such as economic practices, methods of organizing human activity, and mathematical relationships.9 In Section 6, we build a machine that is capable of predicting whether a claim is likely to pass the Alice test. In this section, we initially perform an analysis that identifies particular words that are associated with eligibility or ineligibility under Alice. The presence of such associations indicates that there exist patterns that can be learned by way of machine learning. Next, we describe the training, testing, and performance of a baseline classifier. Our classifiers are trained in a supervised manner using as examples the thousands of subject matter patentability decisions made by examiners at the Patent Office. We then describe an improved classifier that uses an ensemble of multiple distinct classifiers to improve upon the performance of our baseline classifier. We conclude this section with a brief outline of possible extensions, improvements, and future work. In Section 7, we describe a claim evaluation system. The system is a Web-based application that takes a patent claim as input from a user, and provides the text of the claim to a back-end classifier trained as described above. The system provides the decision of the classifier as output to the user. It is envisioned that a system such as this can be used by a patent practitioner to provide improved Alice-related legal services at various stages of the lifecycle of a patent, as discussed in Section 3. In Section 8, we utilize our machine classifier to quantitatively estimate the impact of Alice on the universe of issued patents. While other studies have tracked the actual impact of Alice in cases before the Federal Courts, our effort is the first to quantitatively estimate the impact of Alice on the entire body of issued patents.10 To obtain our estimate, we first determine whether our classifier can be used as a proxy for the decision-making of the Federal Courts. Since our classifier is trained based on decisions made by examiners at the Patent Office, it is natural to ask whether the classifier reasonably approximates the way that the courts apply the Alice test. To answer this question, we evaluate the performance of our classifier on patent claims that have been analyzed by the Court of Appeals for the Federal Circuit. The results of this evaluation show that the outputs produced by our classifier are largely in agreement with the decisions of the CAFC. Finally, we turn our classifier to the task of processing claims from a random sample of 40,000 issued patents dating back to 1996. Extrapolating the results obtained from our sample, we estimate that as many as 100,000 issued patents have been invalidated due to the reduced scope of patent-eligible subject matter under Alice v. CLS Bank. This large-scale invalidation of patent rights represents a significant realignment of intellectual property rights at the stroke of a judge’s pen. 9 Id. at 2354-56. 10 Jasper Tran, Two Years After Alice v. CLS Bank, 98 JOURNAL OF THE PATENT AND TRADEMARK OFFICE SOCIETY 354, 358 (2016). 4 Revised: August 25, 2017 2. BRIEF REVIEW OF THE ALICE FRAMEWORK The following procedure outlines the current test for evaluating a patent claim for subject matter eligibility under 35 U.S.C. § 101. We will refer to this test as the “Alice test,” although it was earlier articulated by the Supreme Court in Mayo Collaborative Services v. Prometheus Laboratories, Inc.11 Step 1: Is the claim to a process, machine, manufacture, or composition of matter? If YES, proceed to Step 2A; if NO, the claim is not eligible subject matter under 35 U.S.C. § 101. Step 2A: Is the claim directed to a law of nature, a natural phenomenon, or an abstract idea? If YES, proceed to step 2B; if NO, the claim qualifies as eligible subject matter under 35 U.S.C. § 101. Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? If YES, the claim is eligible; if NO, the claim is ineligible.12 The test has two main parts. The first part of the test, in Step 1, asks whether the claim is to a process, manufacture, machine, or composition of matter. This is simply applying the plain text of Section 101 of the patent statute to ask whether a patentable “thing” is being claimed.13 As a general matter, this part of the test is easy to satisfy. If the claim recites something that is recognizable as an apparatus/machine, process, manufacture, or composition of matter, Step 1 of the test should be satisfied. If Step 1 of the test is not satisfied, the claim is not eligible, end of analysis.14 The second part of the test attempts to identify claims that are directed to judicial exceptions to the statutory categories.15 The second part of the test has two subparts. Step 2A is designed to ferret out claims that, on their surface, claim something that is patent eligible (e.g., a computer), but contain within them a judicial exception. Step 2A 11 Mayo Collaborative Services v. Prometheus Labs., Inc., 132 S. Ct. 1289 (2012) (addressing a method for administering a drug, and holding that a newly discovered law of nature is unpatentable and that the application of that law is also normally unpatentable if the application merely relies on elements already known in the art); Alice at 2355-60 (applying the Mayo analysis to claims to a computer system and method for electronic escrow; holding the claims invalid because they were directed to an abstract idea, and did not include sufficiently more to transform the abstract idea into a patent-eligible invention). 12 USPTO, 2014 Interim Guidance on Patent Subject Matter Eligibility, 79 FR 74618, 74621 (December 16, 2014) [hereinafter [2014 Guidance] 13 35 U.S.C. § 101 (“Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor.”) 14 E.g., In re Ferguson, 558 F.3d 1359, 1364-66 (Fed. Cir. 2009) (contractual agreements and companies are not patentable subject matter); In re Nuijten, 500 F.3d 1346, 1357 (Fed. Cir. 2007) (transitory signals are not patentable subject matter). 15 Alice, 134 S. Ct. at 2354. 5 Revised: August 25, 2017 asks whether the claim is directed to one of the judicial exceptions. If not, then the claim qualifies as patent eligible. If so, Step 2B must be evaluated. The judicial exceptions in Step 2A include laws of nature, abstract ideas, and natural phenomena.16 The category of abstract ideas can be broken down into four subcategories: fundamental economic practices, ideas in and of themselves, certain methods of organizing human activity, and mathematical relationships and formulas.17 Fundamental economic practices include for example creating contractual relationships, hedging, or mitigating settlement risk.18 Ideas in and of themselves include for example collecting and comparing known information, diagnosing a condition by performing a test and thinking about the results, and organizing information through mathematical correlation.19 Methods of organizing human activity include for example creating contractual relationships, hedging, mitigating settlement risk, or managing a game of bingo.20 Mathematical relationships and formulas include for example an algorithm for converting number formats, a formula for computing alarm limits, or the Arrhenius equation.21 In Step 2B, the test asks whether the claims recites additional elements that amount to “significantly more” than the judicial exception. In the computing context, this part of the test is trying to catch claims that are merely applying an abstract idea within a computing system, without adding significant additional elements or limitations.22 Limitations that may be enough to qualify as ‘‘significantly more’’ when recited in a claim with a judicial exception include, for example: improvements to another technology or technical field; improvements to the functioning of the computer itself; effecting a transformation or reduction of a particular article to a different state or thing; or adding unconventional steps that confine the claim to a particular useful application.23 Limitations that have been found not to be enough to qualify as ‘‘significantly more’’ when recited in a claim with a judicial exception include, for example: adding the words ‘‘apply it’’ with the judicial exception; mere instructions to implement an abstract idea on a computer; simply appending well-understood, routine and conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception; or adding insignificant extra-solution activity to the judicial exception.24 16 Id. at 2354. 17 Id. at 2355-56. 18 E.g., Bilski v. Kappos, 561 U.S. 593 (2010) (mitigating settlement risk). 19 E.g., Digitech Image Tech., LLC v. Electronics for Imaging, Inc., 758 F.3d 1344 (Fed. Cir. 2014) (organizing information through mathematical correlations). 20 E.g., buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 112 (Fed. Cir. 2014) (contractual relationships). 21 E.g., Gottshalk v. Benson, 409 U.S. 63 (1972) (algorithm for converting number formats); Diamond v. Diehr, 450 U.S. 175 (1981) (Arrhenius equation). 22 Alice, 134 S. Ct. at 2357-58. 23 2014 Guidance, supra note 12, at 74624, citations omitted. 24 Id. 6 Revised: August 25, 2017 The Alice test is now being applied by federal agencies and courts at the beginning and end of the patent lifecycle. With respect to the application phase of a patent, shortly after the Alice decision, the Patent Office issued to the examination corps instructions for implementing the Alice test.25 These preliminary instructions were supplemented in December, 2014 by the 2014 Guidance.26 As we will show in Section 5, below, the Patent Office has applied this test widely, with significant numbers of rejections appearing in specific subject matter areas. With respect to the enforcement phase of the patent lifecycle, the Federal Courts have been actively applying the Alice test to analyze the validity of patent claims in the litigation context.27 As of June, 2016, over 500 patents have been challenged under Alice, with a resulting invalidation rate exceeding 65%.28 The Court of Appeals for the Federal Circuit has itself heard over 50 appeals that have raised the Alice issue.29 Note that when we speak of the “Alice test” in the context of the Patent office we include the entire body of case law that has developed in the wake of the Mayo and Alice decisions.30 The cases following Alice have refined and clarified the Alice two-step analysis with respect to particular fact contexts. The Patent Office has made considerable effort to keep abreast of these decisions and to train the examining corps as to their import.31 To a large degree then, the Patent Office embodies the current state of subject matter eligibility law. And while this law is never static, it is also not changing so quickly as to undermine one of the central premises of this article, which is that the Patent Office can be used as a source of examples of a decision maker (in this case, a sort of “hive mind” comprising many thousands of individual examiners) applying a legal rule to determine whether a patent claim is subject matter eligible. Assuming that the application of the rule is not completely random, as we will show in Section 5, then it should be possible to train a machine to learn the rule (or its approximation) based on our collection of examples. 25 Preliminary Examination Instructions, supra note 5. 26 2014 Guidance, supra note 12. 27 At the time of this writing, a Shepard’s Report indicates that Alice has been cited in over 500 Federal Court decisions. Lexis Search (May 2017). 28 Tran, supra note 10, at 358. 29 USPTO, Chart of Subject Matter Eligibility Court Decisions (updated July 31, 2017), https://www.uspto.gov/sites/default/files/documents/ieg-sme_crt_dec.xlsx. 30 E.g., Ultramercial, Inc. v. Hulu, LLC, 772 F.3d 709 (Fed. Cir. 2014); DDR Holdings, LLC v. Hotels.com, L.P., 773 F.3d 1245 (Fed. Cir. 2104); Enfish LLC v. Microsoft Corp., 822 F.3d 1327 (Fed. Cir. 2016); Bascom Global Internet Services, Inc. v. AT&T Mobility LLC, 827 F.3d 1341 (Fed Cir. 2016); McRO, Inc. v. Bandai Namco Games America Inc., 837 F.3d 1299 (Fed. Cir. 2016); Amdocs (Israel) Ltd. v. Openet Telecom, Inc., 841 F.3d 1288 (Fed. Cir. 2016). 31 The Patent Office has released a number of memoranda discussing decisions of the Court of Appeals for the Federal Circuit, including Enfish, McRO, and Bascom. USPTO, Recent Subject Matter Eligibility Decisions (May 19, 2016), https://www.uspto.gov/sites/default/files/documents/ieg-may-2016_enfish_memo.pdf; USPTO, Recent Subject Matter Eligibility Decisions (November 2, 2016), https://www.uspto.gov/sites/default/files/documents/McRo-Bascom-Memo.pdf 7 Revised: August 25, 2017 3. RENDERING LEGAL SERVICES IN THE SHADOW OF ALICE In this section, we motivate a computer-assisted approach for rendering legal advice in the context of Alice. Alice creates a new patentability question that must be answered before and during the preparation, prosecution, and enforcement of a patent. Increased access to data allows us to implement a data-driven, predictive computer system for efficiently answering the Alice patentability question, possibly yielding economic efficiencies. Alice casts a shadow over virtually every phase of the lifecycle of a patent, including preparation, prosecution, and enforcement. Inventors want to understand as an initial matter whether to even attempt to obtain patent protection for their inventions. The cost to prepare and file a patent application of moderate complexity can easily exceed $10,000, and inventors would like to know whether it is worth it even to begin such an undertaking.32 In addition, there are hundreds of thousands of “in flight” patent applications, all prepared and filed prior to the Alice decision. These applications likely do not include the necessary subject matter or level of detail that may be required to overcome a current or impending Alice rejection. These applications may not contain evidence of how the invention improves the operation of a computing system or other technology. In such cases, patent applicants want to know whether it is even worth continuing the fight, given that they must pay thousands of dollars for every meaningful interaction with a patent examiner.33 In the enforcement phase of the patent lifecycle, litigants want to know the likelihood that an asserted patent will be invalidated under Alice. Both parties to a suit rely on such information when deciding whether to settle or continue towards trial. For plaintiffs, the increased likelihood of fee shifting raises the stakes even further.34 From an economic welfare perspective, providing patentees with accurate information regarding the likelihood of invalidation should result in a reduction in the inefficient allocation of resources, by shortening or reducing the number of lawsuits. A. Intuition-Based Legal Services Historically, attorneys have provided the above-described guidance by applying intuition, folk wisdom, heuristics, and their personal and shared historical experience. For 32 American Intellectual Property Law Association, 2015 REPORT OF THE ECONOMIC SURVEY, I-85 (median legal fee to draft a relatively complex electrical/computer patent application is $10,000). 33 Id. at I-86 (median legal fee to prepare a response to an examiner’s rejection for a relatively complex electrical/computer application is $3,000). 34 Octane Fitness, LLC v. Icon Health & Fitness, Inc., 134 S. Ct. 1749 (2014). See e.g., Edekka LLC v. 3Balls.com, Inc., E.D. Texas Case 2:15-cv-00541-JRG, Document No. 133, Order by Judge Gilstrap awarding attorney fees under 35 U.S.C. § 285 in a case dismissed for claims found invalid under Alice. 8 Revised: August 25, 2017 example, in the context of patent prosecution generally, the field is rife with (often conflicting) guiding principles,35 such as: • Make every argument you possibly can • To advance prosecution, amending claims is better than arguing • Keep argument to a minimum, for fear of creating prosecution history estoppel or disclaimer • File appeals early and often • Interviewing the examiner expedites examination • Interviewing the examiner is a waste of time and money • Use prioritized examination – you’ll get a patent in 12 months!36 • You’re playing a lottery: if your case is assigned to a bad examiner, give up hope! Unfortunately, the above approaches are not necessarily effective or applicable in all contexts. For example, while some approaches may have worked in the past (e.g., during the first years of practice when the attorney received her training), they may no longer be effective, given changes in Patent Office procedures and training, changes in the law, and so on. Nor do the above approaches necessarily consider client goals. Different clients may desire different outcomes, depending on their market, funding needs, budget, and the like. Example client requirements include short prosecution time (e.g., get a patent as quickly as possible), long prosecution time (e.g., delay prosecution during clinical trials), obtaining broad claims, minimizing the number of office actions (because each office action costs the client money), or the like. It is clear that any one maxim or approach to patent prosecution is not going optimize the outcome for every client in every possible instance. While a truly optimal outcome may not be possible, in view of the randomness and variability in the examination system, it is undoubtedly possible to do better. In the following subsection, we assert that a data-driven approach can yield improved outcomes and economic efficiencies for the client. B. Data-Driven Patent Legal Services A data-driven approach promises to address at least some of the shortcomings associated with the traditional approach to providing patent-related legal services. As a simple example, many clients are concerned with the number of office actions required to obtain a patent. This is because each office action may cost the client around $3000 in attorney fees to formulate a response.37 For large clients, with portfolios numbering in the thousands of yearly applications, reducing the average number of office actions (even 35 The following list is based on the author’s personal experience as a patent prosecutor. At one time or another the author has worked with a client, supervisor, or colleague who has insisted on following one or more of the presented guidelines. 36 Prioritized examination is a Patent Office program that promises to provide a final disposition for a patent application within one year. USPTO, Prioritized Examination, 76 FR 59050 (September 23, 2011). 37 American Intellectual Property Law Association, supra note 32, at I-86 (median legal fee to prepare an amendment/argument for a relatively complex electrical/computer application is $3,000). 9 Revised: August 25, 2017 by a fractional amount on average) can yield significant savings in yearly fees to outside counsel. For small clients and individual inventors, one less office action may be the difference between pushing forward and abandoning a case. Is it possible to use data about the functioning of the patent office to better address the needs of these different types of clients? In the academic context, prior studies considering patent-related data have focused largely on understanding or measuring patent breadth, quality, and/or value using empirical patent features. One body of literature uses patent citation counts and other features (e.g., claim count, classification identifiers) of an issued patent to attempt to determine patent value.38 Others have studied the relationship between patent scope and firm value.39 Other empirical work has analyzed prosecution-related data in order to determine patent quality.40 For this project, we are more interested in predicting how decision makers (e.g., judges or patent examiners) will evaluate patent claims. We make such predictions based on the prior behaviors and actions of those decision makers. Fortunately, it is now becoming increasingly possible to cheaply obtain and analyze large quantities of data about the behaviors of patent examiners and judges. In the patent prosecution context, the Patent Office hosts the PAIR (Patent Application Information Retrieval) system, which provides the “file wrapper” for every published application or issued patent.41 The patent file wrapper includes every document, starting with the initial application filing, filed by the applicant or examiner during prosecution of a given patent application.42 A number of commercial entities provide services that track and analyze prosecution- related data.43 These services provide reports that summarize examiner or group-specific behaviors and trends within the Patent Office, including allowance rates, appeal dispositions, timing information, and the like.44 Such information can be used to tailor prosecution techniques to a specific examiner or examining group. For example, if the examiner assigned to a particular application has, based on his work on other cases, shown himself to be stubborn (e.g., as evidenced by a high appeal rate, high number of 38 See e.g., Mark Carpenter et al., Citation Rates to Technologically Important Patents, 3 WORLD PATENT INFORMATION 160 (1981); John Allison et al., Valuable Patents, 92 GEO. L.J. 435 (2004); Nathan Falk and Kenneth Train, Patent Valuation with Forecasts of Forward Citations, JOURNAL OF BUSINESS VALUATION AND ECONOMIC LOSS ANALYSIS (2016). 39 See e.g., Joshua Lerner, The Importance of Patent Scope: An Empirical Analysis, 25 RAND JOURNAL OF ECONOMICS 319 (1994) (patent classification is used as a proxy for scope) 40 See e.g., Ronald Mann and Marian Underweiser, A New Look at Patent Quality: Relating Patent Prosecution to Validity, 9 J. EMPIRICAL LEGAL STUD. 1 (2012). 41 Patent Application Retrieval System, http://portal.uspto.gov/pair/PublicPair. In addition, bulk data downloads are available at: Google USPTO Bulk Downloads, https://www.google.com/googlebooks/uspto-patents.html; Reed Tech USPTO Data Sets: http://patents.reedtech.com/index.php. 42 37 C.F.R. § 1.2; Manual of Patent Examining Procedure § 719. 43 E.g., Juristat, https://www.juristat.com/; LexisNexis PatentAdvisor, http://www.reedtech.com/products- services/intellectual-property-solutions/lexisnexis-patentadvisor. 44 Juristat, Juristat Primer, https://www.juristat.com/primers/. 10 Revised: August 25, 2017

Description:
Mechanizing Alice: Automating the Subject Matter Eligibility Test automatically adjusting associations between particular features in those examples
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.