Collaborative Development and Assessment of Student Learning Outcomes for LIS Electives Bradley Wade Bishop School of Information Sciences, University of Tennessee, Email: [email protected] Tony H. Grubesic Center for Spatial Reasoning & Policy Analytics, College of Public Service & Community Solutions, Arizona State University: Email: [email protected] Theresa Parrish School of Information Sciences, University of Tennessee, Email: [email protected] In higher education’s environment of accountability, the development and assessment of student learning outcomes (SLOs) are driven by both external stakeholder require- ments for accreditation and internal institutional pressures to demonstrate student learn- ing as the core function of universities and colleges. This paper presents a framework to reduce faculty workload and increase standardization of SLOs for LIS electives across multiple schools. The framework includes a value-added assessment with results that show a significant increase in the overall scores and specifically in areas of focus for the elective. This approach provides a framework for other popular LIS electives to collaborate across schools and standardize SLOs to determine the overall effectiveness of courses and programs without reinventing course objectives, SLOs, and evaluation techniques at each institution. Keywords: student learning outcomes; value-added assessment; educational effective- ness; geographic information systems; geographic information Collaborative Development and to course, to program, the assessment pro- Assessment of Student Learning cess is recognized as playing a crucial role Outcomes for LIS Electives in the totality of student learning (Praslo- va, 2010). The creation and assessment of student Assessment has been defined in a num- learning outcomes is an essential part ber of ways as applied to various contexts of higher education, with an increasing and purposes. In the past, whereas assess- focus on evaluating educational effective- ment often referred solely to assigning ness especially in North America. While grades in the classroom, it now occurs at colleges and universities have integrated multiple levels. Data collected helps refine assessment into their everyday practices, student learning through course revision, the topic continues to evolve. In particu- program change, and teaching approaches. lar, learning outcomes at the student and Assessment provides valuable feedback program levels are now the primary way regarding the effectiveness of specific of measuring student success, as well as courses and programs, allowing educa- the effectiveness of individual courses, tors to see the link between their courses instructors, programs, colleges, and insti- and what students are (or are not) learning tutions. Though assessment may look dif- (Hock & Mellard, 2011). Thus assessment ferent at each level, ranging from student, is more than determining if students learn J. of Education for Library and Information Science, Vol. 56, No. 4—(Fall) October 2015 272 ISSN: 0748-5786 © 2015 Association for Library and Information Science Education doi:10.12783/issn.2328-2967/56/4/1 Collaborative Development and Assessment of Student Learning Outcomes 273 what educators intend for them to learn, that shapes how educators understand the but assessment also involves improving purpose and practice of carrying out as- both future instruction and student learn- sessments and aid in building effective ing. assessments and measurable, evidence- In an era of increasing accountability based SLOs (Crisp, 2012). within higher education, the development While assessments can be formal or in- of measurable student learning outcomes formal, direct or indirect, and carried out (SLOs) and carrying out meaningful as- through a variety of methods, all assess- sessment of those SLOs is at the core of ments should be informed by standards student learning (McNeill, Gosper, & or goals that are articulated in SLOs that Xu, 2012). Although there are pragmatic are either developed by faculty members reasons such as external stakeholder re- or institutions as a whole (Ewell, 2006; quirements that increase the importance Walvoord & Banta, 2010). Though some of SLOs, there are also growing internal sort of assessment always occurs within institutional pressures. North American education, in the 1990s the rhetoric and universities and colleges are realizing “the practice surrounding it began to evolve. A centrality of student learning to fulfilling greater focus was placed on student-cen- their mission and to meeting . . . expec- tered learning, and the need to tie course tations that investments in higher educa- content to SLOs that develop knowledge, tion do in fact result in students acquiring skills, and abilities that are transferable knowledge, skills, attitudes, and behaviors beyond one specific learning environment aligned with course and program objec- and were based on measurable evidence tives” (Bers, 2008, p. 38–39). Assess- (Boud & Falchikov, 2005). ment ultimately drives the development Within the United States, there are of pedagogy (teaching methods, instruc- expectations from government and ac- tional theories, and learning objectives), crediting bodies that institutions should which in turn provides stakeholders with embody a culture of assessment, with at- evidence of a course or program’s success. titudes and behaviors reflecting a sup- This paper focuses on assessment of SLOs port of assessment at all stages (Weiner, derived from a survey validation study for 2009). In K–12 schools, the Department a new elective in library and information of Education emphasizes assessment: the science. It also provides detail of the ap- evaluation of courses and programs is at proach used, findings, and discussion of the forefront of the No Child Left Behind implications for other course assessments Act (NCLB) of 2001 and the more recent for LIS education in North America. Common Core State Standards Initiative (National Governors Association, 2010) Literature Review that has been adopted by many states. As for higher education, a 2002 Greater Within recent years, a growing body of Expectations National Panel Report pub- literature has emerged that recognizes the lished by the Association of American crucial role assessment plays in support- Colleges and Universities (AACU) asserts ing student learning (McNeill et al., 2012). that “assessment is part and parcel of the However, much of the higher educa- teaching/learning process” (Ramaley & tion literature concerning assessment ap- Leakes, 2002, p. 39). Likewise, the South- proaches it from a theoretical standpoint, ern Association of Colleges and Schools with significantly fewer publications that (SACS) Commission on Colleges stresses present “actual assessment results or detail the value of assessment, advocating for the how these results were used by a depart- teaching-learning-assessment cycle that ment or institution” (Bers, 2008, p.31). begins with learning goals and opportuni- Nevertheless, recent works provide insight ties and ends with using the results of as- 274 JOURNAL OF EDUCATION FOR LIBRARY AND INFORMATION SCIENCE sessments to take action and inform new fields due to diverging foci of individual goals for student learning. SACS further programs’ missions and clientele. Each underlines the focus on measurable SLOs, new iteration of ALA standards remains evidence-based assessment, and the need influential to the overall demonstrations to inform educators on how to carry out of student learning process within core successful, purposeful assessments that courses at each school (Orzoff, Peinovich, support a “culture of improvement and in- & Riedel, 2008). Still, these standards fo- novation” (Suskie, 2013, p. 1–6). cus on core courses taken by all students and do not directly inform the SLOs of Assessment in ALA-Accredited Schools electives offered in this multi-disciplinary field. With a lack of standards handed Standards of graduate-level accredita- down from accrediting bodies or at the tion bodies reiterate the undergraduate- programmatic level for elective courses in level emphasis on assessment. In 1992, the field, it is vital that instructors of these the American Library Association (ALA) courses acknowledge the value of assess- Office of Accreditation (OA) issued stan- ment, the instructor’s role in the process, dards for the accreditation of many North and the potential benefits of sharing some American library schools that emphasize commonalities in SLOs. Instructors must a continuous planning process involving carry out a type of assessment and contin- the assessment of student learning as one uously review their teaching methods and of the primary criteria for accreditation course material to ensure that students are (American Library Association, 1992). gaining the knowledge, skills, and abilities During a study for alternate means of from elective courses that will ultimately teacher effectiveness, researchers at the aid them in their studies and prepare them University of South Florida developed an for their future careers. assessment model that focuses on SLOs as This paper focuses on a course evalua- part of a “continuous improvement cycle” tion conducted via a value-added assess- (Perrault, Gregory, & Carey, 2002, p.270). ment. Value-added refers to changes in This cycle involves not only the creation student learning over an interval of time, of a mission, goals, and objectives, but en- with students being subject to the same compasses a “system that includes deter- evaluation before and after a “higher edu- mining desirable programmatic outcomes, cation intervention” (Fulcher & Willse, measuring how well those outcomes are 2007, p. 10). Value-added assessments are achieved, and applying results for im- typically carried out through the use of pre- provement of program delivery” (Per- tests and post-tests. A 2006 Department of rault et al., p. 270). This framework and Education report states that “results of stu- ones like it reiterate the fact that assess- dent learning assessments, including val- ment of SLOs must focus not only on stu- ue-added measurements that indicate how dent achievement, but must also include a students’ skills have improved over time,” continuous review of the curriculum and are important factors in building success- teaching strategies to allow for “ongoing ful educational systems (Spellings, 2006, appraisal, to make improvements, and to p.24). The American Association of State plan for the future” (Applegate, 2006, p. Colleges and Universities (AASCU) like- 324). wise advocates for value-added methods, Assessment is less standardized and recognizing them as “viable measures” of visible at the graduate level in library and student learning (Boyas, Bryan, & Lee, information science and for each ALA- 2012, p. 429). As the use of pre-tests and accredited program compared to other post-tests increases, change scores wit- professional degrees. LIS in North Amer- nessed from value-added methods can be ica operates with more variety than other used in conjunction with other means of Collaborative Development and Assessment of Student Learning Outcomes 275 assessment to determine the overall effec- and Geospatial Information Round Table tiveness of courses and programs. For this (MAGIRT) to reflect new focus and direc- type of change to be measured, SLOs must tion. The development of the core compe- be developed and questions written to as- tencies included a draft created by twelve sess the outcomes. members of the Education Committee that was subsequently circulated to the Round Background Table’s members (~300) for suggestions, and then a final document to guide future Within the field of library and informa- education endeavor for the group was tion science, preparing students to work available for educators and practicing li- with geographic information is a growing brarians. This study used the core compe- area of focus, as the supply of and demand tencies to guide curriculum development, for geospatial data are both increasing especially student learning outcomes. (Bishop, Grubesic, & Prasertong, 2013). By surveying practicing librarians, ar- As a result, the Laura Bush 21st Century chivists, and other information profession- Librarian Program grant via the Institute als working with geographic information of Museum and Library Services (IMLS) in 2012, mean scores were produced and funded the Geographic Information Librar- the most important core competencies ianship (GIL) project designed to bolster were adapted into SLOs for a subject area the data curation education of librarians elective. A total of 157 librarians, archi- and to improve their abilities to deal with vists, and other information professionals the varied resources generated by geospa- working with geographic information re- tial tools at Drexel University and the Uni- sponded to the survey. The findings led to versity of Tennessee. The first step of the the creation of SLOs used for this course- collaborative project between the two pro- level assessment, and a more detailed grams was to build a model that allows ex- discussion of the approach, findings, and isting subject area core competencies and discussion can be found in another publi- job incumbents to inform coursework. The cation (Bishop, Cadle, & Grubesic, 2015). survey validation approach has been used This paper focuses only on assessment of extensively by other professions, such as existing SLOs derived from the survey teaching, nursing, and law to inform cur- validation study listed in Table 1. ricula with empirical data from current These SLOs are derived from the most workers (Raymond, 2005). important survey validated core compe- The survey validation used core com- tencies from MAGIRT. This reduced set petencies created by the members of the is necessary, in part, because covering the American Library Association’s Map and seventy-five original core competencies in Geography Round Table (MAGERT) in one elective would be impossible. In addi- 2008. In response to the sea change of geo- tion, the course was an attempt to teach a graphic information types and uses, the broader range of information professionals MAGERT Education Committee created working with geographic information. the core competencies with the purpose “to assist in the professional development Methodology life cycle: from student/faculty curriculum development to new professional to mid- This paper provides actual value-added career professionals or others who are new assessment results for these SLOs for one to the specialization, as well as adminis- elective offered for the first time at these trators or personnel officers to assist in two iSchools. A pre-test/post-test was job descriptions and hiring in this area” chosen to assess the SLOs as most fit neat- (MAGERT, 2008, p. 2). In 2011, mem- ly into questions with one clear answer bers voted to change the name to the Map (e.g., scale question: Which representa- 276 JOURNAL OF EDUCATION FOR LIBRARY AND INFORMATION SCIENCE Table 1. Student Learning Outcomes from Survey Validation of Core Competencies. Course Section Total # of Courses 1. Geography and Cartography 1.1. Students will demonstrate geographic and cartographic principles, including geographic and cartographic scale, projection, grids, and geographic coordinate systems 2. Collection Development/Records Appraisal/ 2.1. Students will demonstrate knowledge of lo- Collection Maintenance cal, state/provincial, federal and international mapping agencies and private map publishers, map series and similar publication patterns, and gazetteers, data portals, volunteered geographic information, and aspects of the Federal Deposi- tory Library Program 2.2. Students will select strategies to obtain different types of maps, imagery, and other geospatial data 2.3. Students will describe copyright considerations and the ability to negotiate licensing agreements for databases and collections of geographic information 2.4. Students will explain how to assess the strengths and specialties in a collection and the needs of users to inform collection development 2.5. Students will describe proper materials han- dling, especially for rare and fragile materials 3. Reference and Instruction 3.1. Students will demonstrate the ability to locate geospatial data and software support 3.2. Students will gain awareness of GIS tutorials & training 3.3. Students will develop and deliver geographic information consultations 4. Metadata/Cataloging 4.1. Students will explain metadata standards, schemas, and issues 4.2. Students will understand and interpret existing metadata in geospatial records 4.3. Students will define projections, coordinate systems, and other physical characteristics of cartographic items to create metadata records 4.4. Students will interpret and calculate cartographic scale tive fraction shows “one inch equals one Information Science &Technology Body mile”?). The Geography and Cartography of Knowledge (BoK) (DiBiase, DeMers, competencies were easily written because Johnson, Kemp, Luck, & Plewe, 2006). of the existing foundation developed by The BoK was consulted to inform SLOs the University Consortium for Geographic related to Geography and Cartography Information Science and its Geographic because the MAGIRT core competencies Collaborative Development and Assessment of Student Learning Outcomes 277 only included a simple paragraph detail- in the project completed the two tests ing the obvious criticality of knowing that consisted of 30 multiple choice ques- these concepts. The BoK increased the tions to be answered within a time limit effectiveness and standardization of GIS of 60 minutes. The questions on the pre- education by providing example SLOs and post-tests were the same within each and test questions to guide curriculum de- course, allowing for a change score to be velopment and assessment (DiBiase et al., seen for each individual as well as at the 2006). An equivalent encyclopedic list of course and overall project levels. Again, SLOs does not exist for library and infor- each question correlated to a specific SLO mation science, so the later three SLO sec- allowing for section- and outcome- spe- tions and test questions for (2) Collection cific findings to inform future iterations Development/Records Appraisal/Collec- of the course where students learning var- tion Maintenance; (3) Reference and In- ied. All results were kept confidential and struction; and (4) Metadata/Cataloging re- anonymous and are reported here in the quired writing new test questions. Table 2 aggregate. provides course context of the correspond- ing sections and number of test questions. Findings Table 3 gives some detailed examples of how Student Learning Outcomes were The findings from this value-added as- written into test questions. sessment in both electives are character- Once created, these test questions were ized in Tables 4–6. A total of twenty-five used to assess student understanding of students completed the assessment within concepts in the elective class at Drexel and the two courses, and the average scores and UT. Both programs are conducted online changes from pre- to post-test are included and the same course management software as descriptive statistics. Table 4 provides was used to administer the tests to all en- the average pre-test / post-test scores from rolled students. Students were invited to the two programs, as well as the average take part in a value-added assessment that percentage of change in score (the quan- consisted of two questionnaires (1) a pre- titative value added) from pre-instruction test in the first week of each course and to post-instruction. Table 5 characterizes (2) and post-test at the completion of the average student scores within each course course. Students were informed that these section and correlating SLO, allowing one tests would in no way affect their course to see the specific areas that were compre- grades and that participation in them was hended more fully and those that should be completely voluntary. Fortunately, 100% targeted in the next instance of the course. of students in each elective participated in Table 6 breaks this down further, show- both pre and post-tests—14 from the Uni- casing average student scores on specific versity of Tennessee and 11 from Drexel objective sub-sections for both programs. University. Doing a multivariate test with the pre- Those students who chose to take part test and post-test results for both pro- Table 2. Number of Test Questions Related to Each SLO Section. Course Section SLO # of Test Questions 1. Geography and Cartography 1.1 12 2. Collection Development/Records Appraisal/Collection Maintenance 2.1–2.5 7 3. Reference and Instruction 3.1–3.3 5 4. Metadata/Cataloging 4.1–4.4 6 278 JOURNAL OF EDUCATION FOR LIBRARY AND INFORMATION SCIENCE Table 3. Examples of Test Questions Correlated to Student Learning Outcomes. Student Learning Outcome Sample Test Question 1.1. Which representative fraction shows “one inch equals one mile”? (a) 1:63,360 (b) 1:21,120 (c) 1:24,000 (d) 1:48,000 2.1. __________ allows geospatial data and/or software to be disseminated to an institution’s students, faculty, and staff who provide valid identifica- tion. (a) A site license (b) A single-use license (c) A general public license (d) A workstation license 3.1. A remote sensing faculty member needs 1-meter orthoimagery for a rural part of the U.S. The data must be free and be able to be delivered quickly. Which resource would be best to meet this user’s needs? (a) The National Map (b) GeoCommunity (c) MapServer (d) Web Map Tile Service 4.1. The Content Standard for Digital Geospatial Metadata (CSDGM) increas- es the value of geospatial data by allowing researchers to do each of the following except: (a) automate metadata creation (b) avoid duplication of work (c) limit liability (d) create institutional memory grams, we found a significant difference in cant change occurred for the Geography/ the increase in scores (F (1,23) = 20.913, Cartography (p < 0.001) and Metadata/ p < .001). The University of Tennessee Cataloging objectives (p < 0.003). How- students started off much lower, but caught ever, the other two sections showed no up in the post-test to the Drexel Univer- significant change. This knowledge may sity students. This may be because Drexel be learned in other courses across the students had prior exposure to other GIS curricula in each program because top- courses and content. However, we looked ics and assignments related to Collection for differences across the two programs’ Development/Records Appraisal/Collec- overall scores, but that difference was not tion Maintenance and Reference and In- significant and as a result further analysis struction appear across courses as these into differences between the two programs are relevant throughout different types of were not explored. information agencies. Although scores did To inform how the course performed increase for all four sections of the course, across its different student learning out- the student learning outcomes most spe- comes, we looked at pre-test/post-test cialized to this elective—Geography and differences for each of the four course Geospatial Metadata were those whose sections. The t-test results show a signifi- change was statistically significant. Collaborative Development and Assessment of Student Learning Outcomes 279 Table 4. Average Scores and Overall Change from Pre-Test to Post-Test. Program Pre-Test Score Post-Test Score Overall Change University of Tennessee 14.79 / 30 20.07 / 30 + 5.28 points, or 17.6% Drexel University 18.55 / 30 20.82 / 30 + 2.27 points, or 7.5% Combined 16.46 20.40 + 3.94 points, or 13.1% When comparing the pre-test and post- of each courses were able to compare the test results from the value-added assess- pre-test and post-test scores to measure ment at each institution, one can see that improvement and to what degree this oc- the knowledge gained through the elective curred. This approach not only allowed the course was reflected in student achieve- instructors to immediately gage student ment. Breaking down the scores by each knowledge via the pre-test, but will more objective (Table 6) shows that, in rare importantly allow them to use the findings cases, scores did decrease from pre-test from the change scores when planning fu- to post-test. Nevertheless, overall student ture iterations of the course. The results scores improved for questions relating to from this specific study demonstrate that each SLO (Table 5), and students at both the development of measurable, evidence- programs raised their scores by an average based SLOs and the implementation of a of 3.94 points, or 13.1 percent (Table 4). student assessment (value-added or other- wise) are valuable practices at the gradu- Conclusions and Implications ate level to improve student learning. With evidence of student understanding and in- Despite much discussion on assessment formation on gained knowledge, skills, and in recent years there remains a gap in the abilities, educators can better plan instruc- literature, with few works presenting actu- tion and prepare students to succeed in the al assessments or their results. This paper course, program, and into the workforce. works toward filling this gap by offering At the graduate level, methods of and a view of a real assessment as applied to reasoning for assessments tend to vary a graduate elective course taught in two across fields and institutions. This is the programs involved in North American LIS case within LIS as well, with assessment education. Using a value-added method of student learning at the institution, pro- and change scores on the courses’ thirteen gram, and course levels often dependent student learning outcomes, the instructors on local assessment requirements and in- Table 5. Average Scores and Overall Change from Pre-Test to Post-Test by Section. Outcomes Combined Combined Course Section and SLOs Pre-Test Score Post-Test Score Overall Change 1. Geography and Cartography (SLO 1.1) 6.69 / 12 8.56 / 12 + 1.87 points, or 15% 2. Collection Development/Records 4.31 / 7 4.88 / 7 +.57 points, or 8% Appraisal/Collection Maintenance (SLOs 2.1–2.5) 3. Reference and Instruction (SLOs 3.1–3.3) 2.85 / 5 3.48 / 5 + 0.63 points, or 12.6% 4. Metadata/Cataloging (SLOs 4.1–4.4) 2.38 / 6 3.48 / 6 + 1.1 points, or 18% 280 JOURNAL OF EDUCATION FOR LIBRARY AND INFORMATION SCIENCE Table 6. Average Scores and Overall Change from Pre-Test to Post-Test by Each SLO. SLO Combined Pre-Test Score Combined Post-Test Score Overall Change 1.1. 6.69 / 12 8.56 / 12 + 1.87 points, or 15% 2.1. 1.88 / 3 1.64 / 3 – 0.24 points, or 8% 2.2. 0.54 / 1 0.56 / 1 + 0.02 points, or 2% 2.3. 0.73 / 1 0.92 / 1 + 0.19 points, or 19% 2.4. 0.85 / 1 0.80 / 1 – 0.05 points, or 5% 2.5. 0.81 / 1 0.96 / 1 + 0.15 points, or 15% 3.1. 1.31 / 2 1.60 / 2 + 0.29 points, or 14.5% 3.2. 0.58 / 1 0.36 / 1 – 0.22 points, or 22% 3.3. 0.96 / 2 1.52 / 2 + 0.56 points, or 28% 4.1. 0.31 / 2 0.76 / 2 + 0.45 points, or 22.5% 4.2. 0.85 / 2 1.36 / 2 + 0.51 points, or 25.5% 4.3. 0.62 / 1 0.76 / 1 + 0.14 points, or 14% 4.4. 0.62 / 1 0.60 / 1 – 0.02 points, or 2% structor evaluations. Accrediting bodies, practice in elective courses, and a practice in the case of LIS the ALA Committee that could be standardized across gradu- on Accreditation (COA), perform assess- ate programs regardless of topic. While ments of programs led by external prac- electives are varied within the LIS field, titioners and academics that determine courses that cover common topics such whether each program qualifies for ac- as data curation, information visualiza- creditation and meets their Standards for tion, data science, health informatics, bio- Accreditations of Master’s Programs in informatics, and geographic information Library and Information Studies (Ameri- are increasingly taught in greater number. can Library Association, 2014). However, The frameworks of such courses currently these national standards focus on the core differ from program to program, and both competences covered in core courses and the instructors and students would likely do not directly inform other SLOs that ex- benefit from the standardization of SLOs ist in a myriad of elective courses. This across institutions which offer similar is especially the case within LIS, as its electives. This would allow for frequent multi-disciplinary nature lends itself to a assessment of student learning and the im- variety of electives and specialized topics provement of course content to better fill that are not taught in every LIS programs. curricular gaps. Value-added assessments This specialization means electives are not such as the one included in this study offer even a secondary concern of COA stan- a simple, objective method of determining dards, but those teaching in similar areas whether students have learned what the in- could benefit from cross program col- structor intended for them to learn in the laborations like this project to develop class. If paired with standardized SLOs, some standardization across the field’s instructors could more effectively use electives. their time to plan the course to best ben- The results of this study demonstrate efit students, and each program would not that creating SLOs and using them to eval- need to reinvent course objectives, SLOs, uate teaching effectiveness is a valuable and evaluation wheels. Student success Collaborative Development and Assessment of Student Learning Outcomes 281 should also increase as courses will be tai- Acknowledgements lored to where most students are in terms of background knowledge and where per- The authors would like to acknowledge sonal student achievement advancements funding from the Institute of Museum and are needed. Library Services. The Institute of Mu- Within the field of Geographic Infor- seum and Library Services is the primary mation Science, the Geographic Infor- source of federal support for the nation’s mation Science & Technology Body of 123,000 libraries and museums. Through Knowledge (BoK) was created to increase grant making, policy development, and re- the effectiveness of GIS education, con- search, IMLS helps communities and indi- taining sample SLOs and test questions viduals thrive through broad public access in the subject areas of geography and car- to knowledge, cultural heritage, and life- tography. Similar to frameworks in other long learning. The authors would also like fields, this document guides coursework to thank all the students that participated and student assessment by allowing in- in the courses and Hallie Pritchett, instruc- structors to build off of professional SLOs tor, and Head of the Map and Government and effectively evaluate student learning. Information Library Map at the University An equivalent BoK framework does not of Georgia. exist within LIS for the core courses or electives. While ALA does provide ac- References creditation standards, an encyclopedic list of SLOs or assessment questions has not American Library Association. (1992). Standards yet been created or adopted for use by LIS for accreditation of master’s programs in library and information studies. Retrieved from http:// programs. The Atlas of New Librarianship www.ala.org/accreditedprograms/standards/stan- offers a vision for the field that includes dards visual representations, statements about American Library Association. (2014). Assuring librarianship, and discussions of theory quality, innovation, and value in library and and practice (Lankes, 2011). Although information studies education. Retrieved from this work should be commended, it is not http://www.ala.org/accreditedprograms/home designed to be a pedagogical assessment Applegate, R. (2006). Student learning outcomes assessment and LIS program presentations. Jour- tool paired with standardized learning ob- nal of Education for Library & Information Sci- jectives and questions for the education ence, 47(4), 324–336. aspects of the field. Bers, T. H. (2008). The role of institutional assess- As this study demonstrates, clear, ef- ment in assessing student learning outcomes. fectual SLOs, and measurable, evidence- New Directions for Higher Education, 141, 31– based assessments are important factors 39. doi: 10.1002/he.291 in developing and improving courses that Bishop, B.W., Grubesic, T. H., Prasertong, S. in turn improve institutions and benefit (2013). Digital curation and the GeoWeb: An emerging role of Geographic Information Librar- all students. With this in mind, it is clear ians (GILs). Journal of Map and Geography Li- that the development of SLOs that can be braries, 9(3), 1–17. measured, assessed, and put into practice Bishop, B. W., Cadle, A. W., & Grubesic, T. H. would be beneficial to the LIS field, both (2015). Job analyses of emerging information within its core courses and its electives, as professions: A survey validation of core compe- they would ultimately improve teaching tencies to inform curricula. Library Quarterly, 85(1), 64–84. effectiveness. This would allow for a cy- clical approach to the teaching and learn- Boud, D., & Falchikov, N. (2005). Redesigning as- sessment for learning beyond higher education. ing experience, measuring weakness and Research and Development in Higher Education, strengths across the curricula by sharing 28, 34–41. this piece of the teaching workload across Boyas, E., Bryan, L. D., & Lee, T. (2012). Condi- instructors in the same area. tions affecting the usefulness of pre-and post-