ebook img

Developing a Consistent Method for Evaluating Orthopaedic Hip Implants PDF

30 Pages·2004·2.27 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Developing a Consistent Method for Evaluating Orthopaedic Hip Implants

Developing a Consistent Method for Evaluating Orthopaedic Hip Implants Frank Murphy Kx Simulation Technologies, Inc. Paul R. Tomaszewski DePuy Orthopaedics, Inc. Abstract This paper describes the purpose, evolution and implementation of a custom user interface for the analysis of orthopaedic [artificial] hip stem designs. The orthopaedic implant industry is regulated by the U.S. Food and Drug Administration [FDA]. Thorough testing procedures are necessary before an implant is approved for general use. Typically, this involves a variety of physical and analytical assessments including the determination of sustainable bending stresses in a particular hip stem and the corresponding peak load-carrying capacity. Ultimately, a physical prototype is fatigue tested and will be released only after passing all required tests. A combination of closed-form textbook solutions, often in spreadsheet format, has often been used for initial evaluations. The difficulties and dangers in this approach are manifold and are commonly cited: they include poorly understood and inappropriately applied formulas, as well as the limited nature of the resulting data. Despite these hazards, up-front analysis of each hip stem design is necessary to reduce the number of prototypes that are manufactured and tested. The ability to quickly complete the analysis phase of any process is critical to shorten design cycles. A customized solution was needed which would work within the context of the current finite element tools to provide reliable design information directly to the designers. This was accomplished by writing a UIDL-based “wrapper” [User Interface Design Language] for ANSYS [Canonsburg, PA] simulation software. It guides the user through a series of input screens for the selection and analysis of a variety of common hip stem evaluations. Through the use of drop-down menus, engineers are limited to evaluations that make sense for their particular product family. In a convenient and consistent way, the tool provides the means for obtaining and documenting a wealth of information, some of which was previously unavailable to designers in the absence of analyst intervention. The criticality of computational analysis in the orthopaedic industry is established. Various strategies for accomplishing the task of providing this are presented. One solution, that of codifying a set of standard physical tests into virtual form, is described. Minimally trained engineers are then able to investigate their designs more quickly, more comprehensively, and more consistently. Note: This paper was originally presented at the NAFEMS World Congress in Orlando, FL, May 29, 2003. Introduction Orthopaedic implants are devices that replace skeletal structures and restore biomechanical function. The hip implants considered herein are commonly made of expensive alloys such as CoCrMo and Ti6Al4V. Reducing time and cost in the design and product life phases is no less important here than in most other industries. Given the direct impact on human life, however, an additional premium is placed on the performance of these devices. Ultimately, the design is tested under fatigue loading conditions and will go into production only after passing all required assessments. Figure 1. Examples of Orthopaedic Hip Implants Throughout the design cycle, many common engineering approaches are utilized to better understand function. One means for the acquisition of stress and strain information is through the use of finite element analysis [FEA]. While such a tool is powerful, experienced analysts and those well-informed in the practice thereof are aware of the variety of results which can be generated for the same problem when attempted by several designers/engineers. Hence, a key concern in the implementation of an automated analysis tool is the degree to which the results can be insulated from inter-operator variability. The goal of this project is a software tool that is robust enough to resist this variability while at the same time providing an elevated level of engineering information for design evaluation. From such a platform, additional applications are easily added. Furthermore, the use of various uncontrolled spreadsheets, with the inherent drawbacks previously mentioned, can be eliminated or curtailed. Background Historical FEA Utilization Having decided to pursue an FEA solution, it is wise to gain a historical perspective on finite element analysis in support of product development: • Twenty years ago - beginnings of FEA; no dedicated personnel; homegrown software; product development analysis done in a unique way by each engineer, using classical techniques; • A decade ago - common, though not regular, FEA use; typically a handful of concurrent analyses, TOTAL; most engineering analysis still done by product development engineers; • Recent years – use of formula-based hand calculations varies widely among engineers. FE analysis has become routine, due primarily to stable time-tested software, increased understanding and acceptance of FE capabilities, and a desire for expanded insight into design function [immediate] and performance [longer term]. Rationale for a “Scripted” Solution The challenge to meet the increased demand for analysis output can be met in a variety of ways: • Hire more analysts [spread workload among the highly-trained]; • Outsource [spread workload among questionably-trained] - Can be expensive, and an honest and efficient relationship must be established/maintained; • Train more designers [spread workload among minimally-trained] - Adequate training and oversight are large concerns. • In the end, all three strategies will be employed. This paper will focus on the last one, that of training in-house non-analysts to do proper analysis. Training, in FEA or most anything, can consist of quite a broad set of skills, or not. Here, training refers to application training vs. theory training. Requirements Synthesizing Mandates From the analysts’ perspective, the tool must be accurate, appropriate, consistent, and easily supported. Designers also desire these things, but put a premium on the tool being fast and easy [accuracy secondary], and providing relevant results that can be readily integrated into existing reporting protocols. We will consider each of these analysts’ and designers’ requirements in turn, and explain how they were met in an automated analysis tool. Analysts’ Requirements Accuracy Our use of the term accuracy relates to the degree to which the results match those of a true physical sample under identical conditions, could such a situation be attained. This points to the issue of convergence. As an enhancement, error-based mesh adaptation could be implemented. In the current embodiment, however, trial and error mesh refinements made during the development have produced a density that errs on the conservative side. The fact that these are linear analyses and that contemporary desktop computer power is adequate allows us to include more degrees of freedom than are probably necessary. Appropriateness Appropriateness can be thought of as akin to validity, that aspect of a modeling approach which guarantees, in practical terms, that the FE model properly conveys the necessary details of the physical model to the virtual model. In providing testing results [physical or otherwise] to the FDA, many of our primary joint products are required to undergo, as a minimum, a few standard tests. Considering a hip implant, both the ISO and the ASTM organizations have defined standards for the evaluation of stem structural integrity. The boundary conditions [bc] for these tests are such that they can be readily defined within finite element analysis. Throughout development, various subtleties of bc application were considered, in which the desires of maintaining solution linearity and overall simplicity were balanced with the requirement for consistency in the result. One example is the way in which load is applied to the stem. In actuality, a load cell applies force through a bearing and head construct. This produces three non-linear interfaces, those between the load cell and bearing, bearing and head, and head and stem. One obvious way to reduce this complexity is to find an appropriate way to apply the forces directly to the bearing or head. In its current form, this idea has been taken one step further, as loads are applied directly to the taper portion of the stem using constraint equations. As ideas like this were considered, analyses were run under both the original and proposed conditions with all else equal. The results were evaluated to ensure that there was no detriment to the outcome. Consistency and Support One main goal for this tool was that it continue to see increased usage by the product designers. As much of tool as possible had to be automated, so that, with the same analysis intent, consistent results would be obtained by two different users. Toward this end, selection of model inputs using drop-down menus was favored over using fill-in-the-blank input fields. This same feature [i.e., dd menus] helps make the tool easily supported, as there exists a much more limited set of trouble spots to investigate when problems arise when compared to a full-blown analysis package. Consistency in results reporting was also important, and was obtained using automatic, customized [not customizable] report generation. More will be said on the report in the next section. Designers’ Requirements Fast and Easy Turning to the designers’ requirements, the software must be, first and foremost, fast and easy. While the information on regions of peak stress, and their magnitude, is of great importance, the design engineer’s project responsibilities dictate that only a small portion of their time may be dedicated to analysis efforts. Any translation problems, GUI uncertainties, crashes [or even less drastic software glitches], etc., will almost certainly result in a great reluctance towards future consideration of the tool. It is therefore important that users not suffer through the incremental tweaking of the tool as they encounter problems. Instead, take this process off-line, and return to the affected user with not just the minimal solution, but also the subsequent steps for a complete answer. A drop-down menu scheme certainly helps make the process straightforward. This customized menu system reduces the default menus of the base analysis code to a minimum set required to accomplish the goal of linear hip stem analysis, ensuring that only relevant material choices are included, and that stem configuration inputs are limited to valid component combinations. Of course, the developer can change any of these hard-coded items or authorized user if a special case needs to be considered. Result Relevancy In considering the designer requirement for relevant results, we return to the “what” and “how” of report generation. Informally, the results are used in a variety of ways to evaluate and improve the designs. In a more formal sense, they are valuable as both an addition to the design history file, and a portion of any possible submission to the FDA. This process is no different than for any design process, regardless of how the results in question are generated. Von Mises and first principal stresses are valuable results depending on your expected failure mode. To these outputs we add displacement information, to provide an assessment of overall behavior of the device. A plot of expected fatigue behavior; i.e., will/won’t stem last 10 million [10M] cycles given endurance limit, is also included. The specific default results plots mentioned above are automatically included in the report. As the user evaluates other specific results, they have the option to include the corresponding plot in the base report. This is more fully explained below. Methods Overview ViStA is used to simulate three typical fatigue tests that are conducted on hip stem implants. These tests are similar in that they all include a cyclic load that is applied through the head of the hip stem, and potting material into which the stem is set. The differences among the three tests are the magnitude and direction of the cyclic load and orientation and height of the potting material. Although the lab tests incorporate cyclic fatigue loading, they can be simulated using simple, static analyses. Stress results from these analyses can then be compared to the fatigue strength of the stem material to determine whether it meets the criterion of 10M cycles. These are very well understood analyses that have solid correlation with test data. However, their implementation into an easy-to-use analysis tool for the casual user required a great deal of up-front planning and programming. Therefore, the development of ViStA was broken down into four distinct phases. Phase 1: Data Gathering / Prototype Creation Phase 2: Geometry Translation / Meshing Phase 3: Programming / Interface Creation Phase 4: Documentation Phase 1 A tool like ViStA is of little value if the engineers for whom it was developed do not use it. Therefore, it is critical to create a tool that is user-friendly and produces information that is accurate and relevant to the engineer. The engineering analysis group at DePuy primarily dictated the simulation specifications of ViStA, but usability issues were identified with the input of the engineering user community. The primary task of Phase 1 of this development project included gathering information and recommendations from potential users to ensure that the tool would meet their goals and would be utilized effectively during hip stem design and development. At the onset of the project, potential users were gathered together and presented with two basic prototype options. A “black box” approach would solicit user input for analysis conditions through a web-based form, and then provide an HTML report. An alternative, interactive tool would allow the user to see the geometry, rotate the part, and view a variety of results. This latter paradigm proved more appealing, as it would enable them to interrogate their design and better understand how and where it might fail. Figure 2. Interactive Prototype Phase 2 Two of the biggest obstacles in creating customized engineering analysis tools that work with CAD geometry are the ability to translate the geometry cleanly into the finite element software and to successfully mesh the parts for solution. Although tightly connected, these two tasks are distinct in their challenges. Clean geometry translation has always been difficult to achieve on a consistent basis. Even when this obstacle is overcome, other geometry related issues surface. Often geometry appears to have been translated cleanly only to reveal small problems that cause errors during geometric or meshing operations. Phase 2 focused on finding the most effective method of cleanly translating geometry from Unigraphics [UG] into ANSYS and ViStA. Ten sample stems were used to help determine the best approach. These stems varied in complexity allowing both the geometry translation and meshing algorithms to be fully tested. There are a variety of ways to translate geometry from UG into ANSYS including using a generic neutral format like IGES. Direct IGES translation from UG to ANSYS proved to be poor quality and inconsistent. Few of the test stems could be read into ANSYS cleanly using this method. Another relatively robust translation path is to use the UG-specific Parasolid format to directly read the parts into ANSYS. The direct translation of Parasolid from UG into ANSYS proved to be a better solution than IGES. In tests, geometry was typically read into ANSYS more cleanly with this method and volumes were created a higher percentage of the time. However, often there were small lines or sliver areas that caused problems in the meshing routines. A software tool called CADfix [ITI, Milford, OH] was then investigated. CADfix is a geometry translation tool that also incorporates routines to clean and repair geometry. This is especially useful when attempting to translate CAD data into finite element analysis software. CADfix facilitates the removal of small lines and sliver areas to reduce the possibility of mesh failures and create more efficient models for solution. All ten of the stems were tested using CADfix. Eight of the ten were read into ANSYS cleanly and could be meshed with an appropriately chosen global element size. It was decided that using CADfix as the translation and healing tool between UG and ANSYS was the most effective way to consistently obtain accurate, meshable stem geometry inside ANSYS and, therefore, inside ViStA. Figure 3. Typical geometry / meshing problem overcome using CADfix – sliver removal CADfix can be run in interactive mode or can be run in a batch routine. In addition, CADfix will allow user defined tolerance settings during automatic healing in either mode of operation. Using the DePuy stems that were tested, these settings were tailored to overcome geometry issues that can sometimes appear in hip stem geometry. A batch routine was incorporated into ViStA which utilizes the predefined tolerance settings and works “behind the scenes” with no user interaction. The user simply chooses the stem geometry file to read and CADfix launches in the background to repair, heal, and translate the geometry into ANSYS. Figure 4. Geometry translation flowchart Phase 3 Phase 3 of this development effort was the most time consuming phase. Phase 1 defined the technical and usability specifications for ViStA while Phase 2 defined the method required to achieve clean translations and successful meshing of parts. In Phase 3, the information gathered and methods proven in the first two phases were implemented in the form of programs and macros written inside ANSYS. Two ANSYS development tools were used to create ViStA. The ANSYS User Interface Design Language [UIDL] was used to customize the ANSYS Graphical User Interface [GUI]. With this programming language, the ANSYS interface can be modified or completely replaced. In the case of ViStA, the ANSYS interface was completely replaced with a very simple set of menus specific to the analysis of hip stems. The underlying code to read geometry, prepare the model, solve, post-process and create a report was written using the ANSYS Parametric Design Language [APDL]. Both APDL and UIDL are powerful programming tools that are intimately intertwined. More than 60 macros were written to do all the tasks required to simulate the lab tests. Examples of some of the tasks these macros complete include defining the potting material geometry and interface with the stem, meshing all components with appropriate density, setting up the solution parameters and annotating the graphics window with relevant information. Using APDL, macros can be written that act just like ANSYS commands. The syntax of these user defined command macros includes a primary command based on the name of the macro and a series of comma- delimited fields where command options are input. The user-defined commands can then be called through the ANSYS GUI or command line. For the purposes of this tool, all macros were called from the customized GUI created for ViStA. These macros written for ViStA are in some instances quite complex and their detail is beyond the scope of this paper. The balance of this paper will explain some of the tasks that are completed by these macros but will focus on the user interface and how it was customized to create a streamlined analysis environment. Figure 5. ViStA menu system and graphics window Phase 4 After capturing the needs of the users in Phase 1, defining methods for translating geometry cleanly in Phase 2, and developing the interface and underlying code in Phase 3, Phase 4 was focused on documenting the process and training engineers to effectively use ViStA. ViStA was written with the casual user in mind. The menu system and workflow assist the user in preparing the simulation, solving the model, and viewing results. A full HTML help system was also implemented within the user interface. At any point during the use of ViStA, the user can click the HELP button on any of the menus to access the on-line help. The on-line help explains the methodology implemented to simulate all three lab tests and it includes the steps required to complete each simulation. It is structured with the same graphical layout as the ViStA menu system. So the user can work through the help system using the same workflow logic implemented in ViStA. Figure 6. ViStA on-line documentation The Analysis Although this paper is primarily focused on the development of the user interface created for ViStA, an overview of the finite element modeling details is included here to better understand the interaction of the user interface with the finite element code. Boundary Conditions The three possible lab test simulations that ViStA is capable of solving are similar from a finite element perspective. Although potting geometry and loading vary from test-to-test, the basic model preparation and boundary conditions are similar. The model is constrained in all directions on the outside diameter and bottom surface of the potting material. The interface surfaces between the stem and the potting material are meshed identically and couples are used between them. This allows the use of translucency of the potting material to better visualize the stresses in the stem. It also keeps the solid geometry separated leaving the possibility of incorporating contact between these surfaces in future releases of ViStA. Element Selection Due to the complexity of the geometry and the automatic meshing routines required, the stem and potting material are modeled using ten-noded tetrahedral elements. A single mass element is located at the center of the head. Constraint equations representing rigid links are automatically generated between the mass element node and the nodes on the surface of the neck taper. Thus, the entire load applied to the center of the head is assumed to transfer rigidly to the neck at the taper surface. Materials Originally the potting material was not included in the analysis and the stem was rigidly fixed along its outer surface from the top of the potting material down to the tip. In an effort to avoid exaggerated stress values at the rigid fixation, the potting material was added to the analysis at very little cost to solution time. This material has an elastic modulus about 2-5% of the stem material and allows more compliance along the fixation surface. Material properties for the stem and potting are assumed to act in the linear-elastic range. Figure 7. Standard ISO test loading schematic If more than one test simulation is chosen by the user, all models including the various potting geometries are incorporated into one ViStA database. The appropriate elements and loading are selected for each test simulation and they are solved sequentially.

Description:
Despite these hazards, up-front analysis of each hip stem design is It guides the user through a series of input screens for the selection .. Using APDL, macros can be written that act just like ANSYS commands and Poisson's ratio as well as fatigue strength for all possible surface treatments.
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.