Jump to content

User:Csinfe/C-complementarities in the finite element method

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Csinfe (talk | contribs) at 08:10, 12 August 2008 (References). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Introduction

The Finite Element Method (FEM) is an excellent example of a body of knowledge originating as Technology (Techne = Art or Craft) and rapidly becoming a commercial success. A hundred thousand or more engineers, technicians, teachers and students routinely use finite element analysis packages (of which there are nearly thousands of codes ranging from small dedicated in-house programmes to large general purpose mega-line codes) in design, analysis, teaching or study environments. There are billions of dollars worth of installed software and hardware dedicated to finite element analysis all over the world and perhaps billions of dollars are spent on analysis costs using this software every year. The primary archival literature has grown rapidly and at last count there has been more than 50,000 papers (excluding papers on fluid mechanics), and nearly 3800 papers are published annually. There are about 400 textbooks and primers, about 400 conference proceedings and perhaps thousands of handbooks, course notes and documentation manuals. The science emerged slowly. Here, one has to make the crucial distinction between Technology and Science - while the former proceeds in an autonomous way from perceived needs of man, society and industry and is governed by utility, the latter always deals with economy and unity of understanding of basic principles.

There are very good reasons why the science of the fem grew in fitful and uncertain steps. It may serve us well to realize that the finite element method has progressed as far as it did precisely because there was more `art' and `engineering' and little mathematical rigour and less `science' in the early years of its development. The invention of the method by engineers in very intuitive ways was the heroic phase of the subject, led entirely by bold pioneers. As Robert M Pirsig described it so graphically in his Zen and the Art of Motorcycle Maintenance: "Pioneers [are] invariably, by their nature, mess makers. They go forging ahead, seeing only their noble, distant goal, and never notice any of the crud and debris they leave behind. Someone else gets to clean that up." Now that the noble, distant goal has been fully realized, it's the right time to clean up. This would be the main task of this account.

When work was started (Prathap and Bhashyam) on the development of simple finite elements in the late 80s, there were only two canonical concepts governing the crucial discretization step which is the essence of the FEM - these were the completeness and continuity requirements on the special functions used to model the deformation within a structural region in an approximate (i.e. numerical) way. While these guiding rules were successful in a very large number of structural problems, it also became clear that classes of problems existed where these were recipes for disaster. (Typically, one could get numerical results which were larger than 99.9% in error! This was very reverently called the locking problem). The scientific framework which had emerged alongside the growth of the method was unable to account for these paradoxes.

Albert Camus once said "All great deeds and all great thoughts have ridiculous beginnings". The locking problem, where errors were ridiculously large, could not be rationalised using either the continuity or the completeness requirements. Mark Buchanan said in a book review in NATURE recently that if Camus had been a scientist, he might have added that all great theories have paradoxical beginnings. The review in question was of a book by Etienne Klein. Klein's thesis was: "Paradox is truth standing on its head to get attention." In this view, a "paradox focuses and amplifies intellectual stress [and] sets the stage for upheaval, discovery and sometimes scientific revolution."

A bold move made was to establish that another canonical virtue called consistency was behind the dramatic failures. Until then, all explanations offered for the locking phenomenon were heuristic; some of them were tautological or circular. This kind of deception is not uncommon in the world of science. Karl Popper has a delightful tale which reveals the essence of such a trap. Envisage the following conversation:

Q: Why is the sea so rough?
A: Because Neptune is angry.
Q: How do you know Neptune is angry?
A: Why, can't you see that the sea is rough.

The conventional wisdom on locking went along similar lines: Why are the results so low? Because there is locking. How do you know it is locking? Why, can't you see that the results are low. Prathap and Bhashyam were able to show that the consistency paradigm had more than heuristic appeal; they were able to derive a priori error models using this view point which could then be tested using numerical computations successfully. No other competing explanatory scheme could do this. Also, a variety of similar phenomena, many of them not originally recognized as belonging to the locking category, could be interpreted in a unified way. Further, the consistency paradigm allowed predictions to be made of phenomena which were not recognized or ignored and these were later found to be true.

"These is nothing so practical as a good theory," said Ludwig Boltzmann. The consistency theory now offered a means to build a library of practical elements (beam, plate, shell & solid) which were required for general purpose structural analysis. By this time, Bhashyam had made his way to more beckoning shores. Prathap continued the development of the linear elements with Ramesh Babu (the cruel hands of fate snatched him away in 1986), and the quadratic, higher order and non-linear elements with B P Naganarayana, Ramamohan, etc.

Typically, a finite element computation produces parameters which guide a designer in making his decisions to optimise the structural efficiency and integrity of any load carrying system. Apart from the pattern of deformation, an engineer is principally guided by the state of strains (and stresses) in a system. There are prescribed allowable limits for each material used which cannot be exceeded without the system giving way (yielding, hence yield stress, etc.) Thus, the computation of strain and stress is of primary concern in finite element analysis. Around 1986, Babu and Prathap noticed that their field consistent library of elements could produce strains and stresses of remarkable accuracy at strategic points within each element. These were already known as optimal points, or Barlow points, after Barlow who first noticed them in the 70s.

Barlow attempted to derive the theory behind the existence of these points using the conventional interpretation that the finite element method was a piecewise version of applying the principle of minimum total potential (MTP) energy. The MTP is one variation of what are called the least action principles, one of the most basic statements defining matter, motion and energy, equally valid for quantum mechanics (as Feynman showed) as for classical mechanics (known from the days of Hero and Archimedes of 300 BC vintage). In the MTP interpretation, one is guided by the metaphysical assumption that underlying the finite element procedure is the fact that displacement (or deformation) fields are being approximated. Let us call this the displacement correspondence viewpoint.

To understand this more clearly, let us think of the finite element computation as a process of sampling some incoming signal (a very useful metaphor introduced by Richard MacNeal, the prime mover behind the industry standard MSC-NASTRAN package, the M in MSC standing for MacNeal). Thus the MTP approach, which Barlow adopted, recognized that the displacement fields in the actual structure were being replaced by approximate displacement fields, and Barlow's prediction of optimal points followed from this paradigm. By the late 80s, it was obvious to Prathap that some of these predictions were wrong (Barlow's procedure could not identify the optimal points in cubic elements), and that this approach could not explain why in many problems, stresses at optimal points could be remarkably accurate (a phenomenon quickly named as superconvergence).

Around this time, a variational theorem proposed as early as 1955 by a little known Chinese engineer named Hu and published in an equally obscure Chinese journal began to be noticed. Hu's principle is another variation of the least action principle. Unlike the MTP which implements the least action principle in terms of displacement fields alone, Hu's principle is a multi-field approach which enforces the least action principle by manipulating displacement, strain and stress-fields simultaneously. In classical analytical solutions (the FEM hardly existed when Hu proposed his theorem and almost all structural methods were based on analytical approaches) Hu's method offered no advantages (and no extra information) over the MTP principle. Hence it was largely ignored and even in 1977, authoritative text books dismissed it as having no particular value.

In the early 90's, Prathap proposed what he called the stress correspondence view point as being central to the finite element process, as opposed to the displacement correspondence paradigm, which had been the framework accepted in all the 50,000 plus papers and 350 or so books written on the method. This is a typical situation in the scientific enterprise. The same data may have several different, and often opposite, explanations. FEM computations show that generally displacements are reasonably accurate and strains/stresses (these can be used interchangeably, being what are called conjugate quantities) are less accurate overall throughout the region. However, there are points were stresses are very accurate; often, more accurate than displacements can be anywhere. Both the displacement and the stress correspondence paradigms could anticipate these general trends. This is a classical stand-off, where two competing paradigms are equally appealing in a superficial way. Usually, there is a psychological preference for the explanation that had arrived first. This tyranny of the first explanation can prolong periods of misunderstanding by thousands of years. Although Aristarchus had proposed the heliocentric model in 3rd century BC, it was the Ptolemaic earth-centred cosmology which ruled supreme for about fourteen centuries before Copernicus revived the Sun-centred model based on the same facts. A systematic programme of work by Prathap and Vinayak Ugru provided very accurate quantitative predictions based on the stress correspondence paradigm which could be confirmed by numerical experimentation. The displacement correspondence paradigm could not match this as far as explanatory power was concerned. The field consistent library of elements now bases all its stress recovery procedures on this paradigm and Hu's principle, but it might take some more time for the stress-correspondence paradigm to overcome the tyranny of the displacement correspondence metaphor.

In going directly from the consistency to the correspondence concept in this account here, some anachronistic liberties were taken with the actual historical sequence in which the usefulness of Hu's principle was found. Along the way, Prathap and his Group also established the currency of a concept they called the correctness requirement. We must now look back to the locking problem again. When the locking problems were first encountered, a variety of tricks (some successful, some not so) were used to remove locking. Consistency offered an explanation for the success of many of these tricks. In fact, it turned out that consistency was so precisely defining that only one unique procedure existed by which element strain-fields could be reconstituted to ensure that there was no locking. The MTP principle could not offer this procedure. It turned out that only Hu's principle could provide a set of orthogonality conditions that assured that locking was removed in a variationally correct way. Unlike many general purpose packages which are commercially available, the field consistent library of elements by Prathap and his group has reconstituted all its elements using this correctness concept.

As of now, we recognize 5 canonical principles which guide the FE method - completeness, continuity, consistency, correctness and correspondence. While the first four are prescriptive rules (what you are enjoined to do to ensure robust elements), the last is a descriptive rule (defining exactly how the procedure works from an organising principle like the least action principle). The Group is proud and hopeful of the fact that the consistency, correctness and correspondence concepts will one day be recognized as intellectual territory that will be forever Indian.

In science, as well as in art (life), one finds that it is not possible to achieve all virtues at the same time. Confucius noticed that life is full of contradictions and conflicts and what is most important under such circumstances is to seek harmony and balance. It is the same with finite element computation. These canonical virtues (completeness, continuity, consistency and correctness) cannot all be achieved at the same time – in other words, these are complementarities. Thus locking is what happens because completeness and consistency cannot be ensured at the same time. So some compromise has to be made. Similarly, one would find with mesh distortion sensitivity that for distorted meshes, it is not possible to achieve completeness and continuity at the same time with the same co-ordinate system. The parametric system is to be used for achieving continuity of test basis functions, and the trial functions must be in the cartesian (metric) system. This article will elucidate the role such considerations play in obtaining effective and robust models.

References

  1. Prathap, G. and Mukherjee, S. (2004), Management-by-stress Model of Finite Element Computation, Research Report CM 0405, CSIR Centre for Mathematical Modelling and Computer Simulation, Bangalore, November 2004.[1]
  2. Vinayak, RU and Prathap, Gangan and Naganarayana, BP (1996), Beam elements based on a higher order theory-I: Formulation and analysis of performance, Computers & Structures, 58 (4). pp. 775-789. [2]
  3. Prathap, Gangan and Vinayak, RU and Naganarayana, BP (1996), Beam elements based on a higher order theory-II - Boundary layer sensitivity and stress oscillations, Computers & Structures, 58 (4). pp. 791-796.
  4. Prathap, Gangan (1996), Barlow points and gauss points and the aliasing and best fit paradigms, Computers and Structures, 58 (2). pp. 321-325.
  5. Prathap, Gangan and Vinayak, RU (1996), Best-fit stress performance of a higher-order beam element, Communications in Numerical Methods in Engineering, 12 (4). pp. 229-234. ISSN 0748-8025
  6. Prathap, Gangan (1996), Barlow points and gauss points and the aliasing and best fit paradigms, Computers & structures, 58 (2). pp. 321-325.
  7. Murthy, MVV and Prathap, Gangan (1996), Shear force predictions from RBF corrected QUAD4 elements, Communications in Numerical Methods in Engineering, 12 (2). pp. 135-140. ISSN 1552-485X.
  8. Prathap, Gangan and Naganarayana, BP (1995), Consistent thermal stress evaluation in finite elements, Computers and Structures, 54 (3). pp. 415-426.
  9. G. Prathap, (1996), Finite Element Analysis and the Stress Correspondence Paradigm, Sadhana, 21,525-546.
  10. Prathap, G. (1993), The Finite Element Method in Structural Mechanics, Kluwer Academic Press, Dordrecht.
  11. Prathap, Gangan and Naganarayana, BP (1992), Stress oscillations and spurious load mechanisms in variationally inconsistent assumed strain formulations, International Journal for Numerical Methods in Engineering, 33 (10). pp. 2181-2197. ISSN 0029-5981.
  12. Prathap, Gangan and Naganarayana, BP and Sudhakar, B (1991), Development of general purpose finite element package for structural analysis of Isotropic, Anisotropic and Layered Composite Structures: Progress Report. Technical Report. National Aeronautical Laboratory, Bangalore, India.
  13. Prathap, Gangan and Naganarayana, BP (1990), Consistency force resultant distributions in displacement elements with varying sectional properties, International Journal for Numerical Methods in Engineering, 29 (4). pp. 775-783.
  14. Somashekar, BR and Prathap, Gangan and Ramesh Babu, C. (1987), A Field-consistent, four-noded, laminated anisotropic plate/ shell element, Computers & Structures, 25 (3). pp. 345-353.
  15. Prathap, Gangan and Ramesh Babu, C (1987), Accurate force evaluation with a simple bi-linear, plate bending element, Computers and Structures, 25 (2). pp. 259-270. ISSN 0045-7949.
  16. Rameshbabu, C and Subramanian, G and Prathap, Gangan (1987), Mechanics of field-consistency in finite element analysis - A penalty function approach, Computers and Structures, 25 (2). pp. 161-173. ISSN 0045-7949
  17. Prathap, Gangan and Ramesh Babu, C and Subramanian, G (1987), Stress oscillations in plane stress modelling of flexure - A field-consistency interpretation, International Journal for Numerical Methods in Engineering, 24 (4). pp. 711-724. ISSN 0029-5981
  18. Prathap, Gangan (1986), Field-consistency - Toward a science of constrained multi-strain-field finite element formulations, Sadhana - Academy Proceedings in Engineering Sciences, 9 (4). pp. 319-343.
  19. Prathap, Gangan (1985), Simple plate/shell triangle, International Journal for Numerical Methods in Engineering, 21 (6). pp. 1149-1156. ISSN 0029-5981.
  20. Prathap, Gangan (1985), A Co continuous four-noded cylindrical shell element, Computers and Structures, 21 (5). pp. 995-999. ISSN 0045-7949
  21. Prathap, Gangan (1985), Additional stiffness parameter measure of error of the second kind in the finite element method, International Journal for Numerical Methods in Engineering, 21 (6). pp. 1001-1012.
  22. Prathap, Gangan (1985), Poor bending response of the four-node plane stress quadrilateral, International Journal for Numerical Methods in Engineering, 21 (5). pp. 825-835.
  23. Prathap, Gangan (1984), An optimally constrained 4 node quadrilateral thin plate bending element, Computers and Structures, 18 (5). pp. 789-794. ISSN 0045-7949
  24. Prathap, Gangan and Bhashyam, GR (1982), Reduced integration and the shear-flexible beam element. International Journal for Numerical Methods in Engineering, 18. pp. 195-210. ISSN 0029-5981.