Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 23 von 32

Details

Autor(en) / Beteiligte
Titel
Recent Advances in Algorithmic Differentiation [electronic resource]
Auflage
1st ed. 2012
Ort / Verlag
Berlin, Heidelberg : Springer Berlin Heidelberg
Erscheinungsjahr
2012
Link zum Volltext
Beschreibungen/Notizen
  • Description based upon print version of record.
  • Includes bibliographical references.
  • Recent Advances in Algorithmic Differentiation; Preface; Contents; Contributors; A Leibniz Notation for Automatic Differentiation; 1 Historical Background; 2 The Leibniz Notation; 2.1 The Forward Mode; 2.2 The Reverse Mode; 2.3 Forward over Forward; 2.4 Forward over Reverse; 3 Second Order Approaches Involving Reverse Mode; 3.1 Forward over Reverse; 3.2 Reverse over Forward; 3.3 Reverse over Reverse; 4 The Equivalence Theorem; References; Sparse Jacobian Construction for Mapped Grid Visco-Resistive Magnetohydrodynamics; 1 Introduction; 1.1 Model; 1.2 Implicit Solver Framework
  • 2 Preconditioner Construction2.1 Code Reconfiguration; 2.2 OpenAD Usage; 2.3 Code Integration; 3 Results; 3.1 Conclusions and Future Work; References; Combining Automatic Differentiation Methods for High-Dimensional Nonlinear Models; 1 Introduction; 2 Methodology; 3 Implementation; 3.1 Gradients with OpenAD; 3.2 Higher-Order Derivatives with Rapsodia; 4 Test Cases; References; Application of Automatic Differentiation to an Incompressible URANS Solver; 1 Introduction; 2 Incompressible URANS Equations and Flow Solver; 3 Generation of a Discrete Adjoint Solver; 3.1 Reversal of the Time Loop
  • 3.2 Adjoining the Outer Iterations3.3 Parallelization of the Adjoint Solver; 4 Validation of the Adjoint Solver; References; Applying Automatic Differentiation to the Community Land Model; 1 Introduction; 2 Background; 3 Automatic Differentiation and OpenAD; 4 AD Development Process; 4.1 Code Comprehension; 4.2 Preprocessing; 4.3 Transformation; 4.4 Postprocessing; 5 Results; 6 Conclusion; References; Using Automatic Differentiation to Study the Sensitivity of a Crop Model; 1 The Application Domain: The Agronomic Crop Model STICS; 2 Sensitivity Analysis; 3 Automatic Differentiation of STICS
  • 3.1 The TAPENADE Automatic Differentiation Tool3.2 STICS Adjoint : The Pains and Sufferings of an AD End-User; 3.3 Validation of the Adjoint Model; 4 Results: Sensitivity Analysis of STICS; 4.1 Selection of Input Parameters for Sensitivity Analysis of Output Variables; 4.2 Sensitivity Results of LAI and Biomass; 5 Conclusion and Outlook; References; Efficient Automatic Differentiation of Matrix Functions; 1 Introduction; 1.1 Terminology; 1.2 Matrix Derivatives in Matrix Form; 2 Kronecker Products; 3 Box Products; 3.1 Box Products of Identity Matrices; 4 Differentiation Rules
  • 4.1 A Simple Example4.2 The Hessian and Newton's Method; 4.3 An Example Taylor Series; 5 Future Work; References; Native Handling of Message-Passing Communication in Data-Flow Analysis; 1 Introduction; 2 Context-Sensitive and Flow-Sensitive Data-Flow Analysis; 3 Impact of Message-Passing on Data-Flow Analysis; 4 Data-Flow Analysis with Flow Graph Local Restart; 5 Performance Discussion; 6 Choosing a Good Set of Channels; 7 Implementation and Outlook; References; Increasing Memory Locality by Executing Several Model Instances Simultaneously; 1 Introduction; 2 Cloning; 3 Applications
  • 4 Conclusions
  • The proceedings represent the state of knowledge in the area of algorithmic differentiation (AD).  The 31 contributed papers presented at the AD2012 conference cover the application of AD to many areas in science and engineering as well as aspects of AD theory and its implementation in tools. For all papers the referees, selected from the program committee and the greater community, as well as the editors have emphasized accessibility of the presented ideas also to non-AD experts. In the AD tools arena new implementations are introduced covering, for example, Java and graphical modeling environments or join the set of existing tools for Fortran. New developments in AD algorithms target the efficiency of matrix-operation derivatives, detection and exploitation of sparsity, partial separability, the treatment of nonsmooth functions, and other high-level mathematical aspects of the numerical computations to be differentiated. Applications stem from the Earth sciences, nuclear engineering, fluid dynamics, and chemistry, to name just a few. In many cases the applications in a given area of science or engineering share characteristics that require specific approaches to enable AD capabilities or provide an opportunity for efficiency gains in the derivative computation. The description of these characteristics and of the techniques for successfully using AD should make the proceedings a valuable source of information for users of AD tools.
  • English