Recurrent Neural Networks for Prediction : Learning Algorithms, Architectures and Stability (Adaptive and Learning Systems for Signal Processing, Comm

個数:
  • ポイントキャンペーン

Recurrent Neural Networks for Prediction : Learning Algorithms, Architectures and Stability (Adaptive and Learning Systems for Signal Processing, Comm

  • ウェブストア価格 ¥47,815(本体¥43,469)
  • John Wiley & Sons Inc(2001/08発売)
  • 外貨定価 US$ 274.95
  • ゴールデンウィーク ポイント2倍キャンペーン対象商品(5/6まで)
  • ポイント 868pt
  • 提携先の海外書籍取次会社に在庫がございます。通常3週間で発送いたします。
    重要ご説明事項
    1. 納期遅延や、ご入手不能となる場合が若干ございます。
    2. 複数冊ご注文の場合、分割発送となる場合がございます。
    3. 美品のご指定は承りかねます。
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Hardcover:ハードカバー版/ページ数 285 p.
  • 言語 ENG
  • 商品コード 9780471495178
  • DDC分類 006.32

Full Description

New technologies in engineering, physics and biomedicine are demanding increasingly complex methods of digital signal processing. By presenting the latest research work the authors demonstrate how real-time recurrent neural networks (RNNs) can be implemented to expand the range of traditional signal processing techniques and to help combat the problem of prediction. Within this text neural networks are considered as massively interconnected nonlinear adaptive filters.

Analyses the relationships between RNNs and various nonlinear models and filters, and introduces spatio-temporal architectures together with the concepts of modularity and nesting
Examines stability and relaxation within RNNsPresents on-line learning algorithms for nonlinear adaptive filters and introduces new paradigms which exploit the concepts of a priori and a posteriori errors, data-reusing adaptation, and normalisation
Studies convergence and stability of on-line learning algorithms based upon optimisation techniques such as contraction mapping and fixed point iteration
Describes strategies for the exploitation of inherent relationships between parameters in RNNs
Discusses practical issues such as predictability and nonlinearity detecting and includes several practical applications in areas such as air pollutant modelling and prediction, attractor discovery and chaos, ECG signal processing, and speech processing

Recurrent Neural Networks for Prediction offers a new insight into the learning algorithms, architectures and stability of recurrent neural networks and, consequently, will have instant appeal. It provides an extensive background for researchers, academics and postgraduates enabling them to apply such networks in new applications.

VISIT OUR COMMUNICATIONS TECHNOLOGY WEBSITE!
http://www.wiley.co.uk/commstech/

VISIT OUR WEB PAGE!
http://www.wiley.co.uk/

Contents

Preface.

Introduction.

Fundamentals.

Network Architectures for Prediction.

Activation Functions Used in Neural Networks.

Recurrent Neural Networks Architectures.

Neural Networks as Nonlinear Adaptive Filters.

Stability Issues in RNN Architectures.

Data-Reusing Adaptive Learning Algorithms.

A Class of Normalised Algorithms for Online Training of Recurrent Neural Networks.

Convergence of Online Learning Algorithms in Neural Networks.

Some Practical Considerations of Predictability and Learning Algorithms for Various Signals.

Exploiting Inherent Relationships Between Parameters in Recurrent Neural Networks.

Appendix A: The O Notation and Vector and Matrix Differentiation.

Appendix B: Concepts from the Approximation Theory.

Appendix C: Complex Sigmoid Activation Functions, Holomorphic Mappings and Modular Groups.

Appendix D: Learning Algorithms for RNNs.

Appendix E: Terminology Used in the Field of Neural Networks.

Appendix F: On the A Posteriori Approach in Science and Engineering.

Appendix G: Contraction Mapping Theorems.

Appendix H: Linear GAS Relaxation.

Appendix I: The Main Notions in Stability Theory.

Appendix J: Deasonsonalising Time Series.

References.

Index.