Kirjojen hintavertailu. Mukana 12 310 293 kirjaa ja 12 kauppaa.

Kirjailija

Zhang Yi

Kirjat ja teokset yhdessä paikassa: 12 kirjaa, julkaisuja vuosilta 2003-2024, suosituimpien joukossa Neural Networks: Computational Models and Applications. Vertaile teosten hintoja ja tarkista saatavuus suomalaisista kirjakaupoista.

12 kirjaa

Kirjojen julkaisuhaarukka 2003-2024.

Intelligent Medicine on Prediction of Pelvic Lymph Node Metastasis

Intelligent Medicine on Prediction of Pelvic Lymph Node Metastasis

Haixian Zhang; Zhang Yi; Ziqiang Wang

ELSEVIER SCIENCE PUBLISHING CO INC
2024
nidottu
Intelligent Medicine on Prediction of Pelvic Lymph Node Metastasis focuses on leveraging intelligent medical techniques to predict lymph node metastasis, using pelvic cancer as a primary case study. Combined with the actual clinical application scenarios, this book introduces deep neural network models, application systems, and carries out method concentrated on the four major links of lymph node location, partition, segmentation and metastasis prediction, aiming to provide theoretical and experimental reference for researchers in this field. In 8 chapters this title introduces the reader to intelligent medicine and deep neural networks, summarises the intelligent biological neural network and the classical artificial neural network, introduces several commonly used network architectures and a new neural network model, and introduces the deep learning based algorithms on lymph nodes metastasis prediction, summarising the method and experimental results. This book is a friendly learning tool, providing beginners and researchers with an in-depth knowledge of deep learning and how to develop intelligent medicine methods in lymph node metastasis prediction.
Towards Neuromorphic Machine Intelligence

Towards Neuromorphic Machine Intelligence

Hong Qu; Xiaoling Luo; Zhang Yi

ELSEVIER SCIENCE PUBLISHING CO INC
2024
nidottu
Towards Neuromorphic Machine Intelligence: Spike-Based Representation, Learning, and Applications provides readers with in-depth understanding of Spiking Neural Networks (SNNs), which is a burgeoning research branch of Artificial Neural Networks (ANNs), AI, and Machine Learning that sits at the heart of the integration between Computer Science and Neural Engineering. In recent years, neural networks have re-emerged in relation to AI, representing a well-grounded paradigm rooted in disciplines from physics and psychology to information science and engineering. This book represents one of the established cross-over areas where neurophysiology, cognition, and neural engineering coincide with the development of new Machine Learning and AI paradigms. There are many excellent theoretical achievements in neuron models, learning algorithms, network architecture, and so on. But these achievements are numerous and scattered, with a lack of straightforward systematic integration, making it difficult for researchers to assimilate and apply. As the third generation of Artificial Neural Networks (ANNs), Spiking Neural Networks (SNNs) simulate the neuron dynamics and information transmission in a biological neural system in more detail, which is a cross-product of computer science and neuroscience. The primary target audience of this book is divided into two categories: artificial intelligence researchers who know nothing about SNNs, and researchers who know a lot about SNNs. The former needs to acquire fundamental knowledge of SNNs, but the challenge is that much of the existing literature on SNNs only slightly mentions the basic knowledge of SNNs, or is too superficial, and this book gives a systematic explanation from scratch. The latter needs learning about some novel research achievements in the field of SNNs, and this book introduces the latest research results on different aspects of SNNs and provides detailed simulation processes to facilitate readers' replication. In addition, the book introduces neuromorphic hardware architecture as a further extension of the SNN system. The book starts with the birth and development of SNNs, and then introduces the main research hotspots, including spiking neuron models, learning algorithms, network architectures, and neuromorphic hardware. Therefore, the book provides readers with easy access to both the foundational concepts and recent research findings in SNNs.
Convergence Analysis of Recurrent Neural Networks

Convergence Analysis of Recurrent Neural Networks

Zhang Yi

Springer-Verlag New York Inc.
2013
nidottu
Since the outstanding and pioneering research work of Hopfield on recurrent neural networks (RNNs) in the early 80s of the last century, neural networks have rekindled strong interests in scientists and researchers. Recent years have recorded a remarkable advance in research and development work on RNNs, both in theoretical research as weIl as actual applications. The field of RNNs is now transforming into a complete and independent subject. From theory to application, from software to hardware, new and exciting results are emerging day after day, reflecting the keen interest RNNs have instilled in everyone, from researchers to practitioners. RNNs contain feedback connections among the neurons, a phenomenon which has led rather naturally to RNNs being regarded as dynamical systems. RNNs can be described by continuous time differential systems, discrete time systems, or functional differential systems, and more generally, in terms of non­ linear systems. Thus, RNNs have to their disposal, a huge set of mathematical tools relating to dynamical system theory which has tumed out to be very useful in enabling a rigorous analysis of RNNs.
Neural Networks: Computational Models and Applications

Neural Networks: Computational Models and Applications

Huajin Tang; Kay Chen Tan; Zhang Yi

Springer-Verlag Berlin and Heidelberg GmbH Co. K
2010
nidottu
Neural Networks: Computational Models and Applications covers a wealth of important theoretical and practical issues in neural networks, including the learning algorithms of feed-forward neural networks, various dynamical properties of recurrent neural networks, winner-take-all networks and their applications in broad manifolds of computational intelligence: pattern recognition, uniform approximation, constrained optimization, NP-hard problems, and image segmentation. By presenting various computational models, this book is developed to provide readers with a quick but insightful understanding of the broad and rapidly growing areas in the neural networks domain. Besides laying down fundamentals on artificial neural networks, this book also studies biologically inspired neural networks. Some typical computational models are discussed, and subsequently applied to objection recognition, scene analysis and associative memory. The studies of bio-inspired models have important implications in computer vision and robotic navigation, as well as new efficient algorithms for image analysis. Another significant feature of the book is that it begins with fundamental dynamical problems in presenting the mathematical techniques extensively used in analyzing neurodynamics, thus allowing non-mathematicians to develop and apply these analytical techniques easily. Written for a wide readership, engineers, computer scientists and mathematicians interested in machine learning, data mining and neural networks modeling will find this book of value. This book will also act as a helpful reference for graduate students studying neural networks and complex dynamical systems.
Subspace Learning of Neural Networks

Subspace Learning of Neural Networks

Jian Cheng Lv; Zhang Yi; Jiliu Zhou

CRC Press Inc
2010
sidottu
Using real-life examples to illustrate the performance of learning algorithms and instructing readers how to apply them to practical applications, this work offers a comprehensive treatment of subspace learning algorithms for neural networks. The authors summarize a decade of high quality research offering a host of practical applications. They demonstrate ways to extend the use of algorithms to fields such as encryption communication, data mining, computer vision, and signal and image processing to name just a few. The brilliance of the work lies with how it coherently builds a theoretical understanding of the convergence behavior of subspace learning algorithms through a summary of chaotic behaviors.
Neural Networks: Computational Models and Applications

Neural Networks: Computational Models and Applications

Huajin Tang; Kay Chen Tan; Zhang Yi

Springer-Verlag Berlin and Heidelberg GmbH Co. K
2007
sidottu
Neural Networks: Computational Models and Applications covers a wealth of important theoretical and practical issues in neural networks, including the learning algorithms of feed-forward neural networks, various dynamical properties of recurrent neural networks, winner-take-all networks and their applications in broad manifolds of computational intelligence: pattern recognition, uniform approximation, constrained optimization, NP-hard problems, and image segmentation. By presenting various computational models, this book is developed to provide readers with a quick but insightful understanding of the broad and rapidly growing areas in the neural networks domain. Besides laying down fundamentals on artificial neural networks, this book also studies biologically inspired neural networks. Some typical computational models are discussed, and subsequently applied to objection recognition, scene analysis and associative memory. The studies of bio-inspired models have important implications in computer vision and robotic navigation, as well as new efficient algorithms for image analysis. Another significant feature of the book is that it begins with fundamental dynamical problems in presenting the mathematical techniques extensively used in analyzing neurodynamics, thus allowing non-mathematicians to develop and apply these analytical techniques easily. Written for a wide readership, engineers, computer scientists and mathematicians interested in machine learning, data mining and neural networks modeling will find this book of value. This book will also act as a helpful reference for graduate students studying neural networks and complex dynamical systems.
Convergence Analysis of Recurrent Neural Networks

Convergence Analysis of Recurrent Neural Networks

Zhang Yi

Springer-Verlag New York Inc.
2003
sidottu
Since the outstanding and pioneering research work of Hopfield on recurrent neural networks (RNNs) in the early 80s of the last century, neural networks have rekindled strong interests in scientists and researchers. Recent years have recorded a remarkable advance in research and development work on RNNs, both in theoretical research as weIl as actual applications. The field of RNNs is now transforming into a complete and independent subject. From theory to application, from software to hardware, new and exciting results are emerging day after day, reflecting the keen interest RNNs have instilled in everyone, from researchers to practitioners. RNNs contain feedback connections among the neurons, a phenomenon which has led rather naturally to RNNs being regarded as dynamical systems. RNNs can be described by continuous time differential systems, discrete time systems, or functional differential systems, and more generally, in terms of non­ linear systems. Thus, RNNs have to their disposal, a huge set of mathematical tools relating to dynamical system theory which has tumed out to be very useful in enabling a rigorous analysis of RNNs.