Kirjojen hintavertailu. Mukana 12 390 323 kirjaa ja 12 kauppaa.

Kirjailija

Thomas B. Schön

Kirjat ja teokset yhdessä paikassa: 4 kirjaa, julkaisuja vuosilta 2013-2022, suosituimpien joukossa Backward Simulation Methods for Monte Carlo Statistical Inference. Vertaile teosten hintoja ja tarkista saatavuus suomalaisista kirjakaupoista.

Mukana myös kirjoitusasut: Thomas B. Schon

4 kirjaa

Kirjojen julkaisuhaarukka 2013-2022.

Machine Learning

Machine Learning

Andreas Lindholm; Niklas Wahlström; Fredrik Lindsten; Thomas B. Schön

Cambridge University Press
2022
sidottu
This book introduces machine learning for readers with some background in basic linear algebra, statistics, probability, and programming. In a coherent statistical framework it covers a selection of supervised machine learning methods, from the most fundamental (k-NN, decision trees, linear and logistic regression) to more advanced methods (deep neural networks, support vector machines, Gaussian processes, random forests and boosting), plus commonly-used unsupervised methods (generative modeling, k-means, PCA, autoencoders and generative adversarial networks). Careful explanations and pseudo-code are presented for all methods. The authors maintain a focus on the fundamentals by drawing connections between methods and discussing general concepts such as loss functions, maximum likelihood, the bias-variance decomposition, ensemble averaging, kernels and the Bayesian approach along with generally useful tools such as regularization, cross validation, evaluation metrics and optimization methods. The final chapters offer practical advice for solving real-world supervised machine learning problems and on ethical aspects of modern machine learning.
Elements of Sequential Monte Carlo

Elements of Sequential Monte Carlo

Christian A. Naesseth; Fredrik Lindsten; Thomas B. Schön

now publishers Inc
2019
nidottu
A key strategy in machine learning is to break down a problem into smaller and more manageable parts, then process data or unknown variables recursively. Sequential Monte Carlo (SMC) is a technique for solving statistical inference problems recursively. Over the last 20 years, SMC has been developed to enabled inference in increasingly complex and challenging models in Signal Processing and Statistics. This monograph shows how the powerful technique can be applied to machine learning problems such as probabilistic programming, variational inference and inference evaluation to name a few.Written in a tutorial style, Elements of Sequential Monte Carlo introduces the basics of SMC, discusses practical issues, and reviews theoretical results before guiding the reader through a series of advanced topics to give a complete overview of the topic and its application to machine learning problems.This monograph provides an accessible treatment for researchers of a topic that has recently gained significant interest in the machine learning community.
Using Inertial Sensors for Position and Orientation Estimation

Using Inertial Sensors for Position and Orientation Estimation

Manon Kok; Jeroen D. Hol; Thomas B. Schon

now publishers Inc
2017
nidottu
Microelectromechanical system (MEMS) inertial sensors have become ubiquitous in modern society. Built into mobile telephones, gaming consoles, virtual reality headsets, we use such sensors on a daily basis. They also have applications in medical therapy devices, motion- capture filming, traffic monitoring systems, and drones. While providing accurate measurements over short time scales, this diminishes over longer periods. To date, this problem has been resolved by combining them with additional sensors and models. This adds both expense and size to the devices. This tutorial focuses on the signal processing aspects of position and orientation estimation using inertial sensors. It discusses different modelling choices and a selected number of important algorithms that engineers can use to select the best options for their designs. The algorithms include optimization-based smoothing and filtering as well as computationally cheaper extended Kalman filter and complementary filter implementations. Engineers, researchers, and students deploying MEMS inertial sensors will find that this tutorial is an essential monograph on how to optimize their designs.
Backward Simulation Methods for Monte Carlo Statistical Inference

Backward Simulation Methods for Monte Carlo Statistical Inference

Fredrik Lindsten; Thomas B. Schön

now publishers Inc
2013
nidottu
Monte Carlo methods, in particular those based on Markov chains and on interacting particle systems, are by now tools that are routinely used in machine learning. These methods have had a profound impact on statistical inference in a wide range of application areas where probabilistic models are used. Moreover, there are many algorithms in machine learning that are based on the idea of processing the data sequentially; first in the forward direction, and then in the backward direction.Backward Simulation Methods for Monte Carlo Statistical Inference reviews a branch of Monte Carlo methods that are based on the forward-backward idea, and that are referred to as backward simulators. In recent years, the theory and practice of backward simulation algorithms have undergone a significant development, and the algorithms keep finding new applications. The foundation for these methods is sequential Monte Carlo (SMC). SMC-based backward simulators are capable of addressing smoothing problems in sequential latent variable models, such as general, nonlinear/non-Gaussian state-space models (SSMs).However, this book also clearly shows that the underlying backward simulation idea is by no means restricted to SSMs. Furthermore, backward simulation plays an important role in recent developments of Markov chain Monte Carlo (MCMC) methods. Particle MCMC is a systematic way of using SMC within MCMC. In this framework, backward simulation gives us a way to significantly improve the performance of the samplers.This monograph discusses several related backward-simulation-based methods for state inference as well as learning of static parameters, both using a frequentistic and a Bayesian approach. This is an excellent primer for anyone interested in this active research area.