Probabilistic Inference and Forecasting in the Sciences -- Syllabus - Uncertainty in measurements: a critical introduction. Some metrological premises, following the ISO's GUM recommendations. - Concept and evaluation of probability: basic rules; uncertain numbers and vectors (aka 'random variables'), discrete and continuous: remarkable distributions and relations among them. Main theorems of probability theory. - Distribution of functions of uncertain numbers ('propagation of uncertainties'): exact and approximated methods. Introduction to Monte Carlo methods. - Short introduction to the R language (students might use Python or possibly Julia, or whatever they like). - From probability of effects to probability of causes: meaning and use of the so called 'Bayes rule'. Parametric inference and forecasting in the case of Gaussian, binomial and Poisson 'likelihoods'. - Probabilistic ('Bayesian') networks: general ideal and practical approach using HUGIN (and Netica). - Probabilistic forecasting and decision making. - Combinations of results. Treatment of uncertainties due to systematic errors. - Critical cases in frontier type measurements: upper/lower bounds. - Fits (just parametric inference): general approach and details in simple cases. Recovering approximated methods (e.g. maximum likelihood and least squares) under well understood assumptions. - Other applications, including unfolding experimental distributions distorted because of instrumental and/or physical effects. - Computational issues: Gaussian approximation; conjugate priors; Monte Carlo methods and, in particular, Markov Chain Monte Carlo (MCMC). Use of JAGS (Just Another Gibbs Sampler) for inference and forecasting. . Role of the probabilistic treatment of uncertainty in Artificial Intelligence.