Review of Probability Theory and Random Variables Ppt

Download

lecture 2 probability review and random process n.

Skip this Video

Loading SlideShow in 5 Seconds..

Lecture ii Probability Review and Random Procedure PowerPoint Presentation

play prev play next

Lecture ii Probability Review and Random Process

Download Presentation

Lecture 2 Probability Review and Random Procedure

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

  1. Lecture 2Probability Review and Random Process

  2. Review of final lecture • The signal worth noting are : • The source coding algorithm plays an important role in college code rate (compressing data) • The channel encoder introduce back-up in information • The modulation scheme plays important role in deciding the information charge per unit and immunity of signal towards the errors introduced by the channel • Channel can introduce many types of errors due to thermal noise etc. • The demodulator and decoder should provide high Fleck Error Rate (BER).

  3. Review:Layering of Source Coding • Source coding includes • Sampling • Quantization • Symbols to bits • Compression • Decoding includes • Decompression • Bits to symbols • Symbols to sequence of numbers • Sequence to waveform (Reconstruction)

  4. Review:Layering of Source Coding

  5. Review:Layering of Channel Coding • Channel Coding is divided into • Detached encoder\Decoder • Used to right channel Errors • Modulation\Demodulation • Used to map bits to waveform for transmission

  6. Review:Layering of Channel Coding

  7. Review:Resources of a Communication System • Transmitted Power • Average power of the transmitted betoken • Bandwidth (spectrum) • Band of frequencies allocated for the signal • Type of Communication organisation • Ability express System • Space communication links • Ring express Systems • Telephone systems

  8. Review:Digital communication system • Of import features of a DCS: • Transmitter sends a waveform from a finite set of possible waveforms during a limited time • Channel distorts, attenuates the transmitted bespeak and adds noise to it. • Receiver decides which waveform was transmitted from the noisy received indicate • Probability of erroneous conclusion is an important measure for the system performance

  9. Review of Probability

  10. Sample Space and Probability • Random experiment: its outcome, for some reason, cannot be predicted with certainty. • Examples: throwing a die, flipping a coin and drawing a menu from a deck. • Sample space: the set of all possible outcomes, denoted by S. Outcomes are denoted by East's and each Eastward lies in S, i.e., E ∈ S. • A sample space can be discrete or continuous. • Events are subsets of the sample space for which measures of their occurrences, chosen probabilities, can be defined or determined.

  11. Iii Axioms of Probability • For a discrete sample space S, define a probability measure out P on as a fix role that assigns nonnegative values to all events, denoted by Due east, in such that the following conditions are satisfied • Axiom ane: 0 ≤ P(E) ≤ ane for all E ∈ South • Axiom 2: P(Due south) = one (when an experiment is conducted in that location has to exist an outcome). • Precept three: For mutually exclusive events E1, E2, E3,. . . we have

  12. Conditional Probability • We observe or are told that event E1 has occurred merely are actually interested in upshot E2: Knowledge that of E1 has occurred changes the probability of E2 occurring. • If it was P(E2) before, it now becomes P(E2|E1), the probability of E2 occurring given that issue E1 has occurred. • This conditional probability is given past • If P(E2|E1) = P(E2), or P(E2 ∩ E1) = P(E1)P(E2), then E1 and E2 are said to be statistically independent. • Bayes' rule • P(E2|E1) = P(E1|E2)P(E2)/P(E1)

  13. Mathematical Model for Signals • Mathematical models for representing signals • Deterministic • Stochastic • Deterministic signal: No uncertainty with respect to the signal value at any time. • Deterministic signals or waveforms are modeled by explicit mathematical expressions, such as x(t) = 5 cos(10*t). • Inappropriate for real-world problems??? • Stochastic/Random signal: Some degree of doubtfulness in bespeak values before information technology actually occurs. • For a random waveform it is non possible to write such an explicit expression. • Random waveform/ random process, may exhibit sure regularities that can be described in terms of probabilities and statistical averages. • due east.grand. thermal dissonance in electronic circuits due to the random move of electrons

  14. Energy and Power Signals • The performance of a communication system depends on the received signal energy: college energy signals are detected more reliably (with fewer errors) than are lower energy signals. • An electrical signal can be represented as a voltage v(t) or a current i(t) with instantaneous power p(t) across a resistor defined past OR

  15. Free energy and Power Signals • In communication systems, power is oftentimes normalized by assuming R to be i. • The normalization convention allows us to limited the instantaneous power as where x(t) is either a voltage or a electric current signal. • The energy dissipated during the fourth dimension interval (-T/ii, T/2) by a real signal with instantaneous ability expressed past Equation (one.4) can and so be written every bit: • The average power prodigal by the point during the interval is:

  16. Free energy and Power Signals • We classify x(t) as an free energy signal if, and merely if, it has nonzero just finite energy (0 < Ex< ∞) for all time, where • An energy signal has finite energy merely null average ability • Signals that are both deterministic and not-periodic are termed as Energy Signals

  17. Energy and Power Signals • Power is the rate at which the energy is delivered • We classify x(t) equally an power point if, and only if, it has nonzero but finite energy (0 < Px< ∞) for all fourth dimension, where • A power signal has finite ability simply infinite energy • Signals that are random or periodic termed as Power Signals

  18. Random Variable • Functions whose domain is a sample infinite and whose range is a some set of real numbers is called random variables. • Blazon of RV's • Discrete • E.g. outcomes of flipping a coin etc • Continuous • E.g. amplitude of a noise voltage at a particular instant of time

  19. Random Variables Random Variables • All useful signals are random, i.due east. the receiver does not know a priori what wave form is going to be sent by the transmitter • Allow a random variable Ten(A) stand for the functional human relationship between a random outcome A and a real number. • The distribution function Fx(x) of the random variable X is given by

  20. Random Variable • A random variable is a mapping from the sample space to the set of existent numbers. • We shall denote random variables by boldface, i.e., x, y, etc., while individual or specific values of the mapping 10 are denoted by x(w).

  21. Existent number time (t) Random process • A random process is a collection of time functions, or signals, corresponding to diverse outcomes of a random experiment. For each issue, there exists a deterministic part, which is chosen a sample function or a realization. Random variables Sample functions or realizations (deterministic function)

  22. Random Process • A mapping from a sample space to a prepare of fourth dimension functions.

  23. Random Process contd • Ensemble: The set up of possible time functions that one sees. • Denote this prepare past x(t), where the time functions x1(t, w1), x2(t, w2), x3(t, w3), . . . are specific members of the ensemble. • At any time instant, t = tk, we take random variable x(tk). • At any ii time instants, say t1 and t2, nosotros have two dissimilar random variables 10(t1) and 10(t2). • Any realationship b/w any ii random variables is chosen Articulation PDF

  24. Nomenclature of Random Processes • Based on whether its statistics change with time: the process is non-stationary or stationary. • Unlike levels of stationary: • Strictly stationary: the articulation pdf of whatever order is contained of a shift in time. • Nth-society stationary: the joint pdf does not depend on the time shift, but depends on fourth dimension spacing

  25. Cumulative Distribution Function (cdf) • cdf gives a complete clarification of the random variable. It is defined every bit: FX(ten) = P(E ∈ Southward : 10(E) ≤ x) = P(10 ≤ x). • The cdf has the following backdrop: • 0 ≤ FX(x) ≤ 1 (this follows from Axiom one of the probability measure). • Fx(ten) is non-decreasing: Fx(x1) ≤ Fx(x2) if x1 ≤ x2 (this is because upshot 10(E) ≤ x1 is independent in event x(E) ≤ x2). • Fx(−∞) = 0 and Fx(+∞) = ane (10(E) ≤ −∞ is the empty set, hence an impossible consequence, while x(E) ≤ ∞ is the whole sample space, i.e., a certain event). • P(a < x ≤ b) = Fx(b) − Fx(a).

  26. Probability Density Function • The pdf is defined as the derivative of the cdf: fx(ten) = d/dx Fx(x) • Information technology follows that: • Notation that, for all i, ane has pi ≥ 0 and ∑pi = 1.

  27. Cumulative Joint PDF Joint PDF • Oft encountered when dealing with combined experiments or repeated trials of a single experiment. • Multiple random variables are basically multidimensional functions defined on a sample infinite of a combined experiment. • Experiment ane • S1 = {x1, x2, …,xm} • Experiment ii • S2 = {y1, y2 , …, yn} • If we take any one element from S1 and S2 • 0 <= P(xi, yj) <= 1 (Articulation Probability of 2 or more outcomes) • Marginal probabilty distributions • Sum all j P(xi, yj) = P(xi) • Sum all i P(xi, yj) = P(yi)

  28. Expectation of Random Variables(Statistical averages) • Statistical averages, or moments, play an important role in the characterization of the random variable. • The start moment of the probability distribution of a random variable 10 is called hateful value mx or expected value of a random variable X • The 2nd moment of a probability distribution is mean-foursquare value of X • Fundamental moments are the moments of the difference between 10 and mx, and 2d key moment is the variance of x. • Variance is equal to the deviation between the hateful-square value and the square of the hateful

  29. Contd • The variance provides a measure of the variable's "randomness". • The mean and variance of a random variable give a partial description of its pdf.

  30. Time Averaging and Ergodicity • A process where whatever member of the ensemble exhibits the same statistical behavior equally that of the whole ensemble. • For an ergodic procedure: To measure various statistical averages, it is sufficient to await at merely 1 realization of the procedure and detect the corresponding time average. • For a process to be ergodic it must be stationary. The converse is not true.

  31. Gaussian (or Normal) Random Variable (Process) • A continuous random variable whose pdf is: μ and are parameters. Usually denoted as N(μ, ) . • Nigh of import and frequently encountered random variable in communications.

  32. Key Limit Theorem • CLT provides justification for using Gaussian Process as a model based if • The random variables are statistically contained • The random variables have probability with same hateful and variance

  33. CLT • The central limit theorem states that • "The probability distribution of Vn approaches a normalized Gaussian Distribution N(0, 1) in the limit every bit the number of random variables approach infinity" • At times when N is finite it may provide a poor approximation of for the bodily probability distribution

  34. Autocorrelation Autocorrelation of Energy Signals • Correlation is a matching process; autocorrelation refers to the matching of a indicate with a delayed version of itself • The autocorrelation function of a real-valued free energy signal x(t) is defined as: • The autocorrelation function Rx() provides a measure of how closely the signal matches a copy of itself equally the copy is shifted  units in time. • Rx()is not a function of time; it is only a office of the time difference  between the waveform and its shifted copy.

  35. Autocorrelation • symmetrical in  nearly zero • maximum value occurs at the origin • autocorrelation and ESD course a Fourier transform pair, as designated past the double-headed arrows • value at the origin is equal to the free energy of the signal

  36. AUTOCORRELATION OF A PERIODIC (POWER) Betoken • The autocorrelation function of a real-valued ability signal x(t) is defined equally: • When the power signal x(t) is periodic with period T0, the autocorrelation function can be expressed equally:

  37. Autocorrelation of power signals • symmetrical in  about zero • maximum value occurs at the origin • autocorrelation and PSD form a Fourier transform pair, as designated by the double-headed arrows • value at the origin is equal to the average power of the betoken The autocorrelation function of a existent-valued periodic indicate has backdrop similar to those of an energy signal:

  38. Spectral Density

  39. SPECTRAL DENSITY • The spectral density of a indicate characterizes the distribution of the signal's energy or power, in the frequency domain • This concept is particularly of import when considering filtering in communication systems while evaluating the signal and noise at the filter output. • The energy spectral density (ESD) or the ability spectral density (PSD) is used in the evaluation. • Need to determine how the average power or energy of the procedure is distributed in frequency.

  40. Spectral Density • Taking the Fourier transform of the random process does non work

  41. ENERGY SPECTRAL DENSITY • Free energy spectral density describes the energy per unit bandwidth measured in joules/hertz • Represented equally x(t), the squared magnitude spectrum x(t) =|x(f)|2 • Co-ordinate to Parseval's Relation • Therefore • The Free energy spectral density is symmetrical in frequency well-nigh origin and total energy of the signal 10(t) can be expressed as

  42. Power Spectral Density • The power spectral density (PSD) function Gx(f) of the periodic signal x(t) is a real, even ad nonnegative role of frequency that gives the distribution of the ability of x(t) in the frequency domain. • PSD is represented as (Fourier Series): • PSD of non-periodic signals: • Whereas the boilerplate power of a periodic betoken x(t) is represented as:

  43. Noise

  44. Noise in the Communication System • The term noise refers to unwanted electrical signals that are always present in electrical systems: due east.g. spark-plug ignition noise, switching transients and other electro-magnetic signals or atmosphere: the lord's day and other galactic sources • Can describe thermal racket as nothing-mean Gaussian random process • A Gaussian process n(t) is a random office whose value north at any arbitrary time t is statistically characterized by the Gaussian probability density role

  45. WHITE NOISE • The primary spectral characteristic of thermal noise is that its power spectral density is the same for all frequencies of interest in well-nigh communication systems • A thermal noise source emanates an equal amount of noise power per unit bandwidth at all frequencies—from dc to about 1012 Hz. • Ability spectral density K(f) • Autocorrelation function of white racket is • The average power P of white noise if space

  46. White Noise

  47. White Dissonance • Since Rw( T) = 0 for T = 0, any two unlike samples of white racket, no matter how close in fourth dimension they are taken, are uncorrelated. • Since the dissonance samples of white noise are uncorrelated, if the noise is both white and Gaussian (for example, thermal noise) then the noise samples are too independent.

  48. Additive White Gaussian Noise (AWGN) • The effect on the detection procedure of a channel with Condiment White Gaussian Noise (AWGN) is that the noise affects each transmitted symbol independently • Such a channel is called a memoryless channel • The term "additive" means that the noise is merely superimposed or added to the signal—that at that place are no multiplicative mechanisms at work

smithcrect1947.blogspot.com

Source: https://www.slideserve.com/cmarianne/lecture-2-probability-review-and-random-process-powerpoint-ppt-presentation

0 Response to "Review of Probability Theory and Random Variables Ppt"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel