Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
vuoi
o PayPal
tutte le volte che vuoi
DECISIONV. STATISTICS IN CLINICAL DECISION
We must extract some features from our pre-processed signals
Random experiment: an experiment whose result is not known a priori (such as a coin toss). Biomedical experiments can be uncertain due to the biological variability of the subjects, of the measurement noise, etc. Practically, if we measure the heart rate, we do not know an exact value, only a range in which this value varies, or we can have an approximated value.
Elementary events: a possible result of an experiment that cannot be distinguished in more simple components. For example, HR could be 55 bpm for a specific event
Sample space S: the totality of the outcomes of all elementary events. All the possible value from 45 to 250 bpm on heartrate
Composed events: a subset S (for example even numbers when the experiment is the roll of a dice). HR during bradycardia.
Repeated experiments If S and n(S) are the space and the number of elementary events of the experiment repeated k times, respectively. Then,
S e n(S) are the space and k the number of elementary events of the experiment repeated k times, respectively.
SET THEORY – it belongs to a mathematical field but is useful to translate them in code functions:
- Complementary set. Given a space S and a set A, the complementary set of A is the set of points that do not belong to A. The complementary set is denoted as: (Ā). In formal logic complement it corresponds to the operation of NOT.
- Union set. Given two sets A and B, the union set is constituted by all the points that belong to A or to B or to both. The union set is denoted as AUB or A+B, one special case is Ā U A = S. In formal logic, the union set corresponds to the (non-executive) OR operation.
- Intersection set. Given two sets A and B, the intersection set consists of all points that belong to both A and B. The intersection set is denoted by ∅A∩B or AB. The sets A and B are called disjoint if A∩B = . In formal logic, the intersection corresponds to the operation of
2. Subjective definition
It allows to estimate the value of P(A) based on prior knowledge and particular circumstances. Let N be the number of possible cases (N finite), and N the Anumber of favourable cases:
P(A) = NA/N
Since I can't repeat the experiment an infinite number of times, I delate the limit
3. Axiomatic definition – we will use it very much
According to it, probability is a function defined in S that satisfies the axioms of Kolmogorov:
- Axiom of P(A) ≥ 0 for any event A ∈ S. The probability indicates uncertainty, and it is indicated with a positive number
- Axiom of P(S) = 1, if I do an experiment the probability to have any kind of output is one, so I will always have an output.
- Axiom of additivity: if A ∩ B = 0, no point in common, then P(A∪B) = P(A) + P(B). If one considers bradycardia rhythm and sinus rhythm, which have no common point, the probability that the subject suffer from both is equal to the
Probability to suffer from bradycardia + probability to suffer from sinus rhythm.
Probability of equally probable events - If n(S) is the number of equally probable events, then the probability of each event is:
p = n(S)
Conditional probability - The event A is said to be dependent on the event B if the probability of event A depends on whether the event B has occurred or not. For dependent events, the following theorem of conditional probability holds:
P(A ∩ B | ) = P(A ∩ B) / P(B)
n(A) = number of times in which condition occur
For independent events:
P(A ∩ B) = P(A) * P(B)
RANDOM VARIABLES - A random variable represents the numerical result attributed to the outcome of an experiment:
X : S → R
Conventionally, X indicates the random variable, while x represents the numerical value associated. For example, if we consider the gender, we will have 0 for female and 1 for male. The cumulative distribution function associated to X
is: ( )=P ( )F x X ≤ xX
This function defined on the random variable X, which in correspondence of each value x, is equal to the probability that the random variable is less or equal to this value. For example, the cumulative distribution function value in correspondence to 80 bpm, is given by the probability of having an hear rate lower or equal to 80 bpm.
( )F x properties:
- X- P{ x 1 ≤ x ≤ x 2}=F x( x 2) – F x( x 1)
- )≤0 ≤ F x(x 1- (x) (x (F x non−decreasing∧¿ continuous F x 1) ≤ F x x 2) per ogni x 1 ≤ x 2
Continuous random variable – when the cumulative distribution function is continuous for every x.
The probability density function is defined as:
d ( )=f F xx Xdx ( )F x ≥ 0
X∞∫
( )f x dx=1X−∞ ∞∫ (a)
( )= ( )F x f z dzX X (b)−∞ b∫{ }
( )=P x 1≤ x ≤ x 2 f x dxXa
Continuous distributions 32 STATISTICS IN CLINICAL DECISION- Gaussian or normal distribution (η=mean value;
σ = standard deviation)what we have seen in the previous graph (a), η and σ are called averagevalue: η is zero, is the point in which the maximum occur, instead σ sayshow wide the morphology is.- Uniform distribution is a rectangular distribution for values between band a, the function is zero before and after the rectangle. Inside therectangle the probability is 1/(b-a). The gender usually has a uniformdistribution.- Exponential distribution, the probability decreases with time following anexponential trend.Discrete random variables - when their exist a numerical set n so that:{ }∁ ∈ =1 Then R so that P X ncumulative distribution function is piecewise constant and has discontinuitiesonly in correspondence of the elements of n. If I have, in (b), two points, 1 formale, 0 for female, I will have 0 for all the value x<0, then it goes to 0.5 incorrespondence to x=0. Since no value are allowed between 0 and 1, we havea constant value. At the end when itarrives to 1 it jumps to 1. Probability values, different from 0 and 1 are not allowed.
The probability mass function let's suppose we have an outcome x=0, x=1...x=x (male/female) and at each event is associated a certain probability. Then if I do the summation of all the probability of all the event, it should be equal to one. From this definition, the probability mass function is defined as the summation of all the probability of all event x ≤ xi.
The probability density function f is defined as: X equal to the summation along all the possible outcomes of the probability of that outcome by a delta function in correspondence of random variable in that point. Considering male female, we will have 2 delta function in correspondence of 0 and 1 with a 0.5 amplitude.
Discrete distributions
Binomial distribution: to be applied when you have an experiment with a binary output (Bernoulli variable which indicate success or failure), and consisting of n trials where the probability of success is p.
The random variable X, called binomial, counts the number of k success in trials. The n experiments are independent, and the probability of success does not change in the different est.
Poison distribution. In this distribution the randomness is related to the time of occurrence of a certain phenomenon. It indicates the probability that, in a certain interval of duration T, there occur k events related to the phenomenon, with λ being the average number of events that occur in the unit of time.
RANDOM VARIABLES IN THE CLINICAL DECISION
Clinical decision are always based on probability, there is no diagnosis that is 100% sure, so when a clinician says that you are sick, is saying that there is an high probability to be sick, an the same hold in the case on health response.
We can have 2 kinks of probability:
A priori probability- is the probability that a subject has a disease independently by the fact that we are doing the investigation. Age, smoking, body
mass increases the probability to have an infraction. A posteriori probability: - the probability that the subject has the disease after the test has been done. The estimation of the a posteriori probability is due to the Bayes theorem for test, both positive and negative. It is applied after the subject has the result of its tests, if the test has a positive result, the probability to have the disease is much higher than the one estimated a priori, if the test gives a negative result, the probability to be sick decreases in respect to the a priori probability (remember that this does not mean that you are not sick, but means a low probability to be sick). Prior probability and current probability Bayes Theorem for positive test: the current probability of the hypothesis 'd' (diseased subject), being the event 'p' (test positive) verified, depends on the prior probability of d, P(d), and on the conditioned probability of tp under the hypothesis of d: The probability thatThere is a disease, under the condition that the test has given a positive response, is given by the probability that a test gives a positive result in diseased people, by the a priori probability to be diseased, over the probability that the test comes out positive. The goodness of the test is related to its ability to recognize both the presence and the absence of the disease.