A (real) infinite series is an expression of the form
$$
a_1+a_2+a_3+\cdots,
$$
where $a_i$'s are all real numbers. Of course, we cannot
really add up infinitely many numbers (we shall die before it is
over). So what we mean by this sum is
$$
\lim_{n\rightarrow\infty} (a_1+\cdots + a_n).
$$
The sum $a_1+\cdots+a_n$ is called
the $n$-th partial sum.
If this limit exists, then we say that the series converges to
that value. An example is $1+\frac 12+\frac{1}{2^2}+\frac{1}{2^3}+\cdots = 2.$
If the limit is $\infty $ or $-\infty $
then we say that the series diverges to $\infty $
or $-\infty.$ A trivial example is $1+1+1+\cdots = \infty.$
It is possible that the series neither
converges nor diverges. We then say that the
series oscillates. An example is
$$
1-1+1-1+1-1+-+-\cdots.
$$
Here the partial sum are
$$
1,0,1,0,1,0,...,
$$
which clearly oscillates.
The definition of an infinite series as the limit of its partial sums
is a natural extension of the concept of usual addition. However,
an infinite series may show certain counterintuitive behaviours,
as we shall see next.
Ordinary addition is associative, i.e. you group the numbers as
you please without affecting the sum. For instance, $(a+b)+(c+d) = (((a+b)+c)+d).$
Not so for an infinite series. For example, the oscillating series
$$
1-1+1-1+1-+-+\cdots,
$$
when grouped like this
$$
(1-1)+(1-1)+(1-1)+(-)+(-)+\cdots,
$$
actually becomes
$$
0+0+0+\cdots,
$$
which converges to $0.$ (Don't think of $0\times
\infty=$ undefined. Here the partial sums are all $0,$
and so the limit of partial sums is $0.$) If you group the
terms like
$$
1+(-1+1)+(-1+1)+()+()\cdots,
$$
then you'll a get a series converging to $1.$
Changinf the order of the summands does not change a sum. This is
called commutativity of addition. Unfortunately, this may fail
for an infinite series.
Consider again our old friend
$$
1-1+1-1+1-+-+\cdots.
$$
All the even terms are $1$ and all the odd terms
are $-1,$ right? We shall rearrange these a bit to get
$$
1+1-1+1+1-1++-++-\cdots.
$$
Don't think that we have increased the number of $1's.$ We
still have countably infinitely many $1$'s and just as
many $-1$'s.
This rearranged series has partial sums
$$
1,2,1,2,3,2,...
$$
The pattern is simple it goes up 2 steps then comes down by 1. So
effectively it is increasing, and increasing without bounds. In
short, the rearranged series diverges to $\infty.$
Absence of the helpful and familiar properties like associativity
and commutativity makes it a bit difficult to work with infinite
series. Thankfully, there are certain types of infinite series
that behave nicely (i.e., regrouping and rearranging them do not change
the sums). One type is mentioned in the next theorem.
Thanks to this result, we never need to worry about adding
infinitely many probabilities when using the 3rd axiom.
Later in this course, we shall need to work with infinite series
with negative as well as positive terms. The following theorem
gives us a condition under which these series will behave
nicely.
If this condition is met (i.e, the absolute series converges),
then we call the original series absolutely convergent. It
is an interesting and very useful fact that any absolutely
convergent series must also be convergent.
You'll learn the proofs of all these theorems in you analysis
course.
In our probability course We shall work with only infinite series of nonnegative terms or
absolutely convergent series in our course. So for us infinite
series are as good as school level addition: associative and commutative!