This is the second post regarding the notion of asymptotic comparison. The first one dealt with equivalents, the rigorous way of saying “these two things look alike”. This post is about little o, which describes what we mean by “this thing is much smaller than this other one”. Yes, the “o” is the letter o, and it is read “little o”.
If you are not familiar with these notions, I would very much recommend reading the first post in the series. It covers the motivation and intuition and plenty of basic examples, so let us cut straight to the math. As before, we will focus on sequences, and introduce the analogous notions for functions in a later post.
Formal definition
In all the following, to simplify, we assume that all sequences never take the value zero (at least for large enough). For all intents and purposes, it is not an issue.
Consider two sequences and
. We say that
is a little o of
if
We then write
This can also be read “ equals little o of
“. It is important to remark that this is not an equality in the usual sense of the term: things go from left to right, and we should not write
😦 It has been suggested to use the notation
, where
is understood to be the set of all sequences
such that
. This notation, however, did not catch on. Allegedly, this is due to the fact that we can manipulate little o essentially as equalities, as we shall see below.
The symbol is one of what is collectively known as the Landau notation. In
, it is written using simply the letter o. For more fancy options, check the answers here.
The only difference with equivalents is that the limit of is
, as opposed to
. It naturally makes a world of difference: the limit being
means that
is much smaller than
. The relationship between the two notions can be summarized as follows.
Take two sequences and
. Then
Show Theorem I.2.
If we defined little o first, Theorem I.2 could be used as a definition of equivalents. In any case, this relationship is good to remember.
Properties
Basic comparisons
As expected, we should have that , and this is easy to check, since
The generalization to other power sequences is easy to write down.
If and
, then
Note that this works for negative exponents as well, so for instance .
Similarly, the same kind of computation works for geometric series. Take for instance and
. Then
since . More generally, we have the following result.
If and
, then
Show Proposition II.1 and II.2.
Transitivity
The word “equivalent” implies some sort of symmetry. On the other hand, “little o” means much smaller, so there should be no symmetry. However, a property similar to inequalities would be expected.
Consider sequences . If
and
, then
.
Simplification
We always want the sequence inside the to be as simple as possible. This can sometimes be achieved by replacing it by an equivalent sequence, as follows.
Consider sequences . If
and
, then
.
Multiplication
There are also a few convenient algebraic rules. As for equivalents, little o can be multiplied, and exponentiated to any positive power.
Take sequences . Then we have the following properties.
- If
and
, then
- If
, then for any
, we have
Be careful that, unlike equivalents, the exponent can only be positive! This is similar to inequalities: putting both sides to a negative power changes the direction of the inequality.
Sum
Unlike equivalents, little o can be summed, though one needs to be careful about what this means. It would be very wrong to say that if and
, then
😦
Find a counterexample to show that this is not true.
However, this works if : the sum of two “small” (compared to some quantity) things is small (compared to this quantity).
Take sequences . If
and
, then
Division
For the same reason, it would be very wrong to divide little o: if and
, then it is not true in general that
😦
Find a counterexample to show that this is not true.
However, it is fine if we divide by the same quantity.
Take sequences . If
, then
Constants
Finally, a constant time something small is still small. In particular, this implies that it is useless to write multiplicative constants in equivalents.
Take sequences . If
, then for any
,
.
All of these results are easy to check from the definition, and we leave this as an exercise.
Prove Propositions II.3 to II.8.
It is worth noticing that most of these behave like inequalities. To remember all these, it is often easier to write rather than
. Naturally, the little o notation has other advantages, as we will discuss below.
The derivative of at 0 is 0, therefore
Replacing by
gives
which means
We can then deduce
Equivalents and limits
Little o allow to rephrase limits.
Let be a sequence and
- We have
if and only if
- More generally,
if and only if
This is again simple from the definition.
Show Proposition II.9.
Change of time
Once we know the limit of a sequence , replacing
by any other sequence that also tends to
does not change the limit. For this reason, it is similar for little o.
Take sequences , and a sequence of integers
that tends to
. If
, then
.
From Example II.1, , whence we deduce
.
Classical comparisons
Comparative growth
I already mentioned that many L’Hospital type limits could be obtained very easily thanks to comparative growth. This means, this can be done by only comparing classical sequences, as can be formalized using the little o notation.
For any and
, we have
We have thus a hierarchy of sequences: the logarithms grow to infinity much slower than the (positive) power sequences; the power sequences grow to infinity much slower than the geometric sequences (with ); and the geometric sequences grow to infinity much slower than the factorial (to any positive power).
Let us do the proof of all this as a big exercise. As we shall see, only basic calculus is needed, no need for L’Hospital rule. This would lead to ugly computations, and would not work for factorials.
- Using properties of little o, argue that it is enough to show that for any
and
, we have
- Take
and
. Show that, for all
, we have
, and conclude that
- Take
. Check that
- Take
and
with
. Show that, for
, we have
This allows us to compute many limits very easily: it is enough to identify who wins. Consider for instance the problem of finding the limit of
It is of the type . Now, informally, the hierarchy of Theorem III.1 tells us that the
is much smaller than the power of
, which is itself much smaller than the geometric sequence. So the denominator should win, and the limit be 0.
To be precise, Theorem III.1 and the properties of little o tell us that and
, and therefore
and thus
which means
It is nice to use Theorem III.1. in conjunction with the previous properties.
Let us find the limit of
Obtaining equivalents
Theorem III.1, coupled with Theorem I.2, allows to easily obtain equivalents, as follows. First, look at each sum, and only keep the largest term: this gives us an equivalent of the sum. Then multiply and divide equivalents, to find an equivalent of the final expression. Let us for instance look at
For the numerator, we have , so
(by Theorem I.2). Similarly, for the denominator,
, and thus
. We therefore deduce that
where we use at the end that , so
.
You may want to ask your computer what it thinks of . Probably, you will get an overflow error. Yet, this simple computation tells us that
.
Asymptotic expansion
Definition
In general, we want to use little o to say: this sequence looks like this much more simple sequence, plus something which is small, where this “small” is given by a little o. This motivates the following definition.
Consider sequences . We write
if
Typically, and
should be simple sequences (for instance made of power or geometric sequences), and
should be a little o of
. It makes sense to write
😦
but it has no relevance: . Take
. We are saying that we have 100 bucks + much less than a million. The only relevant thing is that we have much less than a million!
An expression of the type , is called an asymptotic expansion of
. It is telling us that
is well approximated by
. The accuracy of the approximation is given by the
.
Let us try to find an asymptotic expansion of . First, the limit of
is 1, and therefore
Now, we would like a bit more, i.e. knowing how small is . So we write it out and use the usual trick of the conjugate quantity.
The denominator tends to 2. Therefore, we see that this should look like . To be precise, this shows that
which means , and thus
, which means that we have the asymptotic expansion
A more expeditious way is to use equivalents: by the first computation, we get
which means exactly, by Theorem I.2,
Algebra of little o
Example IV.1 is a bit tedious to unfold completely, but we in fact could guess the answer after the first couple lines of computation. In essence: we can do basic algebra on asymptotic expansions. We shall not write this as a formal proposition, which would be just cumbersome and incomprehensible. Working through a few examples should give us the gist of it.
Assume for instance that and that
. We can then make computations as in basic algebra. For instance, we can compute an expansion of the sum, as follows.
Note how the of the expansion of
disappears, or rather, is absorbed into the
. As we only know
up to a precision of
, any extra precision on
is irrelevant.
Similarly, we can compute a product, as follows.
Similarly, anything smaller than our worse precision, here , should be disregarded. With practice, we could shorten much more: namely, the second line of this computation is far too explicit. We see that the worst precision we get is
. Therefore, we do not need to write anything smaller. Similarly, let us try to compute an expansion of
. The worst we will get is
. We can then disregard anything smaller, and quickly obtain
Assume that and
. Compute an asymptotic expansion of
and of
.
Use Example IV.1 to compute, as efficiently as possible, an asymptotic expansion of , of the form
.
Expansions of power functions
We already had plenty of fun with the asymptotic expansion
This can be generalized in two senses: either by changing the exponent, or by getting more terms in the expansion. With the techniques we introduced, getting more terms leads to more and more complicated computations. However, you can try your hand on the following.
Compute, an asymptotic expansion of of the form
,
for some well-chosen .
Getting a first-order expansion (of the form ) of
is however doable quite generally with basic algebra.
- Show that, for any
,
, we have
You may want to remember that
- Conclude that for any
,
,
- Compute
Derivatives and little o
Similarly to equivalents, we can use derivatives to obtain new asymptotic expansions. As before, this relies on the fact that, when is close to
,
Replacing by a sequence converging to 0 exactly yields the following result.
Assume that is differentiable at
, and that
is a sequence converging to
. Then we have the asymptotic expansion
Note how the right-hand side of is much simpler than
. Note also that, unlike equivalents, we do not need to assume
; in this case, our expansion is just a constant (maybe
as well), and a little o.
Again, the proof of this fact is essentially the definition of the derivative.
Show Theorem IV.2.
In most of the cases, we will have , and
is a usual function, from which we can deduce the following.
If , then we have the following asymptotic expansions.
-
.
-
.
-
.
- For any
,
We may want to notice that the last expansion gives another much more simple approach to Exercise IV.4. Arguably, it however does not provide the insight that Exercise IV.4 provides.
Now, this result allows us to do computations as follows. We have
From this asymptotic expansion, we deduce for instance that
.
Consider
.
In particular, tends to 0. But we know even more: it tends to 0 at a speed much faster than
.
Give the best asymptotic expansion that you can of
Asymptotic expansion and equivalents
Recall from Theorem I.2 that if and only if
. Remember as well that, in an equivalent, we do not write sums, but only keep the largest term. Therefore, to get an equivalent from an asymptotic expansion, we really want to reduce the expansion to
, where
only contains one term, absorbing everything else into the little o. In other words, an expression is equivalent to the first term of its asymptotic expansion.
For instance, we showed in Exercise IV.3 (spoiler alert!) that
We can therefore deduce the following, in increasing order of precision.
, so
, i.e.
, so
, so
What about ratios?
We already saw that we can divide a comparison by some fixed quantity, but there is no way to divide little o directly. There is however a way to divide by a nontrivial asymptotic expansion, by using the fact (Theorem IV.3) that if
, then
.
Therefore, we are justified to do the following computation, using the fact that and Proposition II.2,
If the denominator (say ) is not of the form
where
, we just need to make it appear! Find a simple equivalent
of
by keeping the largest term, and write
where . We know we can divide by the (fixed and simple)
, then we can use the previous trick on
, then multiply both expansions.
Consider and
. Then
- Find an expansion of
of the type
, without using Theorem IV.2.
- If
and
, find the best expansion you can of
.
Some words of caution
Keep it simple
As mentioned before, the point of an asymptotic expansion is to make things more simple. In particular, the little o part conveys the error that we make, and as such, should only be an order of magnitude.
Careful. No sum or multiplicative constants in little o.
For instance, it is the same to write or
(check the definition!), so we shall of course write the latter. Similarly, it is also exactly the same to write
or
, since the geometric sequence wins over, and we shall therefore write
only.
No cancellations
In all our computations, we always kept the little o. They are meant to convey the imprecision, and imprecision always gets worse. If we have a mediocre count of the number of people in a classroom, and a mediocre count of a the number of people
who leave during the lecture, we will not magically get a very accurate number
of the people remaining!
Careful. Little o do not cancel out.
In other words, if we do not want little o, we should never have any to begin with.
Functions of little o
Like equivalents, it is not possible to take functions of little o. In other words, if , and
is some function, then we cannot guarantee that
😦 Take for instance
,
, and
.
Careful. It is wrong to take functions of little o.
It would be a good idea to compare this with Theorem II.8!
Extra exercises
- For
and
, find an asymptotic expansion of
of the form
for some well-chosen
.
- Compute