So let us consider another crucial example of improper integrals; the integral of the power function. So it's x powered minus a. For the sake of simplicity, firstly let us assume that a doesn't equal one. I'm going to write this down. It's our first case. So what we should do here, we should calculate our antiderivative. In order to do so, we need to divide one by power of ax plus one, this results in a one minus a multiplied by x powered one minus a. Afterward, we need to substitute our x with upper and lower limits. So the question of convergence here actually stems on the fact of the existence of a limit at the upper point. So we need to establish where the limit of this function at the plus infinity actually exists and in what cases it doesn't. In order to understand, it's about kind of [inaudible]. Because once again we have a constant multiplier and then we have power function. So we need to decide whether the power function has limit as a plus infinity or does not. It's pretty much easy because as you do remember the power function with positive power actually approaches plus infinity. It looks like this or like this. The power function with negative values actually looks like this, and actually negative power. So there is a simple one. If the power of X is positive then the limit doesn't exist. The function approach is plus infinity at the infinity. If the power actually is negative, then the function approaches zero, adds a plus infinity. So we are going to write this results down. If a is not the 1, and if a minus 1, that's 1 minus a is greater than 0, then, the integral does not converge. I'm going to symbolically write there and just I close it. If 1 minus a is actually negative as integral converges. What for the case of a equal to 1. We are going now to look at integral of 1 divided by X, and [inaudible] you do know that the answer is logarithm of the absolute value of X. You need to substitute with upper and lower bounds, and once again we need to understand what is the limit of logarithm of plus infinity is. As you well know, it's plus infinity and thus the integral does not converge. That's fine. So we can simply add this case in the scope for the not convergent cases. Let us summarize our results here. The integrals that we've talked about converges if A is greater than 1 and does not converge otherwise. That's fine. But why do we need these special and pretty simple case? When you choose into following idea, sometimes it's hard or impossible to calculate actual antiderivative or well, it's easier to compare the function to some which is in actually in the knowledge of the answer. For example, just assumes that we're speaking about improper integral of the function f, and we have some function for example g. We do know that both functions lies over zero, they're both positive and one function is always greater than that. For example, let us assume that g is greater than f at every point and the improper integral of g converges. Then understandable, the proper integral of f converges too because along the way it can be not convergent as it's area under this curve approaches infinity. But if the area under f curve a right curve, approaches infinity, then the area under the g curve also approaches infinity because one actually include the other. In that sense, integrals should also be non-convergent. That's normally a called a comparator, to prove out role of comparison in terms of whether the integral converges on it. Let us just think of some basic example here. For example, the integral which is X divided by 1 plus x squared. Well, you can proceed with actual calculation of the antiderivative, and we've done it with our magic borders where we began off talking about anti derivatives. But here, we are going to say something more general. For example, if we take a look at the function x divided by 1 plus x squared, you've actually seen the antideritive of it. We've started with it when we were using our MagicTransfer board. But whilst it's easy to just derive an answer here and try to find limits, when I go into rule with some much more easier method here using our rule of comparison. So we have our function, which is x divided by 1 plus x squared. In fact, I'm going to say is that it is somehow comparable with some function with known answer. Right now, we know answer for all the function which are a power function of x. So maybe we can come up with some comparison with it. So asymptotically speaking, we are looking at the function which is somehow close to 1 divided by x because the denominator grows as x, and denominate basically grows as x squared, and they're equivalent in this manner. So in order to understand how this integral converges, we need to understand and remember how this converges. As we previously, stated Integral of the function 1 divided by x does not converge. So arbitrarily speaking, we need to come up with some inequality here, for example, we need to say is that since it does not converge, the function we are comparing it which should be greater as it needs. So maybe we should write something like 1 divided by 2 here. But informally speaking, equivalent functions are basically the same, they are bounded by [inaudible]. So if one of them converges, then the other one converges. If one of them doesn't converge, then the other doesn't converge. Thus we can even generalize our comparison rule into understanding how equivalent functions actually look here. So by using this very easy example, we can, without any calculations of additional antiderivatives, just call the answer straightaway. At very same time, let us just take a look at the Gauss Distribution exponent of minus x squared. Here, I'm going to try for schematic here for you, and what we're going to look at, we're going to look at the fact whether this integral converges. So we need to come up with something which can be used as comparison for this function. Let's just take a moment here and think about it. Firstly, what we should understand, we should understand that this function as we do know whilst we were talking about functions asymptotics, it's extremely fast approaching 0 while it rises up to the infinite. It is faster than any possible polynomial function, it's faster than any power function. So if we do use this function and we try to compare it with some known polynomial function, we already know the answer. But as we pull down the functions, that is tricky because we've actually spoke about the convergence on the segment from one to plus infinity, and now we're talking about from 0 plus one to infinity. As you all do understand, 0 is slightly more complex than one towards functions one divided by x or x squared. It's another tricky point for it. So we're not going to compare it with the power function, and we're going to compare it with a widely known function that we've actually calculated on the previous video. Do you remember, we've worked with a function which is called exponential distribution or just basic exponents of functions, of linear axis. So let me just consider the following one. Does this inequality hold? Exponent of minus x squared is greater or less than exponent of minus x. Well, in order to understand it, we need to compare those powers because we're looking at exponents of obviously negative numbers. Thus the greatest numbers allows an exponent here, or well obviously, x squared is greater than x if x is greater or equal to one. So for the segment from one to plus infinity, this inequality holds, and thus we can easily call it urgent because we've actually calculated the integral of exponent of minus x as the previous feature, and there was our answer, so limits obviously exist. So what for the actual integral that look at which is not from one, so plus infinity, which is from 0 to plus infinity? That's easy because as we do know, we can add areas under curves because if you want to calculate the area of two figures, you just need to summarize the areas of two separate paths. So we need to write the integral from 0 to 1, plus the integral from 1 to plus infinity. As for the second one, we actually do know that it necessarily converges. But as for the second one, we do not even care about convergence because it is an integral definite, and more importantly, with finite limits, with finite bounds. More importantly, how function being integrated does not have any discontinuations on the segments, so it necessarily is integratable. So as a result this, also is a nice number, and all integral existence convergence. That's a really interesting thing because as we previously stated, we cannot calculate the antiderivative for our exponent of minus x squared function, but we can actually quite simply call it convergent. So it makes sense that all our tricky science here is extremely useful, and can be used in understanding why do we actually consider some basic example from probability theory, and does not actually call it a sensitive and all that stuff. So at this very point, we're actually in power of calculation any possible definite integrals as with proper and improper integrals using our fundamental theorem of calculus. Now, it's time to think about how the numerical methods for calculation imply it is not possible to calculate antiderivative works, and what does it stand for, and that's the theme of our following video.