What is the theme of analysis?How to calculate an integral given a measure?Real Analysis 1 vs Real Analysis 2?Is it necessary for one to understand analysis?Do Electrical Engineering Researchers Usually Know Higher Math? e.g. Measure and Distribution Theory, Functional AnalysisIs it necessary that I take Real Analysis 2 & Abstract Algebra 2?How to self study topology?

Where do overtones in a 555 generated square wave come from?

Oathbow: does the first attack count as being against your Sworn Enemy?

Illustrator - Removing the blank space below a letter

Absorption of dark matter by black holes

Non-differentiable Lipschitz functions

What to do with developers who don't follow requirements?

What would it take to slow down fermentation? (Specific: Wine / grape juice)

Arduino Code for Talking to an RFID Reader

Who originated the dangerous avocado-pitting technique?

Uncooked peppers and garlic in olive oil fizzled when opened

Features of a Coda section

Set 3D cursor position while left-click selection is enabled (2.8)

Should I correct a mistake on an arXiv manuscript, that I found while refereeing it?

Is there a problem using A LOT of locks in a process?

How did the T-850 still function after it removed its second battery?

Creating Variable Distance Buffers using QGIS

Is exploit-free software possible?

Explanation of tac --before

What are the practical differences between the exposure settings my camera picked for me in auto mode and my own choices?

18-month-old kicked out of church nursery

Why can't I book this multi-city fare on American Airlines?

How to use cs_generate_variant:Nn to compare token lists?

What is this aluminum plate that started rattling inside my Ksyrium rims?

Is it possible to have a healthy work-life balance as a professor?



What is the theme of analysis?


How to calculate an integral given a measure?Real Analysis 1 vs Real Analysis 2?Is it necessary for one to understand analysis?Do Electrical Engineering Researchers Usually Know Higher Math? e.g. Measure and Distribution Theory, Functional AnalysisIs it necessary that I take Real Analysis 2 & Abstract Algebra 2?How to self study topology?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;

.everyonelovesstackoverflowposition:absolute;height:1px;width:1px;opacity:0;top:0;left:0;pointer-events:none;








44














$begingroup$


It is safe to say that every mathematician, at some point in their career, has had some form of exposure to analysis. Quite often, it appears first in the form of an undergraduate course in real analysis. It is there that one is often exposed to a rigorous viewpoint to the techniques of calculus that one is already familiar with. At this stage, one might argue that real analysis is the study of real numbers, but is it? A big chunk of it involves algebraic properties, and as such lies in the realm of algebra. It is the order properties, though, that do have a sort of analysis point of view. Sure, some of these aspects generalise to the level of topologies, but not all. Completeness, for one, is clearly something that is central to analysis.



Similar arguments can be made for complex analysis and functional analysis.



Now, the question is: As for all the topics that are bunched together as analysis, is there any central theme to them? What topics would you say that belongs to this theme? And what are the underlying themes in these individual subtopics?



Add. It may be a subjective question, but having a rough idea of what the central themes of a certain field are helps one to construct appropriate questions. As such, I think it is important. I am not expecting a single answer, but more of a diverse set of opinions on the matter.










share|cite|improve this question












$endgroup$










  • 12




    $begingroup$
    IMO: The one unifying concept in analysis is the limit operation. Nearly any result you have in analysis has something to do with taking a limit, and nearly any result in analysis uses some form of the completeness of the reals (directly or indirectly.
    $endgroup$
    – rubikscube09
    Jun 10 at 22:28






  • 4




    $begingroup$
    The central theme here is limits, but I feel amazed when I think that order relations combined with completeness can generate so much of mathematics. The structure of real numbers is thus the essence and the root of all analysis.
    $endgroup$
    – Paramanand Singh
    Jun 11 at 1:03






  • 4




    $begingroup$
    Limits? Naw. It is the triangle inequality. You can do analysis without limits, but not without the triangle inequality. |a|+|b|>=|c|. It just keeps on showing up way beyond epsilon/delta/limits do.
    $endgroup$
    – Yakk
    Jun 11 at 14:49







  • 7




    $begingroup$
    Dear close-voters: I agree that this question is much "softer" than usual questions we find on MSE, and less precise in the sense of having an objective answer. One might think it is too broad. But I think that a good answer to this question would be incredibly beneficial to new students of analysis, and perhaps give an alternative perspective even for experienced math students and mathematicians alike. Hence, I do not think this question should be closed.
    $endgroup$
    – YiFan
    Jun 12 at 5:40






  • 5




    $begingroup$
    I agree this is very soft, but in practise: most mathematicians have an intrinsic understanding of the flavour of proofs. For instance, in algebraic topology, there is a clear intuition in the sense that a geometric concept is given an algebraic flavour through homotopy. Most often, in a working mathematicians' toolset, there is an intrinsic understanding of where a certain idea or parts of the proof comes from. And this question aims to ask when would one say that a particular idea is from the analysis point of view. As broad as that may be, I feel that it is a useful question to ask.
    $endgroup$
    – Sandesh Jr
    Jun 12 at 11:20

















44














$begingroup$


It is safe to say that every mathematician, at some point in their career, has had some form of exposure to analysis. Quite often, it appears first in the form of an undergraduate course in real analysis. It is there that one is often exposed to a rigorous viewpoint to the techniques of calculus that one is already familiar with. At this stage, one might argue that real analysis is the study of real numbers, but is it? A big chunk of it involves algebraic properties, and as such lies in the realm of algebra. It is the order properties, though, that do have a sort of analysis point of view. Sure, some of these aspects generalise to the level of topologies, but not all. Completeness, for one, is clearly something that is central to analysis.



Similar arguments can be made for complex analysis and functional analysis.



Now, the question is: As for all the topics that are bunched together as analysis, is there any central theme to them? What topics would you say that belongs to this theme? And what are the underlying themes in these individual subtopics?



Add. It may be a subjective question, but having a rough idea of what the central themes of a certain field are helps one to construct appropriate questions. As such, I think it is important. I am not expecting a single answer, but more of a diverse set of opinions on the matter.










share|cite|improve this question












$endgroup$










  • 12




    $begingroup$
    IMO: The one unifying concept in analysis is the limit operation. Nearly any result you have in analysis has something to do with taking a limit, and nearly any result in analysis uses some form of the completeness of the reals (directly or indirectly.
    $endgroup$
    – rubikscube09
    Jun 10 at 22:28






  • 4




    $begingroup$
    The central theme here is limits, but I feel amazed when I think that order relations combined with completeness can generate so much of mathematics. The structure of real numbers is thus the essence and the root of all analysis.
    $endgroup$
    – Paramanand Singh
    Jun 11 at 1:03






  • 4




    $begingroup$
    Limits? Naw. It is the triangle inequality. You can do analysis without limits, but not without the triangle inequality. |a|+|b|>=|c|. It just keeps on showing up way beyond epsilon/delta/limits do.
    $endgroup$
    – Yakk
    Jun 11 at 14:49







  • 7




    $begingroup$
    Dear close-voters: I agree that this question is much "softer" than usual questions we find on MSE, and less precise in the sense of having an objective answer. One might think it is too broad. But I think that a good answer to this question would be incredibly beneficial to new students of analysis, and perhaps give an alternative perspective even for experienced math students and mathematicians alike. Hence, I do not think this question should be closed.
    $endgroup$
    – YiFan
    Jun 12 at 5:40






  • 5




    $begingroup$
    I agree this is very soft, but in practise: most mathematicians have an intrinsic understanding of the flavour of proofs. For instance, in algebraic topology, there is a clear intuition in the sense that a geometric concept is given an algebraic flavour through homotopy. Most often, in a working mathematicians' toolset, there is an intrinsic understanding of where a certain idea or parts of the proof comes from. And this question aims to ask when would one say that a particular idea is from the analysis point of view. As broad as that may be, I feel that it is a useful question to ask.
    $endgroup$
    – Sandesh Jr
    Jun 12 at 11:20













44












44








44


31



$begingroup$


It is safe to say that every mathematician, at some point in their career, has had some form of exposure to analysis. Quite often, it appears first in the form of an undergraduate course in real analysis. It is there that one is often exposed to a rigorous viewpoint to the techniques of calculus that one is already familiar with. At this stage, one might argue that real analysis is the study of real numbers, but is it? A big chunk of it involves algebraic properties, and as such lies in the realm of algebra. It is the order properties, though, that do have a sort of analysis point of view. Sure, some of these aspects generalise to the level of topologies, but not all. Completeness, for one, is clearly something that is central to analysis.



Similar arguments can be made for complex analysis and functional analysis.



Now, the question is: As for all the topics that are bunched together as analysis, is there any central theme to them? What topics would you say that belongs to this theme? And what are the underlying themes in these individual subtopics?



Add. It may be a subjective question, but having a rough idea of what the central themes of a certain field are helps one to construct appropriate questions. As such, I think it is important. I am not expecting a single answer, but more of a diverse set of opinions on the matter.










share|cite|improve this question












$endgroup$




It is safe to say that every mathematician, at some point in their career, has had some form of exposure to analysis. Quite often, it appears first in the form of an undergraduate course in real analysis. It is there that one is often exposed to a rigorous viewpoint to the techniques of calculus that one is already familiar with. At this stage, one might argue that real analysis is the study of real numbers, but is it? A big chunk of it involves algebraic properties, and as such lies in the realm of algebra. It is the order properties, though, that do have a sort of analysis point of view. Sure, some of these aspects generalise to the level of topologies, but not all. Completeness, for one, is clearly something that is central to analysis.



Similar arguments can be made for complex analysis and functional analysis.



Now, the question is: As for all the topics that are bunched together as analysis, is there any central theme to them? What topics would you say that belongs to this theme? And what are the underlying themes in these individual subtopics?



Add. It may be a subjective question, but having a rough idea of what the central themes of a certain field are helps one to construct appropriate questions. As such, I think it is important. I am not expecting a single answer, but more of a diverse set of opinions on the matter.







real-analysis complex-analysis functional-analysis analysis soft-question






share|cite|improve this question
















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jun 11 at 12:10









Max

24.7k2 gold badges13 silver badges56 bronze badges




24.7k2 gold badges13 silver badges56 bronze badges










asked Jun 10 at 12:53









Sandesh JrSandesh Jr

4715 silver badges11 bronze badges




4715 silver badges11 bronze badges










  • 12




    $begingroup$
    IMO: The one unifying concept in analysis is the limit operation. Nearly any result you have in analysis has something to do with taking a limit, and nearly any result in analysis uses some form of the completeness of the reals (directly or indirectly.
    $endgroup$
    – rubikscube09
    Jun 10 at 22:28






  • 4




    $begingroup$
    The central theme here is limits, but I feel amazed when I think that order relations combined with completeness can generate so much of mathematics. The structure of real numbers is thus the essence and the root of all analysis.
    $endgroup$
    – Paramanand Singh
    Jun 11 at 1:03






  • 4




    $begingroup$
    Limits? Naw. It is the triangle inequality. You can do analysis without limits, but not without the triangle inequality. |a|+|b|>=|c|. It just keeps on showing up way beyond epsilon/delta/limits do.
    $endgroup$
    – Yakk
    Jun 11 at 14:49







  • 7




    $begingroup$
    Dear close-voters: I agree that this question is much "softer" than usual questions we find on MSE, and less precise in the sense of having an objective answer. One might think it is too broad. But I think that a good answer to this question would be incredibly beneficial to new students of analysis, and perhaps give an alternative perspective even for experienced math students and mathematicians alike. Hence, I do not think this question should be closed.
    $endgroup$
    – YiFan
    Jun 12 at 5:40






  • 5




    $begingroup$
    I agree this is very soft, but in practise: most mathematicians have an intrinsic understanding of the flavour of proofs. For instance, in algebraic topology, there is a clear intuition in the sense that a geometric concept is given an algebraic flavour through homotopy. Most often, in a working mathematicians' toolset, there is an intrinsic understanding of where a certain idea or parts of the proof comes from. And this question aims to ask when would one say that a particular idea is from the analysis point of view. As broad as that may be, I feel that it is a useful question to ask.
    $endgroup$
    – Sandesh Jr
    Jun 12 at 11:20












  • 12




    $begingroup$
    IMO: The one unifying concept in analysis is the limit operation. Nearly any result you have in analysis has something to do with taking a limit, and nearly any result in analysis uses some form of the completeness of the reals (directly or indirectly.
    $endgroup$
    – rubikscube09
    Jun 10 at 22:28






  • 4




    $begingroup$
    The central theme here is limits, but I feel amazed when I think that order relations combined with completeness can generate so much of mathematics. The structure of real numbers is thus the essence and the root of all analysis.
    $endgroup$
    – Paramanand Singh
    Jun 11 at 1:03






  • 4




    $begingroup$
    Limits? Naw. It is the triangle inequality. You can do analysis without limits, but not without the triangle inequality. |a|+|b|>=|c|. It just keeps on showing up way beyond epsilon/delta/limits do.
    $endgroup$
    – Yakk
    Jun 11 at 14:49







  • 7




    $begingroup$
    Dear close-voters: I agree that this question is much "softer" than usual questions we find on MSE, and less precise in the sense of having an objective answer. One might think it is too broad. But I think that a good answer to this question would be incredibly beneficial to new students of analysis, and perhaps give an alternative perspective even for experienced math students and mathematicians alike. Hence, I do not think this question should be closed.
    $endgroup$
    – YiFan
    Jun 12 at 5:40






  • 5




    $begingroup$
    I agree this is very soft, but in practise: most mathematicians have an intrinsic understanding of the flavour of proofs. For instance, in algebraic topology, there is a clear intuition in the sense that a geometric concept is given an algebraic flavour through homotopy. Most often, in a working mathematicians' toolset, there is an intrinsic understanding of where a certain idea or parts of the proof comes from. And this question aims to ask when would one say that a particular idea is from the analysis point of view. As broad as that may be, I feel that it is a useful question to ask.
    $endgroup$
    – Sandesh Jr
    Jun 12 at 11:20







12




12




$begingroup$
IMO: The one unifying concept in analysis is the limit operation. Nearly any result you have in analysis has something to do with taking a limit, and nearly any result in analysis uses some form of the completeness of the reals (directly or indirectly.
$endgroup$
– rubikscube09
Jun 10 at 22:28




$begingroup$
IMO: The one unifying concept in analysis is the limit operation. Nearly any result you have in analysis has something to do with taking a limit, and nearly any result in analysis uses some form of the completeness of the reals (directly or indirectly.
$endgroup$
– rubikscube09
Jun 10 at 22:28




4




4




$begingroup$
The central theme here is limits, but I feel amazed when I think that order relations combined with completeness can generate so much of mathematics. The structure of real numbers is thus the essence and the root of all analysis.
$endgroup$
– Paramanand Singh
Jun 11 at 1:03




$begingroup$
The central theme here is limits, but I feel amazed when I think that order relations combined with completeness can generate so much of mathematics. The structure of real numbers is thus the essence and the root of all analysis.
$endgroup$
– Paramanand Singh
Jun 11 at 1:03




4




4




$begingroup$
Limits? Naw. It is the triangle inequality. You can do analysis without limits, but not without the triangle inequality. |a|+|b|>=|c|. It just keeps on showing up way beyond epsilon/delta/limits do.
$endgroup$
– Yakk
Jun 11 at 14:49





$begingroup$
Limits? Naw. It is the triangle inequality. You can do analysis without limits, but not without the triangle inequality. |a|+|b|>=|c|. It just keeps on showing up way beyond epsilon/delta/limits do.
$endgroup$
– Yakk
Jun 11 at 14:49





7




7




$begingroup$
Dear close-voters: I agree that this question is much "softer" than usual questions we find on MSE, and less precise in the sense of having an objective answer. One might think it is too broad. But I think that a good answer to this question would be incredibly beneficial to new students of analysis, and perhaps give an alternative perspective even for experienced math students and mathematicians alike. Hence, I do not think this question should be closed.
$endgroup$
– YiFan
Jun 12 at 5:40




$begingroup$
Dear close-voters: I agree that this question is much "softer" than usual questions we find on MSE, and less precise in the sense of having an objective answer. One might think it is too broad. But I think that a good answer to this question would be incredibly beneficial to new students of analysis, and perhaps give an alternative perspective even for experienced math students and mathematicians alike. Hence, I do not think this question should be closed.
$endgroup$
– YiFan
Jun 12 at 5:40




5




5




$begingroup$
I agree this is very soft, but in practise: most mathematicians have an intrinsic understanding of the flavour of proofs. For instance, in algebraic topology, there is a clear intuition in the sense that a geometric concept is given an algebraic flavour through homotopy. Most often, in a working mathematicians' toolset, there is an intrinsic understanding of where a certain idea or parts of the proof comes from. And this question aims to ask when would one say that a particular idea is from the analysis point of view. As broad as that may be, I feel that it is a useful question to ask.
$endgroup$
– Sandesh Jr
Jun 12 at 11:20




$begingroup$
I agree this is very soft, but in practise: most mathematicians have an intrinsic understanding of the flavour of proofs. For instance, in algebraic topology, there is a clear intuition in the sense that a geometric concept is given an algebraic flavour through homotopy. Most often, in a working mathematicians' toolset, there is an intrinsic understanding of where a certain idea or parts of the proof comes from. And this question aims to ask when would one say that a particular idea is from the analysis point of view. As broad as that may be, I feel that it is a useful question to ask.
$endgroup$
– Sandesh Jr
Jun 12 at 11:20










7 Answers
7






active

oldest

votes


















79
















$begingroup$

I think that I'd say that one of the underlying themes of analysis is, really, the limit. In pretty much every subfield of analysis, we spend a lot of time trying to control the size of certain quantities, with taking limits in mind. This is especially true in PDEs, when we consistently desire norm estimates on various quantities. Let's just discuss the "basics" of the "basic" subjects (standard topics in real, complex, measure theory, functional). I'm going to keep this discussion loose, since we could quickly get into a very drawn-out and detailed discussion.



Real analysis is built on limits. Continuity, differentiability, integration, series, etc. all require the concept of limits. Complex analysis has a bit of a different "flavor" than the other core types, but it still requires limits to do pretty much everything. By Goursat's theorem, holomorphicity is equivalent to complex differentiability over a neighborhood- limits. Integration (and all that comes with it) and residue theory- limits. We can continue this for the entire subject (Laurent series, normal families, conformity, etc.). Lebesgue integration theory and the powerful theorems that come with it are primarily centered around, essentially, swapping limits, and much of measure theory is built around this. In functional analysis, we certainly have times where we can run into not metrizable (or even Hausdorff) topologies, but limits are still central. Many of the common types of spaces like Hilbert, Banach, and Frechet spaces all make use of a metric. We have things like the uniform boundedness principle, compact operators, spectral theory, semigroups, Fourier analysis (this is a field in its own right, of course, but it deals with a lot of functional analysis), and much more, all of which deal with limits (either explicitly or via objects related to previously-discussed material). A significant subfield of analysis is PDEs. As I said earlier, PDEs often deals with obtaining proper norm estimates on certain quantities in appropriate function spaces to prove e.g. existence and regularity of solutions, once again highly dependent on limit arguments (and, of course, the norms themselves are limit-dependent).



Something else that I didn't touch on, but is important to discuss, is just how many modes of convergence we use. Some common types of convergence of sequences of functions and operators are pointwise convergence, uniform convergence, local uniform convergence, almost everywhere convergence, convergence in measure, $L^p$ convergence, (more generally) convergence in norm, weak convergence, weak star convergence, uniform operator convergence, strong operator convergence, weak operator convergence, etc. I didn't distinguish between convergence for operators and functions too much here, but it is important to do so e.g. weak star convergence is pointwise convergence for elements of the dual, but I listed them as separate.



EDIT: The OP asked for some details. Of course, writing everything above in details would amount to me writing books! Instead of talking about everything, I'd like to talk about one pervasive concept in analysis that comes from limits- the integral. I'd like to note that much of the post deals with limits in many other ways as well, explicitly or otherwise. In real analysis, there are various equivalent ways that the integral is defined, but I'd like to use Riemann sums here: we say that a function is Riemann integrable on an interval $[a,b]$ if and only if there exists $Iin mathbbR$ so that




$$I=lim_sumlimits_i=1^n f(t_i)(x_i-x_i-1),$$




where $|P|$ denotes the size of the partition, where $t_iin [x_i-1,x_i].$ We call $I$ the integral, and we denote it as $$I=intlimits_a^b f(s)ds.$$ The integral of a continuous (limits!) function is related to the derivative (limits!) through the fundamental theorem of calculus:




For $fin Cleft([a,b]right),$ a function $F$ satisfies $$F(x)-F(a)=intlimits_a^x f(s)ds$$ for any $xin [a,b]$ if and only if $F'=f$.




As the name of the theorem states, this is pretty important. All of this generalizes appropriately to higher dimensions, but I won't discuss that here. Sometimes, integrating a function on its own can be hard, so we approximate it with easier functions (or sometimes, we have a sequence of functions tending to something, and we want to know about the limit and how it integrates). A major theorem in an introductory real analysis class is that if we have a sequence of Riemann integrable functions $(f_n)$ converging uniformly on $[a,b]$ to $f$, then $f$ is Riemann integrable, and we can swap the limit and integral. So, we can swap these two limits. We can do the same for a series of functions that converges uniformly.



Okay, let's move on. In complex analysis, the integral is still of importance. Integrals in the complex plane are path integrals, which can be defined similarly. Complex analysis is centered on studying holomorphic functions, and a theorem of Morera relates this to the integral




Let $g:OmegarightarrowmathbbC$ be continuous, and $$intlimits_gamma g(z)dz=0$$ whenever $gamma=partial R$ and $RsubsetOmega$ is a rectangle (with sides parallel to the real and imaginary axes). Then, $g$ is holomorphic.




Cauchy's theorem states that the integral of a holomorphic function along a closed curve is zero. This can be used to prove the Cauchy integral formula:




If $fin C^1(barOmega)$ is holomorphic on a bounded region $Omega$ with smooth boundary, then for any $zinOmega$, we have
$$f(z)=frac12pi iintlimits_partialOmegafracf(zeta)zeta-zdzeta.$$




Taking the derivative and iterating proves that holomorphic functions are smooth, and further we get that they have a power series expansion, where the coefficients given by integration. This all hinges on integration.



The integral pops up in many other fundamental ways here. One is in the form of the mean-value property, which states that $$f(z_0)=frac12piintlimits_0^2pi f(z_0+re^itheta)dtheta$$ whenever $f$ is holomorphic on $Omega$ open and the closed disk centered $z_0$ of radius $r$ is contained in $Omega.$ We use the integral to prove other important theorems, such as the maximum modulus principle, Liouville's theorem, etc. We also use it to define a branch of the complex logarithm, to define the coefficients of a Laurent series, and to count zeros and poles of functions (argument principle). We also like to calculate various types of integrals in the complex plane where the integrand has singularities (often as a trick to calculate real integrals, which is especially relevant for calculating Fourier transforms). This uses the residue theorem, and residues are also calculated via taking limits. The theorem states that
$$intlimits_partialOmegaf(z)dz=2pi isum_jtextRes_z_j(f),$$ where $f$ is holomorphic on an open set $Omega$, except at singularities $z_j,$ each of which has a relatively compact neighborhood on which $f$ has a Laurent series (the residue is the $(-1)$'th indexed coefficient, which are also integrals by construction of the Laurent series). I think that's enough about complex analysis.



Now, let's talk a bit about measure theory. The Riemann integral is somewhat restrictive, so we generalize it to the Lebesgue integral (I have a post about the construction, see How to calculate an integral given a measure?). Note the involvement of limits in the post. If a function is Riemann integrable, then it is equivalent to its Lebesgue integral. We can define the Lebesgue integral on any measure space. Two of the biggest theorems are the monotone and dominated convergence theorems:




If $f_jin L^1(X,mu)$, $0leq f_1(x)leq f_2(x)leq cdots,$ and $|f_j|_L^1leq C<infty,$ then $lim_j f_j(x)=f(x),$ with $fin L^1(X,mu),$ and $|f_j-f|_L^1rightarrow 0.$




and




If $f_jin L^1(X,mu)$ and $lim_j f_j(x)=f(x)$ $mu$-a.e., and there is an $Fin L^1(X,mu)$ so that $F$ dominates each $|f_j|$ pointwise $mu$-a.e., then $fin L^1(X,mu)$ and $|f_j-f|_L^1rightarrow 0.$




We have immediate generalization to $L^p$ spaces, as well. These theorems are used extensively to prove things in measure theory, functional analysis, and PDEs. The dominated convergence theorem generalizes the result of using uniform convergence to swap limit and integral. We can use these to show that $L^p$ is complete for $pin [1,infty),$ in fact a Banach space, as these define norms. We show that if $p$ is in the range and $X$ is $sigma$-finite, then the dual of $L^p$ is $L^q$, where $1/p+1/q=1,$ and this functional is defined, wait for it, via integration. We're beginning to overlap a bit with functional analysis, so I'll switch gears a bit. We often use the integral to define linear functionals, and one such example is in the Riesz representation theorem. Here, we find that the dual of $C(X)$, where $X$ is a compact metric space, is the space of finite, signed measure of the Borel sigma algebra (Radon measures). In particular, to any bounded linear function $omega$ on $C(X)$, there exists a unique Radon measure $rho$ such that $$omega (f)=intlimits_x fdrho.$$



Also, we get a generalization of the fundamental theorem of calculus using the Hardy-Littlewood maximal function:




Let $fin L^1(mathbbR^n, dx)$ and consider $$A_rf(x)=frac1m(B_r)intlimits_B_r(x)f(y)dy,$$ where $r>0.$ Then, $$lim_rrightarrow 0 A_rf(x)=f(x)$$ a.e.




In fact, if $fin L^p$, then $$lim_rrightarrow 0frac1m(B_r)intlimits_B_r(x)|f(y)-f(x)|^p dy=0$$ for a.e. $x$.



Something of interest is swapping integrals themselves, which manifests in the theorems of Tonelli and Fubini. There are multiple versions, most of which have objects which require definitions, so I'll just give a quick version, the Fubini-Tonelli theorem for complete measures:




Let $(X,M,mu)$ and $(Y,N,nu)$ be complete, $sigma$-finite measure spaces, and let $(Xtimes Y,mathcalL,lambda)$ be the completion of $(Xtimes Y,Motimes N, mutimesnu).$ If $f$ is $mathcalL$-measurable, and $fin L^1(Xtimes Y,lambda),$ then $$int fdlambda=intint f(x,y)dmu(x)dnu(y)=intint f(x,y)dnu(y)dmu(x).$$




EDIT: I added in a version of Fubini/Tonelli and material related to Cauchy's theorem and the Cauchy integral formula. I am a bit busy today, but I will post about functional analysis tomorrow!



Now, we'll move on to functional analysis. This is a pretty big topic, so I'm just going to pick a few things to talk about. We already had some discussion of $L^p$ spaces, and a lot of the remaining discussion will be related to them in some manner, making all of the discussion inherently reliant on the integral. For this reason, I'm going to stop outlining exactly when we're using integrals.



Recall that $L^p$ spaces constitute Banach spaces. It is important to know that if $p=2,$ then the space becomes a Hilbert space, which is a very nice structure and makes the use of $L^2$ extremely common in PDEs. If $mu$ is $sigma$-finite and the $sigma$-algebra associated to $X$ is finitely-generated, then $L^2(X,mu)$ is separable, allowing us to get an orthonormal basis. For example, $e^i n theta_ninmathbbZ$ is an orthonormal basis for $L^2(S^1,dx/2pi)$ (see Fourier series).



Something that is very useful in PDEs is to define functions of operators, such as $sqrt-Delta,$ where $Delta$ is the Laplacian, and the integral arises naturally in various forms of functional calculus. One such example is the holomorphic functional calculus (I will discuss more later on). If we have a bounded linear operator $T$, a bounded set $OmegasubsetmathbbC$ with smooth boundary containing the spectrum $sigma(T)$ of $T$ in the interior, and a function $f$ holomorphic on a neighborhood of $Omega,$ then we can define $$f(T)=frac12pi iintlimits_partialOmega f(zeta)R_zeta dzeta,$$ where $R_zeta:=(zeta-T)^-1$ is called the resolvent of $T$. This is one, simple way of making rigorous the idea of functions of operators. The holomorphic functional calculus can be extremely useful, although functional calculus can be made more general (such as the Borel functional calculus).



One of the commonly-studied types of operators are integral operators, given in the form $$Ku(x)=intlimits_X k(x,y)u(y)dmu (y)$$ on a measure space $(X,mu)$. These arise naturally in solving differential/integral equations. You'll notice, for example, that the fundamental solutions of some of the basic PDEs are given via convolution, in which case our solution involves integral operators. You may have heard of the Hilbert-Schmidt kernel theorem, which says the following:




If $T:L^2(X_1,mu_1)rightarrow L^2(X_2,mu_2)$ is a Hilbert-Schmidt operator, then there exist $Kin L^2(X_1times X_2, mu_1times mu_2)$ so that $$(Tu,v)_L^2=intint K(x_1,x_2)u (x_1)overlinev(x_2)dmu_1(x_1)dmu_2(x_2).$$




The converse also holds: given $Kin L^2(X_1times X_2, mu_1times mu_2)$, then $T$ as written above defines a Hilbert-Schmidt operator, and it satisfies $|T|_textHS=|K|_L^2.$ For a generalization to tempered distributions (which we'll talk about later), look up the Schwartz kernel theorem.



Another area of interest is semigroups, which (once again) have applications in differential equations. If we take a contraction semigroup $S(t)_tgeq 0$ on a real Banach space $X$ with infinitesimal generator $A$, then we have the following cool result:




If $lambda>0,$ then $lambda$ is in the resolvent set of $A$, denoted $rho(A),$ and if $R_lambda=(lambda-A)^-1$ denotes the resolvent of $A$, then $$R_lambda u=intlimits_0^infty e^-lambda t S(t) u dt$$ for $uin X,$ and $|R_lambda|leqfrac1lambda.$




This is, we can write the resolvent as the Laplace transform of the semigroup. From here, one can prove some major results, like the Hille-Yosida theorem for contraction semigroups:




Let $A$ be a closed, densely-defined linear operator $X$. Then, $A$ is the generator of a contraction semigroup $S(t)_tgeq 0$ if and only if $$(0,infty)subsetrho(A)hspace.25intext and hspace.25in |R_lambda|leqfrac1lambda$$ for $lambda>0.$




We can generalize past contraction semigroups to get a more general version of this theorem. Let me also state a related theorem, Stone's theorem:




If $A$ is self-adjoint, then $iA$ generates the unitary group $U(t)=e^itA.$ Conversely, if $U(t)_tgeq 0$ is a semigroup of unitary operators, then there exists a self-adjoint operator $A$ so that $S(t)=e^itA.$




Semigroup theory has direct application to PDEs. For example, if you have a parabolic equation, say $$partial_t u+Lu=0$$ with appropriate boundary and initial conditions, where $L$ is uniformly elliptic, then you can apply semigroup theory to this problem to guarantee a unique solution given by $S(t)u(x,0)$. This is a generalization of solving linear ODEs on finite dimensional spaces
beginalign*x'&=Ax\
x(0)&=x_0
endalign*
which have the solution $x(t)=e^tAx_0,$ and $e^tA_tgeq 0$ generate a one-parameter semigroup.



As I mentioned earlier, Fourier analysis is certainly its own field, but it uses a lot of functional analytic techniques, and it is monumentally important in PDEs. Let us first define the Fourier transform $mathcalF:L^1(mathbbR^n)rightarrow L^infty (mathbbR^n)$ by
$$mathcalF u(xi)=hatu(xi):=(2pi)^-n/2intlimits_mathbbR^n u(x)e^-ixcdot xi dx.$$ We can also extend this to a bounded operator from $L^2rightarrow L^2.$ For nice enough functions, the Fourier transform enjoys many useful properties, such as
$$D_xi^alpha (mathcalF u)=mathcalF((-x)^alpha u)$$
and $$mathcalF(D_x^alpha u)=xi^alpha mathcalF u,$$ where $D^alpha=frac1i^partial^alpha$ and $alpha$ is a multi-index. It turns out that a very natural place to define this is on the Schwartz space $mathcalS,$ where $mathcalF$ is a topological isomorphism. We have the famed Fourier inversion formula
$$u(x)=(2pi )^-n/2intlimits_mathbbR^nhatu(xi)e^ixcdot xi dxi,$$ and from here, we can get the Plancheral
theorem:
$$|u|^2_L^2=|mathcalFu|^2_L^2,$$ so $mathcalF$ is an isometric isomorphism.



We can define the Fourier transform on the dual space of $mathcalS,$ the space of tempered distributions (denoted $mathcalS'$), via duality: $$langlehatu,frangle=langle u,hatfrangle,$$ where $uinmathcalS'$ and $finmathcalS.$ Due to the algebra and calculus of distributions, one can extend all of the previously-listed properties to here. Now, we might as well mention a powerful and important generalization of the Hilbert-Schmidt kernel theorem, the Schwartz kernel theorem:




Let $A:mathcalSrightarrowmathcalS'$ be a continuous linear map. Then, there exists $K_AinmathcalS'(mathbbR^ntimesmathbbR^n)$ so that for all $u,vinmathcalS,$ $$langle Au,vrangle=langle K_A, uotimes vrangle,$$ where $(uotimes v)(x,y)=u(x)v(y)in mathcalS(mathbbR^ntimesmathbbR^n).$




We sometimes abuse notation and write this as $Au(x)=int K_A(x,y) u(y), dy,$ so that $$langle Au,vrangle=iint K_A(x,y) v(x)u(y), dydx.$$



We can use the Fourier transform to define another functional calculus via Fourier multipliers. Motivated by the Fourier transform of the Laplacian, we define $$f(D)u=mathcalF^-1left(fleft(|xi|right)mathcalFuright),$$ for a "nice" function $f$ (by "nice," I mean so that the above makes sense). This allows us to make sense of objects like $e^tDelta$ or $cosleft(tsqrt-Deltaright).$ The former is related to the fundamental solution of the heat equation and the latter to the fundamental solution of the wave equation (notice anything related to semigroups for the former?). Fourier multipliers generalize to pseudodifferential operators (which are also a generalization of singular integral operators), but I'd rather not get into this, since this section is already very long. Fourier multipliers can be used to give a general definition of Sobolev spaces, which are of fundamental importance in PDEs, as well. We define these spaces, for any real $k$, as $$W^k,p(mathbbR^n)=leftlbrace uin mathcalS'(mathbbR^n): mathcalF^-1left(langle xirangle ^k mathcalF uright)in L^p(mathbbR^nrightrbrace,$$ where $langle xirangle=left(1+|xi|^2right)^1/2.$ These are Banach spaces, and when $p=2,$ it is a Hilbert space. For this reason, we use the provocative notation $W^k,2=H^k.$



Something very significant that I haven't discussed yet is spectral theory, so I'll say a bit about it now. This is another field with signficant application to PDEs (and quantum mechanics). Spectral theory is directly connected to the Hilbert space $L^2$, so the integral is slightly "hidden." Here is one version of the spectral theorem, which can be proven using holomorphic functional calculus or the Fourier transform on $mathcalS'$:




If $A$ is a bounded, self-adjoint operator on a Hilbert space $H$, then there exists a measure space $(X,mathfrakF,mu)$, a unitary map $Phi:Hrightarrow L^2(X,mu),$ and $ain L^infty (X,mu)$ so that $$Phi APhi^-1f(x)=a(x)f(x)$$ for all $fin L^2(X,mu).$ Here, $a$ is real-valued, and $|a|_L^infty=|A|.$




There is also a near-identical version for bounded unitary operators; the only difference is that $|a|=1$ on $X$. We can further extend to normal operators. This also generalizes for unbounded operators:




If $A$ is an unbounded, self-adjoint operator on a separable Hilbert space $H$, then there is a measure space $(X,mu)$, a unitary map $Phi:L^2(X,mu)rightarrow H$, and a real-valued, measurable function $a$ on $X$ such that $$Phi^-1APhi f(x)=a(x)f(x)$$ for $Phi fin mathcalD(A).$ If $fin L^2(X,mu),$ then $Phi finmathcalD(A)$ if and only if $afin L^2(X,mu).$




This givens us a new functional calculus: If $f:mathbbRrightarrow mathbbC$ is Borel, then we can define $f(A)$ via $$Phi^-1f(A)Phi g(x)=f(a(x))g(x).$$ If $f$ is bounded and Borel, then we can define this for any $gin L^2(X,mu),$ and $f(A)$ will be a bounded operator on $H$. If not, then we can define $$mathcalD(f(A))=Phi gin H: gin L^2(X,mu)text and f(a(x))gin L^2(X,mu).$$



I think I'll stop here. PDEs utilizes all of the above, but since it is an application of these subjects, rather than "pure analysis," I'm going to leave that out.



TL;DR Limits are fundamental in analysis, and one such example of their use is in the definition of the integral, which is ubiquitous in the field.



References:



1. Gerald Folland: Real Analysis
2. Elias M. Stein and Rami Shakarchi: Complex Analysis
3. Michael Taylor: Introduction to Complex Analysis (link to pdf on his website: http://mtaylor.web.unc.edu/files/2018/04/complex.pdf)
4. Michael Taylor: Measure Theory and Integration
5. Michael Taylor: Partial Differential Equations I
6. Michael Taylor: Partial Differential Equations II
7. John Conway: A Course in Functional Analysis
8. Lawrence Evans: Partial Differential Equations


EDIT: I've added in functional analysis (and, arguably, went a bit overboard) and listed some references, as well as tweaked some previous statements. I apologize in advance if there are typos- this was difficult to proofread!



EDIT: I've added in a statement of the Schwartz kernel theorem, which I forgot to write before.






share|cite|improve this answer












$endgroup$










  • 3




    $begingroup$
    If you don't mind, please do go into detail.
    $endgroup$
    – Sandesh Jr
    Jun 10 at 14:23






  • 1




    $begingroup$
    @accumulation we can define limits in much more general settings, but its manifestation as a metric is fundamental to most core constructions in analysis. Also, I know very little about transfinite math, so I’ll elect to not make any claims about it.
    $endgroup$
    – cmk
    Jun 10 at 21:35







  • 1




    $begingroup$
    @SandeshJr I've edited my post to include functional analysis (and I referenced PDE throughout). I think I'm going to call it here!
    $endgroup$
    – cmk
    Jun 12 at 12:14






  • 3




    $begingroup$
    Man, I bet this has taken some time to write out. However, quite delightful to read! (+1)
    $endgroup$
    – mrtaurho
    Jun 12 at 13:57






  • 3




    $begingroup$
    Really great answer. And if you ever do end up writing a book, I would most definitely read it.
    $endgroup$
    – Sandesh Jr
    Jun 13 at 13:54


















18
















$begingroup$

From my viewpoint, Real Analysis is a study of functions of (one or several) real variable. Everything else (limits, derivatives, integrals, infinite series, etc.) is a tool serving this purpose. [There is a mild exception one has to make here for sequences and series of real numbers/vectors; these are functions defined on the set of natural numbers and sometimes, integers.] The theory of real numbers and limits was developed (in the 19th century) in order to make the study of functions rigorous. Note that the theory of limits and the "epsilonetics" is not the only way to go. The alternative (at least, the only alternative I know) is the Nonstandard Analysis which justifies the notion of infinitesimally small and infinitely large quantities used by Newton Leibnitz and others for (about) the first 150 years of existence of Real Analysis (before the limits were introduced by Cauchy and, in the modern form, by Weierstrass).



For instance, what is the purpose (or, rather, purposes) of computing derivatives of functions? It is to determine if the given function is increasing/decreasing/concave/convex or to approximate the given function by some polynomial (usually a polynomial of degree one).



What is the purpose of computing limits? It is to determine "approximate" behavior of the function when the input variable is close to some (finite or infinite) value.



What is the purpose of computing integrals? It is to compute length (of curves), areas (of surfaces), volumes (of solids), or to find solutions of differential equations (which are equations on functions involving some derivatives). In the geometric problems (lengths, areas and volumes) one computes a single number "measuring" the given function (say, the length of a curve).



What is the purpose of computing Taylor (Fourier) series? It is to approximate functions with polynomials (or sums of trigonometric functions) which are (usually) easier to analyze than general smooth functions.



This is how it was from the very beginning of Real Analysis (Newton, Leibnitz, Bernoulli, Euler and many others).






share|cite|improve this answer












$endgroup$














  • $begingroup$
    "Real Analysis is a study of functions (of one or several) real variable." Some kind of little grammar snafu here, since if you take out the parenthetical statement it's missing any word between "functions" and "real." Did you mean, "Real Analysis is the study of functions of a real variable (or several variables)"?
    $endgroup$
    – jpmc26
    Jun 12 at 21:14



















15
















$begingroup$

Long ago someone told me this. I still remember it ...




Sometimes I find myself just pushing symbols around, and I wonder, "Am I really doing analysis?" But when an argument begins, "Let $varepsilon > 0$," then I know it really is analysis.







share|cite|improve this answer










$endgroup$










  • 4




    $begingroup$
    "Let $varepsilon>0$", I spot with my eyeball. \ Analysis! That much is clear to all.\ But much to my chagrin, \ no limits were within; \ the variable could've been integral.
    $endgroup$
    – Discrete lizard
    Jun 11 at 17:54










  • $begingroup$
    @Discretelizard Did you come up with that on the fly? I quite like it.
    $endgroup$
    – user3067860
    Jun 11 at 19:10






  • 2




    $begingroup$
    @user3067860 Yes, I did. Thanks! I first tried to let the punchline be that "Let $epsilon>0$" is also a possible begin of an argument about approximation algorithms, often little to do with analysis, but didn't get that to work and ended up with the above.
    $endgroup$
    – Discrete lizard
    Jun 11 at 19:51











  • $begingroup$
    Let $varepsilon>0$ be a nonzero infinite ordinal ...
    $endgroup$
    – Gareth McCaughan
    Jun 12 at 0:44


















9
















$begingroup$

I would say that the central concept of analysis is the concept of limit, specifically a limit of a sequence. Everything that uses concepts built on the concept of limit I would classify as analysis, not algebra. That includes limit of a series, limit of a function, continuity of a function, derivative and Riemann integral. Then the complex analysis emerges from complex algebra when you introduce complex derivative and integrating over curves. Functional analysis also depends on the concepts of continuity and integral, otherwise it would be just algebra of infinitely-dimensional spaces.






share|cite|improve this answer










$endgroup$






















    7
















    $begingroup$

    One additional central aspect of analysis are approximations. After all, the very limit notation is nothing but a qualitative approximation.



    In its core it means "The more effort you put into your approximation, the more precise it will be".



    For example, continuity gives you the guarantee, that even if you miss the true input value of the function, your output value will still be "close". Just imagine if reading a number of a ruler wasn't continuous - there'd be hardly any point in making the measurement in the first place!



    Differentiability gives you an approximation of an interval, in which the function is monotone.



    Integration gives you an approximation of the mistake you make, when you approximate a function into segments.



    In many cases, analysis gives us results that the difference of an approximation to the original object goes to 0 in the limit.

    If we look at this from the practical point of view, where everything has an error term, this is all the justification we need to swap out the original with the approximation.

    (Yes, if you have a calculation that isn't e.g. continuous, you'll get rubbish - but as any representation of the original object would have an error, you'd always anyway get rubbish)






    share|cite|improve this answer










    $endgroup$














    • $begingroup$
      I like this idea. In a sense, I always thought differential calculus was the idea that the information on a local neighbourhood is encoded at a given point in the form of derivatives and the associated taylor series (for analytic functions). So, this idea resonates with me.
      $endgroup$
      – Sandesh Jr
      Jun 13 at 11:06


















    6
















    $begingroup$

    My impression is that Analysis is largely in contrast to Finite/Discrete Math, and thus deals with continuous spaces, especially the real line. This is generalized to spaces with a metric, measure, and/or topology.






    share|cite|improve this answer










    $endgroup$






















      4
















      $begingroup$

      Mathematical analysis is a mental edifice built up to describe and understand phenomena of geometry, physics, and technics in terms of formulas involving finite mathematical expressions. The core of this all is the study of functions $f:>mathbb Rtomathbb R$ and their properties.






      share|cite|improve this answer










      $endgroup$
















        Your Answer








        StackExchange.ready(function()
        var channelOptions =
        tags: "".split(" "),
        id: "69"
        ;
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function()
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled)
        StackExchange.using("snippets", function()
        createEditor();
        );

        else
        createEditor();

        );

        function createEditor()
        StackExchange.prepareEditor(
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: true,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: 10,
        bindNavPrevention: true,
        postfix: "",
        imageUploader:
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        ,
        noCode: true, onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        );



        );














        draft saved

        draft discarded
















        StackExchange.ready(
        function ()
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3257398%2fwhat-is-the-theme-of-analysis%23new-answer', 'question_page');

        );

        Post as a guest















        Required, but never shown


























        7 Answers
        7






        active

        oldest

        votes








        7 Answers
        7






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        79
















        $begingroup$

        I think that I'd say that one of the underlying themes of analysis is, really, the limit. In pretty much every subfield of analysis, we spend a lot of time trying to control the size of certain quantities, with taking limits in mind. This is especially true in PDEs, when we consistently desire norm estimates on various quantities. Let's just discuss the "basics" of the "basic" subjects (standard topics in real, complex, measure theory, functional). I'm going to keep this discussion loose, since we could quickly get into a very drawn-out and detailed discussion.



        Real analysis is built on limits. Continuity, differentiability, integration, series, etc. all require the concept of limits. Complex analysis has a bit of a different "flavor" than the other core types, but it still requires limits to do pretty much everything. By Goursat's theorem, holomorphicity is equivalent to complex differentiability over a neighborhood- limits. Integration (and all that comes with it) and residue theory- limits. We can continue this for the entire subject (Laurent series, normal families, conformity, etc.). Lebesgue integration theory and the powerful theorems that come with it are primarily centered around, essentially, swapping limits, and much of measure theory is built around this. In functional analysis, we certainly have times where we can run into not metrizable (or even Hausdorff) topologies, but limits are still central. Many of the common types of spaces like Hilbert, Banach, and Frechet spaces all make use of a metric. We have things like the uniform boundedness principle, compact operators, spectral theory, semigroups, Fourier analysis (this is a field in its own right, of course, but it deals with a lot of functional analysis), and much more, all of which deal with limits (either explicitly or via objects related to previously-discussed material). A significant subfield of analysis is PDEs. As I said earlier, PDEs often deals with obtaining proper norm estimates on certain quantities in appropriate function spaces to prove e.g. existence and regularity of solutions, once again highly dependent on limit arguments (and, of course, the norms themselves are limit-dependent).



        Something else that I didn't touch on, but is important to discuss, is just how many modes of convergence we use. Some common types of convergence of sequences of functions and operators are pointwise convergence, uniform convergence, local uniform convergence, almost everywhere convergence, convergence in measure, $L^p$ convergence, (more generally) convergence in norm, weak convergence, weak star convergence, uniform operator convergence, strong operator convergence, weak operator convergence, etc. I didn't distinguish between convergence for operators and functions too much here, but it is important to do so e.g. weak star convergence is pointwise convergence for elements of the dual, but I listed them as separate.



        EDIT: The OP asked for some details. Of course, writing everything above in details would amount to me writing books! Instead of talking about everything, I'd like to talk about one pervasive concept in analysis that comes from limits- the integral. I'd like to note that much of the post deals with limits in many other ways as well, explicitly or otherwise. In real analysis, there are various equivalent ways that the integral is defined, but I'd like to use Riemann sums here: we say that a function is Riemann integrable on an interval $[a,b]$ if and only if there exists $Iin mathbbR$ so that




        $$I=lim_sumlimits_i=1^n f(t_i)(x_i-x_i-1),$$




        where $|P|$ denotes the size of the partition, where $t_iin [x_i-1,x_i].$ We call $I$ the integral, and we denote it as $$I=intlimits_a^b f(s)ds.$$ The integral of a continuous (limits!) function is related to the derivative (limits!) through the fundamental theorem of calculus:




        For $fin Cleft([a,b]right),$ a function $F$ satisfies $$F(x)-F(a)=intlimits_a^x f(s)ds$$ for any $xin [a,b]$ if and only if $F'=f$.




        As the name of the theorem states, this is pretty important. All of this generalizes appropriately to higher dimensions, but I won't discuss that here. Sometimes, integrating a function on its own can be hard, so we approximate it with easier functions (or sometimes, we have a sequence of functions tending to something, and we want to know about the limit and how it integrates). A major theorem in an introductory real analysis class is that if we have a sequence of Riemann integrable functions $(f_n)$ converging uniformly on $[a,b]$ to $f$, then $f$ is Riemann integrable, and we can swap the limit and integral. So, we can swap these two limits. We can do the same for a series of functions that converges uniformly.



        Okay, let's move on. In complex analysis, the integral is still of importance. Integrals in the complex plane are path integrals, which can be defined similarly. Complex analysis is centered on studying holomorphic functions, and a theorem of Morera relates this to the integral




        Let $g:OmegarightarrowmathbbC$ be continuous, and $$intlimits_gamma g(z)dz=0$$ whenever $gamma=partial R$ and $RsubsetOmega$ is a rectangle (with sides parallel to the real and imaginary axes). Then, $g$ is holomorphic.




        Cauchy's theorem states that the integral of a holomorphic function along a closed curve is zero. This can be used to prove the Cauchy integral formula:




        If $fin C^1(barOmega)$ is holomorphic on a bounded region $Omega$ with smooth boundary, then for any $zinOmega$, we have
        $$f(z)=frac12pi iintlimits_partialOmegafracf(zeta)zeta-zdzeta.$$




        Taking the derivative and iterating proves that holomorphic functions are smooth, and further we get that they have a power series expansion, where the coefficients given by integration. This all hinges on integration.



        The integral pops up in many other fundamental ways here. One is in the form of the mean-value property, which states that $$f(z_0)=frac12piintlimits_0^2pi f(z_0+re^itheta)dtheta$$ whenever $f$ is holomorphic on $Omega$ open and the closed disk centered $z_0$ of radius $r$ is contained in $Omega.$ We use the integral to prove other important theorems, such as the maximum modulus principle, Liouville's theorem, etc. We also use it to define a branch of the complex logarithm, to define the coefficients of a Laurent series, and to count zeros and poles of functions (argument principle). We also like to calculate various types of integrals in the complex plane where the integrand has singularities (often as a trick to calculate real integrals, which is especially relevant for calculating Fourier transforms). This uses the residue theorem, and residues are also calculated via taking limits. The theorem states that
        $$intlimits_partialOmegaf(z)dz=2pi isum_jtextRes_z_j(f),$$ where $f$ is holomorphic on an open set $Omega$, except at singularities $z_j,$ each of which has a relatively compact neighborhood on which $f$ has a Laurent series (the residue is the $(-1)$'th indexed coefficient, which are also integrals by construction of the Laurent series). I think that's enough about complex analysis.



        Now, let's talk a bit about measure theory. The Riemann integral is somewhat restrictive, so we generalize it to the Lebesgue integral (I have a post about the construction, see How to calculate an integral given a measure?). Note the involvement of limits in the post. If a function is Riemann integrable, then it is equivalent to its Lebesgue integral. We can define the Lebesgue integral on any measure space. Two of the biggest theorems are the monotone and dominated convergence theorems:




        If $f_jin L^1(X,mu)$, $0leq f_1(x)leq f_2(x)leq cdots,$ and $|f_j|_L^1leq C<infty,$ then $lim_j f_j(x)=f(x),$ with $fin L^1(X,mu),$ and $|f_j-f|_L^1rightarrow 0.$




        and




        If $f_jin L^1(X,mu)$ and $lim_j f_j(x)=f(x)$ $mu$-a.e., and there is an $Fin L^1(X,mu)$ so that $F$ dominates each $|f_j|$ pointwise $mu$-a.e., then $fin L^1(X,mu)$ and $|f_j-f|_L^1rightarrow 0.$




        We have immediate generalization to $L^p$ spaces, as well. These theorems are used extensively to prove things in measure theory, functional analysis, and PDEs. The dominated convergence theorem generalizes the result of using uniform convergence to swap limit and integral. We can use these to show that $L^p$ is complete for $pin [1,infty),$ in fact a Banach space, as these define norms. We show that if $p$ is in the range and $X$ is $sigma$-finite, then the dual of $L^p$ is $L^q$, where $1/p+1/q=1,$ and this functional is defined, wait for it, via integration. We're beginning to overlap a bit with functional analysis, so I'll switch gears a bit. We often use the integral to define linear functionals, and one such example is in the Riesz representation theorem. Here, we find that the dual of $C(X)$, where $X$ is a compact metric space, is the space of finite, signed measure of the Borel sigma algebra (Radon measures). In particular, to any bounded linear function $omega$ on $C(X)$, there exists a unique Radon measure $rho$ such that $$omega (f)=intlimits_x fdrho.$$



        Also, we get a generalization of the fundamental theorem of calculus using the Hardy-Littlewood maximal function:




        Let $fin L^1(mathbbR^n, dx)$ and consider $$A_rf(x)=frac1m(B_r)intlimits_B_r(x)f(y)dy,$$ where $r>0.$ Then, $$lim_rrightarrow 0 A_rf(x)=f(x)$$ a.e.




        In fact, if $fin L^p$, then $$lim_rrightarrow 0frac1m(B_r)intlimits_B_r(x)|f(y)-f(x)|^p dy=0$$ for a.e. $x$.



        Something of interest is swapping integrals themselves, which manifests in the theorems of Tonelli and Fubini. There are multiple versions, most of which have objects which require definitions, so I'll just give a quick version, the Fubini-Tonelli theorem for complete measures:




        Let $(X,M,mu)$ and $(Y,N,nu)$ be complete, $sigma$-finite measure spaces, and let $(Xtimes Y,mathcalL,lambda)$ be the completion of $(Xtimes Y,Motimes N, mutimesnu).$ If $f$ is $mathcalL$-measurable, and $fin L^1(Xtimes Y,lambda),$ then $$int fdlambda=intint f(x,y)dmu(x)dnu(y)=intint f(x,y)dnu(y)dmu(x).$$




        EDIT: I added in a version of Fubini/Tonelli and material related to Cauchy's theorem and the Cauchy integral formula. I am a bit busy today, but I will post about functional analysis tomorrow!



        Now, we'll move on to functional analysis. This is a pretty big topic, so I'm just going to pick a few things to talk about. We already had some discussion of $L^p$ spaces, and a lot of the remaining discussion will be related to them in some manner, making all of the discussion inherently reliant on the integral. For this reason, I'm going to stop outlining exactly when we're using integrals.



        Recall that $L^p$ spaces constitute Banach spaces. It is important to know that if $p=2,$ then the space becomes a Hilbert space, which is a very nice structure and makes the use of $L^2$ extremely common in PDEs. If $mu$ is $sigma$-finite and the $sigma$-algebra associated to $X$ is finitely-generated, then $L^2(X,mu)$ is separable, allowing us to get an orthonormal basis. For example, $e^i n theta_ninmathbbZ$ is an orthonormal basis for $L^2(S^1,dx/2pi)$ (see Fourier series).



        Something that is very useful in PDEs is to define functions of operators, such as $sqrt-Delta,$ where $Delta$ is the Laplacian, and the integral arises naturally in various forms of functional calculus. One such example is the holomorphic functional calculus (I will discuss more later on). If we have a bounded linear operator $T$, a bounded set $OmegasubsetmathbbC$ with smooth boundary containing the spectrum $sigma(T)$ of $T$ in the interior, and a function $f$ holomorphic on a neighborhood of $Omega,$ then we can define $$f(T)=frac12pi iintlimits_partialOmega f(zeta)R_zeta dzeta,$$ where $R_zeta:=(zeta-T)^-1$ is called the resolvent of $T$. This is one, simple way of making rigorous the idea of functions of operators. The holomorphic functional calculus can be extremely useful, although functional calculus can be made more general (such as the Borel functional calculus).



        One of the commonly-studied types of operators are integral operators, given in the form $$Ku(x)=intlimits_X k(x,y)u(y)dmu (y)$$ on a measure space $(X,mu)$. These arise naturally in solving differential/integral equations. You'll notice, for example, that the fundamental solutions of some of the basic PDEs are given via convolution, in which case our solution involves integral operators. You may have heard of the Hilbert-Schmidt kernel theorem, which says the following:




        If $T:L^2(X_1,mu_1)rightarrow L^2(X_2,mu_2)$ is a Hilbert-Schmidt operator, then there exist $Kin L^2(X_1times X_2, mu_1times mu_2)$ so that $$(Tu,v)_L^2=intint K(x_1,x_2)u (x_1)overlinev(x_2)dmu_1(x_1)dmu_2(x_2).$$




        The converse also holds: given $Kin L^2(X_1times X_2, mu_1times mu_2)$, then $T$ as written above defines a Hilbert-Schmidt operator, and it satisfies $|T|_textHS=|K|_L^2.$ For a generalization to tempered distributions (which we'll talk about later), look up the Schwartz kernel theorem.



        Another area of interest is semigroups, which (once again) have applications in differential equations. If we take a contraction semigroup $S(t)_tgeq 0$ on a real Banach space $X$ with infinitesimal generator $A$, then we have the following cool result:




        If $lambda>0,$ then $lambda$ is in the resolvent set of $A$, denoted $rho(A),$ and if $R_lambda=(lambda-A)^-1$ denotes the resolvent of $A$, then $$R_lambda u=intlimits_0^infty e^-lambda t S(t) u dt$$ for $uin X,$ and $|R_lambda|leqfrac1lambda.$




        This is, we can write the resolvent as the Laplace transform of the semigroup. From here, one can prove some major results, like the Hille-Yosida theorem for contraction semigroups:




        Let $A$ be a closed, densely-defined linear operator $X$. Then, $A$ is the generator of a contraction semigroup $S(t)_tgeq 0$ if and only if $$(0,infty)subsetrho(A)hspace.25intext and hspace.25in |R_lambda|leqfrac1lambda$$ for $lambda>0.$




        We can generalize past contraction semigroups to get a more general version of this theorem. Let me also state a related theorem, Stone's theorem:




        If $A$ is self-adjoint, then $iA$ generates the unitary group $U(t)=e^itA.$ Conversely, if $U(t)_tgeq 0$ is a semigroup of unitary operators, then there exists a self-adjoint operator $A$ so that $S(t)=e^itA.$




        Semigroup theory has direct application to PDEs. For example, if you have a parabolic equation, say $$partial_t u+Lu=0$$ with appropriate boundary and initial conditions, where $L$ is uniformly elliptic, then you can apply semigroup theory to this problem to guarantee a unique solution given by $S(t)u(x,0)$. This is a generalization of solving linear ODEs on finite dimensional spaces
        beginalign*x'&=Ax\
        x(0)&=x_0
        endalign*
        which have the solution $x(t)=e^tAx_0,$ and $e^tA_tgeq 0$ generate a one-parameter semigroup.



        As I mentioned earlier, Fourier analysis is certainly its own field, but it uses a lot of functional analytic techniques, and it is monumentally important in PDEs. Let us first define the Fourier transform $mathcalF:L^1(mathbbR^n)rightarrow L^infty (mathbbR^n)$ by
        $$mathcalF u(xi)=hatu(xi):=(2pi)^-n/2intlimits_mathbbR^n u(x)e^-ixcdot xi dx.$$ We can also extend this to a bounded operator from $L^2rightarrow L^2.$ For nice enough functions, the Fourier transform enjoys many useful properties, such as
        $$D_xi^alpha (mathcalF u)=mathcalF((-x)^alpha u)$$
        and $$mathcalF(D_x^alpha u)=xi^alpha mathcalF u,$$ where $D^alpha=frac1i^partial^alpha$ and $alpha$ is a multi-index. It turns out that a very natural place to define this is on the Schwartz space $mathcalS,$ where $mathcalF$ is a topological isomorphism. We have the famed Fourier inversion formula
        $$u(x)=(2pi )^-n/2intlimits_mathbbR^nhatu(xi)e^ixcdot xi dxi,$$ and from here, we can get the Plancheral
        theorem:
        $$|u|^2_L^2=|mathcalFu|^2_L^2,$$ so $mathcalF$ is an isometric isomorphism.



        We can define the Fourier transform on the dual space of $mathcalS,$ the space of tempered distributions (denoted $mathcalS'$), via duality: $$langlehatu,frangle=langle u,hatfrangle,$$ where $uinmathcalS'$ and $finmathcalS.$ Due to the algebra and calculus of distributions, one can extend all of the previously-listed properties to here. Now, we might as well mention a powerful and important generalization of the Hilbert-Schmidt kernel theorem, the Schwartz kernel theorem:




        Let $A:mathcalSrightarrowmathcalS'$ be a continuous linear map. Then, there exists $K_AinmathcalS'(mathbbR^ntimesmathbbR^n)$ so that for all $u,vinmathcalS,$ $$langle Au,vrangle=langle K_A, uotimes vrangle,$$ where $(uotimes v)(x,y)=u(x)v(y)in mathcalS(mathbbR^ntimesmathbbR^n).$




        We sometimes abuse notation and write this as $Au(x)=int K_A(x,y) u(y), dy,$ so that $$langle Au,vrangle=iint K_A(x,y) v(x)u(y), dydx.$$



        We can use the Fourier transform to define another functional calculus via Fourier multipliers. Motivated by the Fourier transform of the Laplacian, we define $$f(D)u=mathcalF^-1left(fleft(|xi|right)mathcalFuright),$$ for a "nice" function $f$ (by "nice," I mean so that the above makes sense). This allows us to make sense of objects like $e^tDelta$ or $cosleft(tsqrt-Deltaright).$ The former is related to the fundamental solution of the heat equation and the latter to the fundamental solution of the wave equation (notice anything related to semigroups for the former?). Fourier multipliers generalize to pseudodifferential operators (which are also a generalization of singular integral operators), but I'd rather not get into this, since this section is already very long. Fourier multipliers can be used to give a general definition of Sobolev spaces, which are of fundamental importance in PDEs, as well. We define these spaces, for any real $k$, as $$W^k,p(mathbbR^n)=leftlbrace uin mathcalS'(mathbbR^n): mathcalF^-1left(langle xirangle ^k mathcalF uright)in L^p(mathbbR^nrightrbrace,$$ where $langle xirangle=left(1+|xi|^2right)^1/2.$ These are Banach spaces, and when $p=2,$ it is a Hilbert space. For this reason, we use the provocative notation $W^k,2=H^k.$



        Something very significant that I haven't discussed yet is spectral theory, so I'll say a bit about it now. This is another field with signficant application to PDEs (and quantum mechanics). Spectral theory is directly connected to the Hilbert space $L^2$, so the integral is slightly "hidden." Here is one version of the spectral theorem, which can be proven using holomorphic functional calculus or the Fourier transform on $mathcalS'$:




        If $A$ is a bounded, self-adjoint operator on a Hilbert space $H$, then there exists a measure space $(X,mathfrakF,mu)$, a unitary map $Phi:Hrightarrow L^2(X,mu),$ and $ain L^infty (X,mu)$ so that $$Phi APhi^-1f(x)=a(x)f(x)$$ for all $fin L^2(X,mu).$ Here, $a$ is real-valued, and $|a|_L^infty=|A|.$




        There is also a near-identical version for bounded unitary operators; the only difference is that $|a|=1$ on $X$. We can further extend to normal operators. This also generalizes for unbounded operators:




        If $A$ is an unbounded, self-adjoint operator on a separable Hilbert space $H$, then there is a measure space $(X,mu)$, a unitary map $Phi:L^2(X,mu)rightarrow H$, and a real-valued, measurable function $a$ on $X$ such that $$Phi^-1APhi f(x)=a(x)f(x)$$ for $Phi fin mathcalD(A).$ If $fin L^2(X,mu),$ then $Phi finmathcalD(A)$ if and only if $afin L^2(X,mu).$




        This givens us a new functional calculus: If $f:mathbbRrightarrow mathbbC$ is Borel, then we can define $f(A)$ via $$Phi^-1f(A)Phi g(x)=f(a(x))g(x).$$ If $f$ is bounded and Borel, then we can define this for any $gin L^2(X,mu),$ and $f(A)$ will be a bounded operator on $H$. If not, then we can define $$mathcalD(f(A))=Phi gin H: gin L^2(X,mu)text and f(a(x))gin L^2(X,mu).$$



        I think I'll stop here. PDEs utilizes all of the above, but since it is an application of these subjects, rather than "pure analysis," I'm going to leave that out.



        TL;DR Limits are fundamental in analysis, and one such example of their use is in the definition of the integral, which is ubiquitous in the field.



        References:



        1. Gerald Folland: Real Analysis
        2. Elias M. Stein and Rami Shakarchi: Complex Analysis
        3. Michael Taylor: Introduction to Complex Analysis (link to pdf on his website: http://mtaylor.web.unc.edu/files/2018/04/complex.pdf)
        4. Michael Taylor: Measure Theory and Integration
        5. Michael Taylor: Partial Differential Equations I
        6. Michael Taylor: Partial Differential Equations II
        7. John Conway: A Course in Functional Analysis
        8. Lawrence Evans: Partial Differential Equations


        EDIT: I've added in functional analysis (and, arguably, went a bit overboard) and listed some references, as well as tweaked some previous statements. I apologize in advance if there are typos- this was difficult to proofread!



        EDIT: I've added in a statement of the Schwartz kernel theorem, which I forgot to write before.






        share|cite|improve this answer












        $endgroup$










        • 3




          $begingroup$
          If you don't mind, please do go into detail.
          $endgroup$
          – Sandesh Jr
          Jun 10 at 14:23






        • 1




          $begingroup$
          @accumulation we can define limits in much more general settings, but its manifestation as a metric is fundamental to most core constructions in analysis. Also, I know very little about transfinite math, so I’ll elect to not make any claims about it.
          $endgroup$
          – cmk
          Jun 10 at 21:35







        • 1




          $begingroup$
          @SandeshJr I've edited my post to include functional analysis (and I referenced PDE throughout). I think I'm going to call it here!
          $endgroup$
          – cmk
          Jun 12 at 12:14






        • 3




          $begingroup$
          Man, I bet this has taken some time to write out. However, quite delightful to read! (+1)
          $endgroup$
          – mrtaurho
          Jun 12 at 13:57






        • 3




          $begingroup$
          Really great answer. And if you ever do end up writing a book, I would most definitely read it.
          $endgroup$
          – Sandesh Jr
          Jun 13 at 13:54















        79
















        $begingroup$

        I think that I'd say that one of the underlying themes of analysis is, really, the limit. In pretty much every subfield of analysis, we spend a lot of time trying to control the size of certain quantities, with taking limits in mind. This is especially true in PDEs, when we consistently desire norm estimates on various quantities. Let's just discuss the "basics" of the "basic" subjects (standard topics in real, complex, measure theory, functional). I'm going to keep this discussion loose, since we could quickly get into a very drawn-out and detailed discussion.



        Real analysis is built on limits. Continuity, differentiability, integration, series, etc. all require the concept of limits. Complex analysis has a bit of a different "flavor" than the other core types, but it still requires limits to do pretty much everything. By Goursat's theorem, holomorphicity is equivalent to complex differentiability over a neighborhood- limits. Integration (and all that comes with it) and residue theory- limits. We can continue this for the entire subject (Laurent series, normal families, conformity, etc.). Lebesgue integration theory and the powerful theorems that come with it are primarily centered around, essentially, swapping limits, and much of measure theory is built around this. In functional analysis, we certainly have times where we can run into not metrizable (or even Hausdorff) topologies, but limits are still central. Many of the common types of spaces like Hilbert, Banach, and Frechet spaces all make use of a metric. We have things like the uniform boundedness principle, compact operators, spectral theory, semigroups, Fourier analysis (this is a field in its own right, of course, but it deals with a lot of functional analysis), and much more, all of which deal with limits (either explicitly or via objects related to previously-discussed material). A significant subfield of analysis is PDEs. As I said earlier, PDEs often deals with obtaining proper norm estimates on certain quantities in appropriate function spaces to prove e.g. existence and regularity of solutions, once again highly dependent on limit arguments (and, of course, the norms themselves are limit-dependent).



        Something else that I didn't touch on, but is important to discuss, is just how many modes of convergence we use. Some common types of convergence of sequences of functions and operators are pointwise convergence, uniform convergence, local uniform convergence, almost everywhere convergence, convergence in measure, $L^p$ convergence, (more generally) convergence in norm, weak convergence, weak star convergence, uniform operator convergence, strong operator convergence, weak operator convergence, etc. I didn't distinguish between convergence for operators and functions too much here, but it is important to do so e.g. weak star convergence is pointwise convergence for elements of the dual, but I listed them as separate.



        EDIT: The OP asked for some details. Of course, writing everything above in details would amount to me writing books! Instead of talking about everything, I'd like to talk about one pervasive concept in analysis that comes from limits- the integral. I'd like to note that much of the post deals with limits in many other ways as well, explicitly or otherwise. In real analysis, there are various equivalent ways that the integral is defined, but I'd like to use Riemann sums here: we say that a function is Riemann integrable on an interval $[a,b]$ if and only if there exists $Iin mathbbR$ so that




        $$I=lim_sumlimits_i=1^n f(t_i)(x_i-x_i-1),$$




        where $|P|$ denotes the size of the partition, where $t_iin [x_i-1,x_i].$ We call $I$ the integral, and we denote it as $$I=intlimits_a^b f(s)ds.$$ The integral of a continuous (limits!) function is related to the derivative (limits!) through the fundamental theorem of calculus:




        For $fin Cleft([a,b]right),$ a function $F$ satisfies $$F(x)-F(a)=intlimits_a^x f(s)ds$$ for any $xin [a,b]$ if and only if $F'=f$.




        As the name of the theorem states, this is pretty important. All of this generalizes appropriately to higher dimensions, but I won't discuss that here. Sometimes, integrating a function on its own can be hard, so we approximate it with easier functions (or sometimes, we have a sequence of functions tending to something, and we want to know about the limit and how it integrates). A major theorem in an introductory real analysis class is that if we have a sequence of Riemann integrable functions $(f_n)$ converging uniformly on $[a,b]$ to $f$, then $f$ is Riemann integrable, and we can swap the limit and integral. So, we can swap these two limits. We can do the same for a series of functions that converges uniformly.



        Okay, let's move on. In complex analysis, the integral is still of importance. Integrals in the complex plane are path integrals, which can be defined similarly. Complex analysis is centered on studying holomorphic functions, and a theorem of Morera relates this to the integral




        Let $g:OmegarightarrowmathbbC$ be continuous, and $$intlimits_gamma g(z)dz=0$$ whenever $gamma=partial R$ and $RsubsetOmega$ is a rectangle (with sides parallel to the real and imaginary axes). Then, $g$ is holomorphic.




        Cauchy's theorem states that the integral of a holomorphic function along a closed curve is zero. This can be used to prove the Cauchy integral formula:




        If $fin C^1(barOmega)$ is holomorphic on a bounded region $Omega$ with smooth boundary, then for any $zinOmega$, we have
        $$f(z)=frac12pi iintlimits_partialOmegafracf(zeta)zeta-zdzeta.$$




        Taking the derivative and iterating proves that holomorphic functions are smooth, and further we get that they have a power series expansion, where the coefficients given by integration. This all hinges on integration.



        The integral pops up in many other fundamental ways here. One is in the form of the mean-value property, which states that $$f(z_0)=frac12piintlimits_0^2pi f(z_0+re^itheta)dtheta$$ whenever $f$ is holomorphic on $Omega$ open and the closed disk centered $z_0$ of radius $r$ is contained in $Omega.$ We use the integral to prove other important theorems, such as the maximum modulus principle, Liouville's theorem, etc. We also use it to define a branch of the complex logarithm, to define the coefficients of a Laurent series, and to count zeros and poles of functions (argument principle). We also like to calculate various types of integrals in the complex plane where the integrand has singularities (often as a trick to calculate real integrals, which is especially relevant for calculating Fourier transforms). This uses the residue theorem, and residues are also calculated via taking limits. The theorem states that
        $$intlimits_partialOmegaf(z)dz=2pi isum_jtextRes_z_j(f),$$ where $f$ is holomorphic on an open set $Omega$, except at singularities $z_j,$ each of which has a relatively compact neighborhood on which $f$ has a Laurent series (the residue is the $(-1)$'th indexed coefficient, which are also integrals by construction of the Laurent series). I think that's enough about complex analysis.



        Now, let's talk a bit about measure theory. The Riemann integral is somewhat restrictive, so we generalize it to the Lebesgue integral (I have a post about the construction, see How to calculate an integral given a measure?). Note the involvement of limits in the post. If a function is Riemann integrable, then it is equivalent to its Lebesgue integral. We can define the Lebesgue integral on any measure space. Two of the biggest theorems are the monotone and dominated convergence theorems:




        If $f_jin L^1(X,mu)$, $0leq f_1(x)leq f_2(x)leq cdots,$ and $|f_j|_L^1leq C<infty,$ then $lim_j f_j(x)=f(x),$ with $fin L^1(X,mu),$ and $|f_j-f|_L^1rightarrow 0.$




        and




        If $f_jin L^1(X,mu)$ and $lim_j f_j(x)=f(x)$ $mu$-a.e., and there is an $Fin L^1(X,mu)$ so that $F$ dominates each $|f_j|$ pointwise $mu$-a.e., then $fin L^1(X,mu)$ and $|f_j-f|_L^1rightarrow 0.$




        We have immediate generalization to $L^p$ spaces, as well. These theorems are used extensively to prove things in measure theory, functional analysis, and PDEs. The dominated convergence theorem generalizes the result of using uniform convergence to swap limit and integral. We can use these to show that $L^p$ is complete for $pin [1,infty),$ in fact a Banach space, as these define norms. We show that if $p$ is in the range and $X$ is $sigma$-finite, then the dual of $L^p$ is $L^q$, where $1/p+1/q=1,$ and this functional is defined, wait for it, via integration. We're beginning to overlap a bit with functional analysis, so I'll switch gears a bit. We often use the integral to define linear functionals, and one such example is in the Riesz representation theorem. Here, we find that the dual of $C(X)$, where $X$ is a compact metric space, is the space of finite, signed measure of the Borel sigma algebra (Radon measures). In particular, to any bounded linear function $omega$ on $C(X)$, there exists a unique Radon measure $rho$ such that $$omega (f)=intlimits_x fdrho.$$



        Also, we get a generalization of the fundamental theorem of calculus using the Hardy-Littlewood maximal function:




        Let $fin L^1(mathbbR^n, dx)$ and consider $$A_rf(x)=frac1m(B_r)intlimits_B_r(x)f(y)dy,$$ where $r>0.$ Then, $$lim_rrightarrow 0 A_rf(x)=f(x)$$ a.e.




        In fact, if $fin L^p$, then $$lim_rrightarrow 0frac1m(B_r)intlimits_B_r(x)|f(y)-f(x)|^p dy=0$$ for a.e. $x$.



        Something of interest is swapping integrals themselves, which manifests in the theorems of Tonelli and Fubini. There are multiple versions, most of which have objects which require definitions, so I'll just give a quick version, the Fubini-Tonelli theorem for complete measures:




        Let $(X,M,mu)$ and $(Y,N,nu)$ be complete, $sigma$-finite measure spaces, and let $(Xtimes Y,mathcalL,lambda)$ be the completion of $(Xtimes Y,Motimes N, mutimesnu).$ If $f$ is $mathcalL$-measurable, and $fin L^1(Xtimes Y,lambda),$ then $$int fdlambda=intint f(x,y)dmu(x)dnu(y)=intint f(x,y)dnu(y)dmu(x).$$




        EDIT: I added in a version of Fubini/Tonelli and material related to Cauchy's theorem and the Cauchy integral formula. I am a bit busy today, but I will post about functional analysis tomorrow!



        Now, we'll move on to functional analysis. This is a pretty big topic, so I'm just going to pick a few things to talk about. We already had some discussion of $L^p$ spaces, and a lot of the remaining discussion will be related to them in some manner, making all of the discussion inherently reliant on the integral. For this reason, I'm going to stop outlining exactly when we're using integrals.



        Recall that $L^p$ spaces constitute Banach spaces. It is important to know that if $p=2,$ then the space becomes a Hilbert space, which is a very nice structure and makes the use of $L^2$ extremely common in PDEs. If $mu$ is $sigma$-finite and the $sigma$-algebra associated to $X$ is finitely-generated, then $L^2(X,mu)$ is separable, allowing us to get an orthonormal basis. For example, $e^i n theta_ninmathbbZ$ is an orthonormal basis for $L^2(S^1,dx/2pi)$ (see Fourier series).



        Something that is very useful in PDEs is to define functions of operators, such as $sqrt-Delta,$ where $Delta$ is the Laplacian, and the integral arises naturally in various forms of functional calculus. One such example is the holomorphic functional calculus (I will discuss more later on). If we have a bounded linear operator $T$, a bounded set $OmegasubsetmathbbC$ with smooth boundary containing the spectrum $sigma(T)$ of $T$ in the interior, and a function $f$ holomorphic on a neighborhood of $Omega,$ then we can define $$f(T)=frac12pi iintlimits_partialOmega f(zeta)R_zeta dzeta,$$ where $R_zeta:=(zeta-T)^-1$ is called the resolvent of $T$. This is one, simple way of making rigorous the idea of functions of operators. The holomorphic functional calculus can be extremely useful, although functional calculus can be made more general (such as the Borel functional calculus).



        One of the commonly-studied types of operators are integral operators, given in the form $$Ku(x)=intlimits_X k(x,y)u(y)dmu (y)$$ on a measure space $(X,mu)$. These arise naturally in solving differential/integral equations. You'll notice, for example, that the fundamental solutions of some of the basic PDEs are given via convolution, in which case our solution involves integral operators. You may have heard of the Hilbert-Schmidt kernel theorem, which says the following:




        If $T:L^2(X_1,mu_1)rightarrow L^2(X_2,mu_2)$ is a Hilbert-Schmidt operator, then there exist $Kin L^2(X_1times X_2, mu_1times mu_2)$ so that $$(Tu,v)_L^2=intint K(x_1,x_2)u (x_1)overlinev(x_2)dmu_1(x_1)dmu_2(x_2).$$




        The converse also holds: given $Kin L^2(X_1times X_2, mu_1times mu_2)$, then $T$ as written above defines a Hilbert-Schmidt operator, and it satisfies $|T|_textHS=|K|_L^2.$ For a generalization to tempered distributions (which we'll talk about later), look up the Schwartz kernel theorem.



        Another area of interest is semigroups, which (once again) have applications in differential equations. If we take a contraction semigroup $S(t)_tgeq 0$ on a real Banach space $X$ with infinitesimal generator $A$, then we have the following cool result:




        If $lambda>0,$ then $lambda$ is in the resolvent set of $A$, denoted $rho(A),$ and if $R_lambda=(lambda-A)^-1$ denotes the resolvent of $A$, then $$R_lambda u=intlimits_0^infty e^-lambda t S(t) u dt$$ for $uin X,$ and $|R_lambda|leqfrac1lambda.$




        This is, we can write the resolvent as the Laplace transform of the semigroup. From here, one can prove some major results, like the Hille-Yosida theorem for contraction semigroups:




        Let $A$ be a closed, densely-defined linear operator $X$. Then, $A$ is the generator of a contraction semigroup $S(t)_tgeq 0$ if and only if $$(0,infty)subsetrho(A)hspace.25intext and hspace.25in |R_lambda|leqfrac1lambda$$ for $lambda>0.$




        We can generalize past contraction semigroups to get a more general version of this theorem. Let me also state a related theorem, Stone's theorem:




        If $A$ is self-adjoint, then $iA$ generates the unitary group $U(t)=e^itA.$ Conversely, if $U(t)_tgeq 0$ is a semigroup of unitary operators, then there exists a self-adjoint operator $A$ so that $S(t)=e^itA.$




        Semigroup theory has direct application to PDEs. For example, if you have a parabolic equation, say $$partial_t u+Lu=0$$ with appropriate boundary and initial conditions, where $L$ is uniformly elliptic, then you can apply semigroup theory to this problem to guarantee a unique solution given by $S(t)u(x,0)$. This is a generalization of solving linear ODEs on finite dimensional spaces
        beginalign*x'&=Ax\
        x(0)&=x_0
        endalign*
        which have the solution $x(t)=e^tAx_0,$ and $e^tA_tgeq 0$ generate a one-parameter semigroup.



        As I mentioned earlier, Fourier analysis is certainly its own field, but it uses a lot of functional analytic techniques, and it is monumentally important in PDEs. Let us first define the Fourier transform $mathcalF:L^1(mathbbR^n)rightarrow L^infty (mathbbR^n)$ by
        $$mathcalF u(xi)=hatu(xi):=(2pi)^-n/2intlimits_mathbbR^n u(x)e^-ixcdot xi dx.$$ We can also extend this to a bounded operator from $L^2rightarrow L^2.$ For nice enough functions, the Fourier transform enjoys many useful properties, such as
        $$D_xi^alpha (mathcalF u)=mathcalF((-x)^alpha u)$$
        and $$mathcalF(D_x^alpha u)=xi^alpha mathcalF u,$$ where $D^alpha=frac1i^partial^alpha$ and $alpha$ is a multi-index. It turns out that a very natural place to define this is on the Schwartz space $mathcalS,$ where $mathcalF$ is a topological isomorphism. We have the famed Fourier inversion formula
        $$u(x)=(2pi )^-n/2intlimits_mathbbR^nhatu(xi)e^ixcdot xi dxi,$$ and from here, we can get the Plancheral
        theorem:
        $$|u|^2_L^2=|mathcalFu|^2_L^2,$$ so $mathcalF$ is an isometric isomorphism.



        We can define the Fourier transform on the dual space of $mathcalS,$ the space of tempered distributions (denoted $mathcalS'$), via duality: $$langlehatu,frangle=langle u,hatfrangle,$$ where $uinmathcalS'$ and $finmathcalS.$ Due to the algebra and calculus of distributions, one can extend all of the previously-listed properties to here. Now, we might as well mention a powerful and important generalization of the Hilbert-Schmidt kernel theorem, the Schwartz kernel theorem:




        Let $A:mathcalSrightarrowmathcalS'$ be a continuous linear map. Then, there exists $K_AinmathcalS'(mathbbR^ntimesmathbbR^n)$ so that for all $u,vinmathcalS,$ $$langle Au,vrangle=langle K_A, uotimes vrangle,$$ where $(uotimes v)(x,y)=u(x)v(y)in mathcalS(mathbbR^ntimesmathbbR^n).$




        We sometimes abuse notation and write this as $Au(x)=int K_A(x,y) u(y), dy,$ so that $$langle Au,vrangle=iint K_A(x,y) v(x)u(y), dydx.$$



        We can use the Fourier transform to define another functional calculus via Fourier multipliers. Motivated by the Fourier transform of the Laplacian, we define $$f(D)u=mathcalF^-1left(fleft(|xi|right)mathcalFuright),$$ for a "nice" function $f$ (by "nice," I mean so that the above makes sense). This allows us to make sense of objects like $e^tDelta$ or $cosleft(tsqrt-Deltaright).$ The former is related to the fundamental solution of the heat equation and the latter to the fundamental solution of the wave equation (notice anything related to semigroups for the former?). Fourier multipliers generalize to pseudodifferential operators (which are also a generalization of singular integral operators), but I'd rather not get into this, since this section is already very long. Fourier multipliers can be used to give a general definition of Sobolev spaces, which are of fundamental importance in PDEs, as well. We define these spaces, for any real $k$, as $$W^k,p(mathbbR^n)=leftlbrace uin mathcalS'(mathbbR^n): mathcalF^-1left(langle xirangle ^k mathcalF uright)in L^p(mathbbR^nrightrbrace,$$ where $langle xirangle=left(1+|xi|^2right)^1/2.$ These are Banach spaces, and when $p=2,$ it is a Hilbert space. For this reason, we use the provocative notation $W^k,2=H^k.$



        Something very significant that I haven't discussed yet is spectral theory, so I'll say a bit about it now. This is another field with signficant application to PDEs (and quantum mechanics). Spectral theory is directly connected to the Hilbert space $L^2$, so the integral is slightly "hidden." Here is one version of the spectral theorem, which can be proven using holomorphic functional calculus or the Fourier transform on $mathcalS'$:




        If $A$ is a bounded, self-adjoint operator on a Hilbert space $H$, then there exists a measure space $(X,mathfrakF,mu)$, a unitary map $Phi:Hrightarrow L^2(X,mu),$ and $ain L^infty (X,mu)$ so that $$Phi APhi^-1f(x)=a(x)f(x)$$ for all $fin L^2(X,mu).$ Here, $a$ is real-valued, and $|a|_L^infty=|A|.$




        There is also a near-identical version for bounded unitary operators; the only difference is that $|a|=1$ on $X$. We can further extend to normal operators. This also generalizes for unbounded operators:




        If $A$ is an unbounded, self-adjoint operator on a separable Hilbert space $H$, then there is a measure space $(X,mu)$, a unitary map $Phi:L^2(X,mu)rightarrow H$, and a real-valued, measurable function $a$ on $X$ such that $$Phi^-1APhi f(x)=a(x)f(x)$$ for $Phi fin mathcalD(A).$ If $fin L^2(X,mu),$ then $Phi finmathcalD(A)$ if and only if $afin L^2(X,mu).$




        This givens us a new functional calculus: If $f:mathbbRrightarrow mathbbC$ is Borel, then we can define $f(A)$ via $$Phi^-1f(A)Phi g(x)=f(a(x))g(x).$$ If $f$ is bounded and Borel, then we can define this for any $gin L^2(X,mu),$ and $f(A)$ will be a bounded operator on $H$. If not, then we can define $$mathcalD(f(A))=Phi gin H: gin L^2(X,mu)text and f(a(x))gin L^2(X,mu).$$



        I think I'll stop here. PDEs utilizes all of the above, but since it is an application of these subjects, rather than "pure analysis," I'm going to leave that out.



        TL;DR Limits are fundamental in analysis, and one such example of their use is in the definition of the integral, which is ubiquitous in the field.



        References:



        1. Gerald Folland: Real Analysis
        2. Elias M. Stein and Rami Shakarchi: Complex Analysis
        3. Michael Taylor: Introduction to Complex Analysis (link to pdf on his website: http://mtaylor.web.unc.edu/files/2018/04/complex.pdf)
        4. Michael Taylor: Measure Theory and Integration
        5. Michael Taylor: Partial Differential Equations I
        6. Michael Taylor: Partial Differential Equations II
        7. John Conway: A Course in Functional Analysis
        8. Lawrence Evans: Partial Differential Equations


        EDIT: I've added in functional analysis (and, arguably, went a bit overboard) and listed some references, as well as tweaked some previous statements. I apologize in advance if there are typos- this was difficult to proofread!



        EDIT: I've added in a statement of the Schwartz kernel theorem, which I forgot to write before.






        share|cite|improve this answer












        $endgroup$










        • 3




          $begingroup$
          If you don't mind, please do go into detail.
          $endgroup$
          – Sandesh Jr
          Jun 10 at 14:23






        • 1




          $begingroup$
          @accumulation we can define limits in much more general settings, but its manifestation as a metric is fundamental to most core constructions in analysis. Also, I know very little about transfinite math, so I’ll elect to not make any claims about it.
          $endgroup$
          – cmk
          Jun 10 at 21:35







        • 1




          $begingroup$
          @SandeshJr I've edited my post to include functional analysis (and I referenced PDE throughout). I think I'm going to call it here!
          $endgroup$
          – cmk
          Jun 12 at 12:14






        • 3




          $begingroup$
          Man, I bet this has taken some time to write out. However, quite delightful to read! (+1)
          $endgroup$
          – mrtaurho
          Jun 12 at 13:57






        • 3




          $begingroup$
          Really great answer. And if you ever do end up writing a book, I would most definitely read it.
          $endgroup$
          – Sandesh Jr
          Jun 13 at 13:54













        79














        79










        79







        $begingroup$

        I think that I'd say that one of the underlying themes of analysis is, really, the limit. In pretty much every subfield of analysis, we spend a lot of time trying to control the size of certain quantities, with taking limits in mind. This is especially true in PDEs, when we consistently desire norm estimates on various quantities. Let's just discuss the "basics" of the "basic" subjects (standard topics in real, complex, measure theory, functional). I'm going to keep this discussion loose, since we could quickly get into a very drawn-out and detailed discussion.



        Real analysis is built on limits. Continuity, differentiability, integration, series, etc. all require the concept of limits. Complex analysis has a bit of a different "flavor" than the other core types, but it still requires limits to do pretty much everything. By Goursat's theorem, holomorphicity is equivalent to complex differentiability over a neighborhood- limits. Integration (and all that comes with it) and residue theory- limits. We can continue this for the entire subject (Laurent series, normal families, conformity, etc.). Lebesgue integration theory and the powerful theorems that come with it are primarily centered around, essentially, swapping limits, and much of measure theory is built around this. In functional analysis, we certainly have times where we can run into not metrizable (or even Hausdorff) topologies, but limits are still central. Many of the common types of spaces like Hilbert, Banach, and Frechet spaces all make use of a metric. We have things like the uniform boundedness principle, compact operators, spectral theory, semigroups, Fourier analysis (this is a field in its own right, of course, but it deals with a lot of functional analysis), and much more, all of which deal with limits (either explicitly or via objects related to previously-discussed material). A significant subfield of analysis is PDEs. As I said earlier, PDEs often deals with obtaining proper norm estimates on certain quantities in appropriate function spaces to prove e.g. existence and regularity of solutions, once again highly dependent on limit arguments (and, of course, the norms themselves are limit-dependent).



        Something else that I didn't touch on, but is important to discuss, is just how many modes of convergence we use. Some common types of convergence of sequences of functions and operators are pointwise convergence, uniform convergence, local uniform convergence, almost everywhere convergence, convergence in measure, $L^p$ convergence, (more generally) convergence in norm, weak convergence, weak star convergence, uniform operator convergence, strong operator convergence, weak operator convergence, etc. I didn't distinguish between convergence for operators and functions too much here, but it is important to do so e.g. weak star convergence is pointwise convergence for elements of the dual, but I listed them as separate.



        EDIT: The OP asked for some details. Of course, writing everything above in details would amount to me writing books! Instead of talking about everything, I'd like to talk about one pervasive concept in analysis that comes from limits- the integral. I'd like to note that much of the post deals with limits in many other ways as well, explicitly or otherwise. In real analysis, there are various equivalent ways that the integral is defined, but I'd like to use Riemann sums here: we say that a function is Riemann integrable on an interval $[a,b]$ if and only if there exists $Iin mathbbR$ so that




        $$I=lim_sumlimits_i=1^n f(t_i)(x_i-x_i-1),$$




        where $|P|$ denotes the size of the partition, where $t_iin [x_i-1,x_i].$ We call $I$ the integral, and we denote it as $$I=intlimits_a^b f(s)ds.$$ The integral of a continuous (limits!) function is related to the derivative (limits!) through the fundamental theorem of calculus:




        For $fin Cleft([a,b]right),$ a function $F$ satisfies $$F(x)-F(a)=intlimits_a^x f(s)ds$$ for any $xin [a,b]$ if and only if $F'=f$.




        As the name of the theorem states, this is pretty important. All of this generalizes appropriately to higher dimensions, but I won't discuss that here. Sometimes, integrating a function on its own can be hard, so we approximate it with easier functions (or sometimes, we have a sequence of functions tending to something, and we want to know about the limit and how it integrates). A major theorem in an introductory real analysis class is that if we have a sequence of Riemann integrable functions $(f_n)$ converging uniformly on $[a,b]$ to $f$, then $f$ is Riemann integrable, and we can swap the limit and integral. So, we can swap these two limits. We can do the same for a series of functions that converges uniformly.



        Okay, let's move on. In complex analysis, the integral is still of importance. Integrals in the complex plane are path integrals, which can be defined similarly. Complex analysis is centered on studying holomorphic functions, and a theorem of Morera relates this to the integral




        Let $g:OmegarightarrowmathbbC$ be continuous, and $$intlimits_gamma g(z)dz=0$$ whenever $gamma=partial R$ and $RsubsetOmega$ is a rectangle (with sides parallel to the real and imaginary axes). Then, $g$ is holomorphic.




        Cauchy's theorem states that the integral of a holomorphic function along a closed curve is zero. This can be used to prove the Cauchy integral formula:




        If $fin C^1(barOmega)$ is holomorphic on a bounded region $Omega$ with smooth boundary, then for any $zinOmega$, we have
        $$f(z)=frac12pi iintlimits_partialOmegafracf(zeta)zeta-zdzeta.$$




        Taking the derivative and iterating proves that holomorphic functions are smooth, and further we get that they have a power series expansion, where the coefficients given by integration. This all hinges on integration.



        The integral pops up in many other fundamental ways here. One is in the form of the mean-value property, which states that $$f(z_0)=frac12piintlimits_0^2pi f(z_0+re^itheta)dtheta$$ whenever $f$ is holomorphic on $Omega$ open and the closed disk centered $z_0$ of radius $r$ is contained in $Omega.$ We use the integral to prove other important theorems, such as the maximum modulus principle, Liouville's theorem, etc. We also use it to define a branch of the complex logarithm, to define the coefficients of a Laurent series, and to count zeros and poles of functions (argument principle). We also like to calculate various types of integrals in the complex plane where the integrand has singularities (often as a trick to calculate real integrals, which is especially relevant for calculating Fourier transforms). This uses the residue theorem, and residues are also calculated via taking limits. The theorem states that
        $$intlimits_partialOmegaf(z)dz=2pi isum_jtextRes_z_j(f),$$ where $f$ is holomorphic on an open set $Omega$, except at singularities $z_j,$ each of which has a relatively compact neighborhood on which $f$ has a Laurent series (the residue is the $(-1)$'th indexed coefficient, which are also integrals by construction of the Laurent series). I think that's enough about complex analysis.



        Now, let's talk a bit about measure theory. The Riemann integral is somewhat restrictive, so we generalize it to the Lebesgue integral (I have a post about the construction, see How to calculate an integral given a measure?). Note the involvement of limits in the post. If a function is Riemann integrable, then it is equivalent to its Lebesgue integral. We can define the Lebesgue integral on any measure space. Two of the biggest theorems are the monotone and dominated convergence theorems:




        If $f_jin L^1(X,mu)$, $0leq f_1(x)leq f_2(x)leq cdots,$ and $|f_j|_L^1leq C<infty,$ then $lim_j f_j(x)=f(x),$ with $fin L^1(X,mu),$ and $|f_j-f|_L^1rightarrow 0.$




        and




        If $f_jin L^1(X,mu)$ and $lim_j f_j(x)=f(x)$ $mu$-a.e., and there is an $Fin L^1(X,mu)$ so that $F$ dominates each $|f_j|$ pointwise $mu$-a.e., then $fin L^1(X,mu)$ and $|f_j-f|_L^1rightarrow 0.$




        We have immediate generalization to $L^p$ spaces, as well. These theorems are used extensively to prove things in measure theory, functional analysis, and PDEs. The dominated convergence theorem generalizes the result of using uniform convergence to swap limit and integral. We can use these to show that $L^p$ is complete for $pin [1,infty),$ in fact a Banach space, as these define norms. We show that if $p$ is in the range and $X$ is $sigma$-finite, then the dual of $L^p$ is $L^q$, where $1/p+1/q=1,$ and this functional is defined, wait for it, via integration. We're beginning to overlap a bit with functional analysis, so I'll switch gears a bit. We often use the integral to define linear functionals, and one such example is in the Riesz representation theorem. Here, we find that the dual of $C(X)$, where $X$ is a compact metric space, is the space of finite, signed measure of the Borel sigma algebra (Radon measures). In particular, to any bounded linear function $omega$ on $C(X)$, there exists a unique Radon measure $rho$ such that $$omega (f)=intlimits_x fdrho.$$



        Also, we get a generalization of the fundamental theorem of calculus using the Hardy-Littlewood maximal function:




        Let $fin L^1(mathbbR^n, dx)$ and consider $$A_rf(x)=frac1m(B_r)intlimits_B_r(x)f(y)dy,$$ where $r>0.$ Then, $$lim_rrightarrow 0 A_rf(x)=f(x)$$ a.e.




        In fact, if $fin L^p$, then $$lim_rrightarrow 0frac1m(B_r)intlimits_B_r(x)|f(y)-f(x)|^p dy=0$$ for a.e. $x$.



        Something of interest is swapping integrals themselves, which manifests in the theorems of Tonelli and Fubini. There are multiple versions, most of which have objects which require definitions, so I'll just give a quick version, the Fubini-Tonelli theorem for complete measures:




        Let $(X,M,mu)$ and $(Y,N,nu)$ be complete, $sigma$-finite measure spaces, and let $(Xtimes Y,mathcalL,lambda)$ be the completion of $(Xtimes Y,Motimes N, mutimesnu).$ If $f$ is $mathcalL$-measurable, and $fin L^1(Xtimes Y,lambda),$ then $$int fdlambda=intint f(x,y)dmu(x)dnu(y)=intint f(x,y)dnu(y)dmu(x).$$




        EDIT: I added in a version of Fubini/Tonelli and material related to Cauchy's theorem and the Cauchy integral formula. I am a bit busy today, but I will post about functional analysis tomorrow!



        Now, we'll move on to functional analysis. This is a pretty big topic, so I'm just going to pick a few things to talk about. We already had some discussion of $L^p$ spaces, and a lot of the remaining discussion will be related to them in some manner, making all of the discussion inherently reliant on the integral. For this reason, I'm going to stop outlining exactly when we're using integrals.



        Recall that $L^p$ spaces constitute Banach spaces. It is important to know that if $p=2,$ then the space becomes a Hilbert space, which is a very nice structure and makes the use of $L^2$ extremely common in PDEs. If $mu$ is $sigma$-finite and the $sigma$-algebra associated to $X$ is finitely-generated, then $L^2(X,mu)$ is separable, allowing us to get an orthonormal basis. For example, $e^i n theta_ninmathbbZ$ is an orthonormal basis for $L^2(S^1,dx/2pi)$ (see Fourier series).



        Something that is very useful in PDEs is to define functions of operators, such as $sqrt-Delta,$ where $Delta$ is the Laplacian, and the integral arises naturally in various forms of functional calculus. One such example is the holomorphic functional calculus (I will discuss more later on). If we have a bounded linear operator $T$, a bounded set $OmegasubsetmathbbC$ with smooth boundary containing the spectrum $sigma(T)$ of $T$ in the interior, and a function $f$ holomorphic on a neighborhood of $Omega,$ then we can define $$f(T)=frac12pi iintlimits_partialOmega f(zeta)R_zeta dzeta,$$ where $R_zeta:=(zeta-T)^-1$ is called the resolvent of $T$. This is one, simple way of making rigorous the idea of functions of operators. The holomorphic functional calculus can be extremely useful, although functional calculus can be made more general (such as the Borel functional calculus).



        One of the commonly-studied types of operators are integral operators, given in the form $$Ku(x)=intlimits_X k(x,y)u(y)dmu (y)$$ on a measure space $(X,mu)$. These arise naturally in solving differential/integral equations. You'll notice, for example, that the fundamental solutions of some of the basic PDEs are given via convolution, in which case our solution involves integral operators. You may have heard of the Hilbert-Schmidt kernel theorem, which says the following:




        If $T:L^2(X_1,mu_1)rightarrow L^2(X_2,mu_2)$ is a Hilbert-Schmidt operator, then there exist $Kin L^2(X_1times X_2, mu_1times mu_2)$ so that $$(Tu,v)_L^2=intint K(x_1,x_2)u (x_1)overlinev(x_2)dmu_1(x_1)dmu_2(x_2).$$




        The converse also holds: given $Kin L^2(X_1times X_2, mu_1times mu_2)$, then $T$ as written above defines a Hilbert-Schmidt operator, and it satisfies $|T|_textHS=|K|_L^2.$ For a generalization to tempered distributions (which we'll talk about later), look up the Schwartz kernel theorem.



        Another area of interest is semigroups, which (once again) have applications in differential equations. If we take a contraction semigroup $S(t)_tgeq 0$ on a real Banach space $X$ with infinitesimal generator $A$, then we have the following cool result:




        If $lambda>0,$ then $lambda$ is in the resolvent set of $A$, denoted $rho(A),$ and if $R_lambda=(lambda-A)^-1$ denotes the resolvent of $A$, then $$R_lambda u=intlimits_0^infty e^-lambda t S(t) u dt$$ for $uin X,$ and $|R_lambda|leqfrac1lambda.$




        This is, we can write the resolvent as the Laplace transform of the semigroup. From here, one can prove some major results, like the Hille-Yosida theorem for contraction semigroups:




        Let $A$ be a closed, densely-defined linear operator $X$. Then, $A$ is the generator of a contraction semigroup $S(t)_tgeq 0$ if and only if $$(0,infty)subsetrho(A)hspace.25intext and hspace.25in |R_lambda|leqfrac1lambda$$ for $lambda>0.$




        We can generalize past contraction semigroups to get a more general version of this theorem. Let me also state a related theorem, Stone's theorem:




        If $A$ is self-adjoint, then $iA$ generates the unitary group $U(t)=e^itA.$ Conversely, if $U(t)_tgeq 0$ is a semigroup of unitary operators, then there exists a self-adjoint operator $A$ so that $S(t)=e^itA.$




        Semigroup theory has direct application to PDEs. For example, if you have a parabolic equation, say $$partial_t u+Lu=0$$ with appropriate boundary and initial conditions, where $L$ is uniformly elliptic, then you can apply semigroup theory to this problem to guarantee a unique solution given by $S(t)u(x,0)$. This is a generalization of solving linear ODEs on finite dimensional spaces
        beginalign*x'&=Ax\
        x(0)&=x_0
        endalign*
        which have the solution $x(t)=e^tAx_0,$ and $e^tA_tgeq 0$ generate a one-parameter semigroup.



        As I mentioned earlier, Fourier analysis is certainly its own field, but it uses a lot of functional analytic techniques, and it is monumentally important in PDEs. Let us first define the Fourier transform $mathcalF:L^1(mathbbR^n)rightarrow L^infty (mathbbR^n)$ by
        $$mathcalF u(xi)=hatu(xi):=(2pi)^-n/2intlimits_mathbbR^n u(x)e^-ixcdot xi dx.$$ We can also extend this to a bounded operator from $L^2rightarrow L^2.$ For nice enough functions, the Fourier transform enjoys many useful properties, such as
        $$D_xi^alpha (mathcalF u)=mathcalF((-x)^alpha u)$$
        and $$mathcalF(D_x^alpha u)=xi^alpha mathcalF u,$$ where $D^alpha=frac1i^partial^alpha$ and $alpha$ is a multi-index. It turns out that a very natural place to define this is on the Schwartz space $mathcalS,$ where $mathcalF$ is a topological isomorphism. We have the famed Fourier inversion formula
        $$u(x)=(2pi )^-n/2intlimits_mathbbR^nhatu(xi)e^ixcdot xi dxi,$$ and from here, we can get the Plancheral
        theorem:
        $$|u|^2_L^2=|mathcalFu|^2_L^2,$$ so $mathcalF$ is an isometric isomorphism.



        We can define the Fourier transform on the dual space of $mathcalS,$ the space of tempered distributions (denoted $mathcalS'$), via duality: $$langlehatu,frangle=langle u,hatfrangle,$$ where $uinmathcalS'$ and $finmathcalS.$ Due to the algebra and calculus of distributions, one can extend all of the previously-listed properties to here. Now, we might as well mention a powerful and important generalization of the Hilbert-Schmidt kernel theorem, the Schwartz kernel theorem:




        Let $A:mathcalSrightarrowmathcalS'$ be a continuous linear map. Then, there exists $K_AinmathcalS'(mathbbR^ntimesmathbbR^n)$ so that for all $u,vinmathcalS,$ $$langle Au,vrangle=langle K_A, uotimes vrangle,$$ where $(uotimes v)(x,y)=u(x)v(y)in mathcalS(mathbbR^ntimesmathbbR^n).$




        We sometimes abuse notation and write this as $Au(x)=int K_A(x,y) u(y), dy,$ so that $$langle Au,vrangle=iint K_A(x,y) v(x)u(y), dydx.$$



        We can use the Fourier transform to define another functional calculus via Fourier multipliers. Motivated by the Fourier transform of the Laplacian, we define $$f(D)u=mathcalF^-1left(fleft(|xi|right)mathcalFuright),$$ for a "nice" function $f$ (by "nice," I mean so that the above makes sense). This allows us to make sense of objects like $e^tDelta$ or $cosleft(tsqrt-Deltaright).$ The former is related to the fundamental solution of the heat equation and the latter to the fundamental solution of the wave equation (notice anything related to semigroups for the former?). Fourier multipliers generalize to pseudodifferential operators (which are also a generalization of singular integral operators), but I'd rather not get into this, since this section is already very long. Fourier multipliers can be used to give a general definition of Sobolev spaces, which are of fundamental importance in PDEs, as well. We define these spaces, for any real $k$, as $$W^k,p(mathbbR^n)=leftlbrace uin mathcalS'(mathbbR^n): mathcalF^-1left(langle xirangle ^k mathcalF uright)in L^p(mathbbR^nrightrbrace,$$ where $langle xirangle=left(1+|xi|^2right)^1/2.$ These are Banach spaces, and when $p=2,$ it is a Hilbert space. For this reason, we use the provocative notation $W^k,2=H^k.$



        Something very significant that I haven't discussed yet is spectral theory, so I'll say a bit about it now. This is another field with signficant application to PDEs (and quantum mechanics). Spectral theory is directly connected to the Hilbert space $L^2$, so the integral is slightly "hidden." Here is one version of the spectral theorem, which can be proven using holomorphic functional calculus or the Fourier transform on $mathcalS'$:




        If $A$ is a bounded, self-adjoint operator on a Hilbert space $H$, then there exists a measure space $(X,mathfrakF,mu)$, a unitary map $Phi:Hrightarrow L^2(X,mu),$ and $ain L^infty (X,mu)$ so that $$Phi APhi^-1f(x)=a(x)f(x)$$ for all $fin L^2(X,mu).$ Here, $a$ is real-valued, and $|a|_L^infty=|A|.$




        There is also a near-identical version for bounded unitary operators; the only difference is that $|a|=1$ on $X$. We can further extend to normal operators. This also generalizes for unbounded operators:




        If $A$ is an unbounded, self-adjoint operator on a separable Hilbert space $H$, then there is a measure space $(X,mu)$, a unitary map $Phi:L^2(X,mu)rightarrow H$, and a real-valued, measurable function $a$ on $X$ such that $$Phi^-1APhi f(x)=a(x)f(x)$$ for $Phi fin mathcalD(A).$ If $fin L^2(X,mu),$ then $Phi finmathcalD(A)$ if and only if $afin L^2(X,mu).$




        This givens us a new functional calculus: If $f:mathbbRrightarrow mathbbC$ is Borel, then we can define $f(A)$ via $$Phi^-1f(A)Phi g(x)=f(a(x))g(x).$$ If $f$ is bounded and Borel, then we can define this for any $gin L^2(X,mu),$ and $f(A)$ will be a bounded operator on $H$. If not, then we can define $$mathcalD(f(A))=Phi gin H: gin L^2(X,mu)text and f(a(x))gin L^2(X,mu).$$



        I think I'll stop here. PDEs utilizes all of the above, but since it is an application of these subjects, rather than "pure analysis," I'm going to leave that out.



        TL;DR Limits are fundamental in analysis, and one such example of their use is in the definition of the integral, which is ubiquitous in the field.



        References:



        1. Gerald Folland: Real Analysis
        2. Elias M. Stein and Rami Shakarchi: Complex Analysis
        3. Michael Taylor: Introduction to Complex Analysis (link to pdf on his website: http://mtaylor.web.unc.edu/files/2018/04/complex.pdf)
        4. Michael Taylor: Measure Theory and Integration
        5. Michael Taylor: Partial Differential Equations I
        6. Michael Taylor: Partial Differential Equations II
        7. John Conway: A Course in Functional Analysis
        8. Lawrence Evans: Partial Differential Equations


        EDIT: I've added in functional analysis (and, arguably, went a bit overboard) and listed some references, as well as tweaked some previous statements. I apologize in advance if there are typos- this was difficult to proofread!



        EDIT: I've added in a statement of the Schwartz kernel theorem, which I forgot to write before.






        share|cite|improve this answer












        $endgroup$



        I think that I'd say that one of the underlying themes of analysis is, really, the limit. In pretty much every subfield of analysis, we spend a lot of time trying to control the size of certain quantities, with taking limits in mind. This is especially true in PDEs, when we consistently desire norm estimates on various quantities. Let's just discuss the "basics" of the "basic" subjects (standard topics in real, complex, measure theory, functional). I'm going to keep this discussion loose, since we could quickly get into a very drawn-out and detailed discussion.



        Real analysis is built on limits. Continuity, differentiability, integration, series, etc. all require the concept of limits. Complex analysis has a bit of a different "flavor" than the other core types, but it still requires limits to do pretty much everything. By Goursat's theorem, holomorphicity is equivalent to complex differentiability over a neighborhood- limits. Integration (and all that comes with it) and residue theory- limits. We can continue this for the entire subject (Laurent series, normal families, conformity, etc.). Lebesgue integration theory and the powerful theorems that come with it are primarily centered around, essentially, swapping limits, and much of measure theory is built around this. In functional analysis, we certainly have times where we can run into not metrizable (or even Hausdorff) topologies, but limits are still central. Many of the common types of spaces like Hilbert, Banach, and Frechet spaces all make use of a metric. We have things like the uniform boundedness principle, compact operators, spectral theory, semigroups, Fourier analysis (this is a field in its own right, of course, but it deals with a lot of functional analysis), and much more, all of which deal with limits (either explicitly or via objects related to previously-discussed material). A significant subfield of analysis is PDEs. As I said earlier, PDEs often deals with obtaining proper norm estimates on certain quantities in appropriate function spaces to prove e.g. existence and regularity of solutions, once again highly dependent on limit arguments (and, of course, the norms themselves are limit-dependent).



        Something else that I didn't touch on, but is important to discuss, is just how many modes of convergence we use. Some common types of convergence of sequences of functions and operators are pointwise convergence, uniform convergence, local uniform convergence, almost everywhere convergence, convergence in measure, $L^p$ convergence, (more generally) convergence in norm, weak convergence, weak star convergence, uniform operator convergence, strong operator convergence, weak operator convergence, etc. I didn't distinguish between convergence for operators and functions too much here, but it is important to do so e.g. weak star convergence is pointwise convergence for elements of the dual, but I listed them as separate.



        EDIT: The OP asked for some details. Of course, writing everything above in details would amount to me writing books! Instead of talking about everything, I'd like to talk about one pervasive concept in analysis that comes from limits- the integral. I'd like to note that much of the post deals with limits in many other ways as well, explicitly or otherwise. In real analysis, there are various equivalent ways that the integral is defined, but I'd like to use Riemann sums here: we say that a function is Riemann integrable on an interval $[a,b]$ if and only if there exists $Iin mathbbR$ so that




        $$I=lim_sumlimits_i=1^n f(t_i)(x_i-x_i-1),$$




        where $|P|$ denotes the size of the partition, where $t_iin [x_i-1,x_i].$ We call $I$ the integral, and we denote it as $$I=intlimits_a^b f(s)ds.$$ The integral of a continuous (limits!) function is related to the derivative (limits!) through the fundamental theorem of calculus:




        For $fin Cleft([a,b]right),$ a function $F$ satisfies $$F(x)-F(a)=intlimits_a^x f(s)ds$$ for any $xin [a,b]$ if and only if $F'=f$.




        As the name of the theorem states, this is pretty important. All of this generalizes appropriately to higher dimensions, but I won't discuss that here. Sometimes, integrating a function on its own can be hard, so we approximate it with easier functions (or sometimes, we have a sequence of functions tending to something, and we want to know about the limit and how it integrates). A major theorem in an introductory real analysis class is that if we have a sequence of Riemann integrable functions $(f_n)$ converging uniformly on $[a,b]$ to $f$, then $f$ is Riemann integrable, and we can swap the limit and integral. So, we can swap these two limits. We can do the same for a series of functions that converges uniformly.



        Okay, let's move on. In complex analysis, the integral is still of importance. Integrals in the complex plane are path integrals, which can be defined similarly. Complex analysis is centered on studying holomorphic functions, and a theorem of Morera relates this to the integral




        Let $g:OmegarightarrowmathbbC$ be continuous, and $$intlimits_gamma g(z)dz=0$$ whenever $gamma=partial R$ and $RsubsetOmega$ is a rectangle (with sides parallel to the real and imaginary axes). Then, $g$ is holomorphic.




        Cauchy's theorem states that the integral of a holomorphic function along a closed curve is zero. This can be used to prove the Cauchy integral formula:




        If $fin C^1(barOmega)$ is holomorphic on a bounded region $Omega$ with smooth boundary, then for any $zinOmega$, we have
        $$f(z)=frac12pi iintlimits_partialOmegafracf(zeta)zeta-zdzeta.$$




        Taking the derivative and iterating proves that holomorphic functions are smooth, and further we get that they have a power series expansion, where the coefficients given by integration. This all hinges on integration.



        The integral pops up in many other fundamental ways here. One is in the form of the mean-value property, which states that $$f(z_0)=frac12piintlimits_0^2pi f(z_0+re^itheta)dtheta$$ whenever $f$ is holomorphic on $Omega$ open and the closed disk centered $z_0$ of radius $r$ is contained in $Omega.$ We use the integral to prove other important theorems, such as the maximum modulus principle, Liouville's theorem, etc. We also use it to define a branch of the complex logarithm, to define the coefficients of a Laurent series, and to count zeros and poles of functions (argument principle). We also like to calculate various types of integrals in the complex plane where the integrand has singularities (often as a trick to calculate real integrals, which is especially relevant for calculating Fourier transforms). This uses the residue theorem, and residues are also calculated via taking limits. The theorem states that
        $$intlimits_partialOmegaf(z)dz=2pi isum_jtextRes_z_j(f),$$ where $f$ is holomorphic on an open set $Omega$, except at singularities $z_j,$ each of which has a relatively compact neighborhood on which $f$ has a Laurent series (the residue is the $(-1)$'th indexed coefficient, which are also integrals by construction of the Laurent series). I think that's enough about complex analysis.



        Now, let's talk a bit about measure theory. The Riemann integral is somewhat restrictive, so we generalize it to the Lebesgue integral (I have a post about the construction, see How to calculate an integral given a measure?). Note the involvement of limits in the post. If a function is Riemann integrable, then it is equivalent to its Lebesgue integral. We can define the Lebesgue integral on any measure space. Two of the biggest theorems are the monotone and dominated convergence theorems:




        If $f_jin L^1(X,mu)$, $0leq f_1(x)leq f_2(x)leq cdots,$ and $|f_j|_L^1leq C<infty,$ then $lim_j f_j(x)=f(x),$ with $fin L^1(X,mu),$ and $|f_j-f|_L^1rightarrow 0.$




        and




        If $f_jin L^1(X,mu)$ and $lim_j f_j(x)=f(x)$ $mu$-a.e., and there is an $Fin L^1(X,mu)$ so that $F$ dominates each $|f_j|$ pointwise $mu$-a.e., then $fin L^1(X,mu)$ and $|f_j-f|_L^1rightarrow 0.$




        We have immediate generalization to $L^p$ spaces, as well. These theorems are used extensively to prove things in measure theory, functional analysis, and PDEs. The dominated convergence theorem generalizes the result of using uniform convergence to swap limit and integral. We can use these to show that $L^p$ is complete for $pin [1,infty),$ in fact a Banach space, as these define norms. We show that if $p$ is in the range and $X$ is $sigma$-finite, then the dual of $L^p$ is $L^q$, where $1/p+1/q=1,$ and this functional is defined, wait for it, via integration. We're beginning to overlap a bit with functional analysis, so I'll switch gears a bit. We often use the integral to define linear functionals, and one such example is in the Riesz representation theorem. Here, we find that the dual of $C(X)$, where $X$ is a compact metric space, is the space of finite, signed measure of the Borel sigma algebra (Radon measures). In particular, to any bounded linear function $omega$ on $C(X)$, there exists a unique Radon measure $rho$ such that $$omega (f)=intlimits_x fdrho.$$



        Also, we get a generalization of the fundamental theorem of calculus using the Hardy-Littlewood maximal function:




        Let $fin L^1(mathbbR^n, dx)$ and consider $$A_rf(x)=frac1m(B_r)intlimits_B_r(x)f(y)dy,$$ where $r>0.$ Then, $$lim_rrightarrow 0 A_rf(x)=f(x)$$ a.e.




        In fact, if $fin L^p$, then $$lim_rrightarrow 0frac1m(B_r)intlimits_B_r(x)|f(y)-f(x)|^p dy=0$$ for a.e. $x$.



        Something of interest is swapping integrals themselves, which manifests in the theorems of Tonelli and Fubini. There are multiple versions, most of which have objects which require definitions, so I'll just give a quick version, the Fubini-Tonelli theorem for complete measures:




        Let $(X,M,mu)$ and $(Y,N,nu)$ be complete, $sigma$-finite measure spaces, and let $(Xtimes Y,mathcalL,lambda)$ be the completion of $(Xtimes Y,Motimes N, mutimesnu).$ If $f$ is $mathcalL$-measurable, and $fin L^1(Xtimes Y,lambda),$ then $$int fdlambda=intint f(x,y)dmu(x)dnu(y)=intint f(x,y)dnu(y)dmu(x).$$




        EDIT: I added in a version of Fubini/Tonelli and material related to Cauchy's theorem and the Cauchy integral formula. I am a bit busy today, but I will post about functional analysis tomorrow!



        Now, we'll move on to functional analysis. This is a pretty big topic, so I'm just going to pick a few things to talk about. We already had some discussion of $L^p$ spaces, and a lot of the remaining discussion will be related to them in some manner, making all of the discussion inherently reliant on the integral. For this reason, I'm going to stop outlining exactly when we're using integrals.



        Recall that $L^p$ spaces constitute Banach spaces. It is important to know that if $p=2,$ then the space becomes a Hilbert space, which is a very nice structure and makes the use of $L^2$ extremely common in PDEs. If $mu$ is $sigma$-finite and the $sigma$-algebra associated to $X$ is finitely-generated, then $L^2(X,mu)$ is separable, allowing us to get an orthonormal basis. For example, $e^i n theta_ninmathbbZ$ is an orthonormal basis for $L^2(S^1,dx/2pi)$ (see Fourier series).



        Something that is very useful in PDEs is to define functions of operators, such as $sqrt-Delta,$ where $Delta$ is the Laplacian, and the integral arises naturally in various forms of functional calculus. One such example is the holomorphic functional calculus (I will discuss more later on). If we have a bounded linear operator $T$, a bounded set $OmegasubsetmathbbC$ with smooth boundary containing the spectrum $sigma(T)$ of $T$ in the interior, and a function $f$ holomorphic on a neighborhood of $Omega,$ then we can define $$f(T)=frac12pi iintlimits_partialOmega f(zeta)R_zeta dzeta,$$ where $R_zeta:=(zeta-T)^-1$ is called the resolvent of $T$. This is one, simple way of making rigorous the idea of functions of operators. The holomorphic functional calculus can be extremely useful, although functional calculus can be made more general (such as the Borel functional calculus).



        One of the commonly-studied types of operators are integral operators, given in the form $$Ku(x)=intlimits_X k(x,y)u(y)dmu (y)$$ on a measure space $(X,mu)$. These arise naturally in solving differential/integral equations. You'll notice, for example, that the fundamental solutions of some of the basic PDEs are given via convolution, in which case our solution involves integral operators. You may have heard of the Hilbert-Schmidt kernel theorem, which says the following:




        If $T:L^2(X_1,mu_1)rightarrow L^2(X_2,mu_2)$ is a Hilbert-Schmidt operator, then there exist $Kin L^2(X_1times X_2, mu_1times mu_2)$ so that $$(Tu,v)_L^2=intint K(x_1,x_2)u (x_1)overlinev(x_2)dmu_1(x_1)dmu_2(x_2).$$




        The converse also holds: given $Kin L^2(X_1times X_2, mu_1times mu_2)$, then $T$ as written above defines a Hilbert-Schmidt operator, and it satisfies $|T|_textHS=|K|_L^2.$ For a generalization to tempered distributions (which we'll talk about later), look up the Schwartz kernel theorem.



        Another area of interest is semigroups, which (once again) have applications in differential equations. If we take a contraction semigroup $S(t)_tgeq 0$ on a real Banach space $X$ with infinitesimal generator $A$, then we have the following cool result:




        If $lambda>0,$ then $lambda$ is in the resolvent set of $A$, denoted $rho(A),$ and if $R_lambda=(lambda-A)^-1$ denotes the resolvent of $A$, then $$R_lambda u=intlimits_0^infty e^-lambda t S(t) u dt$$ for $uin X,$ and $|R_lambda|leqfrac1lambda.$




        This is, we can write the resolvent as the Laplace transform of the semigroup. From here, one can prove some major results, like the Hille-Yosida theorem for contraction semigroups:




        Let $A$ be a closed, densely-defined linear operator $X$. Then, $A$ is the generator of a contraction semigroup $S(t)_tgeq 0$ if and only if $$(0,infty)subsetrho(A)hspace.25intext and hspace.25in |R_lambda|leqfrac1lambda$$ for $lambda>0.$




        We can generalize past contraction semigroups to get a more general version of this theorem. Let me also state a related theorem, Stone's theorem:




        If $A$ is self-adjoint, then $iA$ generates the unitary group $U(t)=e^itA.$ Conversely, if $U(t)_tgeq 0$ is a semigroup of unitary operators, then there exists a self-adjoint operator $A$ so that $S(t)=e^itA.$




        Semigroup theory has direct application to PDEs. For example, if you have a parabolic equation, say $$partial_t u+Lu=0$$ with appropriate boundary and initial conditions, where $L$ is uniformly elliptic, then you can apply semigroup theory to this problem to guarantee a unique solution given by $S(t)u(x,0)$. This is a generalization of solving linear ODEs on finite dimensional spaces
        beginalign*x'&=Ax\
        x(0)&=x_0
        endalign*
        which have the solution $x(t)=e^tAx_0,$ and $e^tA_tgeq 0$ generate a one-parameter semigroup.



        As I mentioned earlier, Fourier analysis is certainly its own field, but it uses a lot of functional analytic techniques, and it is monumentally important in PDEs. Let us first define the Fourier transform $mathcalF:L^1(mathbbR^n)rightarrow L^infty (mathbbR^n)$ by
        $$mathcalF u(xi)=hatu(xi):=(2pi)^-n/2intlimits_mathbbR^n u(x)e^-ixcdot xi dx.$$ We can also extend this to a bounded operator from $L^2rightarrow L^2.$ For nice enough functions, the Fourier transform enjoys many useful properties, such as
        $$D_xi^alpha (mathcalF u)=mathcalF((-x)^alpha u)$$
        and $$mathcalF(D_x^alpha u)=xi^alpha mathcalF u,$$ where $D^alpha=frac1i^partial^alpha$ and $alpha$ is a multi-index. It turns out that a very natural place to define this is on the Schwartz space $mathcalS,$ where $mathcalF$ is a topological isomorphism. We have the famed Fourier inversion formula
        $$u(x)=(2pi )^-n/2intlimits_mathbbR^nhatu(xi)e^ixcdot xi dxi,$$ and from here, we can get the Plancheral
        theorem:
        $$|u|^2_L^2=|mathcalFu|^2_L^2,$$ so $mathcalF$ is an isometric isomorphism.



        We can define the Fourier transform on the dual space of $mathcalS,$ the space of tempered distributions (denoted $mathcalS'$), via duality: $$langlehatu,frangle=langle u,hatfrangle,$$ where $uinmathcalS'$ and $finmathcalS.$ Due to the algebra and calculus of distributions, one can extend all of the previously-listed properties to here. Now, we might as well mention a powerful and important generalization of the Hilbert-Schmidt kernel theorem, the Schwartz kernel theorem:




        Let $A:mathcalSrightarrowmathcalS'$ be a continuous linear map. Then, there exists $K_AinmathcalS'(mathbbR^ntimesmathbbR^n)$ so that for all $u,vinmathcalS,$ $$langle Au,vrangle=langle K_A, uotimes vrangle,$$ where $(uotimes v)(x,y)=u(x)v(y)in mathcalS(mathbbR^ntimesmathbbR^n).$




        We sometimes abuse notation and write this as $Au(x)=int K_A(x,y) u(y), dy,$ so that $$langle Au,vrangle=iint K_A(x,y) v(x)u(y), dydx.$$



        We can use the Fourier transform to define another functional calculus via Fourier multipliers. Motivated by the Fourier transform of the Laplacian, we define $$f(D)u=mathcalF^-1left(fleft(|xi|right)mathcalFuright),$$ for a "nice" function $f$ (by "nice," I mean so that the above makes sense). This allows us to make sense of objects like $e^tDelta$ or $cosleft(tsqrt-Deltaright).$ The former is related to the fundamental solution of the heat equation and the latter to the fundamental solution of the wave equation (notice anything related to semigroups for the former?). Fourier multipliers generalize to pseudodifferential operators (which are also a generalization of singular integral operators), but I'd rather not get into this, since this section is already very long. Fourier multipliers can be used to give a general definition of Sobolev spaces, which are of fundamental importance in PDEs, as well. We define these spaces, for any real $k$, as $$W^k,p(mathbbR^n)=leftlbrace uin mathcalS'(mathbbR^n): mathcalF^-1left(langle xirangle ^k mathcalF uright)in L^p(mathbbR^nrightrbrace,$$ where $langle xirangle=left(1+|xi|^2right)^1/2.$ These are Banach spaces, and when $p=2,$ it is a Hilbert space. For this reason, we use the provocative notation $W^k,2=H^k.$



        Something very significant that I haven't discussed yet is spectral theory, so I'll say a bit about it now. This is another field with signficant application to PDEs (and quantum mechanics). Spectral theory is directly connected to the Hilbert space $L^2$, so the integral is slightly "hidden." Here is one version of the spectral theorem, which can be proven using holomorphic functional calculus or the Fourier transform on $mathcalS'$:




        If $A$ is a bounded, self-adjoint operator on a Hilbert space $H$, then there exists a measure space $(X,mathfrakF,mu)$, a unitary map $Phi:Hrightarrow L^2(X,mu),$ and $ain L^infty (X,mu)$ so that $$Phi APhi^-1f(x)=a(x)f(x)$$ for all $fin L^2(X,mu).$ Here, $a$ is real-valued, and $|a|_L^infty=|A|.$




        There is also a near-identical version for bounded unitary operators; the only difference is that $|a|=1$ on $X$. We can further extend to normal operators. This also generalizes for unbounded operators:




        If $A$ is an unbounded, self-adjoint operator on a separable Hilbert space $H$, then there is a measure space $(X,mu)$, a unitary map $Phi:L^2(X,mu)rightarrow H$, and a real-valued, measurable function $a$ on $X$ such that $$Phi^-1APhi f(x)=a(x)f(x)$$ for $Phi fin mathcalD(A).$ If $fin L^2(X,mu),$ then $Phi finmathcalD(A)$ if and only if $afin L^2(X,mu).$




        This givens us a new functional calculus: If $f:mathbbRrightarrow mathbbC$ is Borel, then we can define $f(A)$ via $$Phi^-1f(A)Phi g(x)=f(a(x))g(x).$$ If $f$ is bounded and Borel, then we can define this for any $gin L^2(X,mu),$ and $f(A)$ will be a bounded operator on $H$. If not, then we can define $$mathcalD(f(A))=Phi gin H: gin L^2(X,mu)text and f(a(x))gin L^2(X,mu).$$



        I think I'll stop here. PDEs utilizes all of the above, but since it is an application of these subjects, rather than "pure analysis," I'm going to leave that out.



        TL;DR Limits are fundamental in analysis, and one such example of their use is in the definition of the integral, which is ubiquitous in the field.



        References:



        1. Gerald Folland: Real Analysis
        2. Elias M. Stein and Rami Shakarchi: Complex Analysis
        3. Michael Taylor: Introduction to Complex Analysis (link to pdf on his website: http://mtaylor.web.unc.edu/files/2018/04/complex.pdf)
        4. Michael Taylor: Measure Theory and Integration
        5. Michael Taylor: Partial Differential Equations I
        6. Michael Taylor: Partial Differential Equations II
        7. John Conway: A Course in Functional Analysis
        8. Lawrence Evans: Partial Differential Equations


        EDIT: I've added in functional analysis (and, arguably, went a bit overboard) and listed some references, as well as tweaked some previous statements. I apologize in advance if there are typos- this was difficult to proofread!



        EDIT: I've added in a statement of the Schwartz kernel theorem, which I forgot to write before.







        share|cite|improve this answer















        share|cite|improve this answer




        share|cite|improve this answer








        edited Jul 24 at 21:20

























        answered Jun 10 at 13:11









        cmkcmk

        6,9855 gold badges9 silver badges31 bronze badges




        6,9855 gold badges9 silver badges31 bronze badges










        • 3




          $begingroup$
          If you don't mind, please do go into detail.
          $endgroup$
          – Sandesh Jr
          Jun 10 at 14:23






        • 1




          $begingroup$
          @accumulation we can define limits in much more general settings, but its manifestation as a metric is fundamental to most core constructions in analysis. Also, I know very little about transfinite math, so I’ll elect to not make any claims about it.
          $endgroup$
          – cmk
          Jun 10 at 21:35







        • 1




          $begingroup$
          @SandeshJr I've edited my post to include functional analysis (and I referenced PDE throughout). I think I'm going to call it here!
          $endgroup$
          – cmk
          Jun 12 at 12:14






        • 3




          $begingroup$
          Man, I bet this has taken some time to write out. However, quite delightful to read! (+1)
          $endgroup$
          – mrtaurho
          Jun 12 at 13:57






        • 3




          $begingroup$
          Really great answer. And if you ever do end up writing a book, I would most definitely read it.
          $endgroup$
          – Sandesh Jr
          Jun 13 at 13:54












        • 3




          $begingroup$
          If you don't mind, please do go into detail.
          $endgroup$
          – Sandesh Jr
          Jun 10 at 14:23






        • 1




          $begingroup$
          @accumulation we can define limits in much more general settings, but its manifestation as a metric is fundamental to most core constructions in analysis. Also, I know very little about transfinite math, so I’ll elect to not make any claims about it.
          $endgroup$
          – cmk
          Jun 10 at 21:35







        • 1




          $begingroup$
          @SandeshJr I've edited my post to include functional analysis (and I referenced PDE throughout). I think I'm going to call it here!
          $endgroup$
          – cmk
          Jun 12 at 12:14






        • 3




          $begingroup$
          Man, I bet this has taken some time to write out. However, quite delightful to read! (+1)
          $endgroup$
          – mrtaurho
          Jun 12 at 13:57






        • 3




          $begingroup$
          Really great answer. And if you ever do end up writing a book, I would most definitely read it.
          $endgroup$
          – Sandesh Jr
          Jun 13 at 13:54







        3




        3




        $begingroup$
        If you don't mind, please do go into detail.
        $endgroup$
        – Sandesh Jr
        Jun 10 at 14:23




        $begingroup$
        If you don't mind, please do go into detail.
        $endgroup$
        – Sandesh Jr
        Jun 10 at 14:23




        1




        1




        $begingroup$
        @accumulation we can define limits in much more general settings, but its manifestation as a metric is fundamental to most core constructions in analysis. Also, I know very little about transfinite math, so I’ll elect to not make any claims about it.
        $endgroup$
        – cmk
        Jun 10 at 21:35





        $begingroup$
        @accumulation we can define limits in much more general settings, but its manifestation as a metric is fundamental to most core constructions in analysis. Also, I know very little about transfinite math, so I’ll elect to not make any claims about it.
        $endgroup$
        – cmk
        Jun 10 at 21:35





        1




        1




        $begingroup$
        @SandeshJr I've edited my post to include functional analysis (and I referenced PDE throughout). I think I'm going to call it here!
        $endgroup$
        – cmk
        Jun 12 at 12:14




        $begingroup$
        @SandeshJr I've edited my post to include functional analysis (and I referenced PDE throughout). I think I'm going to call it here!
        $endgroup$
        – cmk
        Jun 12 at 12:14




        3




        3




        $begingroup$
        Man, I bet this has taken some time to write out. However, quite delightful to read! (+1)
        $endgroup$
        – mrtaurho
        Jun 12 at 13:57




        $begingroup$
        Man, I bet this has taken some time to write out. However, quite delightful to read! (+1)
        $endgroup$
        – mrtaurho
        Jun 12 at 13:57




        3




        3




        $begingroup$
        Really great answer. And if you ever do end up writing a book, I would most definitely read it.
        $endgroup$
        – Sandesh Jr
        Jun 13 at 13:54




        $begingroup$
        Really great answer. And if you ever do end up writing a book, I would most definitely read it.
        $endgroup$
        – Sandesh Jr
        Jun 13 at 13:54













        18
















        $begingroup$

        From my viewpoint, Real Analysis is a study of functions of (one or several) real variable. Everything else (limits, derivatives, integrals, infinite series, etc.) is a tool serving this purpose. [There is a mild exception one has to make here for sequences and series of real numbers/vectors; these are functions defined on the set of natural numbers and sometimes, integers.] The theory of real numbers and limits was developed (in the 19th century) in order to make the study of functions rigorous. Note that the theory of limits and the "epsilonetics" is not the only way to go. The alternative (at least, the only alternative I know) is the Nonstandard Analysis which justifies the notion of infinitesimally small and infinitely large quantities used by Newton Leibnitz and others for (about) the first 150 years of existence of Real Analysis (before the limits were introduced by Cauchy and, in the modern form, by Weierstrass).



        For instance, what is the purpose (or, rather, purposes) of computing derivatives of functions? It is to determine if the given function is increasing/decreasing/concave/convex or to approximate the given function by some polynomial (usually a polynomial of degree one).



        What is the purpose of computing limits? It is to determine "approximate" behavior of the function when the input variable is close to some (finite or infinite) value.



        What is the purpose of computing integrals? It is to compute length (of curves), areas (of surfaces), volumes (of solids), or to find solutions of differential equations (which are equations on functions involving some derivatives). In the geometric problems (lengths, areas and volumes) one computes a single number "measuring" the given function (say, the length of a curve).



        What is the purpose of computing Taylor (Fourier) series? It is to approximate functions with polynomials (or sums of trigonometric functions) which are (usually) easier to analyze than general smooth functions.



        This is how it was from the very beginning of Real Analysis (Newton, Leibnitz, Bernoulli, Euler and many others).






        share|cite|improve this answer












        $endgroup$














        • $begingroup$
          "Real Analysis is a study of functions (of one or several) real variable." Some kind of little grammar snafu here, since if you take out the parenthetical statement it's missing any word between "functions" and "real." Did you mean, "Real Analysis is the study of functions of a real variable (or several variables)"?
          $endgroup$
          – jpmc26
          Jun 12 at 21:14
















        18
















        $begingroup$

        From my viewpoint, Real Analysis is a study of functions of (one or several) real variable. Everything else (limits, derivatives, integrals, infinite series, etc.) is a tool serving this purpose. [There is a mild exception one has to make here for sequences and series of real numbers/vectors; these are functions defined on the set of natural numbers and sometimes, integers.] The theory of real numbers and limits was developed (in the 19th century) in order to make the study of functions rigorous. Note that the theory of limits and the "epsilonetics" is not the only way to go. The alternative (at least, the only alternative I know) is the Nonstandard Analysis which justifies the notion of infinitesimally small and infinitely large quantities used by Newton Leibnitz and others for (about) the first 150 years of existence of Real Analysis (before the limits were introduced by Cauchy and, in the modern form, by Weierstrass).



        For instance, what is the purpose (or, rather, purposes) of computing derivatives of functions? It is to determine if the given function is increasing/decreasing/concave/convex or to approximate the given function by some polynomial (usually a polynomial of degree one).



        What is the purpose of computing limits? It is to determine "approximate" behavior of the function when the input variable is close to some (finite or infinite) value.



        What is the purpose of computing integrals? It is to compute length (of curves), areas (of surfaces), volumes (of solids), or to find solutions of differential equations (which are equations on functions involving some derivatives). In the geometric problems (lengths, areas and volumes) one computes a single number "measuring" the given function (say, the length of a curve).



        What is the purpose of computing Taylor (Fourier) series? It is to approximate functions with polynomials (or sums of trigonometric functions) which are (usually) easier to analyze than general smooth functions.



        This is how it was from the very beginning of Real Analysis (Newton, Leibnitz, Bernoulli, Euler and many others).






        share|cite|improve this answer












        $endgroup$














        • $begingroup$
          "Real Analysis is a study of functions (of one or several) real variable." Some kind of little grammar snafu here, since if you take out the parenthetical statement it's missing any word between "functions" and "real." Did you mean, "Real Analysis is the study of functions of a real variable (or several variables)"?
          $endgroup$
          – jpmc26
          Jun 12 at 21:14














        18














        18










        18







        $begingroup$

        From my viewpoint, Real Analysis is a study of functions of (one or several) real variable. Everything else (limits, derivatives, integrals, infinite series, etc.) is a tool serving this purpose. [There is a mild exception one has to make here for sequences and series of real numbers/vectors; these are functions defined on the set of natural numbers and sometimes, integers.] The theory of real numbers and limits was developed (in the 19th century) in order to make the study of functions rigorous. Note that the theory of limits and the "epsilonetics" is not the only way to go. The alternative (at least, the only alternative I know) is the Nonstandard Analysis which justifies the notion of infinitesimally small and infinitely large quantities used by Newton Leibnitz and others for (about) the first 150 years of existence of Real Analysis (before the limits were introduced by Cauchy and, in the modern form, by Weierstrass).



        For instance, what is the purpose (or, rather, purposes) of computing derivatives of functions? It is to determine if the given function is increasing/decreasing/concave/convex or to approximate the given function by some polynomial (usually a polynomial of degree one).



        What is the purpose of computing limits? It is to determine "approximate" behavior of the function when the input variable is close to some (finite or infinite) value.



        What is the purpose of computing integrals? It is to compute length (of curves), areas (of surfaces), volumes (of solids), or to find solutions of differential equations (which are equations on functions involving some derivatives). In the geometric problems (lengths, areas and volumes) one computes a single number "measuring" the given function (say, the length of a curve).



        What is the purpose of computing Taylor (Fourier) series? It is to approximate functions with polynomials (or sums of trigonometric functions) which are (usually) easier to analyze than general smooth functions.



        This is how it was from the very beginning of Real Analysis (Newton, Leibnitz, Bernoulli, Euler and many others).






        share|cite|improve this answer












        $endgroup$



        From my viewpoint, Real Analysis is a study of functions of (one or several) real variable. Everything else (limits, derivatives, integrals, infinite series, etc.) is a tool serving this purpose. [There is a mild exception one has to make here for sequences and series of real numbers/vectors; these are functions defined on the set of natural numbers and sometimes, integers.] The theory of real numbers and limits was developed (in the 19th century) in order to make the study of functions rigorous. Note that the theory of limits and the "epsilonetics" is not the only way to go. The alternative (at least, the only alternative I know) is the Nonstandard Analysis which justifies the notion of infinitesimally small and infinitely large quantities used by Newton Leibnitz and others for (about) the first 150 years of existence of Real Analysis (before the limits were introduced by Cauchy and, in the modern form, by Weierstrass).



        For instance, what is the purpose (or, rather, purposes) of computing derivatives of functions? It is to determine if the given function is increasing/decreasing/concave/convex or to approximate the given function by some polynomial (usually a polynomial of degree one).



        What is the purpose of computing limits? It is to determine "approximate" behavior of the function when the input variable is close to some (finite or infinite) value.



        What is the purpose of computing integrals? It is to compute length (of curves), areas (of surfaces), volumes (of solids), or to find solutions of differential equations (which are equations on functions involving some derivatives). In the geometric problems (lengths, areas and volumes) one computes a single number "measuring" the given function (say, the length of a curve).



        What is the purpose of computing Taylor (Fourier) series? It is to approximate functions with polynomials (or sums of trigonometric functions) which are (usually) easier to analyze than general smooth functions.



        This is how it was from the very beginning of Real Analysis (Newton, Leibnitz, Bernoulli, Euler and many others).







        share|cite|improve this answer















        share|cite|improve this answer




        share|cite|improve this answer








        edited Jun 12 at 21:20

























        answered Jun 10 at 15:55









        Moishe KohanMoishe Kohan

        54.1k3 gold badges54 silver badges120 bronze badges




        54.1k3 gold badges54 silver badges120 bronze badges














        • $begingroup$
          "Real Analysis is a study of functions (of one or several) real variable." Some kind of little grammar snafu here, since if you take out the parenthetical statement it's missing any word between "functions" and "real." Did you mean, "Real Analysis is the study of functions of a real variable (or several variables)"?
          $endgroup$
          – jpmc26
          Jun 12 at 21:14

















        • $begingroup$
          "Real Analysis is a study of functions (of one or several) real variable." Some kind of little grammar snafu here, since if you take out the parenthetical statement it's missing any word between "functions" and "real." Did you mean, "Real Analysis is the study of functions of a real variable (or several variables)"?
          $endgroup$
          – jpmc26
          Jun 12 at 21:14
















        $begingroup$
        "Real Analysis is a study of functions (of one or several) real variable." Some kind of little grammar snafu here, since if you take out the parenthetical statement it's missing any word between "functions" and "real." Did you mean, "Real Analysis is the study of functions of a real variable (or several variables)"?
        $endgroup$
        – jpmc26
        Jun 12 at 21:14





        $begingroup$
        "Real Analysis is a study of functions (of one or several) real variable." Some kind of little grammar snafu here, since if you take out the parenthetical statement it's missing any word between "functions" and "real." Did you mean, "Real Analysis is the study of functions of a real variable (or several variables)"?
        $endgroup$
        – jpmc26
        Jun 12 at 21:14












        15
















        $begingroup$

        Long ago someone told me this. I still remember it ...




        Sometimes I find myself just pushing symbols around, and I wonder, "Am I really doing analysis?" But when an argument begins, "Let $varepsilon > 0$," then I know it really is analysis.







        share|cite|improve this answer










        $endgroup$










        • 4




          $begingroup$
          "Let $varepsilon>0$", I spot with my eyeball. \ Analysis! That much is clear to all.\ But much to my chagrin, \ no limits were within; \ the variable could've been integral.
          $endgroup$
          – Discrete lizard
          Jun 11 at 17:54










        • $begingroup$
          @Discretelizard Did you come up with that on the fly? I quite like it.
          $endgroup$
          – user3067860
          Jun 11 at 19:10






        • 2




          $begingroup$
          @user3067860 Yes, I did. Thanks! I first tried to let the punchline be that "Let $epsilon>0$" is also a possible begin of an argument about approximation algorithms, often little to do with analysis, but didn't get that to work and ended up with the above.
          $endgroup$
          – Discrete lizard
          Jun 11 at 19:51











        • $begingroup$
          Let $varepsilon>0$ be a nonzero infinite ordinal ...
          $endgroup$
          – Gareth McCaughan
          Jun 12 at 0:44















        15
















        $begingroup$

        Long ago someone told me this. I still remember it ...




        Sometimes I find myself just pushing symbols around, and I wonder, "Am I really doing analysis?" But when an argument begins, "Let $varepsilon > 0$," then I know it really is analysis.







        share|cite|improve this answer










        $endgroup$










        • 4




          $begingroup$
          "Let $varepsilon>0$", I spot with my eyeball. \ Analysis! That much is clear to all.\ But much to my chagrin, \ no limits were within; \ the variable could've been integral.
          $endgroup$
          – Discrete lizard
          Jun 11 at 17:54










        • $begingroup$
          @Discretelizard Did you come up with that on the fly? I quite like it.
          $endgroup$
          – user3067860
          Jun 11 at 19:10






        • 2




          $begingroup$
          @user3067860 Yes, I did. Thanks! I first tried to let the punchline be that "Let $epsilon>0$" is also a possible begin of an argument about approximation algorithms, often little to do with analysis, but didn't get that to work and ended up with the above.
          $endgroup$
          – Discrete lizard
          Jun 11 at 19:51











        • $begingroup$
          Let $varepsilon>0$ be a nonzero infinite ordinal ...
          $endgroup$
          – Gareth McCaughan
          Jun 12 at 0:44













        15














        15










        15







        $begingroup$

        Long ago someone told me this. I still remember it ...




        Sometimes I find myself just pushing symbols around, and I wonder, "Am I really doing analysis?" But when an argument begins, "Let $varepsilon > 0$," then I know it really is analysis.







        share|cite|improve this answer










        $endgroup$



        Long ago someone told me this. I still remember it ...




        Sometimes I find myself just pushing symbols around, and I wonder, "Am I really doing analysis?" But when an argument begins, "Let $varepsilon > 0$," then I know it really is analysis.








        share|cite|improve this answer













        share|cite|improve this answer




        share|cite|improve this answer










        answered Jun 11 at 13:05









        GEdgarGEdgar

        67.9k3 gold badges72 silver badges190 bronze badges




        67.9k3 gold badges72 silver badges190 bronze badges










        • 4




          $begingroup$
          "Let $varepsilon>0$", I spot with my eyeball. \ Analysis! That much is clear to all.\ But much to my chagrin, \ no limits were within; \ the variable could've been integral.
          $endgroup$
          – Discrete lizard
          Jun 11 at 17:54










        • $begingroup$
          @Discretelizard Did you come up with that on the fly? I quite like it.
          $endgroup$
          – user3067860
          Jun 11 at 19:10






        • 2




          $begingroup$
          @user3067860 Yes, I did. Thanks! I first tried to let the punchline be that "Let $epsilon>0$" is also a possible begin of an argument about approximation algorithms, often little to do with analysis, but didn't get that to work and ended up with the above.
          $endgroup$
          – Discrete lizard
          Jun 11 at 19:51











        • $begingroup$
          Let $varepsilon>0$ be a nonzero infinite ordinal ...
          $endgroup$
          – Gareth McCaughan
          Jun 12 at 0:44












        • 4




          $begingroup$
          "Let $varepsilon>0$", I spot with my eyeball. \ Analysis! That much is clear to all.\ But much to my chagrin, \ no limits were within; \ the variable could've been integral.
          $endgroup$
          – Discrete lizard
          Jun 11 at 17:54










        • $begingroup$
          @Discretelizard Did you come up with that on the fly? I quite like it.
          $endgroup$
          – user3067860
          Jun 11 at 19:10






        • 2




          $begingroup$
          @user3067860 Yes, I did. Thanks! I first tried to let the punchline be that "Let $epsilon>0$" is also a possible begin of an argument about approximation algorithms, often little to do with analysis, but didn't get that to work and ended up with the above.
          $endgroup$
          – Discrete lizard
          Jun 11 at 19:51











        • $begingroup$
          Let $varepsilon>0$ be a nonzero infinite ordinal ...
          $endgroup$
          – Gareth McCaughan
          Jun 12 at 0:44







        4




        4




        $begingroup$
        "Let $varepsilon>0$", I spot with my eyeball. \ Analysis! That much is clear to all.\ But much to my chagrin, \ no limits were within; \ the variable could've been integral.
        $endgroup$
        – Discrete lizard
        Jun 11 at 17:54




        $begingroup$
        "Let $varepsilon>0$", I spot with my eyeball. \ Analysis! That much is clear to all.\ But much to my chagrin, \ no limits were within; \ the variable could've been integral.
        $endgroup$
        – Discrete lizard
        Jun 11 at 17:54












        $begingroup$
        @Discretelizard Did you come up with that on the fly? I quite like it.
        $endgroup$
        – user3067860
        Jun 11 at 19:10




        $begingroup$
        @Discretelizard Did you come up with that on the fly? I quite like it.
        $endgroup$
        – user3067860
        Jun 11 at 19:10




        2




        2




        $begingroup$
        @user3067860 Yes, I did. Thanks! I first tried to let the punchline be that "Let $epsilon>0$" is also a possible begin of an argument about approximation algorithms, often little to do with analysis, but didn't get that to work and ended up with the above.
        $endgroup$
        – Discrete lizard
        Jun 11 at 19:51





        $begingroup$
        @user3067860 Yes, I did. Thanks! I first tried to let the punchline be that "Let $epsilon>0$" is also a possible begin of an argument about approximation algorithms, often little to do with analysis, but didn't get that to work and ended up with the above.
        $endgroup$
        – Discrete lizard
        Jun 11 at 19:51













        $begingroup$
        Let $varepsilon>0$ be a nonzero infinite ordinal ...
        $endgroup$
        – Gareth McCaughan
        Jun 12 at 0:44




        $begingroup$
        Let $varepsilon>0$ be a nonzero infinite ordinal ...
        $endgroup$
        – Gareth McCaughan
        Jun 12 at 0:44











        9
















        $begingroup$

        I would say that the central concept of analysis is the concept of limit, specifically a limit of a sequence. Everything that uses concepts built on the concept of limit I would classify as analysis, not algebra. That includes limit of a series, limit of a function, continuity of a function, derivative and Riemann integral. Then the complex analysis emerges from complex algebra when you introduce complex derivative and integrating over curves. Functional analysis also depends on the concepts of continuity and integral, otherwise it would be just algebra of infinitely-dimensional spaces.






        share|cite|improve this answer










        $endgroup$



















          9
















          $begingroup$

          I would say that the central concept of analysis is the concept of limit, specifically a limit of a sequence. Everything that uses concepts built on the concept of limit I would classify as analysis, not algebra. That includes limit of a series, limit of a function, continuity of a function, derivative and Riemann integral. Then the complex analysis emerges from complex algebra when you introduce complex derivative and integrating over curves. Functional analysis also depends on the concepts of continuity and integral, otherwise it would be just algebra of infinitely-dimensional spaces.






          share|cite|improve this answer










          $endgroup$

















            9














            9










            9







            $begingroup$

            I would say that the central concept of analysis is the concept of limit, specifically a limit of a sequence. Everything that uses concepts built on the concept of limit I would classify as analysis, not algebra. That includes limit of a series, limit of a function, continuity of a function, derivative and Riemann integral. Then the complex analysis emerges from complex algebra when you introduce complex derivative and integrating over curves. Functional analysis also depends on the concepts of continuity and integral, otherwise it would be just algebra of infinitely-dimensional spaces.






            share|cite|improve this answer










            $endgroup$



            I would say that the central concept of analysis is the concept of limit, specifically a limit of a sequence. Everything that uses concepts built on the concept of limit I would classify as analysis, not algebra. That includes limit of a series, limit of a function, continuity of a function, derivative and Riemann integral. Then the complex analysis emerges from complex algebra when you introduce complex derivative and integrating over curves. Functional analysis also depends on the concepts of continuity and integral, otherwise it would be just algebra of infinitely-dimensional spaces.







            share|cite|improve this answer













            share|cite|improve this answer




            share|cite|improve this answer










            answered Jun 10 at 13:10









            Adam LatosińskiAdam Latosiński

            7,2927 silver badges20 bronze badges




            7,2927 silver badges20 bronze badges
























                7
















                $begingroup$

                One additional central aspect of analysis are approximations. After all, the very limit notation is nothing but a qualitative approximation.



                In its core it means "The more effort you put into your approximation, the more precise it will be".



                For example, continuity gives you the guarantee, that even if you miss the true input value of the function, your output value will still be "close". Just imagine if reading a number of a ruler wasn't continuous - there'd be hardly any point in making the measurement in the first place!



                Differentiability gives you an approximation of an interval, in which the function is monotone.



                Integration gives you an approximation of the mistake you make, when you approximate a function into segments.



                In many cases, analysis gives us results that the difference of an approximation to the original object goes to 0 in the limit.

                If we look at this from the practical point of view, where everything has an error term, this is all the justification we need to swap out the original with the approximation.

                (Yes, if you have a calculation that isn't e.g. continuous, you'll get rubbish - but as any representation of the original object would have an error, you'd always anyway get rubbish)






                share|cite|improve this answer










                $endgroup$














                • $begingroup$
                  I like this idea. In a sense, I always thought differential calculus was the idea that the information on a local neighbourhood is encoded at a given point in the form of derivatives and the associated taylor series (for analytic functions). So, this idea resonates with me.
                  $endgroup$
                  – Sandesh Jr
                  Jun 13 at 11:06















                7
















                $begingroup$

                One additional central aspect of analysis are approximations. After all, the very limit notation is nothing but a qualitative approximation.



                In its core it means "The more effort you put into your approximation, the more precise it will be".



                For example, continuity gives you the guarantee, that even if you miss the true input value of the function, your output value will still be "close". Just imagine if reading a number of a ruler wasn't continuous - there'd be hardly any point in making the measurement in the first place!



                Differentiability gives you an approximation of an interval, in which the function is monotone.



                Integration gives you an approximation of the mistake you make, when you approximate a function into segments.



                In many cases, analysis gives us results that the difference of an approximation to the original object goes to 0 in the limit.

                If we look at this from the practical point of view, where everything has an error term, this is all the justification we need to swap out the original with the approximation.

                (Yes, if you have a calculation that isn't e.g. continuous, you'll get rubbish - but as any representation of the original object would have an error, you'd always anyway get rubbish)






                share|cite|improve this answer










                $endgroup$














                • $begingroup$
                  I like this idea. In a sense, I always thought differential calculus was the idea that the information on a local neighbourhood is encoded at a given point in the form of derivatives and the associated taylor series (for analytic functions). So, this idea resonates with me.
                  $endgroup$
                  – Sandesh Jr
                  Jun 13 at 11:06













                7














                7










                7







                $begingroup$

                One additional central aspect of analysis are approximations. After all, the very limit notation is nothing but a qualitative approximation.



                In its core it means "The more effort you put into your approximation, the more precise it will be".



                For example, continuity gives you the guarantee, that even if you miss the true input value of the function, your output value will still be "close". Just imagine if reading a number of a ruler wasn't continuous - there'd be hardly any point in making the measurement in the first place!



                Differentiability gives you an approximation of an interval, in which the function is monotone.



                Integration gives you an approximation of the mistake you make, when you approximate a function into segments.



                In many cases, analysis gives us results that the difference of an approximation to the original object goes to 0 in the limit.

                If we look at this from the practical point of view, where everything has an error term, this is all the justification we need to swap out the original with the approximation.

                (Yes, if you have a calculation that isn't e.g. continuous, you'll get rubbish - but as any representation of the original object would have an error, you'd always anyway get rubbish)






                share|cite|improve this answer










                $endgroup$



                One additional central aspect of analysis are approximations. After all, the very limit notation is nothing but a qualitative approximation.



                In its core it means "The more effort you put into your approximation, the more precise it will be".



                For example, continuity gives you the guarantee, that even if you miss the true input value of the function, your output value will still be "close". Just imagine if reading a number of a ruler wasn't continuous - there'd be hardly any point in making the measurement in the first place!



                Differentiability gives you an approximation of an interval, in which the function is monotone.



                Integration gives you an approximation of the mistake you make, when you approximate a function into segments.



                In many cases, analysis gives us results that the difference of an approximation to the original object goes to 0 in the limit.

                If we look at this from the practical point of view, where everything has an error term, this is all the justification we need to swap out the original with the approximation.

                (Yes, if you have a calculation that isn't e.g. continuous, you'll get rubbish - but as any representation of the original object would have an error, you'd always anyway get rubbish)







                share|cite|improve this answer













                share|cite|improve this answer




                share|cite|improve this answer










                answered Jun 12 at 18:09









                SudixSudix

                1,7001 gold badge6 silver badges19 bronze badges




                1,7001 gold badge6 silver badges19 bronze badges














                • $begingroup$
                  I like this idea. In a sense, I always thought differential calculus was the idea that the information on a local neighbourhood is encoded at a given point in the form of derivatives and the associated taylor series (for analytic functions). So, this idea resonates with me.
                  $endgroup$
                  – Sandesh Jr
                  Jun 13 at 11:06
















                • $begingroup$
                  I like this idea. In a sense, I always thought differential calculus was the idea that the information on a local neighbourhood is encoded at a given point in the form of derivatives and the associated taylor series (for analytic functions). So, this idea resonates with me.
                  $endgroup$
                  – Sandesh Jr
                  Jun 13 at 11:06















                $begingroup$
                I like this idea. In a sense, I always thought differential calculus was the idea that the information on a local neighbourhood is encoded at a given point in the form of derivatives and the associated taylor series (for analytic functions). So, this idea resonates with me.
                $endgroup$
                – Sandesh Jr
                Jun 13 at 11:06




                $begingroup$
                I like this idea. In a sense, I always thought differential calculus was the idea that the information on a local neighbourhood is encoded at a given point in the form of derivatives and the associated taylor series (for analytic functions). So, this idea resonates with me.
                $endgroup$
                – Sandesh Jr
                Jun 13 at 11:06











                6
















                $begingroup$

                My impression is that Analysis is largely in contrast to Finite/Discrete Math, and thus deals with continuous spaces, especially the real line. This is generalized to spaces with a metric, measure, and/or topology.






                share|cite|improve this answer










                $endgroup$



















                  6
















                  $begingroup$

                  My impression is that Analysis is largely in contrast to Finite/Discrete Math, and thus deals with continuous spaces, especially the real line. This is generalized to spaces with a metric, measure, and/or topology.






                  share|cite|improve this answer










                  $endgroup$

















                    6














                    6










                    6







                    $begingroup$

                    My impression is that Analysis is largely in contrast to Finite/Discrete Math, and thus deals with continuous spaces, especially the real line. This is generalized to spaces with a metric, measure, and/or topology.






                    share|cite|improve this answer










                    $endgroup$



                    My impression is that Analysis is largely in contrast to Finite/Discrete Math, and thus deals with continuous spaces, especially the real line. This is generalized to spaces with a metric, measure, and/or topology.







                    share|cite|improve this answer













                    share|cite|improve this answer




                    share|cite|improve this answer










                    answered Jun 10 at 21:30









                    AcccumulationAcccumulation

                    8,1552 gold badges9 silver badges20 bronze badges




                    8,1552 gold badges9 silver badges20 bronze badges
























                        4
















                        $begingroup$

                        Mathematical analysis is a mental edifice built up to describe and understand phenomena of geometry, physics, and technics in terms of formulas involving finite mathematical expressions. The core of this all is the study of functions $f:>mathbb Rtomathbb R$ and their properties.






                        share|cite|improve this answer










                        $endgroup$



















                          4
















                          $begingroup$

                          Mathematical analysis is a mental edifice built up to describe and understand phenomena of geometry, physics, and technics in terms of formulas involving finite mathematical expressions. The core of this all is the study of functions $f:>mathbb Rtomathbb R$ and their properties.






                          share|cite|improve this answer










                          $endgroup$

















                            4














                            4










                            4







                            $begingroup$

                            Mathematical analysis is a mental edifice built up to describe and understand phenomena of geometry, physics, and technics in terms of formulas involving finite mathematical expressions. The core of this all is the study of functions $f:>mathbb Rtomathbb R$ and their properties.






                            share|cite|improve this answer










                            $endgroup$



                            Mathematical analysis is a mental edifice built up to describe and understand phenomena of geometry, physics, and technics in terms of formulas involving finite mathematical expressions. The core of this all is the study of functions $f:>mathbb Rtomathbb R$ and their properties.







                            share|cite|improve this answer













                            share|cite|improve this answer




                            share|cite|improve this answer










                            answered Jun 10 at 18:16









                            Christian BlatterChristian Blatter

                            184k10 gold badges124 silver badges344 bronze badges




                            184k10 gold badges124 silver badges344 bronze badges































                                draft saved

                                draft discarded















































                                Thanks for contributing an answer to Mathematics Stack Exchange!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid


                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.

                                Use MathJax to format equations. MathJax reference.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function ()
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3257398%2fwhat-is-the-theme-of-analysis%23new-answer', 'question_page');

                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown









                                Popular posts from this blog

                                Tamil (spriik) Luke uk diar | Nawigatjuun

                                Align equal signs while including text over equalitiesAMS align: left aligned text/math plus multicolumn alignmentMultiple alignmentsAligning equations in multiple placesNumbering and aligning an equation with multiple columnsHow to align one equation with another multline equationUsing \ in environments inside the begintabularxNumber equations and preserving alignment of equal signsHow can I align equations to the left and to the right?Double equation alignment problem within align enviromentAligned within align: Why are they right-aligned?

                                Where does the image of a data connector as a sharp metal spike originate from?Where does the concept of infected people turning into zombies only after death originate from?Where does the motif of a reanimated human head originate?Where did the notion that Dragons could speak originate?Where does the archetypal image of the 'Grey' alien come from?Where did the suffix '-Man' originate?Where does the notion of being injured or killed by an illusion originate?Where did the term “sophont” originate?Where does the trope of magic spells being driven by advanced technology originate from?Where did the term “the living impaired” originate?