Examples of “unsuccessful” theories with afterlivesWhy did Voiculescu develop free probability?Guises of the Stasheff polytopes, associahedra for the Coxeter $A_n$ root system?Fundamental Examplesbooks well-motivated with explicit examplesCollecting various theories on toy examples: Projective spaceIs Galois theory necessary (in a basic graduate algebra course)?How do you decide whether a question in abstract algebra is worth studying?Proofs that require fundamentally new ways of thinkingHow come Cartan did not notice the close relationship between symmetric spaces and isoparametric hypersurfaces?The “derived drift” is pretty unsatisfying and dangerous to category theory (or at least, to me)Examples of simultaneous independent breakthroughs
Examples of “unsuccessful” theories with afterlives
Why did Voiculescu develop free probability?Guises of the Stasheff polytopes, associahedra for the Coxeter $A_n$ root system?Fundamental Examplesbooks well-motivated with explicit examplesCollecting various theories on toy examples: Projective spaceIs Galois theory necessary (in a basic graduate algebra course)?How do you decide whether a question in abstract algebra is worth studying?Proofs that require fundamentally new ways of thinkingHow come Cartan did not notice the close relationship between symmetric spaces and isoparametric hypersurfaces?The “derived drift” is pretty unsatisfying and dangerous to category theory (or at least, to me)Examples of simultaneous independent breakthroughs
$begingroup$
I am looking for examples of mathematical theories which were introduced with a certain goal in mind, and which failed to achieved this goal, but which nevertheless developed on their own and continued to be studied for other reasons.
Here is a prominent example I know of:
Lie theory: It is my understanding that Lie introduced Lie groups with the idea that they would help in solving differential equations (I guess, by consideration of the symmetries of these equations). While symmetry techniques for differential equations to some extent continue to be studied (see differential Galois theory), they remain far from the mainstream of DE research. But of course Lie theory is nonetheless now seen as a central topic in mathematics.
Are there some other examples along these lines?
soft-question ho.history-overview big-list
$endgroup$
|
show 12 more comments
$begingroup$
I am looking for examples of mathematical theories which were introduced with a certain goal in mind, and which failed to achieved this goal, but which nevertheless developed on their own and continued to be studied for other reasons.
Here is a prominent example I know of:
Lie theory: It is my understanding that Lie introduced Lie groups with the idea that they would help in solving differential equations (I guess, by consideration of the symmetries of these equations). While symmetry techniques for differential equations to some extent continue to be studied (see differential Galois theory), they remain far from the mainstream of DE research. But of course Lie theory is nonetheless now seen as a central topic in mathematics.
Are there some other examples along these lines?
soft-question ho.history-overview big-list
$endgroup$
6
$begingroup$
Does the proof of Feit-Thompson really eschew representation theory? I don't know much about the area, but the linked wiki page gives quite the opposite impression.
$endgroup$
– lambda
Sep 18 at 22:34
29
$begingroup$
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry. Nowadays it describes two of the basic model geometries in Riemannian geometry - the sphere and hyperbolic space.
$endgroup$
– Terry Tao
Sep 18 at 22:38
16
$begingroup$
Many parts of graph theory originated or were stimulated in attempts to tackle the 4-colour conjecture. Perhaps the chromatic polynomial is the principal example of something explicitly introduced for the 4CC, which played no role in its solution, but which then went on to be significant in the study of phase transitions in the Potts model.
$endgroup$
– Gordon Royle
Sep 18 at 23:45
5
$begingroup$
I think there are plenty of results in differential equations which use Lie theory in a non-trivial way. Ask a physicist about the Schrödinger equation, the hydrogen atom and the representation theory of SO(3).
$endgroup$
– Thomas Rot
Sep 19 at 10:27
2
$begingroup$
Partcularly relevant "String Theory and Math: Why This Marriage May Last" by Mina Aganagic arxiv.org/abs/1508.06642. Camps have been formed by supporters and detractors, fusilades fired, but no doubt string theory has invigorated large branches of math and given rise to offshoots.
$endgroup$
– Tom Copeland
Sep 19 at 17:20
|
show 12 more comments
$begingroup$
I am looking for examples of mathematical theories which were introduced with a certain goal in mind, and which failed to achieved this goal, but which nevertheless developed on their own and continued to be studied for other reasons.
Here is a prominent example I know of:
Lie theory: It is my understanding that Lie introduced Lie groups with the idea that they would help in solving differential equations (I guess, by consideration of the symmetries of these equations). While symmetry techniques for differential equations to some extent continue to be studied (see differential Galois theory), they remain far from the mainstream of DE research. But of course Lie theory is nonetheless now seen as a central topic in mathematics.
Are there some other examples along these lines?
soft-question ho.history-overview big-list
$endgroup$
I am looking for examples of mathematical theories which were introduced with a certain goal in mind, and which failed to achieved this goal, but which nevertheless developed on their own and continued to be studied for other reasons.
Here is a prominent example I know of:
Lie theory: It is my understanding that Lie introduced Lie groups with the idea that they would help in solving differential equations (I guess, by consideration of the symmetries of these equations). While symmetry techniques for differential equations to some extent continue to be studied (see differential Galois theory), they remain far from the mainstream of DE research. But of course Lie theory is nonetheless now seen as a central topic in mathematics.
Are there some other examples along these lines?
soft-question ho.history-overview big-list
soft-question ho.history-overview big-list
edited Sep 19 at 10:22
community wiki
Sam Hopkins
6
$begingroup$
Does the proof of Feit-Thompson really eschew representation theory? I don't know much about the area, but the linked wiki page gives quite the opposite impression.
$endgroup$
– lambda
Sep 18 at 22:34
29
$begingroup$
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry. Nowadays it describes two of the basic model geometries in Riemannian geometry - the sphere and hyperbolic space.
$endgroup$
– Terry Tao
Sep 18 at 22:38
16
$begingroup$
Many parts of graph theory originated or were stimulated in attempts to tackle the 4-colour conjecture. Perhaps the chromatic polynomial is the principal example of something explicitly introduced for the 4CC, which played no role in its solution, but which then went on to be significant in the study of phase transitions in the Potts model.
$endgroup$
– Gordon Royle
Sep 18 at 23:45
5
$begingroup$
I think there are plenty of results in differential equations which use Lie theory in a non-trivial way. Ask a physicist about the Schrödinger equation, the hydrogen atom and the representation theory of SO(3).
$endgroup$
– Thomas Rot
Sep 19 at 10:27
2
$begingroup$
Partcularly relevant "String Theory and Math: Why This Marriage May Last" by Mina Aganagic arxiv.org/abs/1508.06642. Camps have been formed by supporters and detractors, fusilades fired, but no doubt string theory has invigorated large branches of math and given rise to offshoots.
$endgroup$
– Tom Copeland
Sep 19 at 17:20
|
show 12 more comments
6
$begingroup$
Does the proof of Feit-Thompson really eschew representation theory? I don't know much about the area, but the linked wiki page gives quite the opposite impression.
$endgroup$
– lambda
Sep 18 at 22:34
29
$begingroup$
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry. Nowadays it describes two of the basic model geometries in Riemannian geometry - the sphere and hyperbolic space.
$endgroup$
– Terry Tao
Sep 18 at 22:38
16
$begingroup$
Many parts of graph theory originated or were stimulated in attempts to tackle the 4-colour conjecture. Perhaps the chromatic polynomial is the principal example of something explicitly introduced for the 4CC, which played no role in its solution, but which then went on to be significant in the study of phase transitions in the Potts model.
$endgroup$
– Gordon Royle
Sep 18 at 23:45
5
$begingroup$
I think there are plenty of results in differential equations which use Lie theory in a non-trivial way. Ask a physicist about the Schrödinger equation, the hydrogen atom and the representation theory of SO(3).
$endgroup$
– Thomas Rot
Sep 19 at 10:27
2
$begingroup$
Partcularly relevant "String Theory and Math: Why This Marriage May Last" by Mina Aganagic arxiv.org/abs/1508.06642. Camps have been formed by supporters and detractors, fusilades fired, but no doubt string theory has invigorated large branches of math and given rise to offshoots.
$endgroup$
– Tom Copeland
Sep 19 at 17:20
6
6
$begingroup$
Does the proof of Feit-Thompson really eschew representation theory? I don't know much about the area, but the linked wiki page gives quite the opposite impression.
$endgroup$
– lambda
Sep 18 at 22:34
$begingroup$
Does the proof of Feit-Thompson really eschew representation theory? I don't know much about the area, but the linked wiki page gives quite the opposite impression.
$endgroup$
– lambda
Sep 18 at 22:34
29
29
$begingroup$
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry. Nowadays it describes two of the basic model geometries in Riemannian geometry - the sphere and hyperbolic space.
$endgroup$
– Terry Tao
Sep 18 at 22:38
$begingroup$
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry. Nowadays it describes two of the basic model geometries in Riemannian geometry - the sphere and hyperbolic space.
$endgroup$
– Terry Tao
Sep 18 at 22:38
16
16
$begingroup$
Many parts of graph theory originated or were stimulated in attempts to tackle the 4-colour conjecture. Perhaps the chromatic polynomial is the principal example of something explicitly introduced for the 4CC, which played no role in its solution, but which then went on to be significant in the study of phase transitions in the Potts model.
$endgroup$
– Gordon Royle
Sep 18 at 23:45
$begingroup$
Many parts of graph theory originated or were stimulated in attempts to tackle the 4-colour conjecture. Perhaps the chromatic polynomial is the principal example of something explicitly introduced for the 4CC, which played no role in its solution, but which then went on to be significant in the study of phase transitions in the Potts model.
$endgroup$
– Gordon Royle
Sep 18 at 23:45
5
5
$begingroup$
I think there are plenty of results in differential equations which use Lie theory in a non-trivial way. Ask a physicist about the Schrödinger equation, the hydrogen atom and the representation theory of SO(3).
$endgroup$
– Thomas Rot
Sep 19 at 10:27
$begingroup$
I think there are plenty of results in differential equations which use Lie theory in a non-trivial way. Ask a physicist about the Schrödinger equation, the hydrogen atom and the representation theory of SO(3).
$endgroup$
– Thomas Rot
Sep 19 at 10:27
2
2
$begingroup$
Partcularly relevant "String Theory and Math: Why This Marriage May Last" by Mina Aganagic arxiv.org/abs/1508.06642. Camps have been formed by supporters and detractors, fusilades fired, but no doubt string theory has invigorated large branches of math and given rise to offshoots.
$endgroup$
– Tom Copeland
Sep 19 at 17:20
$begingroup$
Partcularly relevant "String Theory and Math: Why This Marriage May Last" by Mina Aganagic arxiv.org/abs/1508.06642. Camps have been formed by supporters and detractors, fusilades fired, but no doubt string theory has invigorated large branches of math and given rise to offshoots.
$endgroup$
– Tom Copeland
Sep 19 at 17:20
|
show 12 more comments
15 Answers
15
active
oldest
votes
$begingroup$
I quote at length from the Wikipedia essay on the history of knot theory:
In 1867 after observing Scottish physicist Peter Tait's experiments involving smoke rings, Thomson came to the idea that atoms were knots of swirling vortices in the æther. Chemical elements would thus correspond to knots and links. Tait's experiments were inspired by a paper of Helmholtz's on vortex-rings in incompressible fluids. Thomson and Tait believed that an understanding and classification of all possible knots would explain why atoms absorb and emit light at only the discrete wavelengths that they do. For example, Thomson thought that sodium could be the Hopf link due to its two lines of spectra.
Tait subsequently began listing unique knots in the belief that he was creating a table of elements. He formulated what are now known as the Tait conjectures on alternating knots. (The conjectures were proved in the 1990s.) Tait's knot tables were subsequently improved upon by C. N. Little and Thomas Kirkman.
James Clerk Maxwell, a colleague and friend of Thomson's and Tait's, also developed a strong interest in knots. Maxwell studied Listing's work on knots. He re-interpreted Gauss' linking integral in terms of electromagnetic theory. In his formulation, the integral represented the work done by a charged particle moving along one component of the link under the influence of the magnetic field generated by an electric current along the other component. Maxwell also continued the study of smoke rings by considering three interacting rings.
When the luminiferous æther was not detected in the Michelson–Morley experiment, vortex theory became completely obsolete, and knot theory ceased to be of great scientific interest. Modern physics demonstrates that the discrete wavelengths depend on quantum energy levels.
$endgroup$
add a comment
|
$begingroup$
"The modern study of knots grew out an attempt by three 19th-century Scottish
physicists to apply knot theory to fundamental questions about the universe".
$endgroup$
3
$begingroup$
You beat me by 16 seconds.
$endgroup$
– Gerry Myerson
Sep 19 at 4:15
3
$begingroup$
@GerryMyerson well, there was a bit more typing in yours...
$endgroup$
– Nik Weaver
Sep 19 at 11:04
5
$begingroup$
Not really, it was all cut'n'paste from Wikipedia.
$endgroup$
– Gerry Myerson
Sep 19 at 11:51
2
$begingroup$
Perhaps the same thing will happen with string theory, in the future it may be remembered as something which give birth to a lot of interesting mathematics, but without having direct physical relevance.
$endgroup$
– Tom
Sep 20 at 12:42
1
$begingroup$
Links tend to break over time, so best to give the title and author of any linked paper: "Knot Theory’s Odd Origins" by Silver.
$endgroup$
– Tom Copeland
Oct 5 at 21:10
|
show 2 more comments
$begingroup$
Motives and the standard conjectures were developed by Grothendieck to prove the last of the Weil conjectures. They failed at this as none of the standard conjectures were proven - despite some progress on this, I would say we are not closer to proving the Weil conjectures via the standard conjectures today than we were when they were first formulated - and Deligne showed that Grothendieck's earlier invention of etale cohomology was perfectly sufficient to prove the Weil conjectures.
However, since that time different notions of motive were constructed, with different useful properties, in addition to Grothendieck's, and many of them have found applications in areas of algebraic geometry and number theory, with the first really big one being Voevodsky's Fields medal-winning proof of the Milnor conjecture.
$endgroup$
add a comment
|
$begingroup$
String Theory!
String Theory was born in the context of strong interactions inside atomic nuclei, since the 60s-70s. The theory turned out not suited to describe the strong force, and was supplanted around 1973 by the rising Quantum Chromodynamics (our current best model for the strong force interactions). Among the reasons for the failure, there was the mandatory presence of unwanted spin 2 particles...
Those particles are now interpreted as gravitons! And String Theory is now seen as a theory of quantum-gravity, describing all the known forces (electromagnetic, weak, strong) and gravity at the same time! That’s a pretty big afterlife!
$endgroup$
24
$begingroup$
The "is describing" in your final paragraph is a trifle optimistic. Nobody has yet produced a string theory that describes the standard model in 4 dimensions as a low-energy limit.
$endgroup$
– Robert Furber
Sep 19 at 15:32
5
$begingroup$
Oh yes, those are the hidden terms and conditions of those words.
$endgroup$
– Rexcirus
Sep 19 at 15:47
$begingroup$
@RobertFurber That's not true. The problem is rather that there are too many ways to produce the Standard Model in string theory. As an example, take a look at arxiv.org/abs/1903.00009
$endgroup$
– Vigod
Sep 28 at 2:08
add a comment
|
$begingroup$
(Converted from a comment to an answer as requested.)
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry, as can be seen in particular through the pioneering work of Saccheri in this area, who tried in vain to prove the parallel postulate by contradiction and ended up proving a large number of foundational results in what we would now call elliptic and hyperbolic geometry as a consequence. (See for instance this article of Fitzpatrick, or this McTutor article on Non-Euclidean geometry.)
Nowadays, the classical non-Euclidean geometries (the elliptic geometry of the sphere, and the hyperbolic geometry of hyperbolic space) play the important role of describing two of the basic model geometries in Riemannian geometry, namely the simply connected geometries of constant and isotropic positive or negative curvature respectively. (In two dimensions, where Riemann curvature is effectively a scalar quantity, these two geometries, together with Euclidean geometry, are the only models needed; in higher dimensions there are however other model geometries of interest, such as the remaining five Thurston geometries of the geometrisation conjecture in three dimensions.)
$endgroup$
$begingroup$
Great ecample of how a beautiful theory can be in the air until it finally sharply crystalizes in the minds of innovators looking at it from a radical angle.
$endgroup$
– Tom Copeland
Sep 25 at 13:40
add a comment
|
$begingroup$
This is a copy of a copy of some history of the origins of free probability by Dan Voiculescu extracted from a response by Roland Speicher, a developer of the field, to an MO-Q:
This is from his article "Background and Outlook" in the Lectures Notes
"Free Probability and Operator Algebras", see
http://www.ems-ph.org/books/book.php?proj_nr=208
Just before starting in this new direction, I had worked with Mihai Pimsner,
computing the K-theory of the reduced $C^*$-algebras of free groups. From the
K-theory work I had acquired a taste for operator algebras associated with free
groups and I became interested in a famous problem about the von Neumann
algebras $L(mathbbF_n)$ generated by the left regular representations of free groups,
which appears in Kadison's Baton-Rouge problem list. The problem, which
may have already been known to Murray and von Neumann, is:are $L(mathbbF_m)$ and $L(mathbbF_n)$ non-isomorphic if $m not= n$?
This is still an open problem. Fortunately, after trying in vain to solve it,
I realized it was time to be more humble and to ask: is there anything I can
do, which may be useful in connection with this problem? Since I had come
across computations of norms and spectra of certain convolution operators on
free groups (i.e., elements of $L(mathbbF_n)$), I thought of finding ways to streamline
some of these computations and perhaps be able to compute more complicated
examples. This, of course, meant computing expectations of powers of such
operators with respect to the von Neumann trace-state $tau(T) = langle T e_e,e_erangle$, $e_g$
being the canonical basis of the $l^2$-space.
The key remark I made was that if $T_1$, $T_2$ are convolution operators on $mathbbF_m$
and $mathbbF_n$ then the operator on $mathbbF_m+n = mathbbF_m ast mathbbF_n$ which is $T_1 + T_2$, has moments $tau((T_1 + T_2)^p)$ which depend only on the moments $tau(T_j^k)$, $j = 1, 2$ , but not
on the actual $T_1$ and $T_2$. This was like the addition of independent random
variables, only classical independence had to be replaced by a notion of free
independence, which led to a free central limit theorem, a free analogue of
the Gaussian functor, free convolution, an abstract existence theorem for one
variable free cumulants, etc.
$endgroup$
$begingroup$
Good intro to the topic: "Three lectures on free probability" by Jonathan Novak and Michael LaCroix arxiv.org/abs/1205.2097
$endgroup$
– Tom Copeland
Sep 18 at 23:55
1
$begingroup$
More here: bcc.impan.pl/15TQG/uploads/pdf/Speicher-Post.pdf
$endgroup$
– JP McCarthy
Sep 19 at 12:53
2
$begingroup$
, i.e., "Compact Quantum Groups and Free Combinatorics" a presentation by Roland Speicher
$endgroup$
– Tom Copeland
Sep 19 at 14:58
add a comment
|
$begingroup$
Multiplication of quaternions was introduced for use in physics for purposes for which cross-products of vectors came to be used and have been used ever since.
But today quaternions are used in computer graphics. I suspect they also have other applications.
$endgroup$
9
$begingroup$
I don't think quaternions were unsuccessful---they successfully provide an algebraic realization of rotations in 3-space. It just happens that this approach was to a large extent superceded by vectors and linear algebra.
$endgroup$
– Kimball
Sep 19 at 9:36
1
$begingroup$
Quaternions are extremely important in pure mathematics still, but Hamilton's original purpose to use them in mechanics was taken over by algebra of finite-dimensional vectors, which is extremely simple and stream-lined.
$endgroup$
– Tom
Sep 20 at 12:46
3
$begingroup$
Hypercomplex number systems in general can be included here: The dual quaternions are used to represent rigid body motions in computer graphics; the dual numbers are used to implement auto-diff (in forward mode only, and largely for pedagogical purposes); Clifford algebras are used to construct spinors in quantum physics.
$endgroup$
– jkabrg
Sep 20 at 20:13
1
$begingroup$
@Tom Is there any published expository summary of the most important or most interesting uses of quaternions in pure mathematics?
$endgroup$
– Michael Hardy
Sep 23 at 4:13
2
$begingroup$
The unit quaternions form a (Lie) group isomorphic to $SU(2)$ and $Spin(3)$ which is ubiquitous in physics.
$endgroup$
– W. Edwin Clark
Sep 24 at 22:46
|
show 1 more comment
$begingroup$
The chromatic polynomial of a graph was originally introduced as part of an attempt to prove the four-color conjecture (now a theorem), but was unsuccessful in that goal. However, the chromatic polynomial continues to be studied to this day as an interesting algebraic invariant of a graph.
$endgroup$
add a comment
|
$begingroup$
Logic and set theory were developed by Frege, Russell and Whitehead, Hilbert and others in the late 19th, early 20th centuries with the goal of providing a firm foundation for all of Mathematics. In this they failed miserably, but nevertheless they have continued to develop and to be studied for other reasons.
$endgroup$
16
$begingroup$
The current wording sounds like you’re suggesting “logic and set theory” failed miserably as a firm foundation for mathematics, which would be a pretty extreme claim (and I say that as a big proponent of non-set-theoretic foundations). Do you mean just that the specific systems Frege and Russell–Whitehead used failed as foundations? If so then that’s certainly true, but as far as I know they’re not studied much today except for historical interest.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 19 at 12:53
5
$begingroup$
I would say rather "they failed in their project of reducing mathematics to logic" I'd also add that their work led directly to the development of modern computer science.
$endgroup$
– Chris Sunami
Sep 19 at 15:32
5
$begingroup$
Maybe the word "firm" is ambiguous. If we interpret "firm" as "absolutely certain, unassailable, and indubitable" then they indeed failed, but with a weaker notion of "firm" (and also a suitable notion of what a "foundation" is supposed to be) then I think they succeeded.
$endgroup$
– Timothy Chow
Sep 19 at 16:02
3
$begingroup$
I had a feeling that this answer would be a bit more controversial than some others, and perhaps I worded it in an unnecessarily inflammatory way. But I reckon that the idea was to show that arithmetic and other mathematical systems were complete and consistent, and that project was derailed by the incompleteness theorems.
$endgroup$
– Gerry Myerson
Sep 19 at 22:45
3
$begingroup$
@GerryMyerson: I guess this comes down to an interesting historical question: did most early logicians really view proving completeness+consistency of foundations (i.e. the “failed” goals) as an essential or major goal of the foundational project, or were their central motivations more in line with the aspects that succeeded (i.e. a formal system able to encode all mathematics, giving a clear consensus standard for proof correctness in principle)? My impression is more the latter, but I’m not enough of a historian to be certain.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 20 at 21:47
|
show 5 more comments
$begingroup$
Ronald Fisher's theory of fiducial inference was introduced around 1930 or so (I think?), for the purpose of solving the Behrens–Fisher problem. It turned out that fiducial intervals for that problem did not have constant coverage rates, or in what then came to be standard terminology, they are not confidence intervals. That's not necessarily fatal in some contexts, since Bayesian credible intervals don't have constant coverage rates, but everyone understands that there are good reasons for that. Fisher wrote a paper saying that that criticism is unconvincing, and I wonder if anyone understands what Fisher was trying to say. Fisher was brilliant but irascible. (He was a very prolific author of research papers in statistical theory and in population genetics, a science of which he was one of the three major founders. I think he may have single-handedly founded the theory of design of experiments, but I'm not sure about that.)
However, fiducial methods seem to be undergoing some sort of revival:
https://statistics.fas.harvard.edu/event/4th-bayesian-fiducial-and-frequentist-conference-bff4
$endgroup$
1
$begingroup$
Here is a book-length treatment!
$endgroup$
– kjetil b halvorsen
Sep 25 at 7:44
add a comment
|
$begingroup$
Gauge-theory might be another example at the border to physics. The original idea of deriving physics from gauge-symmetries and indeed the use of the term/prefix "gauge-" (in German "Eich-") itself goes back a paper by Hermann Weyl in 1919 ("Eine neue Erweiterung der Relativitätstheorie"). In this paper he somehow tried to unify electrodynamics and general relativity using this approach, by postulating that the notion of scale (or "gauge") might be a local symmetry. This of course was a total failure as it contradicted several experiments.
It was only about a decade later that he and some others picked up the idea again, applied it to electromagnetism and quantum physics (this time with phase as a gauge) and made it work. And then of course in 1954 there came Yang and Mills and now Weyl's "failed idea" is at the core of the Standard model of physics. However the original goal of adding general relativity to the mix still hasn't been achieved.
$endgroup$
add a comment
|
$begingroup$
Continuing what was said by @GerryMyerson, the project of providing foundations for mathematics started by Frege was presented in a treatise called Grundgesetze der Arithmetik (Basic laws of arithmetic). The axioms of this treatise were proven inconsistent by Bertrand Russell in what we know today as Russell's paradox.
This paradox also affects naive set theory, understood as the theory comprising the following two axioms:
Axiom of extensionality: $(x in a leftrightarrow x in b) rightarrow a = b$. That is, if two sets $a$ and $b$ have the same elements, then they're the same set.
Axiom (scheme) of unrestricted comprehension: $x in a leftrightarrow$ P$x$, for each formula P$x$. That is, to each property P uniquely corresponds one set $a$.
Naive set theory, thus understood, follows from Frege's axioms and seems to capture very well the notion of set. But since $x notin x$ is a formula, the axiom scheme of unrestricted comprehension guarantees that the following is an axiom:
- $x in x leftrightarrow x notin x$
Now, when we ask whether $x in x$ or $x notin x$, we obtain contradictory situations in both cases.
This paradox was solved by discarding this axiomatisation of set theory and, hence, Frege's axiomatics. But some logicians, mathematicians and philosophers have considered that perhaps this wasn't the right way to solve this. Instead of rejecting this naive set theory or Frege's theory, they propose to reject the principle of explosion or ex contradictione sequitor quodlibet:
$P wedge neg P rightarrow Q$. That is, from a contradiction follows any formula or statement.
This research programme is often known as the paraconsistent programme, because they work with paraconsistent logics. A logic system is said to be paraconsistent iff the logical thesis (4) is not valid in general. Hence, if the theory is inconsistent, it doesn't mean that anything follows from it (which means that it still may be useful). You can find out more about his programme in:
- SEP: Paraconsistent Logic
- SEP: Inconsistent Mathematics
- SEP: Dialetheism
You will find there (specially in the second link) a whole programme for researching inconsistent mathematical theories, which are generally considered of no mathematical interest. (You will also find that mainly philosophers are working in this programme.)
Whether this programme is of any scientific value, that's for you to judge. But I accept this probably wasn't the kind of answer you were looking for. There is a chance, however, that you find it very interesting. I hope it helps in any case.
$endgroup$
add a comment
|
$begingroup$
It's perhaps slightly (if any) exaggerated, but the development of algebraic number theory (particularly the study of cyclotomic fields) is strongly motivated by attempts to prove Fermat's last theorem.
And everyone knows the end of that story ...
To quote Wiki:
Fermat's last theorem:
The unsolved problem stimulated the development of algebraic number theory in the 19th century and the proof of the modularity theorem in the 20th century.
Cyclotimic field:
The cyclotomic fields played a crucial role in the development of modern algebra and number theory because of their relation with Fermat's last theorem. It was in the process of his deep investigations of the arithmetic of these fields (for prime n) – and more precisely, because of the failure of unique factorization in their rings of integers – that Ernst Kummer first introduced the concept of an ideal number and proved his celebrated congruences.
$endgroup$
add a comment
|
$begingroup$
I think Dirac's equation is a good example. Dirac was looking for a special-relativistic version of Schrödinger's equation. For the probabilistic interpretation to work, it had to have only first-order time derivatives, unlike the field equations known at the time.
He found a Lorentz invariant field equation with first-order derivatives, and it turned out to have enormous theoretical importance since it kicked off the study of relativistic field theories and Lie group representations in physics.
But the Dirac equation isn't a relativistic version of Schrödinger's equation. It can't describe multiparticle entanglement, it doesn't violate Bell's inequality, you can't build a quantum computer in it, etc. From a modern perspective it's just the massive, spin-½ counterpart to Maxwell's equations.
A version of the Dirac equation appears in the Lagrangian of quantum electrodynamics and the Standard Model. But it's right alongside a version of Maxwell's equations, complete with second-order derivatives, which turned out not to be a problem after all.
It's often still taught in introductory courses that Dirac's equation explained the electron's spin and magnetic moment, but both of those retrodictions were essentially accidental. Dirac's argument for spin ½ would imply that all fundamental particles must have half-integer spin, which doesn't appear to be the case; and Weinberg says "there is really nothing in Dirac's line of argument that leads unequivocally to this particular value for the magnetic moment" (The Quantum Theory of Fields, Vol. 1, p. 14).
$endgroup$
add a comment
|
$begingroup$
The typical oracle methods of Computability theory AKA Recursion theory were shown to be insufficient to settle the P vs. NP problem by Baker, Gill and Solovay 1975.
Thus recursion theory became divorced from the problems of efficient computability and experienced a bit of a setback (not as many papers in Ann.Math. anymore etc.).
Nevertheless it continued as the study of in principle computability.
$endgroup$
8
$begingroup$
Oracle methods predate the interest in or even formulation of P=NP. The failure of oracle methods for that problem surely highlighted the distance between computabilirt and efficient computation, but I don’t think the two subjects were ever very married.
$endgroup$
– Matt F.
Sep 18 at 23:28
$begingroup$
@MattF. Fair enough but people became more interested in efficient computability than in-principle computability, because of practical applications.
$endgroup$
– Bjørn Kjos-Hanssen
Sep 19 at 6:49
add a comment
|
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "504"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f341959%2fexamples-of-unsuccessful-theories-with-afterlives%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
15 Answers
15
active
oldest
votes
15 Answers
15
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I quote at length from the Wikipedia essay on the history of knot theory:
In 1867 after observing Scottish physicist Peter Tait's experiments involving smoke rings, Thomson came to the idea that atoms were knots of swirling vortices in the æther. Chemical elements would thus correspond to knots and links. Tait's experiments were inspired by a paper of Helmholtz's on vortex-rings in incompressible fluids. Thomson and Tait believed that an understanding and classification of all possible knots would explain why atoms absorb and emit light at only the discrete wavelengths that they do. For example, Thomson thought that sodium could be the Hopf link due to its two lines of spectra.
Tait subsequently began listing unique knots in the belief that he was creating a table of elements. He formulated what are now known as the Tait conjectures on alternating knots. (The conjectures were proved in the 1990s.) Tait's knot tables were subsequently improved upon by C. N. Little and Thomas Kirkman.
James Clerk Maxwell, a colleague and friend of Thomson's and Tait's, also developed a strong interest in knots. Maxwell studied Listing's work on knots. He re-interpreted Gauss' linking integral in terms of electromagnetic theory. In his formulation, the integral represented the work done by a charged particle moving along one component of the link under the influence of the magnetic field generated by an electric current along the other component. Maxwell also continued the study of smoke rings by considering three interacting rings.
When the luminiferous æther was not detected in the Michelson–Morley experiment, vortex theory became completely obsolete, and knot theory ceased to be of great scientific interest. Modern physics demonstrates that the discrete wavelengths depend on quantum energy levels.
$endgroup$
add a comment
|
$begingroup$
I quote at length from the Wikipedia essay on the history of knot theory:
In 1867 after observing Scottish physicist Peter Tait's experiments involving smoke rings, Thomson came to the idea that atoms were knots of swirling vortices in the æther. Chemical elements would thus correspond to knots and links. Tait's experiments were inspired by a paper of Helmholtz's on vortex-rings in incompressible fluids. Thomson and Tait believed that an understanding and classification of all possible knots would explain why atoms absorb and emit light at only the discrete wavelengths that they do. For example, Thomson thought that sodium could be the Hopf link due to its two lines of spectra.
Tait subsequently began listing unique knots in the belief that he was creating a table of elements. He formulated what are now known as the Tait conjectures on alternating knots. (The conjectures were proved in the 1990s.) Tait's knot tables were subsequently improved upon by C. N. Little and Thomas Kirkman.
James Clerk Maxwell, a colleague and friend of Thomson's and Tait's, also developed a strong interest in knots. Maxwell studied Listing's work on knots. He re-interpreted Gauss' linking integral in terms of electromagnetic theory. In his formulation, the integral represented the work done by a charged particle moving along one component of the link under the influence of the magnetic field generated by an electric current along the other component. Maxwell also continued the study of smoke rings by considering three interacting rings.
When the luminiferous æther was not detected in the Michelson–Morley experiment, vortex theory became completely obsolete, and knot theory ceased to be of great scientific interest. Modern physics demonstrates that the discrete wavelengths depend on quantum energy levels.
$endgroup$
add a comment
|
$begingroup$
I quote at length from the Wikipedia essay on the history of knot theory:
In 1867 after observing Scottish physicist Peter Tait's experiments involving smoke rings, Thomson came to the idea that atoms were knots of swirling vortices in the æther. Chemical elements would thus correspond to knots and links. Tait's experiments were inspired by a paper of Helmholtz's on vortex-rings in incompressible fluids. Thomson and Tait believed that an understanding and classification of all possible knots would explain why atoms absorb and emit light at only the discrete wavelengths that they do. For example, Thomson thought that sodium could be the Hopf link due to its two lines of spectra.
Tait subsequently began listing unique knots in the belief that he was creating a table of elements. He formulated what are now known as the Tait conjectures on alternating knots. (The conjectures were proved in the 1990s.) Tait's knot tables were subsequently improved upon by C. N. Little and Thomas Kirkman.
James Clerk Maxwell, a colleague and friend of Thomson's and Tait's, also developed a strong interest in knots. Maxwell studied Listing's work on knots. He re-interpreted Gauss' linking integral in terms of electromagnetic theory. In his formulation, the integral represented the work done by a charged particle moving along one component of the link under the influence of the magnetic field generated by an electric current along the other component. Maxwell also continued the study of smoke rings by considering three interacting rings.
When the luminiferous æther was not detected in the Michelson–Morley experiment, vortex theory became completely obsolete, and knot theory ceased to be of great scientific interest. Modern physics demonstrates that the discrete wavelengths depend on quantum energy levels.
$endgroup$
I quote at length from the Wikipedia essay on the history of knot theory:
In 1867 after observing Scottish physicist Peter Tait's experiments involving smoke rings, Thomson came to the idea that atoms were knots of swirling vortices in the æther. Chemical elements would thus correspond to knots and links. Tait's experiments were inspired by a paper of Helmholtz's on vortex-rings in incompressible fluids. Thomson and Tait believed that an understanding and classification of all possible knots would explain why atoms absorb and emit light at only the discrete wavelengths that they do. For example, Thomson thought that sodium could be the Hopf link due to its two lines of spectra.
Tait subsequently began listing unique knots in the belief that he was creating a table of elements. He formulated what are now known as the Tait conjectures on alternating knots. (The conjectures were proved in the 1990s.) Tait's knot tables were subsequently improved upon by C. N. Little and Thomas Kirkman.
James Clerk Maxwell, a colleague and friend of Thomson's and Tait's, also developed a strong interest in knots. Maxwell studied Listing's work on knots. He re-interpreted Gauss' linking integral in terms of electromagnetic theory. In his formulation, the integral represented the work done by a charged particle moving along one component of the link under the influence of the magnetic field generated by an electric current along the other component. Maxwell also continued the study of smoke rings by considering three interacting rings.
When the luminiferous æther was not detected in the Michelson–Morley experiment, vortex theory became completely obsolete, and knot theory ceased to be of great scientific interest. Modern physics demonstrates that the discrete wavelengths depend on quantum energy levels.
edited Sep 19 at 12:46
community wiki
Gerry Myerson
add a comment
|
add a comment
|
$begingroup$
"The modern study of knots grew out an attempt by three 19th-century Scottish
physicists to apply knot theory to fundamental questions about the universe".
$endgroup$
3
$begingroup$
You beat me by 16 seconds.
$endgroup$
– Gerry Myerson
Sep 19 at 4:15
3
$begingroup$
@GerryMyerson well, there was a bit more typing in yours...
$endgroup$
– Nik Weaver
Sep 19 at 11:04
5
$begingroup$
Not really, it was all cut'n'paste from Wikipedia.
$endgroup$
– Gerry Myerson
Sep 19 at 11:51
2
$begingroup$
Perhaps the same thing will happen with string theory, in the future it may be remembered as something which give birth to a lot of interesting mathematics, but without having direct physical relevance.
$endgroup$
– Tom
Sep 20 at 12:42
1
$begingroup$
Links tend to break over time, so best to give the title and author of any linked paper: "Knot Theory’s Odd Origins" by Silver.
$endgroup$
– Tom Copeland
Oct 5 at 21:10
|
show 2 more comments
$begingroup$
"The modern study of knots grew out an attempt by three 19th-century Scottish
physicists to apply knot theory to fundamental questions about the universe".
$endgroup$
3
$begingroup$
You beat me by 16 seconds.
$endgroup$
– Gerry Myerson
Sep 19 at 4:15
3
$begingroup$
@GerryMyerson well, there was a bit more typing in yours...
$endgroup$
– Nik Weaver
Sep 19 at 11:04
5
$begingroup$
Not really, it was all cut'n'paste from Wikipedia.
$endgroup$
– Gerry Myerson
Sep 19 at 11:51
2
$begingroup$
Perhaps the same thing will happen with string theory, in the future it may be remembered as something which give birth to a lot of interesting mathematics, but without having direct physical relevance.
$endgroup$
– Tom
Sep 20 at 12:42
1
$begingroup$
Links tend to break over time, so best to give the title and author of any linked paper: "Knot Theory’s Odd Origins" by Silver.
$endgroup$
– Tom Copeland
Oct 5 at 21:10
|
show 2 more comments
$begingroup$
"The modern study of knots grew out an attempt by three 19th-century Scottish
physicists to apply knot theory to fundamental questions about the universe".
$endgroup$
"The modern study of knots grew out an attempt by three 19th-century Scottish
physicists to apply knot theory to fundamental questions about the universe".
answered Sep 18 at 22:17
community wiki
Nik Weaver
3
$begingroup$
You beat me by 16 seconds.
$endgroup$
– Gerry Myerson
Sep 19 at 4:15
3
$begingroup$
@GerryMyerson well, there was a bit more typing in yours...
$endgroup$
– Nik Weaver
Sep 19 at 11:04
5
$begingroup$
Not really, it was all cut'n'paste from Wikipedia.
$endgroup$
– Gerry Myerson
Sep 19 at 11:51
2
$begingroup$
Perhaps the same thing will happen with string theory, in the future it may be remembered as something which give birth to a lot of interesting mathematics, but without having direct physical relevance.
$endgroup$
– Tom
Sep 20 at 12:42
1
$begingroup$
Links tend to break over time, so best to give the title and author of any linked paper: "Knot Theory’s Odd Origins" by Silver.
$endgroup$
– Tom Copeland
Oct 5 at 21:10
|
show 2 more comments
3
$begingroup$
You beat me by 16 seconds.
$endgroup$
– Gerry Myerson
Sep 19 at 4:15
3
$begingroup$
@GerryMyerson well, there was a bit more typing in yours...
$endgroup$
– Nik Weaver
Sep 19 at 11:04
5
$begingroup$
Not really, it was all cut'n'paste from Wikipedia.
$endgroup$
– Gerry Myerson
Sep 19 at 11:51
2
$begingroup$
Perhaps the same thing will happen with string theory, in the future it may be remembered as something which give birth to a lot of interesting mathematics, but without having direct physical relevance.
$endgroup$
– Tom
Sep 20 at 12:42
1
$begingroup$
Links tend to break over time, so best to give the title and author of any linked paper: "Knot Theory’s Odd Origins" by Silver.
$endgroup$
– Tom Copeland
Oct 5 at 21:10
3
3
$begingroup$
You beat me by 16 seconds.
$endgroup$
– Gerry Myerson
Sep 19 at 4:15
$begingroup$
You beat me by 16 seconds.
$endgroup$
– Gerry Myerson
Sep 19 at 4:15
3
3
$begingroup$
@GerryMyerson well, there was a bit more typing in yours...
$endgroup$
– Nik Weaver
Sep 19 at 11:04
$begingroup$
@GerryMyerson well, there was a bit more typing in yours...
$endgroup$
– Nik Weaver
Sep 19 at 11:04
5
5
$begingroup$
Not really, it was all cut'n'paste from Wikipedia.
$endgroup$
– Gerry Myerson
Sep 19 at 11:51
$begingroup$
Not really, it was all cut'n'paste from Wikipedia.
$endgroup$
– Gerry Myerson
Sep 19 at 11:51
2
2
$begingroup$
Perhaps the same thing will happen with string theory, in the future it may be remembered as something which give birth to a lot of interesting mathematics, but without having direct physical relevance.
$endgroup$
– Tom
Sep 20 at 12:42
$begingroup$
Perhaps the same thing will happen with string theory, in the future it may be remembered as something which give birth to a lot of interesting mathematics, but without having direct physical relevance.
$endgroup$
– Tom
Sep 20 at 12:42
1
1
$begingroup$
Links tend to break over time, so best to give the title and author of any linked paper: "Knot Theory’s Odd Origins" by Silver.
$endgroup$
– Tom Copeland
Oct 5 at 21:10
$begingroup$
Links tend to break over time, so best to give the title and author of any linked paper: "Knot Theory’s Odd Origins" by Silver.
$endgroup$
– Tom Copeland
Oct 5 at 21:10
|
show 2 more comments
$begingroup$
Motives and the standard conjectures were developed by Grothendieck to prove the last of the Weil conjectures. They failed at this as none of the standard conjectures were proven - despite some progress on this, I would say we are not closer to proving the Weil conjectures via the standard conjectures today than we were when they were first formulated - and Deligne showed that Grothendieck's earlier invention of etale cohomology was perfectly sufficient to prove the Weil conjectures.
However, since that time different notions of motive were constructed, with different useful properties, in addition to Grothendieck's, and many of them have found applications in areas of algebraic geometry and number theory, with the first really big one being Voevodsky's Fields medal-winning proof of the Milnor conjecture.
$endgroup$
add a comment
|
$begingroup$
Motives and the standard conjectures were developed by Grothendieck to prove the last of the Weil conjectures. They failed at this as none of the standard conjectures were proven - despite some progress on this, I would say we are not closer to proving the Weil conjectures via the standard conjectures today than we were when they were first formulated - and Deligne showed that Grothendieck's earlier invention of etale cohomology was perfectly sufficient to prove the Weil conjectures.
However, since that time different notions of motive were constructed, with different useful properties, in addition to Grothendieck's, and many of them have found applications in areas of algebraic geometry and number theory, with the first really big one being Voevodsky's Fields medal-winning proof of the Milnor conjecture.
$endgroup$
add a comment
|
$begingroup$
Motives and the standard conjectures were developed by Grothendieck to prove the last of the Weil conjectures. They failed at this as none of the standard conjectures were proven - despite some progress on this, I would say we are not closer to proving the Weil conjectures via the standard conjectures today than we were when they were first formulated - and Deligne showed that Grothendieck's earlier invention of etale cohomology was perfectly sufficient to prove the Weil conjectures.
However, since that time different notions of motive were constructed, with different useful properties, in addition to Grothendieck's, and many of them have found applications in areas of algebraic geometry and number theory, with the first really big one being Voevodsky's Fields medal-winning proof of the Milnor conjecture.
$endgroup$
Motives and the standard conjectures were developed by Grothendieck to prove the last of the Weil conjectures. They failed at this as none of the standard conjectures were proven - despite some progress on this, I would say we are not closer to proving the Weil conjectures via the standard conjectures today than we were when they were first formulated - and Deligne showed that Grothendieck's earlier invention of etale cohomology was perfectly sufficient to prove the Weil conjectures.
However, since that time different notions of motive were constructed, with different useful properties, in addition to Grothendieck's, and many of them have found applications in areas of algebraic geometry and number theory, with the first really big one being Voevodsky's Fields medal-winning proof of the Milnor conjecture.
answered Sep 20 at 0:01
community wiki
Will Sawin
add a comment
|
add a comment
|
$begingroup$
String Theory!
String Theory was born in the context of strong interactions inside atomic nuclei, since the 60s-70s. The theory turned out not suited to describe the strong force, and was supplanted around 1973 by the rising Quantum Chromodynamics (our current best model for the strong force interactions). Among the reasons for the failure, there was the mandatory presence of unwanted spin 2 particles...
Those particles are now interpreted as gravitons! And String Theory is now seen as a theory of quantum-gravity, describing all the known forces (electromagnetic, weak, strong) and gravity at the same time! That’s a pretty big afterlife!
$endgroup$
24
$begingroup$
The "is describing" in your final paragraph is a trifle optimistic. Nobody has yet produced a string theory that describes the standard model in 4 dimensions as a low-energy limit.
$endgroup$
– Robert Furber
Sep 19 at 15:32
5
$begingroup$
Oh yes, those are the hidden terms and conditions of those words.
$endgroup$
– Rexcirus
Sep 19 at 15:47
$begingroup$
@RobertFurber That's not true. The problem is rather that there are too many ways to produce the Standard Model in string theory. As an example, take a look at arxiv.org/abs/1903.00009
$endgroup$
– Vigod
Sep 28 at 2:08
add a comment
|
$begingroup$
String Theory!
String Theory was born in the context of strong interactions inside atomic nuclei, since the 60s-70s. The theory turned out not suited to describe the strong force, and was supplanted around 1973 by the rising Quantum Chromodynamics (our current best model for the strong force interactions). Among the reasons for the failure, there was the mandatory presence of unwanted spin 2 particles...
Those particles are now interpreted as gravitons! And String Theory is now seen as a theory of quantum-gravity, describing all the known forces (electromagnetic, weak, strong) and gravity at the same time! That’s a pretty big afterlife!
$endgroup$
24
$begingroup$
The "is describing" in your final paragraph is a trifle optimistic. Nobody has yet produced a string theory that describes the standard model in 4 dimensions as a low-energy limit.
$endgroup$
– Robert Furber
Sep 19 at 15:32
5
$begingroup$
Oh yes, those are the hidden terms and conditions of those words.
$endgroup$
– Rexcirus
Sep 19 at 15:47
$begingroup$
@RobertFurber That's not true. The problem is rather that there are too many ways to produce the Standard Model in string theory. As an example, take a look at arxiv.org/abs/1903.00009
$endgroup$
– Vigod
Sep 28 at 2:08
add a comment
|
$begingroup$
String Theory!
String Theory was born in the context of strong interactions inside atomic nuclei, since the 60s-70s. The theory turned out not suited to describe the strong force, and was supplanted around 1973 by the rising Quantum Chromodynamics (our current best model for the strong force interactions). Among the reasons for the failure, there was the mandatory presence of unwanted spin 2 particles...
Those particles are now interpreted as gravitons! And String Theory is now seen as a theory of quantum-gravity, describing all the known forces (electromagnetic, weak, strong) and gravity at the same time! That’s a pretty big afterlife!
$endgroup$
String Theory!
String Theory was born in the context of strong interactions inside atomic nuclei, since the 60s-70s. The theory turned out not suited to describe the strong force, and was supplanted around 1973 by the rising Quantum Chromodynamics (our current best model for the strong force interactions). Among the reasons for the failure, there was the mandatory presence of unwanted spin 2 particles...
Those particles are now interpreted as gravitons! And String Theory is now seen as a theory of quantum-gravity, describing all the known forces (electromagnetic, weak, strong) and gravity at the same time! That’s a pretty big afterlife!
edited Sep 20 at 9:47
community wiki
3 revs, 2 users 92%
Rexcirus
24
$begingroup$
The "is describing" in your final paragraph is a trifle optimistic. Nobody has yet produced a string theory that describes the standard model in 4 dimensions as a low-energy limit.
$endgroup$
– Robert Furber
Sep 19 at 15:32
5
$begingroup$
Oh yes, those are the hidden terms and conditions of those words.
$endgroup$
– Rexcirus
Sep 19 at 15:47
$begingroup$
@RobertFurber That's not true. The problem is rather that there are too many ways to produce the Standard Model in string theory. As an example, take a look at arxiv.org/abs/1903.00009
$endgroup$
– Vigod
Sep 28 at 2:08
add a comment
|
24
$begingroup$
The "is describing" in your final paragraph is a trifle optimistic. Nobody has yet produced a string theory that describes the standard model in 4 dimensions as a low-energy limit.
$endgroup$
– Robert Furber
Sep 19 at 15:32
5
$begingroup$
Oh yes, those are the hidden terms and conditions of those words.
$endgroup$
– Rexcirus
Sep 19 at 15:47
$begingroup$
@RobertFurber That's not true. The problem is rather that there are too many ways to produce the Standard Model in string theory. As an example, take a look at arxiv.org/abs/1903.00009
$endgroup$
– Vigod
Sep 28 at 2:08
24
24
$begingroup$
The "is describing" in your final paragraph is a trifle optimistic. Nobody has yet produced a string theory that describes the standard model in 4 dimensions as a low-energy limit.
$endgroup$
– Robert Furber
Sep 19 at 15:32
$begingroup$
The "is describing" in your final paragraph is a trifle optimistic. Nobody has yet produced a string theory that describes the standard model in 4 dimensions as a low-energy limit.
$endgroup$
– Robert Furber
Sep 19 at 15:32
5
5
$begingroup$
Oh yes, those are the hidden terms and conditions of those words.
$endgroup$
– Rexcirus
Sep 19 at 15:47
$begingroup$
Oh yes, those are the hidden terms and conditions of those words.
$endgroup$
– Rexcirus
Sep 19 at 15:47
$begingroup$
@RobertFurber That's not true. The problem is rather that there are too many ways to produce the Standard Model in string theory. As an example, take a look at arxiv.org/abs/1903.00009
$endgroup$
– Vigod
Sep 28 at 2:08
$begingroup$
@RobertFurber That's not true. The problem is rather that there are too many ways to produce the Standard Model in string theory. As an example, take a look at arxiv.org/abs/1903.00009
$endgroup$
– Vigod
Sep 28 at 2:08
add a comment
|
$begingroup$
(Converted from a comment to an answer as requested.)
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry, as can be seen in particular through the pioneering work of Saccheri in this area, who tried in vain to prove the parallel postulate by contradiction and ended up proving a large number of foundational results in what we would now call elliptic and hyperbolic geometry as a consequence. (See for instance this article of Fitzpatrick, or this McTutor article on Non-Euclidean geometry.)
Nowadays, the classical non-Euclidean geometries (the elliptic geometry of the sphere, and the hyperbolic geometry of hyperbolic space) play the important role of describing two of the basic model geometries in Riemannian geometry, namely the simply connected geometries of constant and isotropic positive or negative curvature respectively. (In two dimensions, where Riemann curvature is effectively a scalar quantity, these two geometries, together with Euclidean geometry, are the only models needed; in higher dimensions there are however other model geometries of interest, such as the remaining five Thurston geometries of the geometrisation conjecture in three dimensions.)
$endgroup$
$begingroup$
Great ecample of how a beautiful theory can be in the air until it finally sharply crystalizes in the minds of innovators looking at it from a radical angle.
$endgroup$
– Tom Copeland
Sep 25 at 13:40
add a comment
|
$begingroup$
(Converted from a comment to an answer as requested.)
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry, as can be seen in particular through the pioneering work of Saccheri in this area, who tried in vain to prove the parallel postulate by contradiction and ended up proving a large number of foundational results in what we would now call elliptic and hyperbolic geometry as a consequence. (See for instance this article of Fitzpatrick, or this McTutor article on Non-Euclidean geometry.)
Nowadays, the classical non-Euclidean geometries (the elliptic geometry of the sphere, and the hyperbolic geometry of hyperbolic space) play the important role of describing two of the basic model geometries in Riemannian geometry, namely the simply connected geometries of constant and isotropic positive or negative curvature respectively. (In two dimensions, where Riemann curvature is effectively a scalar quantity, these two geometries, together with Euclidean geometry, are the only models needed; in higher dimensions there are however other model geometries of interest, such as the remaining five Thurston geometries of the geometrisation conjecture in three dimensions.)
$endgroup$
$begingroup$
Great ecample of how a beautiful theory can be in the air until it finally sharply crystalizes in the minds of innovators looking at it from a radical angle.
$endgroup$
– Tom Copeland
Sep 25 at 13:40
add a comment
|
$begingroup$
(Converted from a comment to an answer as requested.)
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry, as can be seen in particular through the pioneering work of Saccheri in this area, who tried in vain to prove the parallel postulate by contradiction and ended up proving a large number of foundational results in what we would now call elliptic and hyperbolic geometry as a consequence. (See for instance this article of Fitzpatrick, or this McTutor article on Non-Euclidean geometry.)
Nowadays, the classical non-Euclidean geometries (the elliptic geometry of the sphere, and the hyperbolic geometry of hyperbolic space) play the important role of describing two of the basic model geometries in Riemannian geometry, namely the simply connected geometries of constant and isotropic positive or negative curvature respectively. (In two dimensions, where Riemann curvature is effectively a scalar quantity, these two geometries, together with Euclidean geometry, are the only models needed; in higher dimensions there are however other model geometries of interest, such as the remaining five Thurston geometries of the geometrisation conjecture in three dimensions.)
$endgroup$
(Converted from a comment to an answer as requested.)
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry, as can be seen in particular through the pioneering work of Saccheri in this area, who tried in vain to prove the parallel postulate by contradiction and ended up proving a large number of foundational results in what we would now call elliptic and hyperbolic geometry as a consequence. (See for instance this article of Fitzpatrick, or this McTutor article on Non-Euclidean geometry.)
Nowadays, the classical non-Euclidean geometries (the elliptic geometry of the sphere, and the hyperbolic geometry of hyperbolic space) play the important role of describing two of the basic model geometries in Riemannian geometry, namely the simply connected geometries of constant and isotropic positive or negative curvature respectively. (In two dimensions, where Riemann curvature is effectively a scalar quantity, these two geometries, together with Euclidean geometry, are the only models needed; in higher dimensions there are however other model geometries of interest, such as the remaining five Thurston geometries of the geometrisation conjecture in three dimensions.)
answered Sep 20 at 19:59
community wiki
Terry Tao
$begingroup$
Great ecample of how a beautiful theory can be in the air until it finally sharply crystalizes in the minds of innovators looking at it from a radical angle.
$endgroup$
– Tom Copeland
Sep 25 at 13:40
add a comment
|
$begingroup$
Great ecample of how a beautiful theory can be in the air until it finally sharply crystalizes in the minds of innovators looking at it from a radical angle.
$endgroup$
– Tom Copeland
Sep 25 at 13:40
$begingroup$
Great ecample of how a beautiful theory can be in the air until it finally sharply crystalizes in the minds of innovators looking at it from a radical angle.
$endgroup$
– Tom Copeland
Sep 25 at 13:40
$begingroup$
Great ecample of how a beautiful theory can be in the air until it finally sharply crystalizes in the minds of innovators looking at it from a radical angle.
$endgroup$
– Tom Copeland
Sep 25 at 13:40
add a comment
|
$begingroup$
This is a copy of a copy of some history of the origins of free probability by Dan Voiculescu extracted from a response by Roland Speicher, a developer of the field, to an MO-Q:
This is from his article "Background and Outlook" in the Lectures Notes
"Free Probability and Operator Algebras", see
http://www.ems-ph.org/books/book.php?proj_nr=208
Just before starting in this new direction, I had worked with Mihai Pimsner,
computing the K-theory of the reduced $C^*$-algebras of free groups. From the
K-theory work I had acquired a taste for operator algebras associated with free
groups and I became interested in a famous problem about the von Neumann
algebras $L(mathbbF_n)$ generated by the left regular representations of free groups,
which appears in Kadison's Baton-Rouge problem list. The problem, which
may have already been known to Murray and von Neumann, is:are $L(mathbbF_m)$ and $L(mathbbF_n)$ non-isomorphic if $m not= n$?
This is still an open problem. Fortunately, after trying in vain to solve it,
I realized it was time to be more humble and to ask: is there anything I can
do, which may be useful in connection with this problem? Since I had come
across computations of norms and spectra of certain convolution operators on
free groups (i.e., elements of $L(mathbbF_n)$), I thought of finding ways to streamline
some of these computations and perhaps be able to compute more complicated
examples. This, of course, meant computing expectations of powers of such
operators with respect to the von Neumann trace-state $tau(T) = langle T e_e,e_erangle$, $e_g$
being the canonical basis of the $l^2$-space.
The key remark I made was that if $T_1$, $T_2$ are convolution operators on $mathbbF_m$
and $mathbbF_n$ then the operator on $mathbbF_m+n = mathbbF_m ast mathbbF_n$ which is $T_1 + T_2$, has moments $tau((T_1 + T_2)^p)$ which depend only on the moments $tau(T_j^k)$, $j = 1, 2$ , but not
on the actual $T_1$ and $T_2$. This was like the addition of independent random
variables, only classical independence had to be replaced by a notion of free
independence, which led to a free central limit theorem, a free analogue of
the Gaussian functor, free convolution, an abstract existence theorem for one
variable free cumulants, etc.
$endgroup$
$begingroup$
Good intro to the topic: "Three lectures on free probability" by Jonathan Novak and Michael LaCroix arxiv.org/abs/1205.2097
$endgroup$
– Tom Copeland
Sep 18 at 23:55
1
$begingroup$
More here: bcc.impan.pl/15TQG/uploads/pdf/Speicher-Post.pdf
$endgroup$
– JP McCarthy
Sep 19 at 12:53
2
$begingroup$
, i.e., "Compact Quantum Groups and Free Combinatorics" a presentation by Roland Speicher
$endgroup$
– Tom Copeland
Sep 19 at 14:58
add a comment
|
$begingroup$
This is a copy of a copy of some history of the origins of free probability by Dan Voiculescu extracted from a response by Roland Speicher, a developer of the field, to an MO-Q:
This is from his article "Background and Outlook" in the Lectures Notes
"Free Probability and Operator Algebras", see
http://www.ems-ph.org/books/book.php?proj_nr=208
Just before starting in this new direction, I had worked with Mihai Pimsner,
computing the K-theory of the reduced $C^*$-algebras of free groups. From the
K-theory work I had acquired a taste for operator algebras associated with free
groups and I became interested in a famous problem about the von Neumann
algebras $L(mathbbF_n)$ generated by the left regular representations of free groups,
which appears in Kadison's Baton-Rouge problem list. The problem, which
may have already been known to Murray and von Neumann, is:are $L(mathbbF_m)$ and $L(mathbbF_n)$ non-isomorphic if $m not= n$?
This is still an open problem. Fortunately, after trying in vain to solve it,
I realized it was time to be more humble and to ask: is there anything I can
do, which may be useful in connection with this problem? Since I had come
across computations of norms and spectra of certain convolution operators on
free groups (i.e., elements of $L(mathbbF_n)$), I thought of finding ways to streamline
some of these computations and perhaps be able to compute more complicated
examples. This, of course, meant computing expectations of powers of such
operators with respect to the von Neumann trace-state $tau(T) = langle T e_e,e_erangle$, $e_g$
being the canonical basis of the $l^2$-space.
The key remark I made was that if $T_1$, $T_2$ are convolution operators on $mathbbF_m$
and $mathbbF_n$ then the operator on $mathbbF_m+n = mathbbF_m ast mathbbF_n$ which is $T_1 + T_2$, has moments $tau((T_1 + T_2)^p)$ which depend only on the moments $tau(T_j^k)$, $j = 1, 2$ , but not
on the actual $T_1$ and $T_2$. This was like the addition of independent random
variables, only classical independence had to be replaced by a notion of free
independence, which led to a free central limit theorem, a free analogue of
the Gaussian functor, free convolution, an abstract existence theorem for one
variable free cumulants, etc.
$endgroup$
$begingroup$
Good intro to the topic: "Three lectures on free probability" by Jonathan Novak and Michael LaCroix arxiv.org/abs/1205.2097
$endgroup$
– Tom Copeland
Sep 18 at 23:55
1
$begingroup$
More here: bcc.impan.pl/15TQG/uploads/pdf/Speicher-Post.pdf
$endgroup$
– JP McCarthy
Sep 19 at 12:53
2
$begingroup$
, i.e., "Compact Quantum Groups and Free Combinatorics" a presentation by Roland Speicher
$endgroup$
– Tom Copeland
Sep 19 at 14:58
add a comment
|
$begingroup$
This is a copy of a copy of some history of the origins of free probability by Dan Voiculescu extracted from a response by Roland Speicher, a developer of the field, to an MO-Q:
This is from his article "Background and Outlook" in the Lectures Notes
"Free Probability and Operator Algebras", see
http://www.ems-ph.org/books/book.php?proj_nr=208
Just before starting in this new direction, I had worked with Mihai Pimsner,
computing the K-theory of the reduced $C^*$-algebras of free groups. From the
K-theory work I had acquired a taste for operator algebras associated with free
groups and I became interested in a famous problem about the von Neumann
algebras $L(mathbbF_n)$ generated by the left regular representations of free groups,
which appears in Kadison's Baton-Rouge problem list. The problem, which
may have already been known to Murray and von Neumann, is:are $L(mathbbF_m)$ and $L(mathbbF_n)$ non-isomorphic if $m not= n$?
This is still an open problem. Fortunately, after trying in vain to solve it,
I realized it was time to be more humble and to ask: is there anything I can
do, which may be useful in connection with this problem? Since I had come
across computations of norms and spectra of certain convolution operators on
free groups (i.e., elements of $L(mathbbF_n)$), I thought of finding ways to streamline
some of these computations and perhaps be able to compute more complicated
examples. This, of course, meant computing expectations of powers of such
operators with respect to the von Neumann trace-state $tau(T) = langle T e_e,e_erangle$, $e_g$
being the canonical basis of the $l^2$-space.
The key remark I made was that if $T_1$, $T_2$ are convolution operators on $mathbbF_m$
and $mathbbF_n$ then the operator on $mathbbF_m+n = mathbbF_m ast mathbbF_n$ which is $T_1 + T_2$, has moments $tau((T_1 + T_2)^p)$ which depend only on the moments $tau(T_j^k)$, $j = 1, 2$ , but not
on the actual $T_1$ and $T_2$. This was like the addition of independent random
variables, only classical independence had to be replaced by a notion of free
independence, which led to a free central limit theorem, a free analogue of
the Gaussian functor, free convolution, an abstract existence theorem for one
variable free cumulants, etc.
$endgroup$
This is a copy of a copy of some history of the origins of free probability by Dan Voiculescu extracted from a response by Roland Speicher, a developer of the field, to an MO-Q:
This is from his article "Background and Outlook" in the Lectures Notes
"Free Probability and Operator Algebras", see
http://www.ems-ph.org/books/book.php?proj_nr=208
Just before starting in this new direction, I had worked with Mihai Pimsner,
computing the K-theory of the reduced $C^*$-algebras of free groups. From the
K-theory work I had acquired a taste for operator algebras associated with free
groups and I became interested in a famous problem about the von Neumann
algebras $L(mathbbF_n)$ generated by the left regular representations of free groups,
which appears in Kadison's Baton-Rouge problem list. The problem, which
may have already been known to Murray and von Neumann, is:are $L(mathbbF_m)$ and $L(mathbbF_n)$ non-isomorphic if $m not= n$?
This is still an open problem. Fortunately, after trying in vain to solve it,
I realized it was time to be more humble and to ask: is there anything I can
do, which may be useful in connection with this problem? Since I had come
across computations of norms and spectra of certain convolution operators on
free groups (i.e., elements of $L(mathbbF_n)$), I thought of finding ways to streamline
some of these computations and perhaps be able to compute more complicated
examples. This, of course, meant computing expectations of powers of such
operators with respect to the von Neumann trace-state $tau(T) = langle T e_e,e_erangle$, $e_g$
being the canonical basis of the $l^2$-space.
The key remark I made was that if $T_1$, $T_2$ are convolution operators on $mathbbF_m$
and $mathbbF_n$ then the operator on $mathbbF_m+n = mathbbF_m ast mathbbF_n$ which is $T_1 + T_2$, has moments $tau((T_1 + T_2)^p)$ which depend only on the moments $tau(T_j^k)$, $j = 1, 2$ , but not
on the actual $T_1$ and $T_2$. This was like the addition of independent random
variables, only classical independence had to be replaced by a notion of free
independence, which led to a free central limit theorem, a free analogue of
the Gaussian functor, free convolution, an abstract existence theorem for one
variable free cumulants, etc.
answered Sep 18 at 23:51
community wiki
Tom Copeland
$begingroup$
Good intro to the topic: "Three lectures on free probability" by Jonathan Novak and Michael LaCroix arxiv.org/abs/1205.2097
$endgroup$
– Tom Copeland
Sep 18 at 23:55
1
$begingroup$
More here: bcc.impan.pl/15TQG/uploads/pdf/Speicher-Post.pdf
$endgroup$
– JP McCarthy
Sep 19 at 12:53
2
$begingroup$
, i.e., "Compact Quantum Groups and Free Combinatorics" a presentation by Roland Speicher
$endgroup$
– Tom Copeland
Sep 19 at 14:58
add a comment
|
$begingroup$
Good intro to the topic: "Three lectures on free probability" by Jonathan Novak and Michael LaCroix arxiv.org/abs/1205.2097
$endgroup$
– Tom Copeland
Sep 18 at 23:55
1
$begingroup$
More here: bcc.impan.pl/15TQG/uploads/pdf/Speicher-Post.pdf
$endgroup$
– JP McCarthy
Sep 19 at 12:53
2
$begingroup$
, i.e., "Compact Quantum Groups and Free Combinatorics" a presentation by Roland Speicher
$endgroup$
– Tom Copeland
Sep 19 at 14:58
$begingroup$
Good intro to the topic: "Three lectures on free probability" by Jonathan Novak and Michael LaCroix arxiv.org/abs/1205.2097
$endgroup$
– Tom Copeland
Sep 18 at 23:55
$begingroup$
Good intro to the topic: "Three lectures on free probability" by Jonathan Novak and Michael LaCroix arxiv.org/abs/1205.2097
$endgroup$
– Tom Copeland
Sep 18 at 23:55
1
1
$begingroup$
More here: bcc.impan.pl/15TQG/uploads/pdf/Speicher-Post.pdf
$endgroup$
– JP McCarthy
Sep 19 at 12:53
$begingroup$
More here: bcc.impan.pl/15TQG/uploads/pdf/Speicher-Post.pdf
$endgroup$
– JP McCarthy
Sep 19 at 12:53
2
2
$begingroup$
, i.e., "Compact Quantum Groups and Free Combinatorics" a presentation by Roland Speicher
$endgroup$
– Tom Copeland
Sep 19 at 14:58
$begingroup$
, i.e., "Compact Quantum Groups and Free Combinatorics" a presentation by Roland Speicher
$endgroup$
– Tom Copeland
Sep 19 at 14:58
add a comment
|
$begingroup$
Multiplication of quaternions was introduced for use in physics for purposes for which cross-products of vectors came to be used and have been used ever since.
But today quaternions are used in computer graphics. I suspect they also have other applications.
$endgroup$
9
$begingroup$
I don't think quaternions were unsuccessful---they successfully provide an algebraic realization of rotations in 3-space. It just happens that this approach was to a large extent superceded by vectors and linear algebra.
$endgroup$
– Kimball
Sep 19 at 9:36
1
$begingroup$
Quaternions are extremely important in pure mathematics still, but Hamilton's original purpose to use them in mechanics was taken over by algebra of finite-dimensional vectors, which is extremely simple and stream-lined.
$endgroup$
– Tom
Sep 20 at 12:46
3
$begingroup$
Hypercomplex number systems in general can be included here: The dual quaternions are used to represent rigid body motions in computer graphics; the dual numbers are used to implement auto-diff (in forward mode only, and largely for pedagogical purposes); Clifford algebras are used to construct spinors in quantum physics.
$endgroup$
– jkabrg
Sep 20 at 20:13
1
$begingroup$
@Tom Is there any published expository summary of the most important or most interesting uses of quaternions in pure mathematics?
$endgroup$
– Michael Hardy
Sep 23 at 4:13
2
$begingroup$
The unit quaternions form a (Lie) group isomorphic to $SU(2)$ and $Spin(3)$ which is ubiquitous in physics.
$endgroup$
– W. Edwin Clark
Sep 24 at 22:46
|
show 1 more comment
$begingroup$
Multiplication of quaternions was introduced for use in physics for purposes for which cross-products of vectors came to be used and have been used ever since.
But today quaternions are used in computer graphics. I suspect they also have other applications.
$endgroup$
9
$begingroup$
I don't think quaternions were unsuccessful---they successfully provide an algebraic realization of rotations in 3-space. It just happens that this approach was to a large extent superceded by vectors and linear algebra.
$endgroup$
– Kimball
Sep 19 at 9:36
1
$begingroup$
Quaternions are extremely important in pure mathematics still, but Hamilton's original purpose to use them in mechanics was taken over by algebra of finite-dimensional vectors, which is extremely simple and stream-lined.
$endgroup$
– Tom
Sep 20 at 12:46
3
$begingroup$
Hypercomplex number systems in general can be included here: The dual quaternions are used to represent rigid body motions in computer graphics; the dual numbers are used to implement auto-diff (in forward mode only, and largely for pedagogical purposes); Clifford algebras are used to construct spinors in quantum physics.
$endgroup$
– jkabrg
Sep 20 at 20:13
1
$begingroup$
@Tom Is there any published expository summary of the most important or most interesting uses of quaternions in pure mathematics?
$endgroup$
– Michael Hardy
Sep 23 at 4:13
2
$begingroup$
The unit quaternions form a (Lie) group isomorphic to $SU(2)$ and $Spin(3)$ which is ubiquitous in physics.
$endgroup$
– W. Edwin Clark
Sep 24 at 22:46
|
show 1 more comment
$begingroup$
Multiplication of quaternions was introduced for use in physics for purposes for which cross-products of vectors came to be used and have been used ever since.
But today quaternions are used in computer graphics. I suspect they also have other applications.
$endgroup$
Multiplication of quaternions was introduced for use in physics for purposes for which cross-products of vectors came to be used and have been used ever since.
But today quaternions are used in computer graphics. I suspect they also have other applications.
answered Sep 19 at 5:11
community wiki
Michael Hardy
9
$begingroup$
I don't think quaternions were unsuccessful---they successfully provide an algebraic realization of rotations in 3-space. It just happens that this approach was to a large extent superceded by vectors and linear algebra.
$endgroup$
– Kimball
Sep 19 at 9:36
1
$begingroup$
Quaternions are extremely important in pure mathematics still, but Hamilton's original purpose to use them in mechanics was taken over by algebra of finite-dimensional vectors, which is extremely simple and stream-lined.
$endgroup$
– Tom
Sep 20 at 12:46
3
$begingroup$
Hypercomplex number systems in general can be included here: The dual quaternions are used to represent rigid body motions in computer graphics; the dual numbers are used to implement auto-diff (in forward mode only, and largely for pedagogical purposes); Clifford algebras are used to construct spinors in quantum physics.
$endgroup$
– jkabrg
Sep 20 at 20:13
1
$begingroup$
@Tom Is there any published expository summary of the most important or most interesting uses of quaternions in pure mathematics?
$endgroup$
– Michael Hardy
Sep 23 at 4:13
2
$begingroup$
The unit quaternions form a (Lie) group isomorphic to $SU(2)$ and $Spin(3)$ which is ubiquitous in physics.
$endgroup$
– W. Edwin Clark
Sep 24 at 22:46
|
show 1 more comment
9
$begingroup$
I don't think quaternions were unsuccessful---they successfully provide an algebraic realization of rotations in 3-space. It just happens that this approach was to a large extent superceded by vectors and linear algebra.
$endgroup$
– Kimball
Sep 19 at 9:36
1
$begingroup$
Quaternions are extremely important in pure mathematics still, but Hamilton's original purpose to use them in mechanics was taken over by algebra of finite-dimensional vectors, which is extremely simple and stream-lined.
$endgroup$
– Tom
Sep 20 at 12:46
3
$begingroup$
Hypercomplex number systems in general can be included here: The dual quaternions are used to represent rigid body motions in computer graphics; the dual numbers are used to implement auto-diff (in forward mode only, and largely for pedagogical purposes); Clifford algebras are used to construct spinors in quantum physics.
$endgroup$
– jkabrg
Sep 20 at 20:13
1
$begingroup$
@Tom Is there any published expository summary of the most important or most interesting uses of quaternions in pure mathematics?
$endgroup$
– Michael Hardy
Sep 23 at 4:13
2
$begingroup$
The unit quaternions form a (Lie) group isomorphic to $SU(2)$ and $Spin(3)$ which is ubiquitous in physics.
$endgroup$
– W. Edwin Clark
Sep 24 at 22:46
9
9
$begingroup$
I don't think quaternions were unsuccessful---they successfully provide an algebraic realization of rotations in 3-space. It just happens that this approach was to a large extent superceded by vectors and linear algebra.
$endgroup$
– Kimball
Sep 19 at 9:36
$begingroup$
I don't think quaternions were unsuccessful---they successfully provide an algebraic realization of rotations in 3-space. It just happens that this approach was to a large extent superceded by vectors and linear algebra.
$endgroup$
– Kimball
Sep 19 at 9:36
1
1
$begingroup$
Quaternions are extremely important in pure mathematics still, but Hamilton's original purpose to use them in mechanics was taken over by algebra of finite-dimensional vectors, which is extremely simple and stream-lined.
$endgroup$
– Tom
Sep 20 at 12:46
$begingroup$
Quaternions are extremely important in pure mathematics still, but Hamilton's original purpose to use them in mechanics was taken over by algebra of finite-dimensional vectors, which is extremely simple and stream-lined.
$endgroup$
– Tom
Sep 20 at 12:46
3
3
$begingroup$
Hypercomplex number systems in general can be included here: The dual quaternions are used to represent rigid body motions in computer graphics; the dual numbers are used to implement auto-diff (in forward mode only, and largely for pedagogical purposes); Clifford algebras are used to construct spinors in quantum physics.
$endgroup$
– jkabrg
Sep 20 at 20:13
$begingroup$
Hypercomplex number systems in general can be included here: The dual quaternions are used to represent rigid body motions in computer graphics; the dual numbers are used to implement auto-diff (in forward mode only, and largely for pedagogical purposes); Clifford algebras are used to construct spinors in quantum physics.
$endgroup$
– jkabrg
Sep 20 at 20:13
1
1
$begingroup$
@Tom Is there any published expository summary of the most important or most interesting uses of quaternions in pure mathematics?
$endgroup$
– Michael Hardy
Sep 23 at 4:13
$begingroup$
@Tom Is there any published expository summary of the most important or most interesting uses of quaternions in pure mathematics?
$endgroup$
– Michael Hardy
Sep 23 at 4:13
2
2
$begingroup$
The unit quaternions form a (Lie) group isomorphic to $SU(2)$ and $Spin(3)$ which is ubiquitous in physics.
$endgroup$
– W. Edwin Clark
Sep 24 at 22:46
$begingroup$
The unit quaternions form a (Lie) group isomorphic to $SU(2)$ and $Spin(3)$ which is ubiquitous in physics.
$endgroup$
– W. Edwin Clark
Sep 24 at 22:46
|
show 1 more comment
$begingroup$
The chromatic polynomial of a graph was originally introduced as part of an attempt to prove the four-color conjecture (now a theorem), but was unsuccessful in that goal. However, the chromatic polynomial continues to be studied to this day as an interesting algebraic invariant of a graph.
$endgroup$
add a comment
|
$begingroup$
The chromatic polynomial of a graph was originally introduced as part of an attempt to prove the four-color conjecture (now a theorem), but was unsuccessful in that goal. However, the chromatic polynomial continues to be studied to this day as an interesting algebraic invariant of a graph.
$endgroup$
add a comment
|
$begingroup$
The chromatic polynomial of a graph was originally introduced as part of an attempt to prove the four-color conjecture (now a theorem), but was unsuccessful in that goal. However, the chromatic polynomial continues to be studied to this day as an interesting algebraic invariant of a graph.
$endgroup$
The chromatic polynomial of a graph was originally introduced as part of an attempt to prove the four-color conjecture (now a theorem), but was unsuccessful in that goal. However, the chromatic polynomial continues to be studied to this day as an interesting algebraic invariant of a graph.
answered Sep 19 at 15:57
community wiki
Timothy Chow
add a comment
|
add a comment
|
$begingroup$
Logic and set theory were developed by Frege, Russell and Whitehead, Hilbert and others in the late 19th, early 20th centuries with the goal of providing a firm foundation for all of Mathematics. In this they failed miserably, but nevertheless they have continued to develop and to be studied for other reasons.
$endgroup$
16
$begingroup$
The current wording sounds like you’re suggesting “logic and set theory” failed miserably as a firm foundation for mathematics, which would be a pretty extreme claim (and I say that as a big proponent of non-set-theoretic foundations). Do you mean just that the specific systems Frege and Russell–Whitehead used failed as foundations? If so then that’s certainly true, but as far as I know they’re not studied much today except for historical interest.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 19 at 12:53
5
$begingroup$
I would say rather "they failed in their project of reducing mathematics to logic" I'd also add that their work led directly to the development of modern computer science.
$endgroup$
– Chris Sunami
Sep 19 at 15:32
5
$begingroup$
Maybe the word "firm" is ambiguous. If we interpret "firm" as "absolutely certain, unassailable, and indubitable" then they indeed failed, but with a weaker notion of "firm" (and also a suitable notion of what a "foundation" is supposed to be) then I think they succeeded.
$endgroup$
– Timothy Chow
Sep 19 at 16:02
3
$begingroup$
I had a feeling that this answer would be a bit more controversial than some others, and perhaps I worded it in an unnecessarily inflammatory way. But I reckon that the idea was to show that arithmetic and other mathematical systems were complete and consistent, and that project was derailed by the incompleteness theorems.
$endgroup$
– Gerry Myerson
Sep 19 at 22:45
3
$begingroup$
@GerryMyerson: I guess this comes down to an interesting historical question: did most early logicians really view proving completeness+consistency of foundations (i.e. the “failed” goals) as an essential or major goal of the foundational project, or were their central motivations more in line with the aspects that succeeded (i.e. a formal system able to encode all mathematics, giving a clear consensus standard for proof correctness in principle)? My impression is more the latter, but I’m not enough of a historian to be certain.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 20 at 21:47
|
show 5 more comments
$begingroup$
Logic and set theory were developed by Frege, Russell and Whitehead, Hilbert and others in the late 19th, early 20th centuries with the goal of providing a firm foundation for all of Mathematics. In this they failed miserably, but nevertheless they have continued to develop and to be studied for other reasons.
$endgroup$
16
$begingroup$
The current wording sounds like you’re suggesting “logic and set theory” failed miserably as a firm foundation for mathematics, which would be a pretty extreme claim (and I say that as a big proponent of non-set-theoretic foundations). Do you mean just that the specific systems Frege and Russell–Whitehead used failed as foundations? If so then that’s certainly true, but as far as I know they’re not studied much today except for historical interest.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 19 at 12:53
5
$begingroup$
I would say rather "they failed in their project of reducing mathematics to logic" I'd also add that their work led directly to the development of modern computer science.
$endgroup$
– Chris Sunami
Sep 19 at 15:32
5
$begingroup$
Maybe the word "firm" is ambiguous. If we interpret "firm" as "absolutely certain, unassailable, and indubitable" then they indeed failed, but with a weaker notion of "firm" (and also a suitable notion of what a "foundation" is supposed to be) then I think they succeeded.
$endgroup$
– Timothy Chow
Sep 19 at 16:02
3
$begingroup$
I had a feeling that this answer would be a bit more controversial than some others, and perhaps I worded it in an unnecessarily inflammatory way. But I reckon that the idea was to show that arithmetic and other mathematical systems were complete and consistent, and that project was derailed by the incompleteness theorems.
$endgroup$
– Gerry Myerson
Sep 19 at 22:45
3
$begingroup$
@GerryMyerson: I guess this comes down to an interesting historical question: did most early logicians really view proving completeness+consistency of foundations (i.e. the “failed” goals) as an essential or major goal of the foundational project, or were their central motivations more in line with the aspects that succeeded (i.e. a formal system able to encode all mathematics, giving a clear consensus standard for proof correctness in principle)? My impression is more the latter, but I’m not enough of a historian to be certain.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 20 at 21:47
|
show 5 more comments
$begingroup$
Logic and set theory were developed by Frege, Russell and Whitehead, Hilbert and others in the late 19th, early 20th centuries with the goal of providing a firm foundation for all of Mathematics. In this they failed miserably, but nevertheless they have continued to develop and to be studied for other reasons.
$endgroup$
Logic and set theory were developed by Frege, Russell and Whitehead, Hilbert and others in the late 19th, early 20th centuries with the goal of providing a firm foundation for all of Mathematics. In this they failed miserably, but nevertheless they have continued to develop and to be studied for other reasons.
answered Sep 19 at 4:19
community wiki
Gerry Myerson
16
$begingroup$
The current wording sounds like you’re suggesting “logic and set theory” failed miserably as a firm foundation for mathematics, which would be a pretty extreme claim (and I say that as a big proponent of non-set-theoretic foundations). Do you mean just that the specific systems Frege and Russell–Whitehead used failed as foundations? If so then that’s certainly true, but as far as I know they’re not studied much today except for historical interest.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 19 at 12:53
5
$begingroup$
I would say rather "they failed in their project of reducing mathematics to logic" I'd also add that their work led directly to the development of modern computer science.
$endgroup$
– Chris Sunami
Sep 19 at 15:32
5
$begingroup$
Maybe the word "firm" is ambiguous. If we interpret "firm" as "absolutely certain, unassailable, and indubitable" then they indeed failed, but with a weaker notion of "firm" (and also a suitable notion of what a "foundation" is supposed to be) then I think they succeeded.
$endgroup$
– Timothy Chow
Sep 19 at 16:02
3
$begingroup$
I had a feeling that this answer would be a bit more controversial than some others, and perhaps I worded it in an unnecessarily inflammatory way. But I reckon that the idea was to show that arithmetic and other mathematical systems were complete and consistent, and that project was derailed by the incompleteness theorems.
$endgroup$
– Gerry Myerson
Sep 19 at 22:45
3
$begingroup$
@GerryMyerson: I guess this comes down to an interesting historical question: did most early logicians really view proving completeness+consistency of foundations (i.e. the “failed” goals) as an essential or major goal of the foundational project, or were their central motivations more in line with the aspects that succeeded (i.e. a formal system able to encode all mathematics, giving a clear consensus standard for proof correctness in principle)? My impression is more the latter, but I’m not enough of a historian to be certain.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 20 at 21:47
|
show 5 more comments
16
$begingroup$
The current wording sounds like you’re suggesting “logic and set theory” failed miserably as a firm foundation for mathematics, which would be a pretty extreme claim (and I say that as a big proponent of non-set-theoretic foundations). Do you mean just that the specific systems Frege and Russell–Whitehead used failed as foundations? If so then that’s certainly true, but as far as I know they’re not studied much today except for historical interest.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 19 at 12:53
5
$begingroup$
I would say rather "they failed in their project of reducing mathematics to logic" I'd also add that their work led directly to the development of modern computer science.
$endgroup$
– Chris Sunami
Sep 19 at 15:32
5
$begingroup$
Maybe the word "firm" is ambiguous. If we interpret "firm" as "absolutely certain, unassailable, and indubitable" then they indeed failed, but with a weaker notion of "firm" (and also a suitable notion of what a "foundation" is supposed to be) then I think they succeeded.
$endgroup$
– Timothy Chow
Sep 19 at 16:02
3
$begingroup$
I had a feeling that this answer would be a bit more controversial than some others, and perhaps I worded it in an unnecessarily inflammatory way. But I reckon that the idea was to show that arithmetic and other mathematical systems were complete and consistent, and that project was derailed by the incompleteness theorems.
$endgroup$
– Gerry Myerson
Sep 19 at 22:45
3
$begingroup$
@GerryMyerson: I guess this comes down to an interesting historical question: did most early logicians really view proving completeness+consistency of foundations (i.e. the “failed” goals) as an essential or major goal of the foundational project, or were their central motivations more in line with the aspects that succeeded (i.e. a formal system able to encode all mathematics, giving a clear consensus standard for proof correctness in principle)? My impression is more the latter, but I’m not enough of a historian to be certain.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 20 at 21:47
16
16
$begingroup$
The current wording sounds like you’re suggesting “logic and set theory” failed miserably as a firm foundation for mathematics, which would be a pretty extreme claim (and I say that as a big proponent of non-set-theoretic foundations). Do you mean just that the specific systems Frege and Russell–Whitehead used failed as foundations? If so then that’s certainly true, but as far as I know they’re not studied much today except for historical interest.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 19 at 12:53
$begingroup$
The current wording sounds like you’re suggesting “logic and set theory” failed miserably as a firm foundation for mathematics, which would be a pretty extreme claim (and I say that as a big proponent of non-set-theoretic foundations). Do you mean just that the specific systems Frege and Russell–Whitehead used failed as foundations? If so then that’s certainly true, but as far as I know they’re not studied much today except for historical interest.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 19 at 12:53
5
5
$begingroup$
I would say rather "they failed in their project of reducing mathematics to logic" I'd also add that their work led directly to the development of modern computer science.
$endgroup$
– Chris Sunami
Sep 19 at 15:32
$begingroup$
I would say rather "they failed in their project of reducing mathematics to logic" I'd also add that their work led directly to the development of modern computer science.
$endgroup$
– Chris Sunami
Sep 19 at 15:32
5
5
$begingroup$
Maybe the word "firm" is ambiguous. If we interpret "firm" as "absolutely certain, unassailable, and indubitable" then they indeed failed, but with a weaker notion of "firm" (and also a suitable notion of what a "foundation" is supposed to be) then I think they succeeded.
$endgroup$
– Timothy Chow
Sep 19 at 16:02
$begingroup$
Maybe the word "firm" is ambiguous. If we interpret "firm" as "absolutely certain, unassailable, and indubitable" then they indeed failed, but with a weaker notion of "firm" (and also a suitable notion of what a "foundation" is supposed to be) then I think they succeeded.
$endgroup$
– Timothy Chow
Sep 19 at 16:02
3
3
$begingroup$
I had a feeling that this answer would be a bit more controversial than some others, and perhaps I worded it in an unnecessarily inflammatory way. But I reckon that the idea was to show that arithmetic and other mathematical systems were complete and consistent, and that project was derailed by the incompleteness theorems.
$endgroup$
– Gerry Myerson
Sep 19 at 22:45
$begingroup$
I had a feeling that this answer would be a bit more controversial than some others, and perhaps I worded it in an unnecessarily inflammatory way. But I reckon that the idea was to show that arithmetic and other mathematical systems were complete and consistent, and that project was derailed by the incompleteness theorems.
$endgroup$
– Gerry Myerson
Sep 19 at 22:45
3
3
$begingroup$
@GerryMyerson: I guess this comes down to an interesting historical question: did most early logicians really view proving completeness+consistency of foundations (i.e. the “failed” goals) as an essential or major goal of the foundational project, or were their central motivations more in line with the aspects that succeeded (i.e. a formal system able to encode all mathematics, giving a clear consensus standard for proof correctness in principle)? My impression is more the latter, but I’m not enough of a historian to be certain.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 20 at 21:47
$begingroup$
@GerryMyerson: I guess this comes down to an interesting historical question: did most early logicians really view proving completeness+consistency of foundations (i.e. the “failed” goals) as an essential or major goal of the foundational project, or were their central motivations more in line with the aspects that succeeded (i.e. a formal system able to encode all mathematics, giving a clear consensus standard for proof correctness in principle)? My impression is more the latter, but I’m not enough of a historian to be certain.
$endgroup$
– Peter LeFanu Lumsdaine
Sep 20 at 21:47
|
show 5 more comments
$begingroup$
Ronald Fisher's theory of fiducial inference was introduced around 1930 or so (I think?), for the purpose of solving the Behrens–Fisher problem. It turned out that fiducial intervals for that problem did not have constant coverage rates, or in what then came to be standard terminology, they are not confidence intervals. That's not necessarily fatal in some contexts, since Bayesian credible intervals don't have constant coverage rates, but everyone understands that there are good reasons for that. Fisher wrote a paper saying that that criticism is unconvincing, and I wonder if anyone understands what Fisher was trying to say. Fisher was brilliant but irascible. (He was a very prolific author of research papers in statistical theory and in population genetics, a science of which he was one of the three major founders. I think he may have single-handedly founded the theory of design of experiments, but I'm not sure about that.)
However, fiducial methods seem to be undergoing some sort of revival:
https://statistics.fas.harvard.edu/event/4th-bayesian-fiducial-and-frequentist-conference-bff4
$endgroup$
1
$begingroup$
Here is a book-length treatment!
$endgroup$
– kjetil b halvorsen
Sep 25 at 7:44
add a comment
|
$begingroup$
Ronald Fisher's theory of fiducial inference was introduced around 1930 or so (I think?), for the purpose of solving the Behrens–Fisher problem. It turned out that fiducial intervals for that problem did not have constant coverage rates, or in what then came to be standard terminology, they are not confidence intervals. That's not necessarily fatal in some contexts, since Bayesian credible intervals don't have constant coverage rates, but everyone understands that there are good reasons for that. Fisher wrote a paper saying that that criticism is unconvincing, and I wonder if anyone understands what Fisher was trying to say. Fisher was brilliant but irascible. (He was a very prolific author of research papers in statistical theory and in population genetics, a science of which he was one of the three major founders. I think he may have single-handedly founded the theory of design of experiments, but I'm not sure about that.)
However, fiducial methods seem to be undergoing some sort of revival:
https://statistics.fas.harvard.edu/event/4th-bayesian-fiducial-and-frequentist-conference-bff4
$endgroup$
1
$begingroup$
Here is a book-length treatment!
$endgroup$
– kjetil b halvorsen
Sep 25 at 7:44
add a comment
|
$begingroup$
Ronald Fisher's theory of fiducial inference was introduced around 1930 or so (I think?), for the purpose of solving the Behrens–Fisher problem. It turned out that fiducial intervals for that problem did not have constant coverage rates, or in what then came to be standard terminology, they are not confidence intervals. That's not necessarily fatal in some contexts, since Bayesian credible intervals don't have constant coverage rates, but everyone understands that there are good reasons for that. Fisher wrote a paper saying that that criticism is unconvincing, and I wonder if anyone understands what Fisher was trying to say. Fisher was brilliant but irascible. (He was a very prolific author of research papers in statistical theory and in population genetics, a science of which he was one of the three major founders. I think he may have single-handedly founded the theory of design of experiments, but I'm not sure about that.)
However, fiducial methods seem to be undergoing some sort of revival:
https://statistics.fas.harvard.edu/event/4th-bayesian-fiducial-and-frequentist-conference-bff4
$endgroup$
Ronald Fisher's theory of fiducial inference was introduced around 1930 or so (I think?), for the purpose of solving the Behrens–Fisher problem. It turned out that fiducial intervals for that problem did not have constant coverage rates, or in what then came to be standard terminology, they are not confidence intervals. That's not necessarily fatal in some contexts, since Bayesian credible intervals don't have constant coverage rates, but everyone understands that there are good reasons for that. Fisher wrote a paper saying that that criticism is unconvincing, and I wonder if anyone understands what Fisher was trying to say. Fisher was brilliant but irascible. (He was a very prolific author of research papers in statistical theory and in population genetics, a science of which he was one of the three major founders. I think he may have single-handedly founded the theory of design of experiments, but I'm not sure about that.)
However, fiducial methods seem to be undergoing some sort of revival:
https://statistics.fas.harvard.edu/event/4th-bayesian-fiducial-and-frequentist-conference-bff4
answered Sep 19 at 5:20
community wiki
Michael Hardy
1
$begingroup$
Here is a book-length treatment!
$endgroup$
– kjetil b halvorsen
Sep 25 at 7:44
add a comment
|
1
$begingroup$
Here is a book-length treatment!
$endgroup$
– kjetil b halvorsen
Sep 25 at 7:44
1
1
$begingroup$
Here is a book-length treatment!
$endgroup$
– kjetil b halvorsen
Sep 25 at 7:44
$begingroup$
Here is a book-length treatment!
$endgroup$
– kjetil b halvorsen
Sep 25 at 7:44
add a comment
|
$begingroup$
Gauge-theory might be another example at the border to physics. The original idea of deriving physics from gauge-symmetries and indeed the use of the term/prefix "gauge-" (in German "Eich-") itself goes back a paper by Hermann Weyl in 1919 ("Eine neue Erweiterung der Relativitätstheorie"). In this paper he somehow tried to unify electrodynamics and general relativity using this approach, by postulating that the notion of scale (or "gauge") might be a local symmetry. This of course was a total failure as it contradicted several experiments.
It was only about a decade later that he and some others picked up the idea again, applied it to electromagnetism and quantum physics (this time with phase as a gauge) and made it work. And then of course in 1954 there came Yang and Mills and now Weyl's "failed idea" is at the core of the Standard model of physics. However the original goal of adding general relativity to the mix still hasn't been achieved.
$endgroup$
add a comment
|
$begingroup$
Gauge-theory might be another example at the border to physics. The original idea of deriving physics from gauge-symmetries and indeed the use of the term/prefix "gauge-" (in German "Eich-") itself goes back a paper by Hermann Weyl in 1919 ("Eine neue Erweiterung der Relativitätstheorie"). In this paper he somehow tried to unify electrodynamics and general relativity using this approach, by postulating that the notion of scale (or "gauge") might be a local symmetry. This of course was a total failure as it contradicted several experiments.
It was only about a decade later that he and some others picked up the idea again, applied it to electromagnetism and quantum physics (this time with phase as a gauge) and made it work. And then of course in 1954 there came Yang and Mills and now Weyl's "failed idea" is at the core of the Standard model of physics. However the original goal of adding general relativity to the mix still hasn't been achieved.
$endgroup$
add a comment
|
$begingroup$
Gauge-theory might be another example at the border to physics. The original idea of deriving physics from gauge-symmetries and indeed the use of the term/prefix "gauge-" (in German "Eich-") itself goes back a paper by Hermann Weyl in 1919 ("Eine neue Erweiterung der Relativitätstheorie"). In this paper he somehow tried to unify electrodynamics and general relativity using this approach, by postulating that the notion of scale (or "gauge") might be a local symmetry. This of course was a total failure as it contradicted several experiments.
It was only about a decade later that he and some others picked up the idea again, applied it to electromagnetism and quantum physics (this time with phase as a gauge) and made it work. And then of course in 1954 there came Yang and Mills and now Weyl's "failed idea" is at the core of the Standard model of physics. However the original goal of adding general relativity to the mix still hasn't been achieved.
$endgroup$
Gauge-theory might be another example at the border to physics. The original idea of deriving physics from gauge-symmetries and indeed the use of the term/prefix "gauge-" (in German "Eich-") itself goes back a paper by Hermann Weyl in 1919 ("Eine neue Erweiterung der Relativitätstheorie"). In this paper he somehow tried to unify electrodynamics and general relativity using this approach, by postulating that the notion of scale (or "gauge") might be a local symmetry. This of course was a total failure as it contradicted several experiments.
It was only about a decade later that he and some others picked up the idea again, applied it to electromagnetism and quantum physics (this time with phase as a gauge) and made it work. And then of course in 1954 there came Yang and Mills and now Weyl's "failed idea" is at the core of the Standard model of physics. However the original goal of adding general relativity to the mix still hasn't been achieved.
answered Sep 20 at 8:40
community wiki
mlk
add a comment
|
add a comment
|
$begingroup$
Continuing what was said by @GerryMyerson, the project of providing foundations for mathematics started by Frege was presented in a treatise called Grundgesetze der Arithmetik (Basic laws of arithmetic). The axioms of this treatise were proven inconsistent by Bertrand Russell in what we know today as Russell's paradox.
This paradox also affects naive set theory, understood as the theory comprising the following two axioms:
Axiom of extensionality: $(x in a leftrightarrow x in b) rightarrow a = b$. That is, if two sets $a$ and $b$ have the same elements, then they're the same set.
Axiom (scheme) of unrestricted comprehension: $x in a leftrightarrow$ P$x$, for each formula P$x$. That is, to each property P uniquely corresponds one set $a$.
Naive set theory, thus understood, follows from Frege's axioms and seems to capture very well the notion of set. But since $x notin x$ is a formula, the axiom scheme of unrestricted comprehension guarantees that the following is an axiom:
- $x in x leftrightarrow x notin x$
Now, when we ask whether $x in x$ or $x notin x$, we obtain contradictory situations in both cases.
This paradox was solved by discarding this axiomatisation of set theory and, hence, Frege's axiomatics. But some logicians, mathematicians and philosophers have considered that perhaps this wasn't the right way to solve this. Instead of rejecting this naive set theory or Frege's theory, they propose to reject the principle of explosion or ex contradictione sequitor quodlibet:
$P wedge neg P rightarrow Q$. That is, from a contradiction follows any formula or statement.
This research programme is often known as the paraconsistent programme, because they work with paraconsistent logics. A logic system is said to be paraconsistent iff the logical thesis (4) is not valid in general. Hence, if the theory is inconsistent, it doesn't mean that anything follows from it (which means that it still may be useful). You can find out more about his programme in:
- SEP: Paraconsistent Logic
- SEP: Inconsistent Mathematics
- SEP: Dialetheism
You will find there (specially in the second link) a whole programme for researching inconsistent mathematical theories, which are generally considered of no mathematical interest. (You will also find that mainly philosophers are working in this programme.)
Whether this programme is of any scientific value, that's for you to judge. But I accept this probably wasn't the kind of answer you were looking for. There is a chance, however, that you find it very interesting. I hope it helps in any case.
$endgroup$
add a comment
|
$begingroup$
Continuing what was said by @GerryMyerson, the project of providing foundations for mathematics started by Frege was presented in a treatise called Grundgesetze der Arithmetik (Basic laws of arithmetic). The axioms of this treatise were proven inconsistent by Bertrand Russell in what we know today as Russell's paradox.
This paradox also affects naive set theory, understood as the theory comprising the following two axioms:
Axiom of extensionality: $(x in a leftrightarrow x in b) rightarrow a = b$. That is, if two sets $a$ and $b$ have the same elements, then they're the same set.
Axiom (scheme) of unrestricted comprehension: $x in a leftrightarrow$ P$x$, for each formula P$x$. That is, to each property P uniquely corresponds one set $a$.
Naive set theory, thus understood, follows from Frege's axioms and seems to capture very well the notion of set. But since $x notin x$ is a formula, the axiom scheme of unrestricted comprehension guarantees that the following is an axiom:
- $x in x leftrightarrow x notin x$
Now, when we ask whether $x in x$ or $x notin x$, we obtain contradictory situations in both cases.
This paradox was solved by discarding this axiomatisation of set theory and, hence, Frege's axiomatics. But some logicians, mathematicians and philosophers have considered that perhaps this wasn't the right way to solve this. Instead of rejecting this naive set theory or Frege's theory, they propose to reject the principle of explosion or ex contradictione sequitor quodlibet:
$P wedge neg P rightarrow Q$. That is, from a contradiction follows any formula or statement.
This research programme is often known as the paraconsistent programme, because they work with paraconsistent logics. A logic system is said to be paraconsistent iff the logical thesis (4) is not valid in general. Hence, if the theory is inconsistent, it doesn't mean that anything follows from it (which means that it still may be useful). You can find out more about his programme in:
- SEP: Paraconsistent Logic
- SEP: Inconsistent Mathematics
- SEP: Dialetheism
You will find there (specially in the second link) a whole programme for researching inconsistent mathematical theories, which are generally considered of no mathematical interest. (You will also find that mainly philosophers are working in this programme.)
Whether this programme is of any scientific value, that's for you to judge. But I accept this probably wasn't the kind of answer you were looking for. There is a chance, however, that you find it very interesting. I hope it helps in any case.
$endgroup$
add a comment
|
$begingroup$
Continuing what was said by @GerryMyerson, the project of providing foundations for mathematics started by Frege was presented in a treatise called Grundgesetze der Arithmetik (Basic laws of arithmetic). The axioms of this treatise were proven inconsistent by Bertrand Russell in what we know today as Russell's paradox.
This paradox also affects naive set theory, understood as the theory comprising the following two axioms:
Axiom of extensionality: $(x in a leftrightarrow x in b) rightarrow a = b$. That is, if two sets $a$ and $b$ have the same elements, then they're the same set.
Axiom (scheme) of unrestricted comprehension: $x in a leftrightarrow$ P$x$, for each formula P$x$. That is, to each property P uniquely corresponds one set $a$.
Naive set theory, thus understood, follows from Frege's axioms and seems to capture very well the notion of set. But since $x notin x$ is a formula, the axiom scheme of unrestricted comprehension guarantees that the following is an axiom:
- $x in x leftrightarrow x notin x$
Now, when we ask whether $x in x$ or $x notin x$, we obtain contradictory situations in both cases.
This paradox was solved by discarding this axiomatisation of set theory and, hence, Frege's axiomatics. But some logicians, mathematicians and philosophers have considered that perhaps this wasn't the right way to solve this. Instead of rejecting this naive set theory or Frege's theory, they propose to reject the principle of explosion or ex contradictione sequitor quodlibet:
$P wedge neg P rightarrow Q$. That is, from a contradiction follows any formula or statement.
This research programme is often known as the paraconsistent programme, because they work with paraconsistent logics. A logic system is said to be paraconsistent iff the logical thesis (4) is not valid in general. Hence, if the theory is inconsistent, it doesn't mean that anything follows from it (which means that it still may be useful). You can find out more about his programme in:
- SEP: Paraconsistent Logic
- SEP: Inconsistent Mathematics
- SEP: Dialetheism
You will find there (specially in the second link) a whole programme for researching inconsistent mathematical theories, which are generally considered of no mathematical interest. (You will also find that mainly philosophers are working in this programme.)
Whether this programme is of any scientific value, that's for you to judge. But I accept this probably wasn't the kind of answer you were looking for. There is a chance, however, that you find it very interesting. I hope it helps in any case.
$endgroup$
Continuing what was said by @GerryMyerson, the project of providing foundations for mathematics started by Frege was presented in a treatise called Grundgesetze der Arithmetik (Basic laws of arithmetic). The axioms of this treatise were proven inconsistent by Bertrand Russell in what we know today as Russell's paradox.
This paradox also affects naive set theory, understood as the theory comprising the following two axioms:
Axiom of extensionality: $(x in a leftrightarrow x in b) rightarrow a = b$. That is, if two sets $a$ and $b$ have the same elements, then they're the same set.
Axiom (scheme) of unrestricted comprehension: $x in a leftrightarrow$ P$x$, for each formula P$x$. That is, to each property P uniquely corresponds one set $a$.
Naive set theory, thus understood, follows from Frege's axioms and seems to capture very well the notion of set. But since $x notin x$ is a formula, the axiom scheme of unrestricted comprehension guarantees that the following is an axiom:
- $x in x leftrightarrow x notin x$
Now, when we ask whether $x in x$ or $x notin x$, we obtain contradictory situations in both cases.
This paradox was solved by discarding this axiomatisation of set theory and, hence, Frege's axiomatics. But some logicians, mathematicians and philosophers have considered that perhaps this wasn't the right way to solve this. Instead of rejecting this naive set theory or Frege's theory, they propose to reject the principle of explosion or ex contradictione sequitor quodlibet:
$P wedge neg P rightarrow Q$. That is, from a contradiction follows any formula or statement.
This research programme is often known as the paraconsistent programme, because they work with paraconsistent logics. A logic system is said to be paraconsistent iff the logical thesis (4) is not valid in general. Hence, if the theory is inconsistent, it doesn't mean that anything follows from it (which means that it still may be useful). You can find out more about his programme in:
- SEP: Paraconsistent Logic
- SEP: Inconsistent Mathematics
- SEP: Dialetheism
You will find there (specially in the second link) a whole programme for researching inconsistent mathematical theories, which are generally considered of no mathematical interest. (You will also find that mainly philosophers are working in this programme.)
Whether this programme is of any scientific value, that's for you to judge. But I accept this probably wasn't the kind of answer you were looking for. There is a chance, however, that you find it very interesting. I hope it helps in any case.
answered Sep 20 at 6:47
community wiki
lfba
add a comment
|
add a comment
|
$begingroup$
It's perhaps slightly (if any) exaggerated, but the development of algebraic number theory (particularly the study of cyclotomic fields) is strongly motivated by attempts to prove Fermat's last theorem.
And everyone knows the end of that story ...
To quote Wiki:
Fermat's last theorem:
The unsolved problem stimulated the development of algebraic number theory in the 19th century and the proof of the modularity theorem in the 20th century.
Cyclotimic field:
The cyclotomic fields played a crucial role in the development of modern algebra and number theory because of their relation with Fermat's last theorem. It was in the process of his deep investigations of the arithmetic of these fields (for prime n) – and more precisely, because of the failure of unique factorization in their rings of integers – that Ernst Kummer first introduced the concept of an ideal number and proved his celebrated congruences.
$endgroup$
add a comment
|
$begingroup$
It's perhaps slightly (if any) exaggerated, but the development of algebraic number theory (particularly the study of cyclotomic fields) is strongly motivated by attempts to prove Fermat's last theorem.
And everyone knows the end of that story ...
To quote Wiki:
Fermat's last theorem:
The unsolved problem stimulated the development of algebraic number theory in the 19th century and the proof of the modularity theorem in the 20th century.
Cyclotimic field:
The cyclotomic fields played a crucial role in the development of modern algebra and number theory because of their relation with Fermat's last theorem. It was in the process of his deep investigations of the arithmetic of these fields (for prime n) – and more precisely, because of the failure of unique factorization in their rings of integers – that Ernst Kummer first introduced the concept of an ideal number and proved his celebrated congruences.
$endgroup$
add a comment
|
$begingroup$
It's perhaps slightly (if any) exaggerated, but the development of algebraic number theory (particularly the study of cyclotomic fields) is strongly motivated by attempts to prove Fermat's last theorem.
And everyone knows the end of that story ...
To quote Wiki:
Fermat's last theorem:
The unsolved problem stimulated the development of algebraic number theory in the 19th century and the proof of the modularity theorem in the 20th century.
Cyclotimic field:
The cyclotomic fields played a crucial role in the development of modern algebra and number theory because of their relation with Fermat's last theorem. It was in the process of his deep investigations of the arithmetic of these fields (for prime n) – and more precisely, because of the failure of unique factorization in their rings of integers – that Ernst Kummer first introduced the concept of an ideal number and proved his celebrated congruences.
$endgroup$
It's perhaps slightly (if any) exaggerated, but the development of algebraic number theory (particularly the study of cyclotomic fields) is strongly motivated by attempts to prove Fermat's last theorem.
And everyone knows the end of that story ...
To quote Wiki:
Fermat's last theorem:
The unsolved problem stimulated the development of algebraic number theory in the 19th century and the proof of the modularity theorem in the 20th century.
Cyclotimic field:
The cyclotomic fields played a crucial role in the development of modern algebra and number theory because of their relation with Fermat's last theorem. It was in the process of his deep investigations of the arithmetic of these fields (for prime n) – and more precisely, because of the failure of unique factorization in their rings of integers – that Ernst Kummer first introduced the concept of an ideal number and proved his celebrated congruences.
edited Sep 20 at 20:23
community wiki
2 revs
WhatsUp
add a comment
|
add a comment
|
$begingroup$
I think Dirac's equation is a good example. Dirac was looking for a special-relativistic version of Schrödinger's equation. For the probabilistic interpretation to work, it had to have only first-order time derivatives, unlike the field equations known at the time.
He found a Lorentz invariant field equation with first-order derivatives, and it turned out to have enormous theoretical importance since it kicked off the study of relativistic field theories and Lie group representations in physics.
But the Dirac equation isn't a relativistic version of Schrödinger's equation. It can't describe multiparticle entanglement, it doesn't violate Bell's inequality, you can't build a quantum computer in it, etc. From a modern perspective it's just the massive, spin-½ counterpart to Maxwell's equations.
A version of the Dirac equation appears in the Lagrangian of quantum electrodynamics and the Standard Model. But it's right alongside a version of Maxwell's equations, complete with second-order derivatives, which turned out not to be a problem after all.
It's often still taught in introductory courses that Dirac's equation explained the electron's spin and magnetic moment, but both of those retrodictions were essentially accidental. Dirac's argument for spin ½ would imply that all fundamental particles must have half-integer spin, which doesn't appear to be the case; and Weinberg says "there is really nothing in Dirac's line of argument that leads unequivocally to this particular value for the magnetic moment" (The Quantum Theory of Fields, Vol. 1, p. 14).
$endgroup$
add a comment
|
$begingroup$
I think Dirac's equation is a good example. Dirac was looking for a special-relativistic version of Schrödinger's equation. For the probabilistic interpretation to work, it had to have only first-order time derivatives, unlike the field equations known at the time.
He found a Lorentz invariant field equation with first-order derivatives, and it turned out to have enormous theoretical importance since it kicked off the study of relativistic field theories and Lie group representations in physics.
But the Dirac equation isn't a relativistic version of Schrödinger's equation. It can't describe multiparticle entanglement, it doesn't violate Bell's inequality, you can't build a quantum computer in it, etc. From a modern perspective it's just the massive, spin-½ counterpart to Maxwell's equations.
A version of the Dirac equation appears in the Lagrangian of quantum electrodynamics and the Standard Model. But it's right alongside a version of Maxwell's equations, complete with second-order derivatives, which turned out not to be a problem after all.
It's often still taught in introductory courses that Dirac's equation explained the electron's spin and magnetic moment, but both of those retrodictions were essentially accidental. Dirac's argument for spin ½ would imply that all fundamental particles must have half-integer spin, which doesn't appear to be the case; and Weinberg says "there is really nothing in Dirac's line of argument that leads unequivocally to this particular value for the magnetic moment" (The Quantum Theory of Fields, Vol. 1, p. 14).
$endgroup$
add a comment
|
$begingroup$
I think Dirac's equation is a good example. Dirac was looking for a special-relativistic version of Schrödinger's equation. For the probabilistic interpretation to work, it had to have only first-order time derivatives, unlike the field equations known at the time.
He found a Lorentz invariant field equation with first-order derivatives, and it turned out to have enormous theoretical importance since it kicked off the study of relativistic field theories and Lie group representations in physics.
But the Dirac equation isn't a relativistic version of Schrödinger's equation. It can't describe multiparticle entanglement, it doesn't violate Bell's inequality, you can't build a quantum computer in it, etc. From a modern perspective it's just the massive, spin-½ counterpart to Maxwell's equations.
A version of the Dirac equation appears in the Lagrangian of quantum electrodynamics and the Standard Model. But it's right alongside a version of Maxwell's equations, complete with second-order derivatives, which turned out not to be a problem after all.
It's often still taught in introductory courses that Dirac's equation explained the electron's spin and magnetic moment, but both of those retrodictions were essentially accidental. Dirac's argument for spin ½ would imply that all fundamental particles must have half-integer spin, which doesn't appear to be the case; and Weinberg says "there is really nothing in Dirac's line of argument that leads unequivocally to this particular value for the magnetic moment" (The Quantum Theory of Fields, Vol. 1, p. 14).
$endgroup$
I think Dirac's equation is a good example. Dirac was looking for a special-relativistic version of Schrödinger's equation. For the probabilistic interpretation to work, it had to have only first-order time derivatives, unlike the field equations known at the time.
He found a Lorentz invariant field equation with first-order derivatives, and it turned out to have enormous theoretical importance since it kicked off the study of relativistic field theories and Lie group representations in physics.
But the Dirac equation isn't a relativistic version of Schrödinger's equation. It can't describe multiparticle entanglement, it doesn't violate Bell's inequality, you can't build a quantum computer in it, etc. From a modern perspective it's just the massive, spin-½ counterpart to Maxwell's equations.
A version of the Dirac equation appears in the Lagrangian of quantum electrodynamics and the Standard Model. But it's right alongside a version of Maxwell's equations, complete with second-order derivatives, which turned out not to be a problem after all.
It's often still taught in introductory courses that Dirac's equation explained the electron's spin and magnetic moment, but both of those retrodictions were essentially accidental. Dirac's argument for spin ½ would imply that all fundamental particles must have half-integer spin, which doesn't appear to be the case; and Weinberg says "there is really nothing in Dirac's line of argument that leads unequivocally to this particular value for the magnetic moment" (The Quantum Theory of Fields, Vol. 1, p. 14).
answered Sep 24 at 18:49
community wiki
benrg
add a comment
|
add a comment
|
$begingroup$
The typical oracle methods of Computability theory AKA Recursion theory were shown to be insufficient to settle the P vs. NP problem by Baker, Gill and Solovay 1975.
Thus recursion theory became divorced from the problems of efficient computability and experienced a bit of a setback (not as many papers in Ann.Math. anymore etc.).
Nevertheless it continued as the study of in principle computability.
$endgroup$
8
$begingroup$
Oracle methods predate the interest in or even formulation of P=NP. The failure of oracle methods for that problem surely highlighted the distance between computabilirt and efficient computation, but I don’t think the two subjects were ever very married.
$endgroup$
– Matt F.
Sep 18 at 23:28
$begingroup$
@MattF. Fair enough but people became more interested in efficient computability than in-principle computability, because of practical applications.
$endgroup$
– Bjørn Kjos-Hanssen
Sep 19 at 6:49
add a comment
|
$begingroup$
The typical oracle methods of Computability theory AKA Recursion theory were shown to be insufficient to settle the P vs. NP problem by Baker, Gill and Solovay 1975.
Thus recursion theory became divorced from the problems of efficient computability and experienced a bit of a setback (not as many papers in Ann.Math. anymore etc.).
Nevertheless it continued as the study of in principle computability.
$endgroup$
8
$begingroup$
Oracle methods predate the interest in or even formulation of P=NP. The failure of oracle methods for that problem surely highlighted the distance between computabilirt and efficient computation, but I don’t think the two subjects were ever very married.
$endgroup$
– Matt F.
Sep 18 at 23:28
$begingroup$
@MattF. Fair enough but people became more interested in efficient computability than in-principle computability, because of practical applications.
$endgroup$
– Bjørn Kjos-Hanssen
Sep 19 at 6:49
add a comment
|
$begingroup$
The typical oracle methods of Computability theory AKA Recursion theory were shown to be insufficient to settle the P vs. NP problem by Baker, Gill and Solovay 1975.
Thus recursion theory became divorced from the problems of efficient computability and experienced a bit of a setback (not as many papers in Ann.Math. anymore etc.).
Nevertheless it continued as the study of in principle computability.
$endgroup$
The typical oracle methods of Computability theory AKA Recursion theory were shown to be insufficient to settle the P vs. NP problem by Baker, Gill and Solovay 1975.
Thus recursion theory became divorced from the problems of efficient computability and experienced a bit of a setback (not as many papers in Ann.Math. anymore etc.).
Nevertheless it continued as the study of in principle computability.
answered Sep 18 at 22:36
community wiki
Bjørn Kjos-Hanssen
8
$begingroup$
Oracle methods predate the interest in or even formulation of P=NP. The failure of oracle methods for that problem surely highlighted the distance between computabilirt and efficient computation, but I don’t think the two subjects were ever very married.
$endgroup$
– Matt F.
Sep 18 at 23:28
$begingroup$
@MattF. Fair enough but people became more interested in efficient computability than in-principle computability, because of practical applications.
$endgroup$
– Bjørn Kjos-Hanssen
Sep 19 at 6:49
add a comment
|
8
$begingroup$
Oracle methods predate the interest in or even formulation of P=NP. The failure of oracle methods for that problem surely highlighted the distance between computabilirt and efficient computation, but I don’t think the two subjects were ever very married.
$endgroup$
– Matt F.
Sep 18 at 23:28
$begingroup$
@MattF. Fair enough but people became more interested in efficient computability than in-principle computability, because of practical applications.
$endgroup$
– Bjørn Kjos-Hanssen
Sep 19 at 6:49
8
8
$begingroup$
Oracle methods predate the interest in or even formulation of P=NP. The failure of oracle methods for that problem surely highlighted the distance between computabilirt and efficient computation, but I don’t think the two subjects were ever very married.
$endgroup$
– Matt F.
Sep 18 at 23:28
$begingroup$
Oracle methods predate the interest in or even formulation of P=NP. The failure of oracle methods for that problem surely highlighted the distance between computabilirt and efficient computation, but I don’t think the two subjects were ever very married.
$endgroup$
– Matt F.
Sep 18 at 23:28
$begingroup$
@MattF. Fair enough but people became more interested in efficient computability than in-principle computability, because of practical applications.
$endgroup$
– Bjørn Kjos-Hanssen
Sep 19 at 6:49
$begingroup$
@MattF. Fair enough but people became more interested in efficient computability than in-principle computability, because of practical applications.
$endgroup$
– Bjørn Kjos-Hanssen
Sep 19 at 6:49
add a comment
|
Thanks for contributing an answer to MathOverflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f341959%2fexamples-of-unsuccessful-theories-with-afterlives%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
6
$begingroup$
Does the proof of Feit-Thompson really eschew representation theory? I don't know much about the area, but the linked wiki page gives quite the opposite impression.
$endgroup$
– lambda
Sep 18 at 22:34
29
$begingroup$
Non-Euclidean geometry was initially developed in hopes of deriving the parallel postulate from the other axioms of Euclidean geometry. Nowadays it describes two of the basic model geometries in Riemannian geometry - the sphere and hyperbolic space.
$endgroup$
– Terry Tao
Sep 18 at 22:38
16
$begingroup$
Many parts of graph theory originated or were stimulated in attempts to tackle the 4-colour conjecture. Perhaps the chromatic polynomial is the principal example of something explicitly introduced for the 4CC, which played no role in its solution, but which then went on to be significant in the study of phase transitions in the Potts model.
$endgroup$
– Gordon Royle
Sep 18 at 23:45
5
$begingroup$
I think there are plenty of results in differential equations which use Lie theory in a non-trivial way. Ask a physicist about the Schrödinger equation, the hydrogen atom and the representation theory of SO(3).
$endgroup$
– Thomas Rot
Sep 19 at 10:27
2
$begingroup$
Partcularly relevant "String Theory and Math: Why This Marriage May Last" by Mina Aganagic arxiv.org/abs/1508.06642. Camps have been formed by supporters and detractors, fusilades fired, but no doubt string theory has invigorated large branches of math and given rise to offshoots.
$endgroup$
– Tom Copeland
Sep 19 at 17:20