Jump to content

Talk:Generality of algebra

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Odd refs

[edit]

Happily, S accidentally pasted in his google books search [1] so we know what that ref actually says is The analysts of the 18th century shared certain convictions... and the standards of proof. When he referred to them later, Cauchy spoke aptly of the Generality of algebra. If that is therefore somewhat akin to analyticity then the connection is too weak for me to see it: please explain. It looks to me like someone has misunderstood: the "generality of algebra" is (as far as I can interpret that book) a statement about standards of proof, and how (informally) one may generalise results and formulae William M. Connolley (talk) 22:19, 30 April 2011 (UTC)[reply]

I agree that the analogy with analyticity is somewhat obscure. My understanding is that "generality of algebra" is not just a general lack of rigor, but that refers to a particular class of proof techniques where expressions would be manipulated past values that, if taken literally, would be infinite (or impossible to interpret), in spite of the conclusion being finite (and correct). Euler was particularly known for this style of "proof", and there are some well-known series and product identities (which naturally I can't remember at the moment) where he employed precisely this sort of argument. Sławomir Biały (talk) 22:40, 30 April 2011 (UTC)[reply]
One example (not really the best illustration of the concept, though) is this. Sławomir Biały (talk) 22:42, 30 April 2011 (UTC)[reply]
I think you are likely correct. OK, so the best-guess here is that the original article author has somewhat misunderstood the subject. That leaves open the question of whether this is a good title to discuss / describe these ideas under William M. Connolley (talk) 23:02, 30 April 2011 (UTC)[reply]
Another example could be Euler's computation of the value of the Riemann zeta function at negative integers. RobHar (talk) 05:52, 1 May 2011 (UTC)[reply]
Could you please add this in? We are all reluctant to act as historians of mathematics but until we can get some professional historians interested in editing wiki pages (which does not seem to be the case on a large scale), it will have to be done by mathematicians. Tkuvho (talk) 06:43, 1 May 2011 (UTC)[reply]
Yes, the article needs some examples of what the "generality of algebra" principle actually is - together, of course, with sources that demonstrate that these examples are actually referred to in the mathematical literature as demonstrating the "generality of algebra", and are not guesses and interprtetions made by Wikipedia editors. At the moment, the article is hopelessly vague and says nothing useful about its subject. Of course, if the phrase has no specific meaning and was simply a throw-away description used once or twice by Cauchy, then it is probably not sufficiently notable to merit an article. Gandalf61 (talk) 08:28, 1 May 2011 (UTC)[reply]
I agree with that with the exception that I would replace the word "demonstrate" by "illustrate": generality of algebra being a heuristic principle, it is not something one can demonstrate. Tkuvho (talk) 08:33, 1 May 2011 (UTC)[reply]

In my example above, I do mean that this could be an example, I don't actually know; though from what follows, it seems likely. A quick search turns up this book Lakatos' philosophy of mathematics, Volume 3 by Koetsier, which actually contains the phrase "Examples 1 through 4 illustrate the principle of generality of algebra" preceded by 4 examples. There's no preview on google books, but there is on amazon.com. Starting at page 206 seems to be a nice source on the "generality of algebra". RobHar (talk) 16:13, 1 May 2011 (UTC)[reply]

Rewrite

[edit]

Since no-one else had got round to adding an actual definition of the term "generality of algebra" to the article, I have finally done this myself, using Jahnke's A History of Analysis as a source. As far as I can see, the phrase was coined and mainly used by Cauchy; it is mentioned in our article on Cauchy, in the last sentence of the section Augustin-Louis Cauchy#Cours d'Analyse. I am still not convinced that it is sufficiently notable to merit its own article. Gandalf61 (talk) 08:08, 2 May 2011 (UTC)[reply]

Thanks for your input. As I mentioned above, unfortunately historians of science seem to be less interested in participating in wiki, so we are all dealing in guesswork. I find it surprising that the term is attributed to Cauchy. I had the impression it was already used earlier. If anyone has access to Koetsier's book, could they please check his 4 examples? Another idea is to look for the French expression "generalite d'algebre" (I am missing some accents here), which might be more common than the English term. Tkuvho (talk) 08:21, 2 May 2011 (UTC)[reply]
The rewrite makes sense, so that is great. I agree with you re notability, but OTOH it is now harmless, so don't care much. Q: we now have Euler's vision of a general partial differential calculus for a generalized kind of function hanging on in the further reading section; the relevance of that is unclear William M. Connolley (talk) 08:23, 2 May 2011 (UTC)[reply]
Smithies reports that many authors used the phrase before Cauchy:
Cauchy's conception of rigour in analysis. F Smithies - Archive for history of exact sciences, 1986 - Springer. ... An author appealing to this principle would usually use some such phrase as "the generality of algebra" or "the generality of analysis"; many such appeals can be found in the writings of CAUCHY'S immediate predecessors.
Tkuvho (talk) 08:35, 2 May 2011 (UTC)[reply]

I think I may have also seen generality of algebra used to refer to more than just infinite series but also justification that things that *aren't regular* numbers could be treated like a number and therefore manipulated according to the same rules even if the operations and functions did not exist there. Like that you can treat i or something else like regular numbers when taking derivatives, or the fact you can just throw a negative into a logarithm and manipulate it algebraically to get an answer that makes sense, etc. No references at the moment but I think I saw it mentioned in multiple places when I was reading about the history of Euler's formula somewhere (maybe a JSTOR article or a PDF from somewhere, idk)...

analytic continuation

[edit]

Hi G, thanks for your edits. I am not sure why you deleted my references to analytic continuation. Obviously, if you want to use the formula beyond the radius of convergence, you have to extend it first. How do you propose to extend it if not by analytic continuation? Tkuvho (talk) 16:18, 2 May 2011 (UTC)[reply]

I removed the references to analytic continuation because I believe you are misusing the term. Analytic continuation does not magically extend the domain of convergence of a power series - instead, it says that an analytic function may be represented by different power series in different parts of the complex plane. Analytic continuation forms part of the more rigorous approach to analysis introduced by Cauchy and Weierstrass as a reaction against the less watertight arguments of earlier mathematicians. When you imply that Grandi used analytic continuation, you are rewriting mathematical history. Gandalf61 (talk) 16:21, 2 May 2011 (UTC)[reply]

Not really. I am just explaining what he did in terms that a modern reader can understand. Obviously these techniques anticipate analytic continuation to a certain extent, but there is no need to emphasize this point. Tkuvho (talk) 03:41, 3 May 2011 (UTC)[reply]

There is a detailed description of Grandi's methods at History of Grandi's series. He did not use analytic continuation or anything like it. A more fundamental problem with your claims is that you just cannot use analytic continuation to assign a value to at - the power series plainly does not converge at that point, therefore it has no limit value. Grandi's arguments were ingenious but wrong because he and his contemporaries did not have a rigorous understanding of limits or convergence. This was exactly Cauchy's point. Gandalf61 (talk) 08:04, 3 May 2011 (UTC)[reply]
OK, thanks, I was not familiar with Grandi's method in detail. In other cases, I think, generality of algebra does "work" by assigning to a series a value provided by the analytic continuation of the sum, but I don't have examples right now. Perhaps we could follow up the suggestion above to examine Euler's arguments related to values of the zeta function at negative integers? At any rate, this is not a major point, and one of the historians (Smithies, if I recall correctly) specifically warns against taking the analogy with analytic continuation too far. Tkuvho (talk) 08:19, 3 May 2011 (UTC)[reply]

Dubious

[edit]

Contrary to what is currently written in the article, the term was in use well before Cauchy, as the references above testify. Tkuvho (talk) 16:26, 2 May 2011 (UTC)[reply]

The only specific quotation that I can find for the phrase is a passage from Cours d'Analyse that says "As for methods I have sought to give them all the rigour that one requires in geometry, so as never to have recourse to the reasons drawn from the generality of algebra" - [2]. If you wish to demonstrate earlier use, just provide one specific example, with source, of a mathematical author who used the term before Cauchy. Assertions without sources are worthless. Gandalf61 (talk) 16:39, 2 May 2011 (UTC)[reply]
At wiki the test of an assertion is not its truth but it being sourced. If a reliable historian says they were used before Cauchy, we can rely on that. On the contrary, we generally avoid relying on primary sources. Tkuvho (talk) 17:04, 2 May 2011 (UTC)[reply]
Dead wrong. Interpretation of primary sources is discouraged on Wikipedia. However, WP:PRIMARY says that a primary source may be used to support "straightforward, descriptive statements that any educated person, with access to the source but without specialist knowledge, will be able to verify are supported by the source" - citing a source as an instance of the use of a phrase is an example of such a straightforward, descriptive statement. However, I am bored with this pointless debate, so I have changed "coined" to "used" to placate you. Gandalf61 (talk) 17:50, 2 May 2011 (UTC)[reply]
"However, I am bored with this pointless debate, so I have changed "coined" to "used" to placate you." - Gandalf61 : Hilarious! If only such temperance were more common in Wikipedia. KlappCK (talk) 19:38, 3 August 2011 (UTC)[reply]

Lutzen

[edit]

An editor insists on deleting the following item:

  • Euler's vision of a general partial differential calculus for a generalized kind of function. J Lützen, in Sherlock Holmes in Babylon and Other Tales of Mathematical History, 2004, page 354.

Lutzen is a distinguished historian, and if there is anyone competent to speak about the generality of algebra, it would be Lutzen. This deletion is odd. Tkuvho (talk) 09:00, 3 May 2011 (UTC)[reply]

This deletion was explained in an edit comment. If you want to look neutral, you should acknowledge that. If L says something interesting and relevant, then add it to the article, or put it on the talk page for discussion. The article is not, currently, over-long William M. Connolley (talk) 10:29, 3 May 2011 (UTC)[reply]
I understand that you left an "edit comment", and I look forward to a productive collaboration on this page, but I don't accept the "edit comment". Tkuvho (talk) 14:48, 3 May 2011 (UTC)[reply]
More specifically, I feel that the paper is important and should be visible to all editors working on generality of algebra without having to go to the talk page. Tkuvho (talk) 14:49, 3 May 2011 (UTC)[reply]
Well, if the paper is indeed important, could you perhaps quote some of the important observations made in it? William M. Connolley (talk) 15:58, 3 May 2011 (UTC)[reply]

Ghosts of departed quantities

[edit]

If you liked this page, you might also like Ghosts of departed quantities William M. Connolley (talk) 16:03, 3 May 2011 (UTC)[reply]