If you want to learn abstract algebra for the first time and you're anything like me, don't just read a book about it. It'll start to sound like a bunch of abstract nonsense. Eventually you'll have the skills needed to "bring it alive" for yourself with a set of concrete examples that you'll learn to refer to again and again. I would recommend starting by watching some lectures about it (you can find some here [1] under the lecture schedule).
He has a very engaging style. I'm about 1/3 of the way through, but had to pause for other commitments. It's mostly possible to track down the homework questions that he sets and solutions for them.
I don't see how a lecture would improve things over a book (assuming you still can't ask questions), though it does of course depend on the book. I'd just pick up Pinter's 'Abstract Algebra'—lots of good concrete examples, one of most readable math texts (in any subject) I've come across.
Edit: also, there's a Dover edition you can get for < $15.
Experienced lecturers might have insights into what kinds of difficulties students typically experience with the material, leading them to organize their lecture in a way that is different from a book. Then again, experienced lecturers might also write books as well where they consider the same things for their book.
It's the difference between something mixing up a 30,000 piece jigsaw puzzle and throwing it on the floor, versus that person laying out a framework, with key pieces where they need to be.
Having a set of concrete examples in mathematics is like unit testing in software. The second you get stuck on a theorem, you fall back on your examples. The second you run out of examples that explain a behavior or new theorem, you write down new examples.
I would diversify my sources. i.e. Sometimes it takes seeing the same thing reworked in another book or in lecture instead of a book to 'get it'. The general rule of thumb is you need at least 3 different solutions or approaches to a problem to really be able to say you know "what's going on."
For a counterpoint, read Shafarevich's "Basic notions of algebra". (Beware: author's meaning of "basic" may differ from own.)
It's a book that explains all those abstract algebra concepts by introducing motivating examples for each one of them. It's the next best thing to finding the natural setting for them yourself.
Agreed. Google images is a surprisingly decent source of visual examples for intuition building/testing, although the signal-noise ratio is a crapshoot.
The linked PDF also has an online version which is superior in many ways because it has inline executable Sage examples. It can be found at: http://abstract.pugetsound.edu/aata/
Also, this textbook is one of several open source textbooks developed using MathBook XML which allows authors to create multiple output formats such as PDF, HTML and ePub from one canonical source document written in XML. If you are interested in learning about MathBook XML you should check out: http://mathbook.pugetsound.edu/
Evariste Galois always intrigued me. He died aged 20 in a duel supposably for some love story, yet he apparently had "time" to lay the foundations of some pretty serious math. To quote the into from the last chapter of this book which introduces Galois Theory:
"[...] attempts to solve the general fifth-degree, or quintic, polynomial were repulsed for the next three hundred years [...] no solution like the quadratic formula was found for the general quintic [...] Finally, at the beginning of the nineteenth century, Ruffini and Abel both found quintics that could not be solved with any formula. It was Galois, however, who provided the full explanation by showing which polynomials could and could not be solved by formulas. He discovered the connection between groups and field extensions. Galois theory demonstrates the strong interdependence of group and field theory, and has had far-reaching implications beyond its original purpose."
My mom was actually reading a novel about him (a beautiful mind kinda style) last summer. I wonder how he would have turned out if he had not passed away so young.
Surprisingly, the only other mathematician that died really young (aka younger than jim morrison and co) is Niels Henrik Abel, also mentioned in the quote. Makes you wonder how healthy algebra is, doesn't it ;p
"The tragedy of Galois is that he could have contributed so much more to mathematics if he'd only spent more time on his marksmanship" - Olin Shivers, supposedly quoting his PhD advisor
When I took Abstract Algebra as part of my Math degree, we used Gallian, Herstein, and Judson. Gallian had a lot of examples and felt very much like the textbooks I used in high schools (colorful, lots of prose, etc.) It is incredibly easy to get into, but sometimes you've understood something and you don't need 25 more examples.
Herstein and Judson were both a less verbose (sometimes to the point of being unhelpful), but I'd still recommend them. I think I learned a lot from being exposed to different books. I could always look at another book if one book's explanations, examples, proofs, exercises were an issue. The Math Stackexchange was also invaluable because many of the proofs I saw there were very different from the ones I saw in class or in textbooks.
If it is like other edition changes, you can expect about four typo-fixes, a new sticker on the shrink wrap, a 7-10% price increase, and a single-use code for online tools that make the professor's life easier. Every third edition, a "new and expanded" final chapter and an intro change, and a 20% price bump.
I used to date a textbook marketing manager. It was a joke internally, too.
You forgot the part where you rearrange the questions so that students have to get the latest edition to be able to get the correct question 2 on their assignment.
I used to read textbooks for Reading For the Blind and Dyslexic, and in a number of cases I found that the solutions in the back of the book were not for the same exercises in the front. Somebody had rearranged the exercises, but not the solutions.
For self study of abstract algebra, I recommend Dummit & Foote. I used it as a supplement to my undergrad algebra classes and found it to be very useful. This is in contrast to some books like Artin which leave a lot to the readers, while not necessarily bad, are sometimes difficult for self-study.
I've learned from both in various undergrad and grad courses and I have to say I like Artin better. Dummit & Foote is a little too dry. I recommend Artin and Dummit & Foote as a supplement.
> For self study of abstract algebra, I recommend Dummit & Foote. I used it as a supplement to my undergrad algebra classes and found it to be very useful.
I've almost-exclusively seen "Commutative Groups" referred to as "Abelian Groups". Is there a reason you use "Commutative Groups"? I'll admit my algebra is quite limited and I may be missing a subtlety.
There is no subtlety here. That groups which are commutative are referred to as "abelian" is purely historical. Refering to them as commutative helps understanding for people who are not fammillar with them, and helps highlight the structural simmilarity that the diagram is attempting to show.
Sadly, I find it unlikely that we will get mathamaticians to agree to stop calling them Abelian groups, so learners will have to learn that name eventually.
When I was a physics major at UCI the abstract algebra course ended up being my favorite course. The prof was young, and a real hard ass. Apart from the usual, he forced us to memorize proofs and regurgitate them for tests, as well as come up with original proofs. That might sound draconian (it certainly did to me; none of my real analysis classes asked that of us) but it turned out to be really hard to do unless you understood the entire proof. And it also turns out that if you understand a proof, or a set of them, you can produce new ones.
Of course, almost everyone failed the course. One guy got an A, a couple of us got C's (I was one) and the rest got F's. Never been prouder of a C in my life.
Your analysis exams didn't require bookwork? For someone who has done four years of maths at Cambridge, this idea is dumbfounding: about a third of each exam was bookwork. (Except, in my year, Category Theory, in which there was no bookwork at all. Of course I don't hold a grudge about how much time I spent learning the proofs of the monadicity- and adjoint-functor theorems.)
I'd be appalled if a third of an exam was "reproduce this proof of this theorem". However, if a third of the exam is "Produce a proof of this theorem" and people don't care about whether the proof is original or reproduced, that seems fine.
Certainly, being able to proof any theorems your rely on is important. But being able to recall how any given theorem was proven in a book isn't important.
Not sure what you mean by "bookwork". Is that what memorizing proofs is called in the UK? Most of the exams and homework were more typical math exercises, e.g. little proofs and calculations within the context of special cases.
I mean, I remember getting tested on parts of the "epsilon delta" definition of a limit in analysis, for example, but I don't recall ever being asked to regurgitate the entire thing correctly. It's funny but I don't even associate important tools like Taylor series with "proofs", only applications. The lecturer would sometimes prove things, but we were never tested on the proof, only the ability to apply the result. Mind you, I was physics not math so perhaps that's the difference. In fairness there is a LOT of math tools to learn, and I'm not sure there's really time to get into proving everything as an undergrad, but perhaps that's a mistake.
Wiktionary's definition of "bookwork" is accurate: "The act of memorising information; used attributively to describe or denote questions that test information learned rather than requiring additional thought." It notes that the definition is "chiefly University of Cambridge", which I didn't realise. Perhaps it is indeed a physics/maths thing.
I would tend to say: It's the basis of everything. Here are some examples that were not given in other comments:
- Graph theory is pretty algebra-heavy, and graphs are everywhere.
- Static analysis, in particular abstract interpretation, relies heavily on vector spaces and various algebraic structures.
- All the "highly functional" structures (you know, monads and the like) are an off-shot branch of algebra.
- Patch theory (git, darcs and other versioning systems that rely on patches). You get nice stuff if you use algebraic properties (such as having your patch commutes and things like that).
I don't know about very specific applications, but I think the field itself and is very complimentary to CS, especially when considering finite or discrete structures. As it's name suggests, abstract algebra abstracts "nice" properties of, e.g., integers and formalizes them in a very concise and general manner. Modern abstract algebra is deeply tied to category theory, and so now these "nice" properties get abstracted even further out to maps between objects, and as maps between categories (i.e., functors). As such abstract algebra is tied to functional programming on some level (I know nothing about this connection though).
Linear algebra is a subfield of abstract algebra, and lots of general theorems about what classes of matrices are diagonalizable, or what their eigenvalues look like, etc. are within the purview of abstract algebra. These types of results are relevant to many algorithms, e.g., page rank.
Aside from that, I think abstract algebra is quite a beautiful field in its own right. Two books I would recommend are Artin's Abstract Algebra (as an intro) and Lang's Algebra (more advanced, good bridge into the category theory perspective).
I second the recommendation for Artin's Algebra. I'd also recommend Paolo Aluffi's Algebra: Chapter 0, which is a nice alternative to Lang and also uses category theory right from the beginning and doesn't require many prerequisites.
Monoids, groups, rings, fields, etc. are all very relevant to cryptography. In general algebraic constructions come up more often than you might think. The idea of a homomorphism though not specific to algebra is prevalent in almost any formal domain. Compilers by and large can be considered homomorphisms that preserve certain semantic properties which if you dig deep enough can be expressed as algebraic structures: http://www.logicmatters.net/resources/pdfs/Galois.pdf.
Developing an intuition for all those things is best accomplished by studying abstract algebra.
Monoids aren't just relevant to cryptography. Any structure with an associative "+" operator forms a monoid (technically semi-group, but whatevs.) Lists are a prime example.
Smith is a great expositor. I had the joy of participating in a category theory reading group with him; he really knows his stuff. It's refreshing to be taught by a mathematician/philosopher rather than simply a mathematician.
Most of the modern theory of error-correcting codes has its roots in abstract algebra. It's also important in theory - Babai's recent celebrated quasipolynomial algorithm for graph isomorphism relies on group theory. Most of the recent results in cryptographic obfuscation also rely on group theory because they encode circuits as permutation matrices using Barrington's theorem.
Guy Steele [1] sometimes mentions it. He gave an interesting Google Tech Talk called Four Solutions to a Trivial Problem [2], and at 1:59 [3] he said:
Also, algebraic properties are important. I've got a background in applied algebra, and I think that has informed my programming. And I think that making programmers aware of algebraic properties of their code, and communicating some of those properties to the compiler, may be worthwhile.
The language Fortress [4], which he was one of the language designers of, allowed one to explicitly provide the compiler with such information - you could say that a certain operation was distributive or associative for instance, and the compiler could then do some refactorings and optimizations taking this knowledge into account.
I think that Abstract Algebra has the same relationship with CS as Linear Algebra has with the theory of most engineering disciplines. That is to say that in computer science, Abstract Algebra is the natural setting to define and decompose problems and design their solutions. For a specific example, CRDTs are a fundamentally algebraic approach to problem solving in computer science. They were discussed recently on HN here. [0] If you want to go further down that rabbit hole, Joseph Goguen spent a large portion of his career working on applications of Abstract Algebra to computer science. He produced a category-theory-focused introduction here. [1]
As Stepanov said: “[Generic] algorithms are defined on algebraic structures”. He even gives an example: “I realized that a parallel reduction algorithm is associated with a semigroup structure type”. If you care about algorithms being maximally reusable (as in “just reuse”, rather than “tweak and reuse”), you definitely want to adopt an abstract-algebraic mindset.
Crypto is probably a better way to learn abstract algebra than the other way around, for whatever that's worth. You don't need more than a surface level understanding of abstract algebra to do fairly serious crypto work, but a lot of abstract algebra made more sense to me after a few years of crypto.
It has a lot of application in functional programming languages, thinking of my programs algebraically has had a meaningful effect on what feels like "clearer expression of thought" in my code.
Do you mean to say a finite ring? "Pure" integer arithmetic in math (without any notion of overflow or wrapping) is a ring also, so I'm not sure what distinction you're making by saying "in hardware".
There are languages that overflow from hardware integers into some kind of BigNum automatically. Those languages are not finite rings. (Well, OK, they're finite in the sense that eventually the memory space will be exhausted...)
One way to view it, is that all other fields with the word "algebra" in them, are applications of Abstract Algebra. That includes Linear Algebra, Relational Algebra, and Boolean Algebra, which you might already be familiar with and know their relevance to CS.
Category theory is considered a field that is associated with algebraic skillsets.
Edit: Crypography is specifically associated with algebraic number theory.
Abstract algebra is also used in algebraic geometry, which also comes into play with facial recognition software, and maybe with fingerprinting as well.
Every reply to my parent question has been excellent. What beginner resources do you recommend for learning category theory and its applications to programming?
I'd suggest learning it via Haskell or maybe Scala if you've found programming helps your understanding of mathematics as I have. Learning via doing, internalizing by example, then diving deeper into the then-easier theory is how I like to go about things.
I took Abstract Algebra course at a local university couple of yrs ago. I learnt how to do proofs first from Spivak's calculus, learning to do proofs felt very similar to writing code. I really enjoyed doing Abstract Algebra proofs. Got top score in the course and promptly forgot everything about it after a couple of months. Back to the day job fixing garbage collection on poorly written java apps.
[1] https://www.math.upenn.edu/~ted/371F14/math371.html