Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rigor and mathematical terminology make it simpler and easier, because you can say more things with fewer words and more precision. Mathematicians are not masochists. Compared with truly understanding the theorems and the concepts, the effort to understand the terminology is basically zero.


I would say that the pros of mathematical notation is the opposite of rigor. You get to not write everything out, which is both good for the lazy reader and lazy writer.

It is just not very good for the newcomer that needs to learn the implicit assumptions that are not written out.


> because you can say more things with fewer words

You say it as if it was a good thing. It's not. APL, J, and K would reign supreme over all programming if brevity and conciseness were all that good for people actually understanding what the heck is happening.

Math notation is a peculiar, loosely defined, context-dependent, ambiguous syntax that requires a lot of memorization, a special keyboard when writing, and a lot of focus when reading. It only benefits people forced to write equations on blackboards all day long. I mean, I feel for them, it's a tough job, but I'm not going to be doing that.

> the effort to understand the terminology is basically zero

You say it as if it was a fact. I don't believe it's anything else than a gut feeling you have. It's trivial to find people for whom the effort needed to understand the terminology or syntax was too big of a barrier to entry. If you could show a proper, replicated study from scientists working on cognition and memory that proves this statement (zero-cost of alien terminology), it would be great. Otherwise, I see this as a gut feeling coupled with survivorship bias.


> Math notation is a peculiar, loosely defined, context-dependent, ambiguous syntax

That's the point. This is what going all-in on formality looks like:

    /-- If n is divisible by m then the set of divisors of m is a subset of the set of divisors of n
    lemma divisors_subset_of_dvd {m : ℕ} (hzero : n ≠ 0) (h : m ∣ n) : divisors m ⊆ divisors n :=
    finset.subset_iff.2 $ λ x hx, nat.mem_divisors.mpr (⟨(nat.mem_divisors.mp hx).1.trans h, hzero⟩)
where each of those names is a reference to another proof - the full call tree would be far, far worse.

Compare that to a handwritten proof:

Let x be a divisor of m. Then there exists some y such that m = x * y. n is divisible by m, so there exists some k such that n = m * k. Thus n = (x * y) * k = x * (y * k) and x is a divisor of n.


> You say it as if it was a good thing. It's not. APL, J, and K would reign supreme over all programming if brevity and conciseness were all that good for people actually understanding what the heck is happening.

On the flip side, try reading all your programs in assembly.

Verbosity is nice up to a point. When I look at the math I did for physics, solving a problem could take 3 pages. If we go with the GGP's approach, it would take perhaps 15-20 pages. Almost everyone would grok it quicker with those 3 pages than trying to read a more verbose 15.

It's actually why some prefer functional programming. Which is easier to understand:

"Take these student papers, and split them into two groups based on whether the student's name begins in a vowel or not."

OR

"Create two groups. Call them vowels and consonants. Take the first paper. If it begins with a vowel, put it in the vowel group. Otherwise put it in the consonant group. Once done with that paper, repeat the steps with the next paper. Keep repeating till there are no papers left."

And I won't even bother describing how one would do it in C (have a counter variable, at the end of each iteration explicitly check for termination, etc).

The difference between math formalism and verbosity is analogous to the difference between the two descriptions above. At some point, more verbosity lets you see the fine details at the expense of the big picture.

> It's trivial to find people for whom the effort needed to understand the terminology or syntax was too big of a barrier to entry.

It's almost impossible to find someone who can do, say Griffiths level electromagnetics or quantum mechanics without that formalism. Your refrain pops up all the time on HN, but I have yet to see someone do even undergrad level physics in any of the alternatives suggested.


> Verbosity is nice up to a point.

Yes, agreed. But so is brevity. Up to a point, it's good. Beyond that point, it's a needless burden that could be eliminated.

> It's almost impossible to find someone who can do, say Griffiths level electromagnetics or quantum mechanics without that formalism.

I'm not saying that the formalism is useless. I'm saying it's not zero-cost. I'm against handwaving away the difficulty of working with the specific syntax because "concepts!"

Again, show me that learning to use the notation is not a problem, objectively, and then we can talk. Otherwise, you're just saying that "you just have to learn it, I did it and it wasn't that hard". OK, but that's not a proof that it isn't hard or it isn't a barrier to entry that could be lowered.


> Again, show me that learning to use the notation is not a problem, objectively, and then we can talk. Otherwise, you're just saying that "you just have to learn it, I did it and it wasn't that hard". OK, but that's not a proof that it isn't hard or it isn't a barrier to entry that could be lowered.

It's always believable that the barrier can be lowered. However, consider that on the one hand, you have millions of people who are quite comfortable with the current notation. On the other hand, there is ... nothing.

As I said, show me the alternative notation where people are comfortable solving Griffith level EM/QM problems with it.

I've heard this complaint for years - particularly on HN. An insistence that a superior notation must exist, that decades of SW experience shows this level of brevity makes working in the field harder, etc. Yet no one has come up with an alternative where one can solve higher level physics problems with it while maintaining sanity.

The status quo is we have a widely used system working. The burden is on those who claim it can be better to come up with something better.


> The burden is on those who claim it can be better to come up with something better.

We need to agree to disagree: in my mind, it's on those who say it's the best it can be to show that it indeed, cannot be better. Because otherwise their insistence on not even looking for ways to make it better looks drastically different. If you can show me that math notation is as closely aligned with how cognition works as possible without sacrificing its usability - that's great, you're right, I concede. OTOH, if the only thing you say is that it worked for a long time, worked for you, and therefore you're not interested in doing anything for it to work better - that strikes me as simply elitist.

The other problem is that nobody who is not deeply involved with math cares enough to take a closer look. How many linguists, psychologists, cognitive scientists invested their time into researching ways of making math notation better? I bet even fewer than the ones who tried researching programming. On the other hand, mathematicians are simply not equipped with knowledge and skills required to objectively assess the notation they use (neither are programmers, BTW.)


> In my mind, it's on those who say it's the best it can be to show that it indeed, cannot be better.

Indeed. The issue is that neither I nor most people are claiming it to be the best. I explicitly pointed this out in another comment.

> OTOH, if the only thing you say is that it worked for a long time, worked for you, and therefore you're not interested in doing anything for it to work better - that strikes me as simply elitist.

How is that elitist? If it works for me, why should I spend time making it better for? What do I gain from it?

And this comment doesn't even make sense. Mathematicians invent notations for their own convenience all the time. There's no committee that says "Yes, this is the official accepted notation." A mathematician uses whatever notation works for him, and if others find it useful, they adopt it.

> The other problem is that nobody who is not deeply involved with math cares enough to take a closer look. How many linguists, psychologists, cognitive scientists invested their time into researching ways of making math notation better? I bet even fewer than the ones who tried researching programming. On the other hand, mathematicians are simply not equipped with knowledge and skills required to objectively assess the notation they use (neither are programmers, BTW.)

You're not wrong, but you're also not helping. This is basically saying "Look, someone should do this!" If you think it's worthwhile, go for it. The category of professionals you have mentioned (linguists, etc) - most of them do not see it to be worthwhile. Put yourself in their shoes. Are they really going to invest a lot of effort to unseat a notation that has evolved over so many centuries, and then fight a battle to convince people to use it? That may well be a career killer.

And where the two of us will have to disagree on: Any improvement, although may be great for newcomers and amateurs, will barely have any impact on the productivity of a professional mathematician. As people have repeatedly pointed out: Notation is amongst the least challenging part of math. Sure, it is a barrier to entry, but at best you're simply lowering the barrier to entry - it won't benefit people who are already good at mathematics. A better notation will not enable them to suddenly grasp concepts they couldn't. That's why mathematicians don't bother.

To be frank (and I say it in all seriousness), the English language has more problems than the mathematical one, and if we could fix those, it would have a much larger impact.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: