## A Mathematical Notation Suggestion

Started by 3 years ago●6 replies●latest reply 3 years ago●159 viewsAt the risk of offending thousands of university professors, I have a suggestion for the IEEE journal/magazine Editors regarding signal processing manuscripts submitted for publication. My suggestion is: Any manuscript author who insists on using symbolic logic, or set theory, notation such as

be forced to watch 30 continuous hours of infomercials on American cable television.

I agree with mathematical notation expert Knuth (and his coauthors)[1]: "Don't use the symbols ......; replace them by corresponding words. Except in works on logic, of course."

[1] D. Knuth, T. Larrabee, and P. Roberts, *Mathematical Writing*, Washington, DC: The Mathematical Association of America, 1989.

I think many of these kinds of abbreviations are left over from the days when saving time during handwriting, or saving money on typesetting, or saving money on paper or print space, was important. Those limitations are very rare these days, so I think many such abbreviations no longer serve a purpose and are more likely to cause confusion.

**Except:** Rick may want to get input from academics whose first language is not English. On the one hand, I'm certain that there's folks out there who can barely read the text but can follow the math just fine -- on the other, I have no idea what proportion of the total readership that is.

There's no need to discuss this topic with today's academics. Modern academics are the cause of this confusion in mathematical notation! How many classic papers from the past on science, mathematics, or signal processing (or papers written by Nobel Prize winners) used epsilons and upside down A's? Can you show us any?

Your confrontational replies to my posts tickle me. If I post "Batman is a superhero." you're likely to reply with "Batman is a wealthy white guy who beats up mentally ill people."

Well, he was! Those poor murderous psycho-killers!

I didn't mean to be confrontational. Correcting laziness in writing is good, and I'm all for it -- unless there's some unintended consequence.

What triggered my original post was the last sentence below (prior to the figure), from an article in the most recent issue of the IEEE Signal Processing Magazine titled "Analog to Digital Compression".

I shudder if I imagine a mathematical approach containing something like

"x element of domain of Real numbers greater than zero "

D. Knuth's TEX-"language" provides easy-to-read abbreviations which translate to those mathematical signs. Somewhat improved in LaTeX.

I personally like to use libreoffice to create such formulas in fast & pretty text.

Therefore I'm used to read the formula source like: x in setR^+

and even {x} in {setR}^{+} as the more correct term is still readable.

So one might agree to any such formula system. But as far as I know, the big browser companies have such an arrangement already... So why reinvent it...

But don't underestimate the burn-in-effect of our brain.

If I read a term over and over again, I'll have a always present picture in my head which I automatically link to its meaning like I automatically map a photography to the person which it shows.

This helps so much in understanding, that we would lose a lot when replacing these symbols. For example, try to memorize this easy formula:

f(x)=sum from {{i=0}}to{infinity}{{f^{(i)}(0)} over {i!} x^i}

So I would vote for something like this libreoffice syntax or TeX/LaTeX for writing. The codes are learnt pretty fast. But for viewing I vote for the original mathematical symbols.

At least, as long as they are not so complicated as the strange term in your decouraging example, Rick.

And throughout this forum, we could have the server enhanced with any (maybe LaTex-) math-translator, just to have a playground and find out, if mathematical formulas are easily written and even the source code is roughly readable, but it can be automatically translated to pretty mathematical formulas.