Simplicity is the ultimate sophistication. --Leonardo da Vinci

Rule #6 - Adapt and Adopt

The roguish Richard Feynman after winning his Nobel Prize in Physics in 1965.

In 1965, Dr. Richard Feynman won the Nobel Prize in Physics. He was brash, rebellious and rarely passed up an opportunity to point out hypocrisy and irrationality.

For example, when he was a young man working on the Manhattan Project at Los Alamos National Lab, he noticed a big wash under the perimeter fence.

While the gated entrances were heavily guarded, this unprotected gully was deep enough for a man to pass under without bending over.

Feynman walked up to the guarded entrance one day, said hello, then turned, walked down the fence line, walked under the fence and went to work.

Yet another case of the front door being heavily padlocked while the back door swings wide open.

Sounds like a lot of our technology security efforts today. Isn't it comforting to know this stuff has been going on forever.

Anyway, that's not what I want to talk about today.

Feynman's New Trig Notation

When Feynman was much younger, he became annoyed with the standard mathematical notation used for trigonometry and so invented his own. Here is how he describes it in his autobiography:

"While I was doing all this trigonometry, I didn't like the symbols for sine, cosine, tangent, and so on. To me, "sin f" looked like s times i times n times f! So I invented another symbol, like a square root sign, that was a sigma with a long arm sticking out of it, and I put the f underneath. For the tangent it was a tau with the top of the tau extended, and for the cosine I made a kind of gamma, but it looked a little bit like the square root sign. Now the inverse sine was the same sigma, but left -to-right reflected so that it started with the horizontal line with the value underneath, and then the sigma. That was the inverse sine, NOT sink f--that was crazy! They had that in books! To me, sin_i meant i/sine, the reciprocal. So my symbols were better."

Reference https://www.physicsforums.com/threads/feynmans-trig-notations.78087/

— "Surely You're Joking, Mr. Feynman!"

He goes on to acknowledge that while this notation was better for him, no one else understood it and therefore, it was not very useful.

"I thought my symbols were just as good, if not better, than the regular symbols - it doesn't make any difference what symbols you use - but I discovered later that it does make a difference. Once when I was explaining something to another kid in high school, without thinking I started to make these symbols, and he said, "What the hell are those?" I realized then that if I'm going to talk to anybody else, I'll have to use the standard symbols, so I eventually gave up my own symbols."

— "Surely You're Joking, Mr. Feynman"

Standard ways of doing things make communication and interaction possible. It's often not the best way, but it's the common way.

And that makes it the best way.

The reasons one approach gets adopted over another are usually the stuff of history and politics.

Why do Britain and the U.S. use British Imperial units for distance and weight while most of the rest of the world uses the metric system?

I'm sure it's more about the powerful influence of the British empire during a certain period in world history than a rejection of the greater logical consistency of the metric system.

When something becomes a standard there is a high cost to changing that standard.

For better or worse we have certain standard notations in math and physics.

Yet notation is completely arbitrary.

You could come up with a thousand different ways to represent written mathematics.

Right or Wrong, Programming Standards Matter

Well, that brings us to programming.

Just as in math, computer programming has its standards...and its battles over standards.

One coder likes the Kernigan & Ritchie style for opening braces while the Microsoft standard for C# is to put the opening brace on its own line.

// K&R
for (int i=0; i<10; i++) {

}


// Microsoft C#
for (int i=0; i<10; i++) 
{

}

Another C# coder prefers to use the var keyword instead of the full, explicit name of the type.

// ---------------------
// Var shortcut
// ---------------------

// This is a bad use of var because the type is not obvious from the line of code
var v = MethodThatReturnsSomething();

// This is a better use of var
var v = new SomeClassType();


// -------------------------------------------
// Explicit specification of the variable type
// -------------------------------------------

// Redundant, but without ambiguity
SomeClassType v = new SomeClassType();



Still another coder uses "m_" prefixes to denote his private variables while someone else uses just a single underscore and camel casing.

// ----------------------------------
// Private variables prefixed by "m_"
// ----------------------------------

public class MyClass
{
   private int m_MyInt;

   public MyClass()
   {
      m_MyInt = 0;
   }

}

// -------------------------------------------------------
// Private variable prefixed by "_" and using camel casing
// -------------------------------------------------------

public class MyClass
{
   private int _myInt;

   public MyClass()
   {
      _myInt = 0;
   }

}

This rule, Adapt and Adopt, is about the importance of speaking the same language as other developers.

A coding style guide is absolutely essential for any organization. If you don't have one, get one. You can find several good starting points with a Google search.

Then abide by the style guide.

By all means, debate the merits of one approach over another, but in the end make a decision and document it.

Now, having said that, most of your style guide should also reflect the common conventions for your language and coding platform.

As much as I loathe certain coding conventions, some of them have become standards and I can't go around changing existing code to look the way I want it to.

When a developer looks at a piece of code, it's many times easier to digest if the style of that code is familiar and consistent (Remember Rule #1?!). It reduces bugs, too.

So, if you start in a new organization, don't go all vigilante and start coding whatever style you want cause you think it's better.

It's your job, as painful as it might be, to change, to adapt and to adopt their way of doing things.

However, it's just as likely when you enter an organization that they have no consistent coding standards.

If they don't, then complain about that. Create a style guide and ask them if they'll approve it and enforce it.

Feynman was right. His notation for trig was better. But if something is a common convention used by everyone else, then you'll just alienate yourself by going your own way.

Bottom line, adapt and adopt the standards and conventions until you can get your team or the community to agree on new standards.

It may hurt a little in the beginning, but you'll save everyone, including yourself, a lot of pain down the line.

 

Version: 6.0.20200920.1535