Enterprise Integration Zone is brought to you in partnership with:

John Sonmez is a Pluralsight author of over 25 courses spanning a wide range of topics from mobile development to IoC containers. He is a frequent guest on podcasts such as DotNetRocks and Hanselminutes. John has created applications for iOS, Android and Windows Phone 7 using native tools, HTML5, and just about every cross platform solution available today. He has a passion for Agile development and is engaged in a personal crusade to make the complex simple. John is a DZone MVB and is not an employee of DZone and has posted 88 posts at DZone. You can read more from them at their website. View Full User Profile

So You Think You Can Polymorph?

04.16.2013
| 7417 views |
  • submit to reddit
In the true spirit of this blog I am going to take the complex idea of polymorphism and make it as simple as possible.

polymorph

Now you may already think you understand polymorphism—and perhaps you do—but I’ve found that most software developers don’t actually understand exactly what polymorphism is.

What is polymorphism?

How many times have you been asked this question during a job interview?

Do you actually know confidently what the right answer is?

Don’t worry, if you are like most developers out there in the world you probably have this feeling that you know what polymorphism is, but are unable to give a clear and concise definition of it.

Most developers understand examples of polymorphism or one particular type of polymorphism, but don’t understand the concept itself.

Allow me to clarify a bit.

What I mean by this is that many times when I ask about polymorphism in an interview, I get a response in the form of an example:

Most commonly a developer will describe how a shape base class can have a circle derived class and a square derived class and when you call the draw method on a reference to the shape base class, the correct derived class implementation of draw is called without you specifically having to know the type.

While this is technically a correct example of runtime polymorphism, it is not in any way concise, nor is it a definition of the actual term.

I myself have described polymorphism in a similar fashion in plenty of job interviews.

True understanding

The problem with just that example as an explanation is that it lacks true understanding of the concept.

It is like being able to read by memorizing words, while not understanding the concepts of phonetics that underlie the true concept of reading.

young boy reading

A good test for understanding a concept is the ability to create a good analogy for that concept.

Oftentimes if a person cannot come up with an analogy to describe a concept, it is because they lack the true understanding of what the concept is.

Analogies are also an excellent way to teach concepts by relating things to another thing that is already understood.

If right now you can’t come up with a real world analogy of polymorphism, don’t worry you are not alone.

A basic definition

Now that we understand why most of us don’t truly understand polymorphism, let’s start with a very basic concise definition.

Polymorphism is sharing a common interface for multiple types, but having different implementations for different types.

This basically means that in any situation where you have the same interface for something but can have different behavior based on the type, you have polymorphism.

Think about a Blu-ray player.

When you put a regular DVD in the player what happens?

How about when you put a Blu-ray disc in the player?

The interface of the player is the same for both types of media, but the behavior is different.  Internally, there is a different implementation of the action of playing a disc depending on what the type is.

How about a vending machine?

Have you ever put change into a vending machine? vending

You probably put coins of various denominations or types in the same slot in the machine, but the behavior of the machine was different depending on the type.

If you put a quarter in the machine it registers 25 cents.  If you put in a dime it registers 10 cents.

And that is it, you now understand the actual concept of polymorphism.

Want to make sure you don’t forget it?  Try coming up with a few of your own real world analogies or examples of polymorphism.

Bringing it back to code

In code polymorphism can be exhibited in many different ways.

Most developers are familiar with runtime polymorphism that is common in many OO languages like C#, Java and C++, but many other kinds of polymorphism exist.

Consider method overloading.

If I create two methods with the same name, but they only differ in type, I have polymorphic behavior.

The interface for calling the method will be the same, but the type will determine which method actually gets called.

Add(int a, int b)
Add(decimal a, decimal b)

You might be shaking your head “no” thinking that this is not polymorphism, but give me the benefit of the doubt for a moment.

The most common argument against this example as polymorphism is that when you write this code the method that is going to be called is known at compile time.

While this is indeed true for statically typed and compiled languages, it is not true for all languages.

Consider Add being a message instead of a method.

What I mean by this is that if you consider that the actual determination of the method that is called in this situation could be differed until runtime, we would have a very similar situation to the common shape example.  (Late binding)

In many languages this is what happens.  In Objective-C or Smalltalk for example, messages are actually passed between objects and the receiver of the message determines what to do at runtime.

The point here is that polymorphism can be done at compile time or during execution, it doesn’t really matter.

Other polymorphic examples in code

Since the intent of this post is not to classify and explain each type of polymorphism that exists in code, but rather to provide a simplified understanding of the general concept, I won’t go into a detailed explanation of all the kinds of polymorphism we see in code today.  Instead I’ll give you a list of some common examples that you may not have realized were actually polymorphic.

  • Operator overloading (similar to method overloading.)
  • Generics and template programming. (Here you are reusing source code, but actual machine code executed by the computer is different for different types.)
  • Preprocessing (macros in C and C++)
  • Type conversions

Why understanding polymorphism is important

I may be wrong, but I predict that more and more development will move away from traditional OO as we tend to find other ways of modularizing code that is not so rooted in the concept of class hierarchies.

Part of making the transition requires understanding polymorphism as a general purpose and useful computer science concept rather than a very situational OO technique.

Regardless, I think you’ll agree that is it nice to be able to describe polymorphism itself rather than having to cite the commonly overused example of shapes.



Published at DZone with permission of John Sonmez, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Barry Smith replied on Tue, 2013/04/16 - 8:45am

Why not make it even simpler? Polymorphism is just a neat way of doing repeated switch statements on a type variable:

e.g, in Java-ish:

public class Animal {

private int type;

public void speak() {

switch(type) {

case DOG: println("woof"); break;

case CAT: println("meow"); break;

}}

...etc. Polymorphism is basically just transforming the switches into implementations.

John J. Franey replied on Tue, 2013/04/16 - 10:12am

You didn't really help clear up understanding.

You munged together the concepts of overriding and polymorphism.  You also attempted to distinguish polymorphism in weakly typed languages which is difficult because every method call is polymorphic by definition.

Quite simply, 'polymorph' is 'many-form'.  I mean, if you go back to the greek, 'morph' is a noun not a verb, meaning 'form' or 'shape', and 'poly' means 'many'.

In OO programming, it means a method can have many (poly) definitions (morphs),  where the definition invoked at runtime is determined when the method is called.  How confusing is that?

John Sonmez replied on Tue, 2013/04/16 - 2:49pm in response to: John J. Franey

This is a common misconception about polymorphism.  

Overriding is a form of polymorphism, as is generic programming.

You are talking only about subtype polymorphism.

Here is a wikipedia article that explains some of this: http://en.wikipedia.org/wiki/Polymorphism_(computer_science)

Thanks for bringing this up though, it is certainly a point of confusion.

John J. Franey replied on Tue, 2013/04/16 - 3:47pm in response to: John Sonmez

A cat is form of animal, it is not a misconception to differentiate cats from animals.  Overriding is a form of polymorphism, it is not a misconception to differentiate overriding from polymorphism.  The differentiation clarifies.


Stephen Lindsey replied on Wed, 2013/04/17 - 5:34am

Just what, exactly, is the point of the large image at the top of this article. I would suggest that most people read these articles at work and it's not good when ones screen is dominated by such an image. 

It's unnecessary, childish and unprofessional, you're by far not the only one who does this; you're just the one I picked to make the point.


Lund Wolfe replied on Sun, 2013/04/21 - 7:13pm

Polymorphism (in programming) does imply differences in behavior (methods) of derived classes.  Otherwise, there is no reason to have derived classes.

The simple explanation of writing the code once for the usage of the super class is all that really matters from the developer's point of view.  Technically, the derived types are created at run time and dynamically bound (and accessed accordingly).

Along with predicting the end of OO, I think the brain is overrated ;-)

Brad Appleton replied on Wed, 2013/04/24 - 6:02pm

I like the way I learned it in SmallTalk better -- "Polymorphism is the ability to send any message to any object capable of understanding it". This covers inclusion polymorphism, parametric polymorphism, and ad-hoc polymorphism. It even covers the seeming exception (tho not really) to subtype polymorphism known as "delegates" in C# (and which was known as "Signatures" in g++ back in  the early 90s).

I often like to describe polymorphism (and its different types) based on upon what varies and what stays the same for each kind of polymorphism:

1. using the same method name for the same type (interface) to invoke differing implementations ==> subtype polymorphism

2. using the same method name for different types to invoke the same implementation ==> parametric polymorphism (generics/templates)

3. using the same method name for (same or different) types to invoke different method signatures ==> ad-hoc polymorphsm

 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.