Thursday, April 18, 2013

Why singletons are "Bad Patterns"

I made a point for my new years resolution of 2013 to use more design patterns. I wanted to start making more conscious and judicious design decisions and to stop "slinging code".

So I started using the singleton pattern for things like caches and input in my projects.

Of course, I then find out that these convenient things are "bad practice" and that I should stop using them. I didn't really get why they are bad though until I crawled around some Stack Overflow posts on the topic.

What it boils down to is...

  1. Singletons hide dependencies
  2. Singletons make it hard to test
  3. Singletons aren't automatically cleaned up and are thus liabilities
I really do enjoy the concept of keeping things scoped so that they can just get destructed and neatly go away.

That being said, a decent "solution" to the "problem" seems to be to create your potential singleton object high up in the design hierarchy as a scoped object. Then, just have every object below it in the hierarchy that needs it receive a reference to it in their construction.

Doing that, you have a clear dependency, a scoped object, and defined initialization order (I believe that's the right way to say it?).

Maybe you could just define a base class that constructs to take reference of potential singleton resources like a cache manager or input manager. Then, the dependency would just be a way normal thing in your design.

No comments:

Post a Comment