Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Steve Yegge: Singleton Considered Stupid  (googlepages.com)
33 points by nickb on May 27, 2008 | hide | past | favorite | 37 comments


Ironically, my experience was that everybody seemed to only understand the factory pattern. Wherever I worked, people loved using factory, whether it made sense or not.


I can't stand it when people use Factory in Python. In Python, classes are objects -- you can just pass a class into a function as a parameter!


you cant in java!?


Of course you can.

I don't see how passing around classes could replace the factory pattern?

In Java these days usually Spring is the factory...


Usually you use the factory pattern because you want to customize the concrete type of object that's instantiated, correct? In Python, you'd just pass the class itself in, and since every class is a callable that constructs an instance of itself when called, you can use that to control instantiation:

   class Foo: ...
   class Bar: ...
   def factory_client(cls):
      my_instance = cls()
      ...
      return my_instance

   foo_instance = factory_client(Foo)
   bar_instance = factory_client(Bar)
You can do the same thing in Java with Class objects and reflection (Class.newInstance() etc.), but reflection is a.) slow and b.) clunky, so most Java programmers create factories instead. Parameterized classes are much more a part of Python culture than Java culture.


Yeah, that's exactly what I meant. I sometimes see people create an entire class wrapping a single factory method, and then override that method in various subclasses to construct different types of objects. The class, the subclasses, and the factory methods are all redundant in Python. The Twisted framework is an offender that comes to mind.

I haven't worked in Java very much, so I'm not sure what the trade-offs are with using reflection vs. factory methods. But GoF gave examples in C++, and was published in 1994, before the reflection API was released with JDK 1.1 in I think 1997. It's entirely possible that using factory methods to customize the concrete types of objects doesn't make a lot of sense in Java, either.


Well thinking about Spring, it does a lot of wiring behind the scenes. How would you do that with Python? What does the factory_client method do?

I don't think Java Reflection is a/the problem.


factory_client does whatever you want it to do - it's just the method that would call your Abstract Factory or Factory Method or whatever.

I've never used Spring, but I assume that the wiring it does is similar to JSF managed beans, eg. you give it an XML config file and it takes care of creating the appropriate Java objects from that. As for how you'd do it in Python - you wouldn't. To quote Phillip Eby: "XML is not the answer. It is not even the question."

http://dirtsimple.org/2004/12/python-is-not-java.html

That's because XML is usually easier to write than Java but harder to write than Python, so you're better off just writing the Python.

You'd write your classes, and then stick them in a data structure somewhere. You could even use the package structure of your app as that data structure - have the client code receive a module (modules and packages are first-class in Python), then use getattr to pull out the class object. Then when you have a class object, you call it to create a new instance.

I suppose if you want some security guarantees, you'd explicitly specify the classes that may be passed in to the method. For example, in my own app, I have a bunch of classes that essentially represent language constructs in a visual domain-specific language, each of which has a "default" value so that when the user's clicking away they don't have to type in code from scratch. I wanted to expose these defaults to JavaScript, the language of the UI. So I put together a dictionary of the classes that I want exposed, then loop through it, instantiate each class as the default, encode each object as a plain old dictionary (they all have a 'props' method that returns the object represented as a JSON-suitable dictionary), and JSON-encode the whole mess. It totals like 5 lines for the whole thing, not counting the list of classes to be exposed.


I once saw code that had factories that generated factories that generated factories. Yuck...

Like the ancient world view, "it's turtles all the way down"


But think about the generalizations it affords... ;-)


Pages 243 to 256 are, in my copy at least, the Interpreter pattern, in case anyone else was wondering.


Pages 243 to 256 are, in my copy at least, the Interpreter pattern

http://en.wikipedia.org/wiki/Interpreter_pattern

From what I can tell, using this is a pretty clear sign you should be using a language with macros...


I wasn't, but my Observer was...


Curiously, somehow only in OOP there is a real need for teaching/learning "patterns", whereas outside of the everything-is-object world anything you do is a "pattern".


Indeed, I found that this statement sounded alarms in my critical thinking cortex:

"the Singleton "pattern" encourages you to forget everything you know about OO design, since OO is hard, and procedural is easy."

Why, oh why, are we encouraging people to use OO 100% of the time, when it's admittedly harder than the alternatives? (I'm looking at you, Java.) Methinks the bandwagon continues to play on, even after two decades...


Something that Alex Martelli said in one of his talks when I was starting off with Python was something along the lines of: "Ignore OOP!"

I had never heard anyone suggest anything like that before, I think some developers think OOP is "the way forward." I spent many months trying to force my mind to think in an OOP manner, now I don't even bother. Occasionally I find a place where it does work, but it's definitely not a magical bullet.


I've just noticed that there are two "impedance mismatches" in a lot of applications.

UI -> Domain Model -> Persistence

Most people know of the 2nd one. The 1st one is just as big an impedance mismatch!

Most of the ways of doing UIs are not OO. If UIs were OO in a given environment, then naive developers following the path of least resistance starting from the GUI "hello world" would naturally build an OO application.

Are there any environments like that? Is Rails one such environment?


Most UI code is intrinsically event-based. e.g., "on click, do this", "on resize, do that"

Erlang comes to mind as a good event-based language. Strangely, it is rarely (if ever) used for UI.


I guess this is why Rob Pike can spend 6 months writing a concurrency-friendly language like Newsqueak, then use it to write a GUI windowing system in a coupe of hours.

http://video.google.com/videoplay?docid=810232012617965344


"The problem is, about 1/3 to 1/2 of them were basically cover-ups for deficiencies in C++ that don't exist in other languages. Although I'm not a huge Perl fan anymore, I have to admit the Perl community caught on to this first (or at least funniest). They pointed out that many of these so-called patterns were actually an implementation of Functional Programming in C++."

Compare Norvig (1998): "Design Patterns in Dylan or Lisp: 16 of 23 patterns are either invisible or simpler". http://norvig.com/design-patterns/


When writing a game, I tend to have a singleton "Game" or "World" class (well, actually a static class, I don't bother with all the getInstance() crap), and from there a bunch of proper classes all "owned" by Game. Now, I'm basically using static classes as a namespace here, but is this really that harmful? If so, what options do I have?


I guess it will fly in your face once you'll try to create multiplayer games and serve several games at the same time?

What do you gain from making a game a singleton?


Well in that case, I'd have a Server class and a Client class. One handles logic and the other handles display and input.

And actually, for my last little prototype (now that I think about it), I didn't make it a singleton, I just had a single public instance of it.

Question retracted.


OK, so reading this article is enough to thoroughly convince one to rethink the usage of Singletons. But what other design patterns would you guys recommend? Most of the common design patterns are some form of Singleton implementation or the other and suffer many of the same drawbacks Steve points out in the article - what are the alternatives?

(I've been making do with custom interactor classes that perform object sharing by using linked-list of class instances with a custom "mapper" and though it works great it's really hard to explain to anyone).

Also - any books you guys would recommend as an alternative to Design Patterns?


I never liked the patterns books because I always thought they gave you solutions without knowing what the problem was.

Ever learn MVC? Might be a good place to start.


When you write your own MVC you'll come right back to the original patterns question. MVC is just the thing programmers interact with so they don't have to deal with inter-class interactions directly. But the MVC framework itself needs to manage those interactions somehow, using one pattern or the other.


Sure enough.

So your question is "How do I write good controllers?"

It's kinda hard for me to give you specific advice without a model and a set of behaviors to traverse. I guess "do good things" and "don't do stupid things" is about it. Singletons are not evil, but, like, gotos, I'd try to avoid them if at all possible.

In my experience, once you've matured the model enough for it to be code, you're automatically thinking in terms that leads to good design. Singletons pop out when you start with a small piece of the problem and try to bludgeon whatever you've got into a solution. These problems arise when you start to design from the bottom-up and use functional paradigms instead of OO paradigms. Nothing wrong with that, but for good code you should really pick one or the other and stick with it.


Looking back at my post I see my mistake. I understand OOP programming and class interactions very well, and as I mentioned have my own "pattern" that I'm using to great satisfaction; built, like you say, from the ground-up to serve that particular purpose. Getting there wasn't easy and certainly not reproducible - it was a mess.

So, my problem is that my solution is specific to my case and it's terribly complicated to explain to someone asking for advice on design patterns. So my question isn't actually how to write a good controller but what resources to recommend for intermediate-level coders asking for advice on the matter.


Not sure if I'm being helpful to you or not, but I'll reply again hoping I hit on something that you find useful.

I think the question is: at what point when designing from the bottom-up do you realize it was shorter to come at it from the top down? And at that point do you do a "52-pickup" and refactor, make the best of what you have, or continue working the way you were?

I used to do this, but haven't for a long time. I was really suprised when I got into this same mess a year or two ago writing a system that had multiple datasources and multiple valid object graphs.

Looking at this from the distance of a year or more, my mistake was in paradigms. I really should have went totally functional with the project instead of trying a classic OOP-tiering structure. Because my head was in the wrong place, I was like a guy with a square peg trying to get it into a round hole -- I just kept thinking I needed a bigger hammer.

Now I could have still kept an OO mentality and solved the problem -- no doubt about it. But that would have required more design up front than I gave it. So in this case lack of sufficient thinking up front combined with my head being in the wrong space caused the code base to be a lot more jumbled than it should have been.


Do I get bonus points for not really seeing much point to the Singleton pattern when I read the GoF book? In fact, I can generalize that to say that all the patterns seemed either obvious or pointless.

I still haven't decided what that means about me as a programmer.


My reaction was similar. Later I found they were useful as a communication medium when you worked in a team that were doomed to use Java or C++; walking may be a natural thing for you that wouldn't need an explanation, but knowing that action is called "walking" helps when you explain to someone how you get to somewhere.

Of course, if you're using more succinct programming languages, it may be much faster and accurate to write a line or two of the actual code to show what you mean, rather than remembering all those names of the patterns.


> Do I get bonus points for not really seeing much point to the Singleton pattern when I read the GoF book?

This comment entitles the bearer to one (1) Hacker News Bonus Point.

Don't spend it all in once place.


"In my Data Structures course in college, when we got to AVL trees, my prof turned and wrote on the board, in huge, clear letters:

AVL Trees are EVIL

...and that's all we had to learn about them. He had us implement red/black trees and splay trees instead. To this day, I have no idea how threaded AVL trees work. But if that's OK with Dan Weld, it's OK with me. "

Thank goodness it wasn't: "LISP is EVIL". Or replace lisp by anything useful and generally misunderstood.


I think the professor's point is that using red-black trees is a dominant strategy over using AVL trees.


There are no stupid design patterns, only stupid programmers who don't understand why they are using the design patterns.


only stupid programmers

the thing is, you can't just read a book about design patterns (any book, no matter how well written) and know when to apply them. You'll have to try it, make mistakes, learn from them, make mistakes in the opposite direction and so forth.

People should build some personal throwaway projects and cram in every design pattern they know. That should at least get them started on the learn/mistake cycle. At least it did so for me. Though in my last project I was totally abusing the factory pattern while all the while chanting "I shall not use patterns when they are not necessary" :blush:


That's right. The programmer may not be stupid at all. He or she may just be in the middle of learning patterns.

One might say that only stupid speakers make simple grammatical errors. Or they might be beginners in a second language. I hope and expect to always be a beginner in something in my life, whether it's a new computer or human language, a new craft, or some other new way of thinking.

By the way, Peter Norvig says that in Dylan or Lisp, 16 of 23 patterns are either invisible or simpler.

That's from the "Design Patterns in Dylan or Lisp" slide from his Design Patterns presentation at:

http://www.norvig.com/design-patterns/




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: