Although it has some overly verbose constructs, IMO most of the complained verbosity in java is due to what is considered idiomatic java.
You can indeed write much more concise and clear code in it, but for some reason the "proper" way to write java is to use a ton of factories, defining new class (and immediately extending them with more subclasses) for every little thing, wrappers, interfaces and so on.
The reason why idiomatic Java is so verbose is because if you don't write it that way from the start, you wind up painting yourself into a corner because Java doesn't have any way to add flexibility later without breaking things.
Yeah, what drives me crazy is the insistence that java "properties" be hidden behind set/get methods even for simple ones in which their is realistically no need to every intercept their accesses. I think C# and Python at least allow transparent control of properties while still allowing the caller's code to look natural.
"We may see a shake-up in the mobile market, with at least 18 new Android handsets being released this year. Until that happens, iPhone will remain a market leader and developers will have to put up with XCode and Objective-C."
Although the author says this, the fact that 18 new android handsets can be released in a year is one of the problems with both Android and, to some degree, Windows Mobile.
The iPhone OS isn't really revolutionary. Even their interface on their home screen is pretty simple, it's just a more rigid desktop metaphor, Apple dock and all. It's the hardware that truly stands out. The hardware features Apple's carefully thought out design and style, along with an incredibly well done multi-touch screen. The awesome hardware accompanying the incredibly polished and functional screen are really the biggest selling points of the iPhone that no one has matched yet.
It's the same win that Apple's laptops provide over the multitude of OEMs that produce other computers. But it's obvious that hardware design is more important to consumers than the development platform, thus developers are going to have to cope.
Apple makes nice things, but market segmentation is good. It means people can choose between the basic 25 Euro Nokia that doesn't do that much, or the 500 Euro model with all the bells and whistles, or a variety of phones in between.
My guess is that Apple doesn't really care: they'll eventually end up with some reasonable, but minority stake in the market... more or less like the computer market.
This is not what happened in the music player market - in 2007 the iPod commanded over 70% of sales, helped by modest segmentation (flash vs HDD).
While the mobile phone market is much larger and a phone is less of a luxury item for many, I expect they could eventually have the majority of the smartphone market.
I think they'd lose some of their cachet if that happened.
In any case, I hope they stay in a smallish chunk of the market like the computers - they're nice phones, but I am simply not interested in a hacker-unfriendly platforms, and I'd hate to see it become the major, or only one. Android wins by a mile (or two) from that point of view.
Indeed. The simple lack of a hardware keyboard means the iPhone isn't really an option for me. If there were no phones available with hardware keyboards I would be very disappointed.
In fact, I'm disappointed that there are no upcoming Android phones with decent keyboards, so I won't be getting an upgrade any time soon.
I would say that you're correct. Apple's polish and attention to detail only seems to actually matter to a small minority of people - about 5% of the global marketplace, historically. Most (normal) people don't pay sufficient attention to technology to notice the difference and the hefty price premium simply doesn't seem worth it to them.
As in life, I would expect the Android ecosystem to out compete the Apple mono-culture in the long run.
I would have agreed with you a couple months ago, but a $99 iPhone 3G KILLS other $99 offerings of every other manufacturer. Now if Apple sticks with AT&T in the U.S. that's a serious handicap, but if they begin working with other networks, the iPhones should take a lot of market share and be the dominant player eventually.
while you are correct, the upfront amount is what most people think about when buying most things (not everyone). housing bubble? (we can debate about who is to blame all day, but people simply looked at their monthly payment and down payment when they signed their deeds.)
The number of applications on AppStore shows, that developers cope just fine.
Also it is very tough to compare technology and tool you are familiar with to the new ones without any bias.
I'm not sure that's entirely fair - I read it more like "I'm used to A, which means I find A much easier to use."
And in this context, I think that's fair enough. There are far more developers out there with more than a passing familiarity with Java & the Java toolchain, than there are for Objective-C/XCode.
By the sounds of it, it will be easier for those developers to develop for Android phones than the iPhone. Regardless of the relative technical merits of the platforms viewed in isolation, the ability to re-use your existing Java chops is going to be valuable.
(I speak as someone who has done equally small amounts of development in both environments - personally, I prefer Objective-C development)
Exactly. Being a hardcore Android fan, I was really looking forward to some high-quality developer-perspective anti-iPhone rant but the whole post can be really summed up by "I know Java programming better than OS X programming so I had an easier time developing for Android than for the iPhone".
yeah but he does have a valid point. Java's IDEs are just so much better than what ObjectiveC has to offer and having a GarbageCollector makes you're code much less error prone and increases productivity. So it kind of does seem like a stoneage enviroment coming from Java.
Not having GC on the iPhone is unfortunate, and I think there's a good chance we'll see it in OS 4.0. But Apple developers have a culture of not trusting garbage collection. It's irrational, but it runs deep through most of the experienced Apple devs I know. Thankfully, of all the manual memory management environments I've used, Objective-C's is the best.
Saying java IDEs are better than XCode is not a valid point. It's an opinion (and an arguable one at that).
As to memory management: given that Cocoa's retain/release/autorelease is not painful and my most agonizing debugging sessions with java all seem to revolve around theorizing WTF the garbage collector is doing and why... Well I think the productivity opinion is arguable as well.
This guy sounds like he has been programming Java way too long without doing any other programming to expand his mind and in his first attempt at something new and different complained the entire time.
LOL I was thinking the same thing ;), Java is OK for web dev, but jeez frontend GUI development ;/ He didn't touch on the pure speed of an iphone app/game.
"You also can’t see or install apps that cost money on a developer phone. Actually you can, but not if the app has copy protection — which is almost every non-free app. On the other hand when you upload your app to the app store it’s available within minutes, so you don’t have to worry about an approval process."
That seems unfortunate, as with some of the android phones you essentially want the developer version (or something similarly unlocked) so that you can eg kill background processes when you don't need them to improve battery life
I don't like Java at all, and I'm a long time C programmer, but... in the article there is at least one thing that's simply true: Objective-C is an 80s language, from syntax to behavior. What is particularly annoying about it is that's not low level enough, like C, where you manage memory by hand but it's pretty clear what's happening, nor it is more high level (automatic memory management of some kind). It's in the middle. Another problem is verbosity. It's just too verbose to create a trivial class, and the syntax is hard to remember in my opinion.
C has no conventions for memory management. You've got malloc, you've got free, and the rest is up to you.
Objective-C has an easy to follow pattern, reference counting, which is used consistently throughout the language and libraries.
Creating a class is no more (or less) typing than Java or C++. The syntax is different, but harder to remember? Not really convinced of that. There are basically six things to remember:
@interface for declaring a class header
@implementation for declaring a class implementation
@end for ending either
-whatever for an instance method
+whatever for a class method
and [object message] for message passing syntax
This is exactly the problem. In every reference counting system you need to know if a function will incrRefCount or not the passed argument, if you need to increment it instead before to pass a new object, and things like this. One can try to have conventions but actually it is needed to know the behavior of other code in order to know if objects are reused, if one must pass objects that will be retained, if the object is expected to be already "safe" and will simply used and removed without to retain/release the reference.
For instance if you try to write a C extension for Python or Tcl or other systems using reference counting you'll notice that if you don't know every well the internals it can be not obvious when to increment or not the refcount.
I use refcount myself in Redis, and again for people not used to Redis internals to implement a new command may not be trivial exactly for this problem of reference counting. Reference counting is not something to expose for a language that aims to be a such an higher level. In short every language with complex OOP system should have some kind of automatic memory management IMHO.
Reference counting really isn't that hard to follow. Cocoa's conventions are pretty straightforward. You need to release (or autorelease -- which by the way is great) for every call to alloc, retain, copy (inc. fooCopy), or new (inc. newFoo). Everything else is autoreleased. Exceptions are documented.
Heh, this is probably true of Objective-C, it's just funny to hear a Java developer say this about any other technology...