I couldn't disagree more. Objective-C is a product of the 1980s, when it kind of made sense that your program would crash if you did this:
[NSArray arrayWithObjects:@"Hello", @"World"];
Of course it crashes! You have to add a nil sentinel value to the end of your list of objects, silly. And of course it compiles with only a warning, just like when you leave out the @ that makes the difference between a C string literal and an NSString. Those errors will crash your program as soon as the code is run, but they compile as valid Objective-C. Things like that are just a nuisance if you tell the compiler to treat warnings as errors, though. If you really want to know why Objective-C is hard, why not trust the authorities on Objective-C, namely Apple?
Where Apple tells you you will screw this up is memory management. To start with, there are four different memory management schemes: C memory management for C objects, manual reference counting, automatic reference counting, and garbage collection. You get to choose two, of which C will be one. Objective-C originated as enhancements on top of C, and Objective-C programmers writing Cocoa apps still have to rely on C APIs for some functionality, so you'd think by now they would have provided a better way of managing, say, the arrays of structs you sometimes have to pass to the low-level drawing functions. Nope; everyone still uses malloc and free. Failure to make malloc and free obsolete is hard to forgive.
From the other three memory management methods, pick one. (Different OS versions support different ones.) Automatic reference counting (ARC) is the latest and apparently the new standard, though manual reference counting is still supported, and GC is still supported on Mac OS. Reference counting requires a little bit more thinking than garbage collection. For example, since the Cocoa APIs were written with reference counting in mind, some objects, notably UI delegates, are held as weak references to avoid reference cycles. You basically have to manage those objects manually: create a strong reference to keep the object alive and then delete the strong reference when you decide it's okay for the object to be collected. (I'm not sure, but I think this is true even if you turn on GC, because delegate references remain weak.)
All reference-counting systems have that problem, but at least they have the benefit of determinism, right? When you pay that much attention to object lifetimes, you get to piggyback other resource management on top of memory management and kill two birds with one stone. (In C++ it's called RAII, and it's the saving grace of C++ that almost completely makes up for C++'s other warts.) However, according to Apple, this technique should not be used with Objective-C:
You should typically not manage scarce resources such as file descriptors, network connections, and buffers or caches in a dealloc method. In particular, you should not design classes so that dealloc will be invoked when you think it will be invoked.
Why not? Application tear-down is one issue, but that doesn't matter for resources that are recovered by the OS when a process terminates. "Bugs" are given as a reason, but I think they mean bugs in application code, not in the Objective-C runtime. The main reason, then, is that if your Objective-C programs leaked file descriptors and network connections as often as they leaked memory, the world would be in a sorry state:
Memory leaks are bugs that should be fixed, but....
Remember the "I don't mean to be a ___, but..." discussion?
Memory leaks are bugs that should be fixed, but they are generally not immediately fatal. If scarce resources are not released when you expect them to be released, however, you may run into more serious problems.
In other words, if you really need something to work reliably, you had better use a different mechanism, because you don't want your management of other resources to be as unreliable as your management of memory. That's a pretty strong statement that you will screw up memory management whatever your best efforts.
So apparently Objective-C memory management is hard. That's what Apple thinks, anyway.
> You should typically not manage scarce resources such as file descriptors, network connections, and buffers or caches in a dealloc method. In particular, you should not design classes so that dealloc will be invoked when you think it will be invoked.
Do they propose an alternative mechanism to handling resources other than reference counting? As you state, RAII breaths life into c++, given how well it works for all types of resources.
It sounds like the above statement is possibly being made in anticipation of the introduction of garbage collection, which would make piggybacking resource destruction non-deterministic. Whereas, it could also be interpreted as a very strong reason to favor manual (maybe automatic) reference counting, and eschew GC entirely. I don't know objective-c very well, but I wonder if the use of GC has generated these arguments against it from within the OSX developer community.
That's an interesting hypothesis, but I can't find any source to confirm or contradict it offhand. The part I took the quotes from only mentions that the order of dealloc'ing objects in a collectable object tree is undefined, as is the thread on which dealloc is called. Both of those are easy to keep in mind while implementing dealloc, though. If a resource has to be freed from a particular thread, then dealloc can schedule it to be released on the right thread using GCD. The non-deterministic order of dealloc'ing would rarely be a problem for releasing resources. After all, if a resource is only used via a particular object, and that object is dealloc'ed, then clearly it's okay to release that resource! Perhaps there are complicated cases where resources have to be released in a particular order, but that's no reason to give up RAII for simple cases.
Apparently it's a feature in Xcode 4.4 in the beta release of the Mountain Lion SDK. There's no developer preview for Lion, though. Fingers crossed that Xcode 4.4 will be released for Lion and not just for Mountain Lion....
Best I can tell, all of the problems you cite are fixed by MacRuby. It shows how surprising well Ruby semantics maps onto the message passing semantics of Objective C. They also found ways to wrap up the C stuff without making you manage your own memory.
Not sure why Apple hasn't been more aggressive in pushing it for Cocoa development. Maybe because they don't trust it to perform well, yet, on iOS devices and don't want to promote it until it can be used anywhere as a replacement for Objective C.
What in the parent post do you disagree with? It's probably obvious to you, but it's not obvious to me.
I understood his point to be mostly that syntax melts away after time, and you will just see the concepts. It seems that you are objecting to the notion that "Programming in Objective-C is easy," but I don't see that in his post.
" And of course it compiles with only a warning, just like when you leave out the @ that makes the difference between a C string literal and an NSString."
How is the compiler supposed to know you meant NSString or C-String?
Where Apple tells you you will screw this up is memory management. To start with, there are four different memory management schemes: C memory management for C objects, manual reference counting, automatic reference counting, and garbage collection. You get to choose two, of which C will be one. Objective-C originated as enhancements on top of C, and Objective-C programmers writing Cocoa apps still have to rely on C APIs for some functionality, so you'd think by now they would have provided a better way of managing, say, the arrays of structs you sometimes have to pass to the low-level drawing functions. Nope; everyone still uses malloc and free. Failure to make malloc and free obsolete is hard to forgive.
From the other three memory management methods, pick one. (Different OS versions support different ones.) Automatic reference counting (ARC) is the latest and apparently the new standard, though manual reference counting is still supported, and GC is still supported on Mac OS. Reference counting requires a little bit more thinking than garbage collection. For example, since the Cocoa APIs were written with reference counting in mind, some objects, notably UI delegates, are held as weak references to avoid reference cycles. You basically have to manage those objects manually: create a strong reference to keep the object alive and then delete the strong reference when you decide it's okay for the object to be collected. (I'm not sure, but I think this is true even if you turn on GC, because delegate references remain weak.)
All reference-counting systems have that problem, but at least they have the benefit of determinism, right? When you pay that much attention to object lifetimes, you get to piggyback other resource management on top of memory management and kill two birds with one stone. (In C++ it's called RAII, and it's the saving grace of C++ that almost completely makes up for C++'s other warts.) However, according to Apple, this technique should not be used with Objective-C:
You should typically not manage scarce resources such as file descriptors, network connections, and buffers or caches in a dealloc method. In particular, you should not design classes so that dealloc will be invoked when you think it will be invoked.
Why not? Application tear-down is one issue, but that doesn't matter for resources that are recovered by the OS when a process terminates. "Bugs" are given as a reason, but I think they mean bugs in application code, not in the Objective-C runtime. The main reason, then, is that if your Objective-C programs leaked file descriptors and network connections as often as they leaked memory, the world would be in a sorry state:
Memory leaks are bugs that should be fixed, but....
Remember the "I don't mean to be a ___, but..." discussion?
Memory leaks are bugs that should be fixed, but they are generally not immediately fatal. If scarce resources are not released when you expect them to be released, however, you may run into more serious problems.
In other words, if you really need something to work reliably, you had better use a different mechanism, because you don't want your management of other resources to be as unreliable as your management of memory. That's a pretty strong statement that you will screw up memory management whatever your best efforts.
So apparently Objective-C memory management is hard. That's what Apple thinks, anyway.
The quotes are from the "Practical Memory Management" section of Apple's memory management programming guide: https://developer.apple.com/library/mac/#documentation/cocoa...