Take the 2-minute tour ×
Code Review Stack Exchange is a question and answer site for peer programmer code reviews. It's 100% free, no registration required.

I'm interested in discussing string literals in Objective-C and would like to know 1. if there's a preferred paradigm / style and 2. what (if any) the performance implications might be.

There are parts of the Cocoa / CocoaTouch frameworks that use strings as identifiers. Some examples in Cocoa / CocoaTouch...

-[NSNotificationCenter addObserver:selector:name:object:]
-[UITableView dequeueReusableCellWithIdentifier:]
-[UIViewController performSegueWithIdentifier:sender:]

I find myself most often declaring a global variable within the class like so...

NSString * const kMySegueIdentifier = @"Awesome Segueueueueue";

For segue identifiers, I will often times expose the variable in the header file extern NSString * const kMySegueIdentifier; so that other modules can reuse it.

The same behaviors can be accomplished with preprocessor macros: #define kMySegueIdentifier @"Awesome Segueueueueue". I believe this would also prevent the app from consuming memory to hold these globals. I cringe a little at this syntax however because it exposes the "implementation details" of my string literal constants.

Both of these lines accomplish an end goal of abstracting the string into being easy to remember, type correctly, and will generate compile warnings / errors, is one actually better then the other? What are the situations that would arise where one would be preferred over the other?

share|improve this question
add comment

closed as off topic by Glenn Rogers, Corbin, Brian Reichle, Jeff Vanzella, palacsint Oct 17 '12 at 22:28

Questions on Code Review Stack Exchange are expected to relate to code review request within the scope defined by the community. Consider editing the question or leaving comments for improvement if you believe the question can be reworded to fit within the scope. Read more about reopening questions here.If this question can be reworded to fit the rules in the help center, please edit the question.

1 Answer

up vote 4 down vote accepted

Actually, they are completely equal (obviously sans the extern keyword on the constant define). When literal strings are declared @"", the compiler expands them out to a compile-time constant expression, which looks familiar to us all: (static) NSString *const; -albeit with a lot more compiler magic thrown in.

Nearly the same process occurs with macros, but with one extra step: replacement. Macros are placeholders, which the compiler replaces with the value you #define at compile-time, which is why CLANG can show you errors and warnings.

Where the difference lies is how much work the compiler has to do to replace your abstractions, not in the "memory overhead" they will incur (which means there is absolutely no speed or performance to squeeze out). Besides, NSString* is a brilliant class cluster that's been optimized over the years, especially with literal constants, where a sort of caching occurs in the binary itself. That way, literals used over and over again don't get reallocated over and over again. Though, to make one thing perfectly clear: #define'd literals do NOT reduce memory overhead!

share|improve this answer
add comment

Not the answer you're looking for? Browse other questions tagged or ask your own question.