Was Objective-C really a hindrance to Apple software development?Was there any commercially available graphical interfaces before the apple Lisa?What was the first language with regexes?Unknown Apple to VGA adapterWhat was this Apple external CRT monitor that looked like an iMac G3?A good method for formatting a modern device in HFS (Not HFS+)What Apple computers are shown in this MacBook Pro Reveal video?Help identify old Apple game from 80s (Apple II, Apple IIGS)First language designed to be embedded?What exactly did Sony contribute to the original Apple PowerBook?How was it back then in 1984, when the Apple II had color, and the new Macintosh didn't?
Why doesn't the university give past final exams' answers?
Has a Nobel Peace laureate ever been accused of war crimes?
Is there a way to fake a method response using Mock or Stubs?
What is /etc/mtab in Linux?
Was there ever a LEGO store in Miami International Airport?
Marquee sign letters
Is it accepted to use working hours to read general interest books?
Is there a verb for listening stealthily?
Coin Game with infinite paradox
Will I be more secure with my own router behind my ISP's router?
What is ls Largest Number Formed by only moving two sticks in 508?
How can I wire a 9-position switch so that each position turns on one more LED than the one before?
Why did Israel vote against lifting the American embargo on Cuba?
`FindRoot [ ]`::jsing: Encountered a singular Jacobian at a point...WHY
Co-worker works way more than he should
Israeli soda type drink
How would you suggest I follow up with coworkers about our deadline that's today?
false 'Security alert' from Google - every login generates mails from 'no-reply@accounts.google.com'
When I export an AI 300x60 art board it saves with bigger dimensions
"Working on a knee"
Did war bonds have better investment alternatives during WWII?
Will temporary Dex penalties prevent you from getting the benefits of the "Two Weapon Fighting" feat if your Dex score falls below the prerequisite?
What does the black goddess statue do and what is it?
Test if all elements of a Foldable are the same
Was Objective-C really a hindrance to Apple software development?
Was there any commercially available graphical interfaces before the apple Lisa?What was the first language with regexes?Unknown Apple to VGA adapterWhat was this Apple external CRT monitor that looked like an iMac G3?A good method for formatting a modern device in HFS (Not HFS+)What Apple computers are shown in this MacBook Pro Reveal video?Help identify old Apple game from 80s (Apple II, Apple IIGS)First language designed to be embedded?What exactly did Sony contribute to the original Apple PowerBook?How was it back then in 1984, when the Apple II had color, and the new Macintosh didn't?
I have heard stories from some of the greybeards I have met on the Internet that Objective-C was by all accounts a nightmare to work with. Was that just a thing about all the low-level languages of its era or is there some sort of feature that made working with it hard?
Did Objective-C hamper software development for Apple software, or is this just the random experiences of someone on the Internet?
programming apple
|
show 5 more comments
I have heard stories from some of the greybeards I have met on the Internet that Objective-C was by all accounts a nightmare to work with. Was that just a thing about all the low-level languages of its era or is there some sort of feature that made working with it hard?
Did Objective-C hamper software development for Apple software, or is this just the random experiences of someone on the Internet?
programming apple
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
9 hours ago
3
Probably because Jobs wasn't a programmer :)
– dashnick
9 hours ago
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
9 hours ago
3
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
9 hours ago
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
9 hours ago
|
show 5 more comments
I have heard stories from some of the greybeards I have met on the Internet that Objective-C was by all accounts a nightmare to work with. Was that just a thing about all the low-level languages of its era or is there some sort of feature that made working with it hard?
Did Objective-C hamper software development for Apple software, or is this just the random experiences of someone on the Internet?
programming apple
I have heard stories from some of the greybeards I have met on the Internet that Objective-C was by all accounts a nightmare to work with. Was that just a thing about all the low-level languages of its era or is there some sort of feature that made working with it hard?
Did Objective-C hamper software development for Apple software, or is this just the random experiences of someone on the Internet?
programming apple
programming apple
edited 3 hours ago
Warren Young
36825
36825
asked 13 hours ago
Neil MeyerNeil Meyer
266110
266110
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
9 hours ago
3
Probably because Jobs wasn't a programmer :)
– dashnick
9 hours ago
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
9 hours ago
3
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
9 hours ago
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
9 hours ago
|
show 5 more comments
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
9 hours ago
3
Probably because Jobs wasn't a programmer :)
– dashnick
9 hours ago
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
9 hours ago
3
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
9 hours ago
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
9 hours ago
7
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
9 hours ago
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
9 hours ago
3
3
Probably because Jobs wasn't a programmer :)
– dashnick
9 hours ago
Probably because Jobs wasn't a programmer :)
– dashnick
9 hours ago
4
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
9 hours ago
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
9 hours ago
3
3
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
9 hours ago
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
9 hours ago
2
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
9 hours ago
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
9 hours ago
|
show 5 more comments
11 Answers
11
active
oldest
votes
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
5
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
9 hours ago
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
9 hours ago
3
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
7 hours ago
1
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
5 hours ago
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
3 hours ago
add a comment |
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
11 hours ago
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
10 hours ago
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
10 hours ago
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
10 hours ago
7
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
9 hours ago
|
show 2 more comments
A nightmare? Seriously?
I never worked for Apple, and I don't know what kind of attitude those people have, but I wrote desktop application software for NeXTStep. I recall Objective-C and the NeXTStep development tools to be a quite reasonable and easy-to-use. As far as I can remember, none of my co-workers had any complaints.
Sounds like somebody was just p***ed off about being asked to do something new and different.
7
the tone of this post is needlessly aggressive
– Neil Meyer
10 hours ago
10
@NeilMeyer, Sorry. I guess I don't know the P.C. formula for describing the stress that some people feel, and the ways in which they might express their feelings when their employer tells them that, "we no longer want you to do this thing for which you've spent the last N years becoming an expert, but you can stay on with us if you're willing to learn this new thing..." My guess is though, that if somebody called the switch to Objective-C a "nightmare," then that may in fact have been what was happening.
– Solomon Slow
10 hours ago
7
Doesn't seem aggressive to me. If anything it validly points out the needless aggression of the original source of the claim (whoever that may be)
– Lightness Races in Orbit
9 hours ago
It's not the best answer, but this is an answer.
– wizzwizz4♦
9 hours ago
1
"Being asked to do something new and different"? More like "annoyed at being asked to re-write an existing codebase to keep up with Apple's latest development fad, again".
– Mark
4 hours ago
add a comment |
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
add a comment |
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
add a comment |
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
add a comment |
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
3
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
11 hours ago
How much time did you spend in Objective-C?
– Ed Plunkett
10 hours ago
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
9 hours ago
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
8 hours ago
add a comment |
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
add a comment |
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
add a comment |
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
add a comment |
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "648"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9785%2fwas-objective-c-really-a-hindrance-to-apple-software-development%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
11 Answers
11
active
oldest
votes
11 Answers
11
active
oldest
votes
active
oldest
votes
active
oldest
votes
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
5
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
9 hours ago
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
9 hours ago
3
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
7 hours ago
1
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
5 hours ago
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
3 hours ago
add a comment |
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
5
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
9 hours ago
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
9 hours ago
3
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
7 hours ago
1
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
5 hours ago
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
3 hours ago
add a comment |
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
answered 10 hours ago
Maury MarkowitzMaury Markowitz
2,866626
2,866626
5
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
9 hours ago
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
9 hours ago
3
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
7 hours ago
1
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
5 hours ago
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
3 hours ago
add a comment |
5
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
9 hours ago
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
9 hours ago
3
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
7 hours ago
1
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
5 hours ago
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
3 hours ago
5
5
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
9 hours ago
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
9 hours ago
1
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
9 hours ago
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
9 hours ago
3
3
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts
!
on everything" is doing it completely wrong.– par
7 hours ago
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts
!
on everything" is doing it completely wrong.– par
7 hours ago
1
1
In defence of Swift. You mention stating the type as
var c:Int=0
in this case, you can do without the type completely and infer it by doing var c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of !
is a sign of poor coding standards, and definitely not "everyone puts !
on everything", especially in enterprise code– Ferdz
5 hours ago
In defence of Swift. You mention stating the type as
var c:Int=0
in this case, you can do without the type completely and infer it by doing var c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of !
is a sign of poor coding standards, and definitely not "everyone puts !
on everything", especially in enterprise code– Ferdz
5 hours ago
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
3 hours ago
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
3 hours ago
add a comment |
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
11 hours ago
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
10 hours ago
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
10 hours ago
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
10 hours ago
7
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
9 hours ago
|
show 2 more comments
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
11 hours ago
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
10 hours ago
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
10 hours ago
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
10 hours ago
7
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
9 hours ago
|
show 2 more comments
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
answered 11 hours ago
TommyTommy
16.5k14782
16.5k14782
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
11 hours ago
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
10 hours ago
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
10 hours ago
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
10 hours ago
7
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
9 hours ago
|
show 2 more comments
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
11 hours ago
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
10 hours ago
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
10 hours ago
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
10 hours ago
7
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
9 hours ago
7
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
11 hours ago
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
11 hours ago
4
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
10 hours ago
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
10 hours ago
3
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
10 hours ago
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
10 hours ago
2
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
10 hours ago
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
10 hours ago
7
7
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
9 hours ago
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
9 hours ago
|
show 2 more comments
A nightmare? Seriously?
I never worked for Apple, and I don't know what kind of attitude those people have, but I wrote desktop application software for NeXTStep. I recall Objective-C and the NeXTStep development tools to be a quite reasonable and easy-to-use. As far as I can remember, none of my co-workers had any complaints.
Sounds like somebody was just p***ed off about being asked to do something new and different.
7
the tone of this post is needlessly aggressive
– Neil Meyer
10 hours ago
10
@NeilMeyer, Sorry. I guess I don't know the P.C. formula for describing the stress that some people feel, and the ways in which they might express their feelings when their employer tells them that, "we no longer want you to do this thing for which you've spent the last N years becoming an expert, but you can stay on with us if you're willing to learn this new thing..." My guess is though, that if somebody called the switch to Objective-C a "nightmare," then that may in fact have been what was happening.
– Solomon Slow
10 hours ago
7
Doesn't seem aggressive to me. If anything it validly points out the needless aggression of the original source of the claim (whoever that may be)
– Lightness Races in Orbit
9 hours ago
It's not the best answer, but this is an answer.
– wizzwizz4♦
9 hours ago
1
"Being asked to do something new and different"? More like "annoyed at being asked to re-write an existing codebase to keep up with Apple's latest development fad, again".
– Mark
4 hours ago
add a comment |
A nightmare? Seriously?
I never worked for Apple, and I don't know what kind of attitude those people have, but I wrote desktop application software for NeXTStep. I recall Objective-C and the NeXTStep development tools to be a quite reasonable and easy-to-use. As far as I can remember, none of my co-workers had any complaints.
Sounds like somebody was just p***ed off about being asked to do something new and different.
7
the tone of this post is needlessly aggressive
– Neil Meyer
10 hours ago
10
@NeilMeyer, Sorry. I guess I don't know the P.C. formula for describing the stress that some people feel, and the ways in which they might express their feelings when their employer tells them that, "we no longer want you to do this thing for which you've spent the last N years becoming an expert, but you can stay on with us if you're willing to learn this new thing..." My guess is though, that if somebody called the switch to Objective-C a "nightmare," then that may in fact have been what was happening.
– Solomon Slow
10 hours ago
7
Doesn't seem aggressive to me. If anything it validly points out the needless aggression of the original source of the claim (whoever that may be)
– Lightness Races in Orbit
9 hours ago
It's not the best answer, but this is an answer.
– wizzwizz4♦
9 hours ago
1
"Being asked to do something new and different"? More like "annoyed at being asked to re-write an existing codebase to keep up with Apple's latest development fad, again".
– Mark
4 hours ago
add a comment |
A nightmare? Seriously?
I never worked for Apple, and I don't know what kind of attitude those people have, but I wrote desktop application software for NeXTStep. I recall Objective-C and the NeXTStep development tools to be a quite reasonable and easy-to-use. As far as I can remember, none of my co-workers had any complaints.
Sounds like somebody was just p***ed off about being asked to do something new and different.
A nightmare? Seriously?
I never worked for Apple, and I don't know what kind of attitude those people have, but I wrote desktop application software for NeXTStep. I recall Objective-C and the NeXTStep development tools to be a quite reasonable and easy-to-use. As far as I can remember, none of my co-workers had any complaints.
Sounds like somebody was just p***ed off about being asked to do something new and different.
answered 11 hours ago
Solomon SlowSolomon Slow
36818
36818
7
the tone of this post is needlessly aggressive
– Neil Meyer
10 hours ago
10
@NeilMeyer, Sorry. I guess I don't know the P.C. formula for describing the stress that some people feel, and the ways in which they might express their feelings when their employer tells them that, "we no longer want you to do this thing for which you've spent the last N years becoming an expert, but you can stay on with us if you're willing to learn this new thing..." My guess is though, that if somebody called the switch to Objective-C a "nightmare," then that may in fact have been what was happening.
– Solomon Slow
10 hours ago
7
Doesn't seem aggressive to me. If anything it validly points out the needless aggression of the original source of the claim (whoever that may be)
– Lightness Races in Orbit
9 hours ago
It's not the best answer, but this is an answer.
– wizzwizz4♦
9 hours ago
1
"Being asked to do something new and different"? More like "annoyed at being asked to re-write an existing codebase to keep up with Apple's latest development fad, again".
– Mark
4 hours ago
add a comment |
7
the tone of this post is needlessly aggressive
– Neil Meyer
10 hours ago
10
@NeilMeyer, Sorry. I guess I don't know the P.C. formula for describing the stress that some people feel, and the ways in which they might express their feelings when their employer tells them that, "we no longer want you to do this thing for which you've spent the last N years becoming an expert, but you can stay on with us if you're willing to learn this new thing..." My guess is though, that if somebody called the switch to Objective-C a "nightmare," then that may in fact have been what was happening.
– Solomon Slow
10 hours ago
7
Doesn't seem aggressive to me. If anything it validly points out the needless aggression of the original source of the claim (whoever that may be)
– Lightness Races in Orbit
9 hours ago
It's not the best answer, but this is an answer.
– wizzwizz4♦
9 hours ago
1
"Being asked to do something new and different"? More like "annoyed at being asked to re-write an existing codebase to keep up with Apple's latest development fad, again".
– Mark
4 hours ago
7
7
the tone of this post is needlessly aggressive
– Neil Meyer
10 hours ago
the tone of this post is needlessly aggressive
– Neil Meyer
10 hours ago
10
10
@NeilMeyer, Sorry. I guess I don't know the P.C. formula for describing the stress that some people feel, and the ways in which they might express their feelings when their employer tells them that, "we no longer want you to do this thing for which you've spent the last N years becoming an expert, but you can stay on with us if you're willing to learn this new thing..." My guess is though, that if somebody called the switch to Objective-C a "nightmare," then that may in fact have been what was happening.
– Solomon Slow
10 hours ago
@NeilMeyer, Sorry. I guess I don't know the P.C. formula for describing the stress that some people feel, and the ways in which they might express their feelings when their employer tells them that, "we no longer want you to do this thing for which you've spent the last N years becoming an expert, but you can stay on with us if you're willing to learn this new thing..." My guess is though, that if somebody called the switch to Objective-C a "nightmare," then that may in fact have been what was happening.
– Solomon Slow
10 hours ago
7
7
Doesn't seem aggressive to me. If anything it validly points out the needless aggression of the original source of the claim (whoever that may be)
– Lightness Races in Orbit
9 hours ago
Doesn't seem aggressive to me. If anything it validly points out the needless aggression of the original source of the claim (whoever that may be)
– Lightness Races in Orbit
9 hours ago
It's not the best answer, but this is an answer.
– wizzwizz4♦
9 hours ago
It's not the best answer, but this is an answer.
– wizzwizz4♦
9 hours ago
1
1
"Being asked to do something new and different"? More like "annoyed at being asked to re-write an existing codebase to keep up with Apple's latest development fad, again".
– Mark
4 hours ago
"Being asked to do something new and different"? More like "annoyed at being asked to re-write an existing codebase to keep up with Apple's latest development fad, again".
– Mark
4 hours ago
add a comment |
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
add a comment |
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
add a comment |
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
answered 9 hours ago
RaffzahnRaffzahn
57.3k6140234
57.3k6140234
add a comment |
add a comment |
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
add a comment |
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
add a comment |
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
edited 9 hours ago
answered 10 hours ago
DarkDustDarkDust
22016
22016
add a comment |
add a comment |
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
add a comment |
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
add a comment |
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
answered 8 hours ago
Brian HBrian H
18.1k68156
18.1k68156
add a comment |
add a comment |
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
3
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
11 hours ago
How much time did you spend in Objective-C?
– Ed Plunkett
10 hours ago
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
9 hours ago
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
8 hours ago
add a comment |
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
3
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
11 hours ago
How much time did you spend in Objective-C?
– Ed Plunkett
10 hours ago
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
9 hours ago
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
8 hours ago
add a comment |
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
answered 12 hours ago
Brian KnoblauchBrian Knoblauch
345210
345210
3
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
11 hours ago
How much time did you spend in Objective-C?
– Ed Plunkett
10 hours ago
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
9 hours ago
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
8 hours ago
add a comment |
3
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
11 hours ago
How much time did you spend in Objective-C?
– Ed Plunkett
10 hours ago
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
9 hours ago
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
8 hours ago
3
3
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
11 hours ago
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
11 hours ago
How much time did you spend in Objective-C?
– Ed Plunkett
10 hours ago
How much time did you spend in Objective-C?
– Ed Plunkett
10 hours ago
3
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
9 hours ago
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
9 hours ago
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
8 hours ago
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
8 hours ago
add a comment |
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
add a comment |
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
add a comment |
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
New contributor
answered 5 hours ago
gnasher729gnasher729
1112
1112
New contributor
New contributor
add a comment |
add a comment |
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
add a comment |
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
add a comment |
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
New contributor
answered 2 hours ago
ChristopheChristophe
111
111
New contributor
New contributor
add a comment |
add a comment |
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
add a comment |
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
add a comment |
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
answered 3 hours ago
MohairMohair
1192
1192
add a comment |
add a comment |
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
add a comment |
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
add a comment |
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
New contributor
answered 1 hour ago
Justin OhmsJustin Ohms
1012
1012
New contributor
New contributor
add a comment |
add a comment |
Thanks for contributing an answer to Retrocomputing Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9785%2fwas-objective-c-really-a-hindrance-to-apple-software-development%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
9 hours ago
3
Probably because Jobs wasn't a programmer :)
– dashnick
9 hours ago
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
9 hours ago
3
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
9 hours ago
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
9 hours ago