Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A decade is a pretty short time for an investment of that magnitude. Learn C (yes, I know), COBOL, Java, Erlang, Python or any other non-niche language and you'll hopefully be employable for decades, or at least until general AI rolls around.


The languages actually aren't really the big investment. It's really two things:

1) Learning new concepts in computer science. For example Swift uses a lot of modern concepts around type safety, closures, parallelism, synchronization, etc. These are a big learning curve but they are not specific to Swift and you'll notice other modern languages are adopting the same concepts.

2) The UI frameworks are fundamentally different. If you're an option expert at XCode constraints, storyboards, etc, none of the applies to android and vice versa.

I can tell you this, if you do bite the bullet and learn both very well you'll be surprised how much it helps you learn other systems more quickly because you've see all the concepts before.


For example Swift uses a lot of modern concepts around type safety, closures, parallelism, synchronization, etc.

None of those things seem particularly modern to me. Does swift actually have any concepts that weren't already implemented in other languages by say, 1980?

It might seem like I'm being pedantic, but the flip side is - why not learn older, simpler, more mature languages that already have those concepts?


>None of those things seem particularly modern to me. Does swift actually have any concepts that weren't already implemented in other languages by say, 1980?

Modern for mainstream languages. Non-mainstream languages are irrelevant to the discussion, since nobody cares about theme except niche industries, hobbyists and academics...

>but the flip side is - why not learn older, simpler, more mature languages that already have those concepts?

Because those languages are not tied to a $50 billion app industry or have major adoption and increased support.


It has options instead of null


To which the same argument applies.


Which languages before 1980 had an option/maybe type?


https://en.wikipedia.org/wiki/Tagged_union#1960s (and so on in the rest of the article; you could make the argument that being able to express a sum type and using option/result pervasively are different things; I don't know much about early ML but it was from '73...)


No, a sum type is the essence.

TIL that ML is ooold.


ML has roots that go back a long ways, but it wasn't developed as a general-purpose programming language (for use outside theorem provers) until the '80s, and I'd consider it properly "released" to the general public as something intended for real use only in the 1990s, with the publication of the Standard ML definition (1990) the release of OCaml (1996).


Protocol extensions.


It's never just the language. The libraries are where the time goes and a language without libraries is fairly useless these days.


This, and package managers.

For example, node.js, many people say its popular because lots or webdevs knew JS already, but that's not even close to true, most people went to node.js because they found a couple of usable libraries on npm that gave the trust necessary to start the project.


You got it reverse, since when Node started getting adoption npm wasn't even a thing or was still small. Back in the Node early days it was all "we can run JS we know on the server now" (plus some cargo cult hype about it being "fast because async").


> For example Swift uses a lot of modern concepts around type safety, closures, parallelism, synchronization, etc.

Ada. [0]

It's been around since before '83 (when it became an ANSI standard), and was developed for the DOD, to replace the hodge podge of languages they were using, with an emphasis on safety.

It has all the above, whilst focusing on being plain English.

Ada also has a few things that are considered to be fairly modern, and has had them for a long time. Such as:

* No Primitive Data Types

* Type Contracts

* Subtyping, operator overloading

The more languages change... The more they stay the same.

These concepts aren't new.

[0] http://www.adacore.com/adaanswers/about/ada


@Coldtea had it right - I meant it has many concepts that are new to the most popular platforms. For example C# is very popular, but what it has of these features has trickled in slowly over the years and many .NET programmers have had no need to master them, same with Java.

Ada was a beautiful design for its time. Maybe its most fundamental flaw was government oversight. So much about designing a widely successful language is non-technical, choosing the right features, trends, hardware, that builds enough momentum to support a self-sustaining ecosystem. To do so often requires an agility that the DOD just doesn't have. Ada is one of so many examples that reminds us technical superiority commonly loses out to pragmatism.

I do like the name, it would have been fascinating to know Ada Lovelace.


Ada does not have parametric polymorphism, algebraic data types, Objective-C interoperability, etc.


I wasn't criticising Swift. I criticised certain features of Swift being regarded as modern.

But, if you insist:

> Ada does not have parametric polymorphism.

Yes, it does. [1][2] It's supported through the use of generic units.

> Ada does not have ... algebraic data types

Ada does have tagged records, and other variant types, and has had for quite some time. [0] They aren't quite Sum Types, but are incredibly close.

> Ada does not have... Objective-C interoperability

GNAT does. It's part of the GNU Compiler Collection, and as such, can be linked against other languages supported by the toolchain. GCC also supports Objective-C.

[0] http://archive.adaic.com/standards/83rat/html/ratl-04-07.htm...

[1] https://rosettacode.org/wiki/Parametric_polymorphism#Ada

[2] https://en.wikibooks.org/wiki/Ada_Programming/Generics#Param...


> GNAT does. It's part of the GNU Compiler Collection

Yay! :-)

One thing that's probably almost unknown these days is that Objective-<X> was always supposed to be something you can easily add to any <X>, and in fact there were quite a few of these, including Objective-Assembler.


Turns out I was quite ignorant of Ada - thanks for the pointers.


Ada surely has generics since 1983.

Ada 2012 probably even has more generic data containers on their standard library than Swift.

ADTs can be done via tagged records.


> Ada surely has generics since 1983.

That's interesting, I did not know that.

From a quick glance it appears they do not have bounded polymorphism ("protocol-constrained generic parameters"), associated types, or existential types ("values of protocol type"). So for example if you have a generic Set data type you would have to pass in an equality and hash function to each operation instead of saying that the element type is Hashable, etc.

The explicit instantiation looks quaint, but it reminds me of ML functors for some reason:

    procedure Instance_Swap is new Swap (Float);


> From a quick glance it appears they do not have bounded polymorphism

You can use abstract classes and interfaces for that.


Do those first two strike you as extremely limiting if they are not present in a programming language?


Yes. Having done a fair bit of C programming in the kernel, where generic data structures are simulated with preprocessor macros and unsafe casts, I much prefer either static languages with generics, or dynamically typed languages.


Ada was designed for the DOD by the High Order Language Working Group for this.

They wanted something safe for embedded, and the original specification in '83 included generics, so you could handle data structures in a nice, safe, performant manner.


That's a fair point. Personally I don't like that style either so I avoid it at all costs but I've seen it practiced. It usually revolves around creative use of 'void' pointers and terribly hard to isolate bugs.


>Swift uses a lot of modern concepts around [...] parallelism, synchronization

What does Swift have in terms of parallelism/concurrency?


If you're in your fifties, a single decade is about all you need to take you through to retirement age.

To be honest, I would've said going deep on JavaScript right now (React, React Native, etc) would give you a shelf life of at least five years.

Kotlin seems like a bit of a random recommendation - it's not in the TIOBE top 50, and a quick search for Kotlin jobs on indeed.com (the first job search site I found from Google) reveals several orders of magnitude fewer jobs than for anything mainstream (hell, even Haskell beats it comprehensively). Perhaps it's the next big thing though!


Javascript frameworks change on an annual basis if not faster.


This is a trendy sentiment that is oft repeated but of dubious veracity. Seems to me that Angular and React have remained as the preeminent Javascript frameworks for quite a few years without much signs of a shakeup, with the exception of vue.js taking on some modest gains in popularity, but still decidedly in the shadow of Angular and React.


I feel like Node, Angular, React have all been around for a few years each


If you find Erlang a bit weird at first, try out Elixir. Its syntax is more familiar, so it's a bit more approachable. One nice perk is modules can be shared between both languages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: