Swift for Programmers

Swift is Apple's friendlier language instead of their Objective-C. It's meant to look a little like the "cool" languages Python or C# (for the new shortcuts in each version).

Running it

With a laptop, Swift through XCode is ridiculously slow. You can create a new Swift project and see lovely pop-up help and warnings. "Run" tries to run on your device, instead press the "play" triangle to run a simulator. But don't. It's ridiculously slow, every time, to bring up even the smallest amount of code from XCode.

Instead, run it from the terminal. Even though you already have it if you have XCode, you'll need to download that other thing from the Swift download page. %swift takes 30 seconds to start. There's nothing like import mySampleClass. You'll have to fake it by pasting.


General things to know: 1) Semi-colons are optional. 2) Parens around if and loop conditions are optional. But curly-braces for bodies are non-optional. 3) An underscore, all by itself, is the "don't care" symbol (used in various contexts).

The preferred format for mixing strings and variables is "x is \(x) and sum is \(3+6)".

It's funny about spacing. Assignment operators must have the same spacing on either side (n=5 or n = 5, but not n= 5). There are also funny times where lack of a space can confuse it. For example, the ?: operator requires spaces (b?1:2 is an error, but b ? 1 : 2 is fine) because ! and ? have special meanings immediately after a variable.


The type goes on the end, after a colon, where you can easily leave it off. The required keyword is var or let (for a constant):

var a : Int
var a = 4 // inferred to be Int
let w="cow" // constant

All types start with caps: Int, Float, Double, String, Bool, Character.

Character literals use double-quotes. var ch : Character ="x". This is a stupid idea and does nothing useful. Chars and strings are still different and still don't mix. "x"+ch is an error, since double-quotes make a string unless you force it.

Numeric literals don't have types. Their type is guessed from context. That's just nuts. 3/4 is 0, but 3/4+0.0 is 0.75. The parser looks ahead and assumes you meant 3.0 and 4.0.

Replacing var with let declares constants, which is strongly encouraged. For pointers, let is a const pointer, to non-const data:

let MAX_AGE = 30 // constant int 
let c1 : Cat = Cat(7)
c1.age=12; // this is fine
c1 = c2; // error - let is const pointer

variable/operator misc

Functions, pt-1

The return type goes at the end. Parens are optional:

func max(n1:Int, n2:Int)->(Int) { if n1>n2 { return n1 }; return n2  }

func doStuff(n:Int, name:String)->() { ... }  // no return value

Keyword func is in front. The parms use the same name:type style. The return type is at the back after a ->. Use () for void.

Function calls must include the parameter names (but there's a way around it):

doStuff(n:5, name:"earl") // need to add actual parm names in front
doStuff(5,"earl") // error, no names

Because of this, function prototypes are written with the names, like doStuff(n:name:) or doStuff(_:_:) (which we'll see later). This is just terrible - you need to know the types, but it won't tell you.

Unlike most systems with named parameters, you can't flip the order: doStuff(name:"ted",n:3) is an error.

Required names can be used to distinguish overloads:

// both of these are legal:
func cost(dollars:Int)->(Double) { return dollars*5; }
func cost(spaceBucks:Int)->(Double) { return spaceBucks*2.3; }

var c = cost(spaceBucks:6) // pick out the second one

There are two opposite ways to modify call-with-name. An extra underscore lets you leave it out:

func doStuff(_ n:Int, _ name:String)->() {...} // underscore = no names
doStuff(5,"earl") // now this is legal

The second one -- you know that trick where you use a nice descriptive name in the parm list, where everyone can read it. But then copy it into a shorter name to use inside the function? Adding an extra name in front does that in Swift:

func doStuff(num n:Int, buddy name:String)->() {
  // inside we use n and name:
  if n==5 || name==   ...

// in the call we use the extra names:
doStuff(num:5, buddy:"earl")

All parameters are automatically constants. If you try to change one, the error complains about changing a "let"' - remember that's the word for a constant. I gotta say, this is an OK rule. Functions tweaking their input parms cause more errors than they save time.

Call-by-reference uses inout before the type, and & in the call:

func add5(_ n : inout Int) { n+=5 }
var x=7; add5(&x);

In a function or loop, the defer { ... } runs it's body just before leaving the function (and after the return value is computed):

// prints ABC, then "doing some clean-up"
func abc()->Int {
  defer { print("doing some clean up"); ... } // runs just after any exit

You can use a defer to make a while loop fake a for loop:

while i<10 {
  defer { i+=1 } // runs after body end, or after any break or continue

nullable types ("optional" types)

Swift has the semi-common trick of making int's nullable. Int? is an int which can also be nil. Then it adds non-nullable reference types., and an option to require an explicit dereference, like: cat!.friendCat!.name

Adding an ? after the type makes it nullable - it can be set to nil. This applies to value and reference types. Value types are simple. Here smallest could return nil (on an empty array), so the return value is Int?:

func smallest(_ A:[Int]) -> Int? { ... } // returns an int, or nil

We need a nullable int to catch it:

var n1 : Int? = smallestInArray(A)
n2 = smallest(A) // n2 implicitly declared as Int?

var n3 : Int = smallest(A) // error. n3 can't hold the possible nil

The funny change in Swift is that reference types follow the same rules. They can't use nil unless you allow it with the ?:

var c1 : Cat = Cat("fluffy")
c1=nil // error
c1 = c2 // this is fine. c1 is still a pointer

var bestCat : Cat? = nil // this is fine

This kind of makes sense. Most reference types are meant to act as value types anyway: c1 will never move away from that Cat. Cats that could be nil are much more rare. For fun, Crew : [Cat?]? is an array that could be nil, of elements that could be nil

Looking up a nullable type requires an explicit dereference, which is an explanation point on the end. Not using one on a nullable type is an error:

var n : Int? = 8
a = n! + 1 // <- the ! is required, warns us of possible nil+1

var c : Cat?
num = c!.age // <- the ! is required

var N : [Int?]? // <- nullable array of nullable ints
N![0]!+1 // <- dereference N, dereference [0]

Note that the pointer-ness isn't changing. n1 : Int? is still a value type, which you can only read from with an ! after it. This isn't java-style boxing. Likewise, Cats c1=c2 is a pointer re-assign in ether case.

The idea is to clearly mark possible problems. c!.bestFriend!.name is letting you know there are two ways this could crash by following a nil.

Swift also allows Java/C#-style for these. Change c1: Int? to c1: Int!, with an explanation point at the end. It's still a nullable, but simply c1.name gets a field. It will still crash on nil. You just don't need the ! to remind you. Officially, ! is only for variables that must be nil briefly before initting:

var cowButton : Button! = nil // logically can't be nil, but for real it can

// this will absolutely set cowButton:
cowButton = getNewButton() // we know this will never be nil

cowButton.text // legal, and presumable always safe

? and ! are the same type, and are interchangeable. getNewButton probably returns a Button?, assigned to a Button!, which is fine.

If's and if shortcuts

If's have optional ()'s around the test, and require {}'s around the body, but are otherwise normal:

if n>=0 && n<=10 { print("\(n) is in range") }

Nil-fixing special ifs

The binary n1 ?? n2 operator is a shortcut for "n1, unless it's nil, then n2". It can be used as a cast from an Int? to an Int, since it always returns a value (the result is a non-nullable):

var nn : Int? // can be nil
var n : Int // cannot be nil

n = nn ?? -1 // n=nn, or -1 if nil

Notice how there's no required ! after nn. Using ?? implies you're looking it up. In fact nn! ?? -1 is useless - a nil for nn will crash.

Obviously, the last item should be a non-nullable, with everything else a nullable, but the system doesn't enforce it. n ?? -1 is useless but legal (it will always be n). nn1 ?? nn2 ?? nn3 is legal and will crash if all are nil.

Swift also has the fashionable cleanly-abort-on-nil trick. Using ? for a field instead of ! turns crash-on-nil into answer-is-nil. Obviously, it returns a nullable. It only works on reference types (for value types, there's no need):

var age1 : Int? // nullable int
age1 = c1?.age // if c1 is nil, abort and age1 is nil
age1 = c1!.age // crash if c1 is nil

age1 = c1?.bestFriend?.age // if either is nil, no crash - age1 is nil

It can be used with the ?? trick. This gives -99 on any nil:

// age can be a non-nullable
age1 = c1?.bestFriend?.age ?? -99

The third silly nil-testing shortcut is a combined assign and nil-check. if let ASSIGN { TEST } only runs on a non-nil assign (it assumes a nil was an error). As you'd guess, the variable has scope only in the body:

if let a = c1?.age { total += a }
// same as: if c1 != nil { total += c1!.age }

It allows multiple assigns, and only runs if all work:

if let a1=c1?.age, a2=c2?.age { total += a1+a2 }

In a burst of silliness, a real if-test is allowed after a comma:

if let a = c1?.age, a>=0 { total += a } // add if age exists and is >=0

The fourth nil-tester attempts to declare and assign a variable, and quits the function or loop on a nil. The syntax is meant to imply "if this assignment works, keep going, else quit":

guard let ca = c1?.age else { return }
// if we reach this point, ca is a declared Int (non-nullable)

// same as:
if c1=nil { return }
let ca = c1!.age

The body is required to be a return or loop-break/continue. For extra fun, it also allows the same extra if-test as an if-let. The test is for when to continue:

guard let ca = c1?.age,  ca>=0 else { return }

// same as:
if c1==nil { return }
let ca=c1!.age
if ca<0 { return }

To sum up, Swift has 4 trivial shortcuts to avoid writing if c1==nul.

Reference modifiers for garbage collection

Swift uses reference counting for garbage collection. When the last pointer to an object is moved away, the count goes to 0 and the object instantly deletes itself. As a review, this method has a problem with pointer-loops - if 2 Cats are each other's best friends, they keep each other from being deleted after you lose all other pointers to them. The fix for this is a weak pointer - a real pointer in every way, except it won't keep an object from being deleted. We can make sure any possible pointer-loops have at least one weak pointer.

A weak pointer is normally left dangling, still pointing to the now invalid location. That's fine since you know your code will never use it. In a linked list, the forward pointers are normal, the backwards are weak, and the list is thrown away at the correct time.

Swift has 2 types: unowned is the normal one, which is left dangling, with no way to know it's aimed at an illegal spot. weak is automatically set to nil, which is obviously more work for the machine. Both can only be used inside classes and so on (which should make sense - there's no reason to want a declared variable to be weak).

Weak is simpler to use. Here next and prev are regular pointers, used the normal way, despite prev being weak:

class Node {
  var val : Int = 0
  var next : Node? = nil
  var weak prev : Node? = nil

unowned is funnier. It must have a value (it's not a nullable) which means it must be assigned in the constructor:

class Cat {
  var name = "Kitty"
  unowned var c1 : Collar
  init(myCollar:Collar) { c1=myCollar } // <- c1 must be inited here

For fun, { c1=Collar() } in the constructor is also legal. But since c1 is unowned, the collar is immediately deleted.

The obvious rule is to use unowned where-ever you can, but you probably can't. Use weak everywhere else you need to break pointer cycles.


The type of a tuple is other types in parens: (Int, String). Tuple literals are anything in parens: var nn = (5,"cow").

Tuples aren't as flexible as in other languages. You can't convert them to arrays or slices, or to longer or shorter tuples. The one shortcut is tuple L-values:

 var a=0, b=0, c=0
 (a,b,c) = (4,9,12) // triple assign

Functions return tuples with (type, type). You can catch them, with an underscore for a don't-care:

// returns min, max and mean of an array:
func minMaxMean(_ A:[Double])->(Double, Double, Double) {
  return (m1, m2, sum/total) // create a tuple

// we only want min and mean:
var low, avg : Double
(low, _ , avg) = minMaxMean(Nums)

Access values of a tuple using dot-zero-based-index: d.0 and so on. Or, you can add field names when created:

 var d = (4,"yaw",6.8) // d is a tuple
 d.0; d.1; d.2 // 4 yaw 6.8
 var d2 = (feet:2, hands:4) // add names to fields
 d2.feet // 2
 var d3 : (feet:Int, hands:Int) // these names are now locked to d3
 d3 = (2,4); d3.hands // 4

A very nice rule says tuples with different field names can't be assigned, unless you force it with an as. In other words, they act like the structs they are being used to fake:

 var p1 = (feet:7, hands:8)
 var d2 = (hat:0, coat:0)
 d2=p1 // ERROR
 d2=p1 as (Int,Int)
 d2.hat // 7 - d2 retains its field names

Tuple fields are L-values. d2.hat=3 is legal and fine. And of course the types are fixed. (3,"rat",5.6) is locked as (Int,String,Double).


Like C#, classes are reference types, structs are value types. They work the usual way. Except: this is renamed self, and public/private are file-level. You can read a private field from anywhere else in the file. The constructor is named init:

class Cat {
  var age : Int
  var name : String
  // 2 constructors, named init:
  init() { age=1; name="mittens" }
  init(_ a:Int, _ nm:String) { age=a; name=nm } 
  func isKitten()->Bool { return self.age<3 }

There's no new keyword. Construct and/or allocate with just var c1=Cat().


A quick aside: properties are silly, but C# people think they're cool, so Swift needs them. Then they 1-up C# with another version.

Swift calls normal class fields, properties. C#-style gets-sets are called computed properties. They work in the usual way:

class Angle { 
  var degs=0.0 
  var rads : Double { 
    get { return degs/10 } 
    set(r) { degs=r*10 } // choose any var-name

a.degs=6; a1.rads // 0.6
a1.rads=4; a1.degs // 40

They have another version of a set called a property observer, running whenever a normal field is changed. willSet runs just before an assign, didSet runs after. They each have access to the incoming/previous value. A typical "can't be less than 0" written with didSet:

class Cow
  public var weight : Double { // start of observer for weight
    didSet { if weight<0 { weight=0 } } // runs after the =

It might feel better to use willSet(nNew) { if nNew<0 { nNew=0 } } except you can't change the incoming value.


All functions are virtual, but require a decorative override keyword. It's a required hint:

class Animal {
  public var age:Int = 1
  public func cost()->Int {return 5}

class Cow : Animal {
  // override is required:
  public override func cost()->Int {return age*2}

The dynamic downcast to a subtype is as?. The ? is to show it can return nil. (it returns the ?-version of the type):

var aa : Animal = Cow()
var cc = aa as? Cow // cc is a nullable Cow?
if cc != nil { ... }

The base class is used with super. Ex: return super.cost()*2. Funny rule: super.init() must come last (actual rule: you have to init all of your variables before calling base init.)

Formal virtual base-class interfaces are called Protocols. As normal, they define only functions (and properties) which users must implement:

// straight-forward interface:
protocol heldItem {
  func handsRequired()->Int  

Classes use the usual syntax. hotDog inherits from Food, and satisfies the protocol heldItem:

class hotDog : Food, heldItem {
  // required to implement this:
  func handsRequired()->Int { return bunLen > 5 ? 2 : 1 }

As usual, interfaces are declarable types:

var hh : heldItem = hotDog()
var n = hh.handsRequired()

Container classes

There is NO linked-list class.


Instead of a crude built-in array, then a nicer array class, Swift's built-in array is a fully functional array class. For example arrays can use A.append(8), and x=A.popLast().

Type of an array is [Int]. Official version is Array<Int>. Array literals are written inside brackets: var A:[String] = ["cat","duck","goat"].

var A = [3,6,12] // guess the type is [Int]
var A2 : [Double] = [5,7,8,9]
var W : Array<String> = ["cat","dog","goat"] // declare using long type

Other fun ways to init are by casting from a range, or the constructor:

var A2 = Array(4...7) // [4,5,6,7] cast from range
var A3 = Array(repeating:4, count:3) // [4,4,4] A constructor

Arrays are value types. They copy on assignment. This is very strange. A=B; A[0]=5 will not change B[0]. They use copy-on-write, so A=B or C if you only plan to read from A is fine.


Swift has the standard Map/Hash-table: B["red"]=5. Look-ups return nullables, which is very cute. Instead of checking ContainsKey, just do the look-up - nil means not found.

var CatCute = ["siameese":9, "persian":8, "calico":3] // a dictionary
CatCute["tabby"]=7 // add an item

var c1 = CatCute["persian"] // c1 is an Int?
if c1==nil { no persian }

var c2=CatCute["liger"] ?? 0 // c2 is an Int where 0=not found

The type is [ keyType : valType]. The formal type is Dictionary<key, val>:

var Spellings : [Int, String] = [:] // empty dictionary

Each entry in a Dictionary is a tuple, with names set as key and value.


This is standard Perl-style stuff: A[2...5] grabs those 4 items from A. It includes both indexes. Note there are 3 dots. Expressions for the ends are allowed: A[n*2...A.Count-1]. You can also leave out the start, or use < on the end:

var A = [10,11,12,13,14,15,16,17,18,19,20] // array of Int

var a2 = A[4...6] // {14,15,16} This is an ArraySlice<Int>

var a3 = A[...4] // inserts a 0 as first item [10,11,12,13,14]

var a4 = A[...<commaPos] // shortcut for [0...commaPos-1]

Slices are copies. You can't write to an array through a slice, and slices won't track changes to the array.

The weirdest thing: slices use indexes from where they came. They are not 0-based. A[2...5] has indexes from 2 to 5:

var a2 = A[2...5] // {12,13,14,15}
a2[0] // ERROR - out of range
a2[2]; a2[3]; a2[4] // 12, 13, 14

// double-checking: finding the index of 13 also gives 3, not 1:
a2.index(of:13) // 3

Can use startIndex and endIndex (which is 1 past the last index - A[2...5] has a lastIndex of 6).

Slices and tuples don't mix (tuples don't mix with anything). Slices aren't arrays, but you can cast them into arrays, using [Int](a2). The array indexes will jump back to 0-based (they have to).


While loops work as normal (but parens around the condition are optional, and the body requires {}'s). There's also a "repeat { } while" loop.

while i<3 { print(i); i+=1 }

repeat { i*=2 } while i<100

There's no C-style for(;;) loop. There are the usual foreach-style shortcuts: give a range with dot-dot-dot, or a special range-making function, or a collection to walk through:

for a in 3...9 { print(a) } // 3...9 is a Range
for _ in 1...8 { print("x") } // underscore means to hide the index variable

for a in stride(from:2, to:20, by:2) // range-creating function

for a in B // B is any collection 
for a in ["cat","duck","owl"] // ditto

For a dictionary, each item is a tuple of the two types, with built-in names key and value. For fun, the type of a stride looks like: var ss : StrideTo<Int> = stride(from: 5, to:33, by:5).

Functions, part 2, anonymous and lambda's and function pointers

Function-pointers are declared using the signature, in the obvious way:

var f1 : (Int, Int)->Int; // function-type variable
f1 = intMax; // assign a function
f1(8,12) // 12

Anonymous functions look like: { signature in body }:

f1 =  { (n1:Int, n2:Int)->Int in return n1+n2 } // assign anonymous function
f1(6,2) // 8

// Of course we can make these bodies as long as we need:
f1 = { (x:Int, y:Int)->Int in
    let sum = x+y
    if sum<0 { return 0 }

The parm names aren't used in a call. That's why f1(6,2) is legal. Requiring names would make them nearly useless. Also note the {} is around the entire thing, but no special markers are around the body.

As you'd expect, there are lots of shortcuts. Function variables can omit the function signature if it can be guessed, can leave out the parameter types if they can be guessed, and leave out the return for just a computed answer. Finally, can use $0, $1 ... for parameter names and leave out the signature (assuming we can guess all types):

var f2 = intMax // intMax tells us f2's type

f2 = { (x, y) in return x+y } // omit types in parms, since f2 tells us
f2 = { (x, y) in x+y } // also omit "return" keyword

var chooser : (Int,Int)->Int
chooser = { // omit signature, since chooser tells us:
  if $0>$1 { return $0 }
  return $1

adder = { $0+$1 } // yes, {$0+$1} defines an anon "add" function

Then one more insane shortcut: if you're calling a function which takes a function as the last input, you can supply it anonymously after the parens:

// apply takes and integer, and a function:
func apply(_ x:Int,  _ f:(Int,Int)->Int) {
  let n = f(x,7) + 1000 // converted 2-int function to a 1-int funtion
  return n

apply( 5, { $0*$1 } ) // the normal way to call apply

apply(5) { $0*$1 } // alternate, same call

This next one looks even more bizarre. A function with a single function as the input can be called with no parens and an anonymous function. So you know, useDuck is not terribly useful. The important thing is it takes a String->String as input, and nothing else:

func useDuck(_ f:(String)->String)->Int { return f("duck").count }

useDuck { $0+"y" } // anon function is the input, no parens at all

This trick is more clearly useful with member functions.

First-class, Capture

Functions can return functions, and it looks pretty nice. Here's a standard curry. This function takes 1 int, and returns an Int->Int function (give it any number and it adds the original value):

func curriedAdd(_ n1:Int)->(Int)->Int {
  return { (n2:Int)->Int in return n1+n2 } // returns an anon function, using n1

var add6 = curriedAdd(6)
var add10 : (Int)->Int = curriedAdd(10) // same idea, but explicit signature

add6(4) // 10
add10(4) // 14

The next thing here is how the returned function brings a context with it - add6 has a 6 saved as n1, while add10 saves the 10.

Classes can return functions which do this, and can also capture class variables. Here addAge returns a function which adds an amount to our age (and returns to value, for fun):

class Frog { 
  var age : Int = 0 
  // returns an "increase this frog's age" fucntion:
  // "captures" our age
  func addAge(_ n:Int)->()->Int 
    return { () in self.age += n; return self.age } 

var frog1 = Frog()

var f1seesAGhost = frog1.addAge(3) // f1seesAGhost is a ()->Int function

f1seesAGhost() // 3, and frog1 is now 3
var n = f1seesAGhost() // 6, and frog1 is now 6

Template<T> functions/classes

These use the strict "must establish operations are legal" rules. Basic template functions are only allowed to use =. Otherwise can restrict to a subtype using doStuff<T : baseClass> (similar to how C# does it). This searches any list A for value n:

func contains<T:Equatable>(_ A:[T], _ n:T)->Bool { 
  for a in A { 
    if a==n { return true } 
   return false 

In this case we were forced to say T must use the pre-defined Equatable prototype, otherwise we can't use ==. Ick.


They don't caste to int's by default, but you can force them to:

enum waterStates { case ice, water, steam } // these have no values

// add int values:
enum waterState : Int { case ice, water=10, steam } // ice is 0, steam is 11

// use dot-rawValue (casts never work):
var n = waterState.ice.rawValue

The most common surprise is being able to omit the type-name in the constants, like dot-ice.when we can guess it. For example: var ws : waterState = .ice.

You're also allowed to use any type as the fixed value. This enum uses doubles:

enum waterState : Double {
  case ice = -99.9, // need the spaces here
  steam = 212.0
// waterState.steam.rawValue is 212


This is where it gets really weird. A union is an old C trick to save space by re-using memory. Suppose some types of monkeys need 1 double, while others need a pair of ints. We could make them share those 8 bytes with something like (union {int a,b;} or {double d;}). The only advantage is saving space - we don't have to allocate for the vars we will never use. In Swift, a version of an enum does that. It's not an enum at all - it's a class which holds data.

This union enum can hold 1 int (case cat), or a pair of strings (case dog):

// each SS variable is an int, or 2 strings:
enum SS {
  case cat(Int)
  case dog(String,String) 

When you create it, you decide which case it is, and give the permanent values for the fields:

var s1 = SS.cat(6) // type "cat", so needs 1 int
var s2=SS.dog("arf","ruff") // type "dog", which needs 2 strings

You can only read them with various special syntaxes:

// checks s2 (at the end) for "dog" and assigns w1, w2 in order:
if case .dog(let w1, let w2) = s2 {
  print("dog. Values are "+w1+" and "+w2)

There's also a special way using switch and another with guard. Swift is a very silly language.