I have an enum named ProgrammingLanguage:
enum ProgrammingLanguage {
case Swift, Haskell, Scala
}
Now I have a class named Programmer with the following property:
let favouriteLanguages: ProgrammingLanguage = .Swift
Seeing how a programmer could have several favourite languages, I'd thought it'd be nice to write something like this:
let favouriteLanguages: ProgrammingLanguage = [.Swift, .Haskell]
After a bit of research, I realized that I need to conform to OptionSetType, but in doing so, I've raise the following 3 errors:
ProgrammingLanguage does not conform to
SetAlgebraTypeOptionSetTypeRawRepresentable
When I saw the Raw Representable error, I immediately thought of associated types for enums. I wanted to be able to print the enum value anyway, so I changed my enum signature to the following:
case ProgrammingLanguage: String, OptionSetType {
case Swift, Haskell, Scala
}
This silenced 2 of the warnings. But I'm still left with one which is that I don't conform to protocol SetAlgebraType.
After a bit of trial and error, I found out having the associated type of the enum as Int fixed it (which makes sense, since the RawRepresentable protocol requires you to implement an initializer of the signature init(rawValue: Int)). However, I'm unsatisfied with that; I want to be able to get the String representation of the enum easily.
Could someone advise me how I can do this easily, and why OptionSetType requires an Int associated type?
Edit:
The following declaration compiles correctly, but errors at runtime:
enum ProgrammingLanguage: Int, OptionSetType {
case Swift, Scala, Haskell
}
extension ProgrammingLanguage {
init(rawValue: Int) {
self.init(rawValue: rawValue)
}
}
let programmingLanguages: ProgrammingLanguage = [.Swift, .Scala]
Edit: I'm surprised at my former self for not saying this upfront at the time, but... instead of trying to force other value types into the
OptionSetprotocol (Swift 3 removedTypefrom the name), it's probably better to consider the API where you use those types and useSetcollections where appropriate.OptionSettypes are weird. They are both collections and not collections — you can construct one from multiple flags, but the result is still a single value. (You can do some work to figure out a collection-of-single-flags equivalent to such a value, but depending on the possible values in the type it might not be unique.)On the other hand, being able to have one something, or more than one unique somethings, can be important to the design of an API. Do you want users to say they have more than one favorite, or enforce that there's only one? Just how many "favorites" do you want to allow? If a user claims multiple favorites, should they be ranked in user-specific order? These are all questions that are hard to answer in an
OptionSet-style type, but much easier if you use aSettype or other actual collection.The rest of this answer a) is old, using Swift 2 names, and b) assumes that you're trying to implement
OptionSetanyway, even if it's a bad choice for your API...See the docs for
OptionSetType:In other words, you can declare
OptionSetTypeconformance for any type that also adoptsRawRepresentable. However, you gain the magic set-algebra syntax support (via operators andArrayLiteralConvertibleconformance) if and only if your associated raw value type is one that conforms toBitwiseOperationsType.So, if your raw value type is
String, you're out of luck — you don't gain the set algebra stuff becauseStringdoesn't support bitwise operations. (The "fun" thing here, if you can call it that, is that you can extendStringto supportBitwiseOperationsType, and if your implementation satisfies the axioms, you can use strings as raw values for an option set.)Your second syntax errors at runtime because you've created an infinite recursion — calling
self.init(rawValue:)frominit(rawValue:)keeps gong until it blows the stack.It's arguably a bug (please file it) that you can even try that without a compile time error. Enums shouldn't be able to declare
OptionSetTypeconformance, because:The semantic contract of an enum is that it's a closed set. By declaring your
ProgrammingLanguageenum you're saying that a value of typeProgrammingLanguagemust be one ofSwift,Scala, orHaskell, and not anything else. A value of "Swift and Scala" isn't in that set.The underlying implementation of an
OptionSetTypeis based on integer bitfields. A "Swift and Haskell" value, ([.Swift, .Haskell]) is really just.Swift.rawValue | .Haskell.rawValue. This causes trouble if your set of raw values isn't bit-aligned. That is, if.Swift.rawValue == 1 == 0b01, and.Haskell.rawValue == 2 == 0b10, the bitwise-or of those is0b11 == 3, which is the same as.Scala.rawValue.TLDR: if you want
OptionSetTypeconformance, declare a struct.And use
static letto declare members of your type.And pick your raw values such that members you want to be distinct from possible (bitwise-or) combinations of other members actually are.
Good ways to keep your values distinct: use binary-literal syntax as above, or declare your values with bit shifts of one, as below: