CGContext in Swift

A lot of code that I've seen on StackOverflow for correctly getting the CGContext from an NSGraphicsContext in Swift doesn't seem to be working in OSX 10.9. The following does work as of Xcode 6 beta 4 running on OSX 10.9:

var context:CGContextRef = reinterpretCast(NSGraphicsContext.currentContext().graphicsPort)

In OSX 10.10 a new method is available on NSGraphicsContext that returns the CGContext however I haven't had this correctly working with Quartz.

SQLite in Swift frameworks

I have an unusually specific use case for Swift: I want to use it to replace the model layer in one of my apps, and to do so I wanted it in a separate framework (given that these are so easy in Xcode 6). My model layer works off of on top of FMDB + SQLite, so that was a must have. However, the latest beta of Xcode 6 (b4) removed bridging headers from frameworks - instead you have to add Objective-C imports to the 'umbrella header' of the framework. Unfortunately 'sqlite3 is a non-modular header import', which meant that I couldn't import FMDB into the Swift framework at all:

Screen Shot 2014-07-23 at 19.30.04.png

This was very frustrating because the Objective-C version of the framework would build perfectly! The solution, however, is to use module maps. These are part of the LLVM compiler that allow you to map non-modular libraries and frameworks such as sqlite3, which means that they can be used by Swift.

Here's what I did to setup FMDB

  1. Create a new Objective-C framework in an Xcode workspace for FMDB. I then added all of the FMDB headers and source files
  2. Ensured that all FMDB headers were made public and that they were included in FMDB.h
  3. Linked libsqlite3.dylib
  4. Ensured that the 'Defines Module' was 'Yes' in the FMDB build settings

Then I created Swift framework that used FMDB:

  1. Create a new Swift framework (I called it ModelSwift)
  2. Link it with FMDB
  3. Add #import <FMDB/FMDB.h> to the umbrella header (ModelSwift.h)
  4. Create a module map (sqlite3.modulemap) and add it to the project (you could place it anywhere, however):
module sqlite3 [system] {
    header "/Applications/Xcode6-Beta4.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk/usr/include/sqlite3.h"
    link "sqlite3"
    export *
}

module sqlite3simulator [system] {
    header "/Applications/Xcode6-Beta4.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator8.0.sdk/usr/include/sqlite3.h"
    link "sqlite3"
    export *
}

You then have to add the path to the directory that this the module map is stored in to the 'Import Settings' of the Swift framework:

Once you've done this you'll be able to freely use FMDB in your Swift framework. Once you've built the framework and you're importing it into an app you will also need to add the import path to your app's build settings to ensure that it picks up the module map as well (I'm not quite sure why this is). I found that I also need to add an empty Swift file to my Objective-C app so that it would allow me to set the import paths. You may also need to enable non-modular headers in the build settings of the app.

Hopefully a future release of the beta will fix all of this, but this definitely works for now.

Replicating Overcast's show notes

Early this week Marco Arment released Overcast, a really elegant new podcast app for iOS. The show notes aren't displayed by default in the player, instead you swipe up on the show artwork to view them:

image.jpg

This is an effect that I quite like, so I thought I would take a look at how it could be implemented. Firstly, the show notes are probably presented using a UIWebView, because most podcasts use (relatively simple) HTML in their shownotes. Secondly, a UIWebView is just a UIScrollView, so it is possible to add a contentOffset to the web view and display the artwork in an image view begins the web view. Here's what that hierarchy looks like:

Therefore, all you really need to do is resize the image view as the web view, which is in front, is scrolled. This can be done with some simple code in the UIScrollViewDelegate:

CGFloat miniSize = CGRectGetWidth(self.view.frame) / 3;

if (scrollView.contentOffset.y < 0) {
    CGFloat size = miniSize;
    if (scrollView.contentOffset.y < -miniSize) {
        CGFloat offset = scrollView.contentOffset.y + 320;
        CGFloat fraction = 1 - offset / (320 - miniSize);
        size = fraction * (320 - miniSize) + miniSize;
    }
    self.artworkImageView.frame = CGRectMake(CGRectGetMaxX(self.view.frame) - size, 0, size, size);
    self.artworkScrollView.contentOffset = CGPointZero;
}
else {
    self.artworkScrollView.contentOffset = scrollView.contentOffset;
}


If the user has scrolled between the artwork and the 'mini size' then the shownotes will be displayed directly underneath. When the show note title is between the bottom of the artwork and the top of the scroll view, the artwork stays fixed, however when it will zoom when the title is below the artwork. The interaction itself is pretty simple, but I really like the way it works. You can find my full implementation on GitHub. Here's a demo video:

iOS Developer FAQ

For the last few weeks I've been working on an extensive list of FAQs for new iOS developers, because they commonly need answers of questions that they may not know how to find. In order to write the FAQ, which is available on GitHub, I drew on my own experiences, StackOverflow and /r/iOSProgramming.

I don't want this to be a static document, so I'm actively looking for new questions and answers through issues and pull requests.

OpenCL fractal generation

I've been meaning to play around with OpenCL for a while (like a couple of years), so I decided to experiment with some of the basics. In this post I'm going to be focussing on using OpenCL on OSX to create some Mandelbrot fractals, so I'll assume you've already read the first few chapters of Apple's documentation (don't worry, it doesn't take long). If you want to skip the post and get straight to the code, please check it out on GitHub.

Start out by creating a new command line tool (Foundation) in Xcode, linking it with AppKit.framework, Foundation.framework and OpenCL.framework (you're going to want to do this because we'll need to write a tiny bit of Objective-C to save the images). Import these frameworks in main.m:

Screen Shot 2014-05-27 at 19.55.42.png

The next step is to actually write the kernel. OpenCL kernels are basically programs written in a C-like language that execute on the stream processors of the GPU, a little like OpenGL shaders (but way more powerful). The kernel is based off of this GLSL shader (so I won't go into detail on complex numbers):

The kernel itself has several options, including the output image to write to, the width of the image, the height of the image (which are used to normalise the coordinates) and the number of iterations to do. This is fairly similar to the original GLSL shader, and it acts in a similar way because it is executed per pixel. Now we need the Objective-C/C code to run the kernel:

This code does the following:

  1. Creates a dispatch queue for OpenCL. On OSX Apple has made it super easy to run OpenCL kernels by integrating them with GCD. On other platforms a lot more boiler-plate code is required
  2. Allocates some bytes for the image (notice that we allocate 4 bytes - 1 unsigned integer - per pixel for the RGBA channels)
  3. Creates a struct describing the image format (RGBA, 1 byte per component) for OpenCL
  4. Allocates OpenCL memory for the image
  5. On the OpenCL queue a range is created to describe the image (this should be familiar once you've read through Apple's docs)
  6. Execute the kernel
  7. Copy the image data back to the main memory from OpenCL's memory
  8. Create an NSBitmapImageRep for the data, encode that as a PNG and export to disk

Voila! You'll find this in your home directory:

As a bonus, I also stuck this in a loop and generated a video for the first 1000 iterations:

OpenCL is really powerful, and Apple has done an awesome job at integrating it into OSX and Xcode. This project doesn't even begin to scratch the surface of what you can do with it. At some point soon I'm going to take a look at some more advanced topics such as image processing and integrating with OpenGL.

OpenGL Template for iOS

Generally when I'm creating new projects in Xcode I use the 'Single-view' template because it tends to be the one that allows for the most customization, however when I do OpenGL work I always use the GLKit ('OpenGL Game') template. This is an OK template, but I keep finding myself deleting all of the GLKit effect based code and replacing it with my own shader class.

I've therefore created a new template for Xcode that only uses OpenGL shaders, hides the status bar, uses anti-aliasing, has a 60fps frame rate and follows some best practices recommended by Apple that the default template doesn't.

In order to get started with the template, feel free to check it out on GitHub.

Reading OSX Reviews

Over the course of the last week I've reread all of John Siracusa's OSX reviews for Ars Technica, starting with OSX Developer Preview 2 and finishing with OSX Mavericks. This page handily lists all of the reviews (earlier reviews linked back, but this pattern ended at 10.5).

My motivation was to learn a little more about the history of OSX, as I've only been using Macs since Snow Leopard. These reviews are a great place to get a sense of the history of the OS because they were written at the time of each version's release, rather than having been updated regularly with new information, like the Wikipedia articles.

The pattern of the early reviews (the first seven are all about 10.0 and its betas) was focussing on the differences between OSX, raw UNIX and Mac OS 9. Whilst 10.0 is very different there are striking similarities to the OSX I use everyday: it still had Aqua, HFS+, Cocoa, etc. A lot of the key differences, for Siracusa at least, were the user interface and experience, the Finder and the file system (there was also a font kerning issue in the Terminal for several versions that was his pet peeve).

As OSX progressed, the reviews did too. Up until 10.4 Tiger, the reviews often discussed the performance changes from the previous version (especially the fact that they got faster on the same hardware), whereas after this performance improvements were much smaller (Siracusa, whilst expressing admiration, did point out that the cynic's response is that 10.0 was so slow, which is why there was so much room for performance improvements).

Most early reviews also advised the reader on whether or not they should upgrade, especially given that each new release was a $129 upgrade. Later reviews do not even consider this (at one point Siracusa even considered it a compulsory 'Mac tax') but with Mavericks being free, price is given a final consideration in its review. Modern reviews seem to have this pattern:

  • Introduction and summary of expectations from the last review
  • The new features
  • Performance changes, if any
  • Kernel changes and enhancements
  • New technologies and APIs, if any
  • File system rant (which, having read the earlier reviews and got some context, now seems justified)
  • Grab bag (summary of minor apps and changes)
  • Conclusion and looking ahead to the future

The evolution of OSX is fascinating from a user and developer perspective. As new developer technologies, like Quartz 2D Extreme/QuartzGL, Core Image/Audio/Video, OpenCL, Objective-C 2.0, Blocks, GCD and ARC, emerged each was carefully explained. Whilst I am familiar with these technologies today it was awesome to see when, why and how they were released. Transitions in other areas (32-bit to 64-bit, PowerPC to Intel, 'lickable' Aqua UI to a more flat UI) was also interesting to read about, especially given a modern perspective.

Reading the reviews in 2014 has been fun, especially considering that some of them are almost 15 years old. Often Siracusa made predictions of varying accuracy:

  • Tags were predicted in the Tiger review
  • High DPI/retina displays were predicted when support was first added in OSX (although retina displays weren't considered until the Lion review)
  • HFS+ would be replaced
  • Macs with 96GB of RAM (this was in the Snow Leopard review, but the current Mac Pro can be configured to ship with up to 64GB and supposedly supports up to 128GB)

Another interesting series of articles was the 'Avoiding Copland 2010' series which were written  in 2005 and discussed the various enhancements that Apple would have to make to Objective-C in order for it to remain competitive with other high level languages. I also recommend listening to the associated 2011 episode of Hypercritcal, A Dark Age of Objective-C. With the recent debate over whether Objective-C/Cocoa should be replaced (they shouldn't), these articles are surprisingly relevant.

I highly recommend reading the OSX reviews if you're a developer and haven't read them before, or are just interested in the history of the OS. I ended up reading all of the reviews in Safari's reading mode because it was able to take the multi-page reviews and stick it in one page (albeit with some images missing) - in the past I had used Instapaper for this but it occasionally seems to miss multi-page articles.