In the first installment of this series, I told you that I had done the design for the first version of Well Tempered myself. But the design didn’t look great, and the logo was not selling the tuner either.

For this version, I wanted to have a better design. Not having learned, I thought I’d become a better designer and went to work, but having spent enough time with my sketches, I found that I should call in the professionals. I looked at a lot of designs people had made and asked for some quotes, providing my initial design, my wishes, a history of the project and its economy, and codes to be redeemed in the app store for my old version.

Out of a total of eight designers that got back to me, I chose to work with two. I really wanted to have multiple inputs, and was even contemplating doing two distinctive versions of the app. Oh boy, what a load of extra work that would have been! One of the two quickly became silent, though, so I ended up working only with Aleksandar from Serbia.

Working with Aleksandar was really a lot of fun. We did all of the work through 99designs, and I can really recommend it as a collaboration tool for design projects. 99designs concept is to solicit many designs and only pay for the one I really like. I opted not to go with this concept, trusting more in me finding designers and thinking I would feel bad for all the designs I would choose not to pay for. But 99designs got their cut by having the collaboration tools and by holding the payment until the end of the project.

Aleksandar isn’t a musician, but we had fun iterating through the designs. Whenever I thought something hard to explain to a non-musician, I would just shoot an iPhone video, put it on Dropbox and send him the link. I’m sure he knows a lot about how to tune an instrument now. He also ping-ponged with his musician-friends, and delivered a design that I thought looked really good. I’ll admit, it was in many ways very close to my initial sketches, and that worried me, as I’m not a skilled designer. But running it by other musicians, it got good feedback, and thus I went ahead with that.

But here was the first stumbling block - I got so caught up in ping-ponging about the design that I waited with implementing it. Yes, implementing right away before something is mature makes for code that is best thrown away, and has the risk of locking you into that design since it is easy to start resisting changes knowing the extra work it will bring. But in retrospect I really should have implemented parts of it, because as I started implementing the final design, I noticed there were lots of usability issues that we hadn’t worked through. I very was happy with the responsiveness with Aleksandar, even after we thought we were done, but I also reworked parts of it myself and asking for his thoughts on it.

Come back for the fourth installment of the Well Tempered behind-the-scenes story.

Full index:

Audio Frameworks

I ended the last installment of the Well Tempered behind-the-scenes story saying that the 1.x versions were live from early 2009 to mid 2015. It taking so long for version 2.0 to come out was never my intention. Well Tempered was my first released app as an iOS developer, and I don’t have count of how many applications in different versions and variations I would release until version 2.0 of Well Tempered was finally in the app store, so it wasn’t because I had stopped developing for the platform. Far from it. If anything, it was probably that I wanted too much - the second system syndrome was strong for this project.

I really wanted to deliver on making a tuner that would listen to what people played and told them how flat or sharp they were. I already had done the calculations to know where the target was with the pitch pipe, and I had implemented my own spectral tuner back in 2006, so how hard could it be?

Well, the original iPhone wasn’t a great place to calculate FFTs in real time. I believe the Accellerate framework (which I seem to remember was called VecLib at the time) was available with iPhoneOS 2.0, but I could not get it to perform well on my iPhone 3G, which still ran the original iPhone hardware but with a 3G modem.

When the iPhone 3GS came around, I did have a rudamentary implementation that would perform, but AudioQueue wasn’t that nice a framework to play ball with, and I never was able to get a precision I could accept. The code I wrote for this was hard to maintain, and debugging it and experimenting with it to try to increase precision drained my energy and interest from it so much so that I abandoned the code, not wanting to build a new release upon this.

The reason AudioQueue was written in C was not to have the overhead of Objective C. The overhead wasn’t much, but there was little room to spare. Since there was really no alternative to AudioQueue, I built some Objective C wrapping around it of my own, but was never happy with the performance. That is why I experimented every time I would see a new open source audio framework for the iOS platform. There were in particular three that drew my attention:

libPD, a Pure Data engine for iOS, was something that I gave a lot of attention. My original work had been done with Pure Data and Max/MSP, so this was a world I knew, and I was hoping I could bring over my earlier work and base a version 2.0 upon that. While I still find this project very interesting, I must admit I have yet to ship something built on that. I have used it for demos, both by myself, with friends and with co-workers, but never shipped anything with it.

The second project that I found amazing was novocaine, which provided a nice block-interface on top of AudioQueue, making a nice compromise between performance and convenience. But alas, I never could compute the frequencies just right with it, and after a bit of testing I discovered the examples wouldn’t play a clear sine tone. In the end, I abandoned it.

The third project that got me excited was AudioKit. AudioKit looked like a really nice framework on top of Csound. I was surprised to find I could use it in real time, as I’ve always seen people render sounds and music with it that could be played back afterwards. And with the nice examples that came along with it, I could easily prototype that I could get pretty far in what I wanted, enough to give me confidence that I could build my version 2.0 upon it.

AudioKit has a great maintainer and a very inclusive community with an active mailinglist. If you want to write audio software for iOS, you really should check it out. I hope to use it in many later projects, and plan to continue using it as the basis for Well Tempereds audio and audio analysis.

Come back for installment three of the Well Tempered behind-the-scenes story.

Full index:

This is the Well Tempered Chronicles, a behind the scenes look at the development and continued life of my tuner for the iPhone and iPad, Well Tempered.

Well Tempered 1.0

I wrote the first version of my chromatic tuner, Well Tempered, back in january 2009 when I was between jobs because of a non-compete clause in the contract from my old job. In my old job I had got to spend some time with the iPhoneOS betas, and when my boss decided there was no market for that, I continued in my spare time.

The choice of working on a tuner that specilaized in different temperaments came from my research on the topic while I was preparing to do a PhD in Medialogy at Aalborg University back in 2005. The funding for the PhD never came through, and I left to become an IT consultant.

My background for wanting to do a PhD in Medialogy was that even while I studied for my Cand. Scient in computer science at the University of Oslo, I devoted a whole lot of time to my music. I had started playing around the same time I started programming, around the age of 7, and the two had always competed for my attention. I always wanted to become an engineer, except for when I wanted to become a composer. So in the end I did one year of composition, a 5 year Cand Scient in Computer Science, and a 5 years diploma in recorder playing.

But lets forward back to 2009 again. iPhoneOS 2.0 had been out for a little while, and I wanted to apply my skills and data collection I had done during my research in 2005-2006. The OS contained the C-oriented audio framework AudioQueue, and I would use this to make a pitch fork that I could make different sounds with in the precise frequencies that were needed for tuning scales in different temperaments.

Although many users later asked for a tuner that would listen to a tone and then tell how sharp or flat it was, I really felt that having a reference tone to tune after was far more precise, so this is where I chose to focus. Working with musicians that played in other settings than the chamber music I was used to, or musicians that played instruments with other tone qualities than the harpsichord and recorder at my disposal, I explored different sounds to include with the pitch pipe that would suit tuning different instruments better.

User interface wasn’t something I knew much about, and honestly, the App Store was a mishmash of different UI attempts. I was very fascinated about Core Animation, and was delighted that I could make it look like I rolled and unrolled a parchment scroll, that I would use to display a description of the chosen temperament. Temperaments were chosen in a picker view. The name sounded like the obvious choice, but alas, the experience of choosing a temperament was not great. And the rest were buttons, labels and textfields, on the background of a baroque painting that I’d blur a bit here and there to match the background better.

All in all development was done in less than a week, and I was excited to see how life in the app store would be.

Back in 2009, not many people had an iPhone. I soon discovered that my target audience, musicians playing early music instruments, didn’t have an iPhone, seeing it as an interesting but oh so expensive device that was outside their budget. So sales were always slow.

While working on bug fixes and adding temperaments, I had two main requests: adding a tuner that would tune by sound and display if the played tone was too flat or too sharp, and keeping compability. When I would update to requiring iPhoneOS 2.2.1, people who had not upgraded would complain. I don’t remember exactly why, but there was a lot of resistance to upgrading back then. So it was radical for me to scrap compability with iPhoneOS 2.x when I released Well Tempered version 1.3, the last version of the 1.x series. But nothing too radical, though, I would still keep compability with the original iPhone.

Marketingwise, I had totally misunderstood the deal with Apple. It was my impression that the 30% cut of the sales that Apple would take included marketing. I clearly remember Scott Forstall saying that in the videos. I could be mistaken, I will admit I have not gone and looked it up to have a reference for you. So I did very little marketing outside the app store, and so my quirky-looking, under-marketed tuner intended for early music players who couldn’t afford the iPhone did rather poorly in the app store, probably selling around 300-400 copies from early 2009 to mid 2015.

While the first version of Well Tempered didn’t sell much, I still consider it a success, since the people that did buy it spent an average of almost 15 minutes when they did use it. And while I have bought many tuners since, I always ended up using it myself. So in terms of scatching my own itch, it was absolutely a success.

Come back for installment two of the Well Tempered behind-the-scenes story.

Full index:

Dear Neil,

thanks for episode #8 of the Above Avalon podcast (blog article here). I really enjoy the show, and this episode in particular I would like to add some comments to. It’s really a follow-up to my previous article about the iPad.

Here on the west coast of Denmark, I have been surprised to see a lot of iPad Airs 2. The model I usually see is the silver 64gb with 4G, which has surprised me as this is not an inexpensive model. The other one I see a lot is the iPad 2 32Gb wifi model. Apart from that I see very little of the others, and remarkably, I haven’t seen anyone actually owning the current generation iPad Mini.

The fact that the iPad Air 2 is so strong has surprised me, as it didn’t seem like such a big deal. The reason I’ve concluded is that a lot of people are upgrading from the iPad 2, which I think must have been a great success seeing how many are out there. But also, about 1/3 of the people I’ve spoken to who got an iPad Air 2 were new to the iPad.

So I’m more optimistic on the numbers. Yes, the west coast of Denmark probably isn’t very representative of Apples market, but it’s the data point I see, and I thought I should share it with you.

I think someone at Apple said “Wouldn’t it be cool if iOS apps could run on OS X”? And I think what we get is the Macbook Air 12” (2015 model). Here is why I think so:

Apple is often very good at leaving clues about the future. The clue I find the most interesting is what Tim Cook said with the introduction of the A7 processor, and Apple has repeated many times since: “it’s really a desktop class CPU”. And with the iPad Air 2 I must admit, I’m using it much like I would a desktop - and I’ve added a keyboard for it:

My iPad setup

Of course, with Apple you have to sort in what comments to lay weight to. The last reference I’ve found to a touchscreen Mac is from october 2014 where Craig Federighi said “We don’t think it’s the right interface, honestly, […] Mac is sort of a sit-down experience.”. Since I at the same time kept unconsiously touching my desktop screen and my laptop screen, my reactions tell me that this is no longer true.

Right now, the rumour mills are full of speculations around devices with a 12” screen, both a 12” Macbook Air and a 12” iPad Pro have been predicted. I really think these are the same device. Apple has at times made two identically sized but still different screens, but they usually don’t. The iPad version of iOS is honestly not as well adapted for its interface as the iPhone version, so I think making a 12” version of iOS that isn’t just a scale-up version, would be stretching already thinly stretched resources too much.

I also don’t think Apple would just scale up a larger interface: Apple has already shown this by giving developers an iPad simulator with a resizable width and height. Resizing would make sense if it was used for having two apps run side-by-side on iOS. It would also make sense if iOS apps were running in a window on OS X.

By going the route of a 12” touch screen display for the next Macbook Air, and adding an A-series chip such as the A8 into the Macbook in addition to the expected Intel chip, Apple could let any iOS apps run on the Mac, without emulation. It’s not like they haven’t done it before: when Apple was switching to the PowerPC I was told (by my neighbour at the time) it would be able to run both Windows (3.x) and Motorola Mac applications. We’ve also had the two OS’es in one Mac situation with Rosetta, allowing Mac OS 9 to run together with OS X.

In material costs, Tech insights estimated that the A8 processor cost $37 at the time of the iPhone 6 launch, a small extra hardware price for a spectacular integration point.

The motivation for doing this, apart from great integration opportunities, are to mac the Mac a more interesting platform. All the sudden it has everything you have on iOS! Even more important, it would instantly make the many iOS developers Mac developers!

When I began developing for iOS back in the beta days of iPhoneOS 2.0, my plan was to use it as a stepping stone to build Mac apps. I still haven’t released a single Mac app, having only made a few to help me automate my build processes. I am sure I am not the only developer in this boat, and I know of many developers that have complained about the AppKit framework missing features UIKit developers take for granted.

To add to the argument, Apple released size classes for iOS that help us, together with AutoLayout, span the range from 3.5” to 10” devices. With the Apple Watch I expect it’ll bring us down to 1,25”: could that be a key to span up to 27”? At least beginning with 12” as an incremental step would not seem unreasonable.

An argument against, of course, is that the examples of using AutoLayout and size classes to span the existing range in a manner that feels optimized for each screen size are far between. How do you design a responsive UI anyway? The web guys tell us they have been doing it for ages, yet I have not seen anything that meets my criteria above: “feels optimized for each screen size”.

The second argument against, is Android. It has also been able to be make responsive UIs for a long time, and also there I can’t think of an example that meets my criteria above. And, I’ve been able to run Android and apps based on that on my Mac for a while now, and I can’t say that I’ve really taken good use of that.

The scary part of the scenario, is of course the consequence for Mac Apps. The iOS apps would probably all be vended through the iOS App Store, with it’s 30% cut to Apple and low prices. If the Mac App Store was joined together with it, it would probably impact the pricing on Mac apps, instead of what many developers are trying for - bringing Mac style pricing over to iOS.

So that is my speculation for the 12” Macbook Air / 12” iPad Pro/Plus. I’d love to hear your thoughts, and I’m looking forward to seeing how well this matches what Apple will actually bring this spring.