Thursday, June 21, 2012

Mobile app vs. mobile site?

The mobile app vs. site debate started out being slanted in a single direction: an app is native, can do more, and does not require an internet connection. Also, having a mobile app downloaded means a significantly more captive audience than a website. Really, does anyone use bookmarks on a smartphone?

Recently, the balance has been shifting. According to ABIResearch, "the mobile web is getting more and more sophisticated... so that more subscribers will use the functionality on mobile websites themselves rather than dedicated apps... "  In fact, the download rate is predicted to begin its decrease after a peek in 2013 as "many applications (increasingly built on web standards) will migrate from app stores to regular websites, and for some sites you won’t need an app at all."

There is a combination of factors that I feel will impact the reality:
  • Bandwidth availability. Say what you will, but many of your apps already fail to work without an Internet connection, demonstrating that we are relying on a near-constant internet connection.

    Plop! There goes the BIG reason for native apps: connectivity.
  • Usage patterns and failure rate. More and more, apps are relying on in-app purchases to generate revenue. According to localytics, over a quarter of apps' downloads are deleted after the first try. Still, this is far better than website abandonment rate.
  • Proliferation of web technologies for mobile app development.  A year or so after the iphone release, I looked into iOS development and learned that payday would be behind many mountains of a steep learning curve.

    As technologies such as PhoneGap, AppMobi, etc are building tools to make mobile development a snap and javascript libraries are maturing to create miracles (how's this for a 3D miracle?), we are required to know less, our skills become more portable and we no longer need to fight with individual phone platform's OS, libraries and the variety of programming languages required to support multiple devices.

    Wait, am I making the argument that there will be more "native" apps because they are easier to build? Well, yes - but the here is the flip side: apps are looking more and more like mobile websites. As the trend continues, consumers will abandon the high hurdle of downloads and navigate via the web - and app developers will maintain a website identical to the functionality of the app, eventually making the app deployment less necessary.
Will the concept of a mobile app go away? Of course not. Most of us do not use the web to check our email. Many have desktop task managers and office products. But let's face it: there was a time when the web was mostly content and stores, while "real work" was done locally. That paradigm is gone. Our life is online, and there are a few key applications we like to keep close.  

What does this mean for mobile developers? In my opinion, it pushes the debate of native development vs. web technologies strongly in the direction of the web. It's where we live. It's what we are. It's what the future looks like.  So close that C# text book and get to know jQuery Mobile. You'll get far more bang for your buck.  Enjoy!

Friday, June 15, 2012

Audio Sprites

Creating audio that works across browsers and mobile platforms is a challenging task. After spending several days attempting to make it work, I ran into an impasse on iOS. So I turned to a new concept: audio sprites. It works!

Before I start posting code samples, some bottom line conclusions about getting audio tag to work in HTML5 across platforms:

  • You will need multiple audio file formats to support different platforms. W3schools has a good list of audio formats supported by different browsers.
    • A combination of mp3 and wav will do the job and that is what I have chosen.
  • When using audio for the purpose of sound effects in an app, do not attempt to break them into individual files.  Instead, create a single file with well-defined break points you will be able to seek to and find the sound you need.
    • Be sure to add nice-sized periods of silence between sounds to ensure you have some leeway in arriving at the right place within the audio file. 
    • Audacity is a great tool to do the necessary sound editing, despite the painful UI (really painful; you've been warned).
  • Allow a couple of weeks within your project for a sound prototype - things don't always work the way you hope - but things do work out in the end. Perhaps, knowing this, you won't blow the timeframe.
So why HTML5 audio instead of flash? Flash is an awesome tool of the past. HTML5 is young, but it'll get there. Invest in the right solution early. And like it or not, Apple has made its intentions known and it will win. Flash has already been dealt its fatal blow.

Now to the fun part.  Audio sprites are a parallel of css sprites: a single file that you can use to display (or in our case, stream) the portion needed for the specific item. It can be used multiple times and therefore contains data for potentially different or even unrelated applications.

A sprite example from css sprites tutorial at w3schools
I discovered the concept of css sprites at Remy Sharp's blog, in which he posted the results of a monumental effort in getting HTML audio to work within iOS.  Two and a half years later, this is not a day out of date.

I have made some changes to his code and will describe the results here.  

My biggest discovery was that one needs to put some silence between individual sound bytes and plan to "trim" the edges.

Snapshot of an audio file with
silent edges shown in dark grey

I modified the code to take the size of an edge as a parameter and take them into account both when starting audio and stopping.  Different browsers will do an ever-so-slightly different thing with respect to timing, and you will lose that battle unless you play it safe. We are talking fractions of less than a tenth of a second - but getting a bit of the next sound into the audio of the previous one will spoil the effect.

Secondarily, audio is non-blocking.  So multiple calls to audio.play() will result in the browser trying to play bits and pieces simultaneously... badly.  So I created a sort of queue to start the next sound when the first one has ended.

Finally, I realized that we may need a pause when attempting to play a series of sounds.  I added pause parameter to the queue management code to pause between sounds if required.

So here it is.  Here is the basic usage I was building for:
First, we set up a constructor of Track, the class that will be playing the audio. The comments about included iOS magic are Remy's. Now the fun part: play, pause an manage queue:
I hope, this basically makes sense. Note the use of this.edge: both when we are trying to play and pause.

Questions? Bugs? Comments? Hate mail?

Monday, June 11, 2012

Posting code on your blog

If I want to have a technical blog, I better figure out an easy way to post code.  I looked around and found what appears to be the golden goose: quick, easy, very pretty.


Check out gist - a simple tool, which enables you to create your code snippet on github, embed it using something as simple as:

<script src="https://gist.github.com/2914590.js"></script>

and post looking as pretty as this:



There is one caveat: your RSS readers do not get to process javascript! Though that is horribly sad, you do need to account for the readers and display the standard ugliness:

<noscript><pre>
code here
</pre></noscript>

This is what the beautiful snippet above will look like:

function Track(src, spriteLength) {
  var audio = document.createElement('audio');

  audio.src = src;
  audio.autobuffer = true;
  audio.load(); // force the audio to start loading...doesn't work in iOS

  this.audio = audio;
  this.spriteLength = spriteLength;
}

Track.prototype.play = function (position) {
  var track = this,
      audio = this.audio, // the audio element with our sprite loaded
      length = this.spriteLength, // the length of the individual audio clip
      time = position * length,
      nextTime = time + length;

  audio.pause();
  audio.currentTime = time;
  audio.play();

  // clear any stop monitoring that was in place already
  clearInterval(track.timer);
  track.timer = setInterval(function () {
    if (audio.currentTime &amp;amp;gt;= nextTime) {
      audio.pause();
      clearInterval(track.timer);
    }
  }, 10);
};
Not great, but better than nothing.  :-)  Note that you will need to paste the code while in Compose mode, so it would properly escape all html brackets.  To simplify the process I would do something like this:


  • Add temporary markers first:
    START
    END
  • Paste code in-between
    START
    my $var = 'hello';
    END
  • Switch to HTML mode
  • Add script tags above START :
    <script src="..."></script>
    START
    my $var = 'hello';
    END
  • Add noscript tags, replacing START and END
    <script src="..."></script>
    <noscript><pre>
    my $var = 'hello';
    </pre></noscript>
  • All done!

From now on, only pretty code snippets in this blog. Enjoy!


Audio in iOS

If you have considered putting together sounds using HTML5 and javascript as tutorials all over the web suggest - you have likely run into troubles everywhere you turned.

Apple has made some undocumented (or undocumented-where-I-looked) decisions, which make development painful.  First of all, each audio file you wish to load should be triggered by user action.  This means, click button => play 'a'; click another button => play 'b'.  Forget playing 'a', 'b','c' in a row consistently.  This also means, no sound on page load!

Turns out, there are ways to get around it. This is the purpose of my life for the next few days.

But first, I would like to show what DOES NOT WORK in iOS.


    $('a#play-all').bind('vclick', function(){  
           //here is our touch/click handler that starts everything up
           var audio = $('audio#a').get(0);  
           // so far so good - here is the audio element
           audio.play();  //great!  You'll hear your sound
    }


So what am I complaining about?  Oh, right!  I want to play multiple sounds in a row.  Let's say, I am sounding out the word B-A-T with a sound file associated with every letter.


    var sounds = Array('b', 'a', 't');
    var i = 0; //start at the beginning;
    // somewhere in our html file we have the <audio> elements, one for each sound
    $('a#play-all').bind('vclick', function(){  
           //here is our touch/click handler that starts everything up
           var audio = $('audio#' + sounds[i]).get(0);  // get the b sound first

           // event handler goes here!
           $(audio).bind('ended', function(){
              ++i;
              var next = $('audio#' + sounds[i]).get(0);
              next.load(); next.play(); 
           })
           audio.play();  
    }

This code and its variants works perfectly across all major browsers and is a great solution if you will be developing a web app not intended for display on an iOS browser.  On iOS, however, you will not hear the second sound coming through!  If you add more clicks to the application, you get to load another audio file for each click.

After days of struggling with this problem (which I documented on stackoverflow with no success), I found something new to experiment with: audio sprites.  Remember css sprites?  The idea that you load up a single image full of buttons, rounded corners, etc, then find the right portion of the image to display for each part of the page?  Well, why not apply the same content to audio?

Here you create a single audio file, containing all the sounds you will need to play, then seek to the appropriate position.  Results in the upcoming post.



   

PhoneGap Getting Started

I have now developed a few prototypes with html5 and jquery-mobile and was ready to make a real mobile app.

I was a little confused. PhoneGap advertises to be a cross-platform phone development platform, but their Getting Started guide has you declaring your mobile affiliation before you read the first word of documentation.  Looking at the individual guides, I gained an understanding: one can think of PhoneGap as a publishing platform. (There is much more to PhoneGap by way of its powerful javascript API's, but that's outside of the scope here.)  One sets up a project, develops the code, then publishes it to the individual platforms as needed, allowing PhoneGap software to compile the proper binaries to be distributed.


This discovery enabled me to answer the next question I had:

How do I organize the files?

Because my web project is the platform-independent portion of the solution, its folder should reside outside of the project hierarchy.

Here is what PhoneGap's getting started guide has to say:
IMPORTANTDrag the www folder into Xcode 4. Don't drag the www folder into your app's folder. It needs to be dragged into Xcode 4. For example, you would drag and drop it on the highlighted red section of the HelloWorld project shown below.



This means, create the folder somewhere on your hard drive (Finder, in my case), then grab and drag it directly into the XCode UI.

A dialog will pop up with a question:








It's an important discovery for me: we are simply creating a reference to a folder, containing our web project.  Other resources: images, css, js, html pages will all reside there.  The project tree will be used for platform-specific components only.  What those are is still to be determined...

In the meantime, I am setting up a folder to sync with my web app. This will enable me to test and demo my app on the web, while publishing it to the mobile platform with PhoneGap.


Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Powerade Coupons