August 16, 2013

iOS: Force audio output to speakers while headphones are plugged in

After much searching through Apple documentation and scarce examples of what I wanted to do, I came up with the following code. A client wanted to play audio through the iPhone/iPad speakers while a microphone was plugged in. While this solution can't do both at the same time, it will let you switch back and forth between playing sounds through the speakers, then record through a microphone or a headset, without unplugging anything. It will also default to use the internal microphone and speakers if nothing is plugged in. Note that by calling the setup method, audio output will initially be forced through the speakers, rather than the headphones, if plugged in. Hopefully this code helps someone facing similar issues.

AudioRouter.h
@interface AudioRouter : NSObject

+ (void) initAudioSessionRouting;
+ (void) switchToDefaultHardware;
+ (void) forceOutputToBuiltInSpeakers;

@end
AudioRouter.m
#import "AudioRouter.h"
#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>

@implementation AudioRouter

#define IS_DEBUGGING NO
#define IS_DEBUGGING_EXTRA_INFO NO

+ (void) initAudioSessionRouting {
    
    // Called once to route all audio through speakers, even if something's plugged into the headphone jack
    static BOOL audioSessionSetup = NO;
    if (audioSessionSetup == NO) {
        
        // set category to accept properties assigned below
        NSError *sessionError = nil;
        [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error: &sessionError];
        
        // Doubly force audio to come out of speaker
        UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
        AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
        
        // fix issue with audio interrupting video recording - allow audio to mix on top of other media
        UInt32 doSetProperty = 1;
        AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doSetProperty), &doSetProperty);
        
        // set active
        [[AVAudioSession sharedInstance] setDelegate:self];
        [[AVAudioSession sharedInstance] setActive: YES error: nil];
        
        // add listener for audio input changes
        AudioSessionAddPropertyListener (kAudioSessionProperty_AudioRouteChange, onAudioRouteChange, nil );
        AudioSessionAddPropertyListener (kAudioSessionProperty_AudioInputAvailable, onAudioRouteChange, nil );
        
    }
    
    // Force audio to come out of speaker
    [[AVAudioSession sharedInstance] overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];
    
    
    // set flag
    audioSessionSetup = YES;
}

+ (void) switchToDefaultHardware {
    // Remove forcing to built-in speaker
    UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_None;
    AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
}

+ (void) forceOutputToBuiltInSpeakers {
    // Re-force audio to come out of speaker
    UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
    AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
    

}

void onAudioRouteChange (void* clientData, AudioSessionPropertyID inID, UInt32 dataSize, const void* inData) {
    
    if( IS_DEBUGGING == YES ) {
        NSLog(@"==== Audio Harware Status ====");
        NSLog(@"Current Input:  %@", [AudioRouter getAudioSessionInput]);
        NSLog(@"Current Output: %@", [AudioRouter getAudioSessionOutput]);
        NSLog(@"Current hardware route: %@", [AudioRouter getAudioSessionRoute]);
        NSLog(@"==============================");
    }
    
    if( IS_DEBUGGING_EXTRA_INFO == YES ) {
        NSLog(@"==== Audio Harware Status (EXTENDED) ====");
        CFDictionaryRef dict = (CFDictionaryRef)inData;
        CFNumberRef reason = CFDictionaryGetValue(dict, kAudioSession_RouteChangeKey_Reason);
        CFDictionaryRef oldRoute = CFDictionaryGetValue(dict, kAudioSession_AudioRouteChangeKey_PreviousRouteDescription);
        CFDictionaryRef newRoute = CFDictionaryGetValue(dict, kAudioSession_AudioRouteChangeKey_CurrentRouteDescription);
        NSLog(@"Audio old route: %@", oldRoute);
        NSLog(@"Audio new route: %@", newRoute);
        NSLog(@"=========================================");
    }
    
    
    
}

+ (NSString*) getAudioSessionInput {
    UInt32 routeSize;
    AudioSessionGetPropertySize(kAudioSessionProperty_AudioRouteDescription, &routeSize);
    CFDictionaryRef desc; // this is the dictionary to contain descriptions
    
    // make the call to get the audio description and populate the desc dictionary
    AudioSessionGetProperty (kAudioSessionProperty_AudioRouteDescription, &routeSize, &desc);
    
    // the dictionary contains 2 keys, for input and output. Get output array
    CFArrayRef outputs = CFDictionaryGetValue(desc, kAudioSession_AudioRouteKey_Inputs);
    
    // the output array contains 1 element - a dictionary
    CFDictionaryRef diction = CFArrayGetValueAtIndex(outputs, 0);
    
    // get the output description from the dictionary
    CFStringRef input = CFDictionaryGetValue(diction, kAudioSession_AudioRouteKey_Type);
    return [NSString stringWithFormat:@"%@", input];
}

+ (NSString*) getAudioSessionOutput {
    UInt32 routeSize;
    AudioSessionGetPropertySize(kAudioSessionProperty_AudioRouteDescription, &routeSize);
    CFDictionaryRef desc; // this is the dictionary to contain descriptions
    
    // make the call to get the audio description and populate the desc dictionary
    AudioSessionGetProperty (kAudioSessionProperty_AudioRouteDescription, &routeSize, &desc);
    
    // the dictionary contains 2 keys, for input and output. Get output array
    CFArrayRef outputs = CFDictionaryGetValue(desc, kAudioSession_AudioRouteKey_Outputs);
    
    // the output array contains 1 element - a dictionary
    CFDictionaryRef diction = CFArrayGetValueAtIndex(outputs, 0);
    
    // get the output description from the dictionary
    CFStringRef output = CFDictionaryGetValue(diction, kAudioSession_AudioRouteKey_Type);
    return [NSString stringWithFormat:@"%@", output];
}

+ (NSString*) getAudioSessionRoute {
    /*
     returns the current session route:
     * ReceiverAndMicrophone
     * HeadsetInOut
     * Headset
     * HeadphonesAndMicrophone
     * Headphone
     * SpeakerAndMicrophone
     * Speaker
     * HeadsetBT
     * LineInOut
     * Lineout
     * Default
    */
    
    UInt32 rSize = sizeof (CFStringRef);
    CFStringRef route;
    AudioSessionGetProperty (kAudioSessionProperty_AudioRoute, &rSize, &route);
    
    if (route == NULL) {
        NSLog(@"Silent switch is currently on");
        return @"None";
    }
    return [NSString stringWithFormat:@"%@", route];
}

@end

July 7, 2013

Inspiration: Yoichiro Kawaguchi

If you get into any of the many facets of creative coding or graphical programming, you inevitably run into the history of the demoscene. I don't know too much about the specific history, but I enjoy watching old and new demos and learning more about the building blocks of modern generative graphics techniques. My good friend and veteran graphical programmer Kris just introduced me to one of the original masters and pioneers, Yoichiro Kawaguchi. It seems that he's gone into some really interesting territory with his work, including creation of toys and physical sculptures based on his algorithmic work. I've included some videos and visuals, as well as some links to show off some of what I found:

283 Useful Ideas from Japan - 1990 - The Techno Deep



Origin (1985)



Embryo (1988)



Mutation (1992)



Gigalopolis (1995)



Cyloton (2002)



Yoichiro Kawaguchi Exhibition at Yushima Seido (2009)



Virtual creature simulations (2011)



Gross Tendril (2012)





Links:

May 27, 2013

JavaScript: Throttle requestAnimationFrame to maintain 30fps

One problem with using requestAnimationFrame is that rendering will take place as quickly as the computer can process the per-frame calculations and screen redraw. If you only want to run at 30fps, your computer might be running a lot faster than you want. To work around this problem, simply check the elapsed time before running the next frame update. Check out the example:
var frameLength = 33; // this is ~1/30th of a second, in milliseconds (1000/30)
var lastFrame = 0;

var render = function() {
  if(Date.now() - lastFrame > frameLength) {
    lastFrame = Date.now()

    // run your 30fps code here...
  }
  requestAnimationFrame(render);
};
requestAnimationFrame(render);
You'll notice that I'm using Date.now(), which requires a polyfill for old versions of IE. requestAnimationFrame also requires a polyfill for some browsers. Another solution is to use time-based calculations, but that's not always easy to implement.

May 25, 2013

Bookmarklet: Scrub through a Vine video

I was watching a friend's Vine video on the web, and I got the idea that it would be cool to control the playback of the video. I wrote this little bookmarklet to scrub through the video as you move your mouse over it. Here's the original JavaScript:
// grab video element and pause it
var vid = document.getElementById('post_html5_api'); 
vid.pause(); 
// get x offset of video
var vidX = 0; 
var el = vid;
while (el && !isNaN( el.offsetLeft ) && !isNaN( el.offsetTop ) ) {
  vidX += el.offsetLeft;
  el = el.offsetParent;
}
// scrub the video based on mouse x
var vidTime = vid.seekable.end(0); 
vid.addEventListener('mousemove', function(e) {
  var x = e.clientX - vidX;
  var percent = x / vid.offsetWidth;
  vid.currentTime = percent * vidTime;
}, false);
And the bookmarklet (Vine Scrubber):
javascript:(function()%7Bvar%20vid=document.getElementById('post_html5_api');vid.pause();var%20vidX=0;var%20el=vid;while(el&&!isNaN(el.offsetLeft)&&!isNaN(el.offsetTop))%7BvidX+=el.offsetLeft;el=el.offsetParent;%7Dvar%20vidTime=vid.seekable.end(0);vid.addEventListener('mousemove',function(e)%7Bvar%20x=e.clientX-vidX;var%20percent=x/vid.offsetWidth;vid.currentTime=percent*vidTime;%7D,false)%7D)();

April 28, 2013

Bookmarklet: Select & invite all friends on Facebook

This may be an evil thing, as I hate getting unwanted invites and spam on Facebook... But if you're throwing an event or have created a Facebook "Page", you might want to invite a bunch of people. You probably don't want to have to click each person's name/picture to add them to the invite, so I wrote a little bookmarklet to select them all at once. Simply scroll down to the bottom of your list of friends (it will load more in as you scroll). Once your (no-longer) friends have all loaded, click the bookmarklet to check them all. Here's the original code:
var checks = document.getElementsByClassName('checkableListItem');
for(i=0; i<checks.length; i++){ $(checks[i]).click(); }
And the same code, reformatted for a bookmarklet:
javascript:(function()%7Bvar checks%3Ddocument.getElementsByClassName(%27checkableListItem%27)%3Bfor(i%3D0%3Bi<checks.length%3Bi%2B%2B)%7B%24(checks%5Bi%5D).click()%3B%7D%7D)()%3B
I shall pay for this with spam karma.

April 6, 2013

JavaScript: Use the goo.gl link shortener from your own site

Here's a quick, stripped-down version of a javascript implementation of the goo.gl link-shortener service. It asynchronously loads the Google client API, then uses another callback when the link shortener service is loaded. After the service loads, you can call shortenUrl() as many times as you'd like. For simplicity, I've only shortened one URL here. It doesn't appear that you need an API key to simply shorten URLs, but certain calls to this service would require one. Here's the basic version, which should work in modern browsers.
var shortenUrl = function() {
  var request = gapi.client.urlshortener.url.insert({
    resource: {
      longUrl: 'http://plasticsoundsupply.com'
    }
  });
  request.execute(function(response) {
    var shortUrl = response.id;
    console.log('short url:', shortUrl);
  });
};

var googleApiLoaded = function() {
  // gapi.client.setApiKey("YOUR API KEY")
  gapi.client.load("urlshortener", "v1", shortenUrl);
};

window.googleApiLoaded = googleApiLoaded;
$(document.body).append('<script src="https://apis.google.com/js/client.js?onload=googleApiLoaded"></script>');

March 2, 2013

JavaScript: Antialias post-processing with THREE.js on a (non) retina screen

When drawing a basic Mesh object in THREE.js, the edges can be particularly jagged if your browser doesn't properly support antialiasing in webGL (most don't seem to at the moment). In my current project this became a sticking point, and I set out to fix the aliased edges of my 3D models and Mesh objects.

I found the FXAA post-processing shader effect in the THREE.js library, and it worked like a charm to smooth the rough edges. However, the THREE.EffectComposer utility doesn't automatically handle different pixel densities, and by default, the aliasing actually became twice as bad on the Retina screen of my Mac. After some fiddling, I found that you simply have to adjust the uniforms for the shader effect if it depends on knowing your screen size, as well as set the screen size for the EffectsComposer object.

See below, where I detect the pixel density, and use that to multiply your screen dimensions in the shader and EffectComposer:
var composer, dpr, effectFXAA, renderScene;

dpr = 1;
if (window.devicePixelRatio !== undefined) {
  dpr = window.devicePixelRatio;
}

renderScene = new THREE.RenderPass(scene, camera);
effectFXAA = new THREE.ShaderPass(THREE.FXAAShader);
effectFXAA.uniforms['resolution'].value.set(1 / (window.innerWidth * dpr), 1 / (window.innerHeight * dpr));
effectFXAA.renderToScreen = true;

composer = new THREE.EffectComposer(renderer);
composer.setSize(window.innerWidth * dpr, window.innerHeight * dpr);
composer.addPass(renderScene);
composer.addPass(effectFXAA);
You'll also probably want to update these settings if the window size changes, like so:
$(window).on('resize', onWindowResize);

function onWindowResize(e) {
  effectFXAA.uniforms['resolution'].value.set(1 / (window.innerWidth * dpr), 1 / (window.innerHeight * dpr));
  composer.setSize(window.innerWidth * dpr, window.innerHeight * dpr);
}

February 5, 2013

Base64 encode an image from the command line in OS X

I frequently use base64 encoding to include small images inline in my CSS. This helps me avoid loading lots of small images or managing an image sprite. Luckily, there's a super easy native command line tool in Mac OS X to do this. Use it like so:
openssl base64 -A -in your-image.png
If your image file is valid, your Terminal will generate a base64 string. You can then drop this into your CSS as a background-image or as the src of an img tag. You simply need to prepend the base64 string with the following header string:
data:image/png;base64,
In the 2 cases, your code would look something like this:
/* CSS */
#container {
  background-image: url("data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAB4AAAAbCAYAAABr/T8RAAAB8qpOejw03OsRMxMR8RDujjC14YwEWEg/bF/6glXHxYm2JTCa4xRxoT/gW5/67s0Hu88AAAAABJRU5ErkJggg==")
}
<!-- HTML -->
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAB4AAAAbCAYAAABr/T8RAAAB8qpOejw03OsRMxMR8RDujjC14YwEWEg/bF/6glXHxYm2JTCa4xRxoT/gW5/67s0Hu88AAAAABJRU5ErkJggg==" />