JavaScript Mobile Application Development
上QQ阅读APP看书,第一时间看更新

Developing Sound Recorder application

After generating the initial application code, it's time to understand what to do next.

Sound Recorder functionality

The following screenshot shows our Sound Recorder page:

When the user clicks on the Record Sound button, they will be able to record their voices; they can stop recording their voices by clicking on the Stop Recording button. You can see this in the following screenshot:

As shown in the following screenshot, when the user clicks on the Playback button, the recorded voice will be played back:

Sound Recorder preparation

In order to implement this functionality using Apache Cordova, we need to add the following plugins using the indicated commands, which should be executed from the application directory:

  • media: This plugin is used to record and play back sound files:
     > cordova plugin add https://git-wip-us.apache.org/repos/asf/cordova-plugin-media.git
    
  • device: This plugin is required to access the device information:
     > cordova plugin add https://git-wip-us.apache.org/repos/asf/cordova-plugin-device.git
    
  • file: This plugin is used to access the device's filesystem:
     > cordova plugin add https://git-wip-us.apache.org/repos/asf/cordova-plugin-file.git
    

In order to apply these plugins to our Apache Cordova project, we need to run the cordova build command again from the project directory, as follows:

> cordova build

Sound Recorder details

Now we are done with the preparation of our Sound Recorder application. Before moving to the code details, let's see the hierarchy of our Sound Recorder application, as shown in the following screenshot:

The application's www directory contains the following directories:

  • css: This directory contains the custom application CSS file(s)
  • img: This directory contains the custom application image file(s)
  • js: This directory contains the custom application JavaScript code
  • jqueryMobile: This directory (which is a newly added one) contains jQuery Mobile framework files

Finally, the index.html file contains the application's single page whose functionality was illustrated earlier in this section.

Tip

It is important to note that Apache Cordova does not require you to use a JavaScript mobile User Interface (UI) framework. However, it is recommended that you use a JavaScript mobile UI framework in addition to Apache Cordova. This is in order to facilitate building the application UI and speed up the application development process. The jQuery Mobile framework is one of the best mobile UI frameworks, and as such will be used in all the Apache Cordova applications developed in this book.

Let's see the details of the index.html page of our Sound Recorder application. The following code snippet shows the included files in the page:

<link rel="stylesheet" type="text/css" href="css/app.css" />
<link rel="stylesheet" href="jqueryMobile/jquery.mobile-1.4.0.min.css">
<script src="jqueryMobile/jquery-1.10.2.min.js"></script>
<script src="jqueryMobile/jquery.mobile-1.4.0.min.js"></script>
...
<script type="text/javascript" src="cordova.js"></script>
<script type="text/javascript" src="js/app.js"></script>

In the preceding code, the following files are included:

  • app.css: This is the custom style file of our Sound Recorder application
  • The files required by the jQuery Mobile framework, which are:
    • jquery.mobile-1.4.0.min.css
    • jquery-1.10.2.min.js
    • jquery.mobile-1.4.0.min.js
  • cordova.js: This is the Apache Cordova JavaScript API's file
  • app.js: This is the custom JavaScript file of our Sound Recorder application

It is important to know that you can download the jQuery Mobile framework files from http://jquerymobile.com/download/.

The following code snippet shows the HTML content of our application's single page, whose id is "main":

<div data-role="page" id="main">
    <div data-role="header">
        <h1>Sound Recorder</h1>
    </div>
    <div data-role="content">
    <div data-role="fieldcontain">
        <h1>Welcome to the Sound Recorder Application</h1>
        <p>Click 'Record Sound' button in order to start recording. You will be able to see 
           the playback button once the sound recording finishes.<br/><br/></p>
        <input type="hidden" id="location"/>
        <div class="center-wrapper">
            <input type="button" id="recordSound" data-icon="audio" value="Record Sound" class="center-button" data-inline="true"/>
            <input type="button" id="playSound" data-icon="refresh" value="Playback" class="center-button" data-inline="true"/><br/>
        </div>
            
        <div data-role="popup" id="recordSoundDialog" data-dismissible="false" style="width:250px">
            <div data-role="header">
                <h1>Recording</h1>
            </div>
            
            <div data-role="content">
                <div class="center-wrapper">
                    <div id="soundDuration"></div>
                    <input type="button" id="stopRecordingSound" value="Stop Recording" 
                              class="center-button" data-inline="true"/>
                </div>
            </div>
            </div>
        </div>
    </div>
    
    <div data-role="footer" data-position="fixed">
        <h1>Powered by Apache Cordova</h1>
    </div>
</div>

Looking at the preceding code, our Sound Recording page ("main") is defined by setting a div's data-role attribute to "page". It has a header defined by setting a div's data-role to "header". It has content defined by setting a div's data-role to "content", which contains the recording and playback buttons.

The content also contains a "recordSoundDialog" pop up, which is defined by setting a div's data-role to "popup". The "recordSoundDialog" pop up has a header and content. The pop-up content displays the recorded audio duration in the "soundDuration" div, and it has a "stopRecordingSound" button that stops recording the sound.

Finally, the page has a footer defined by setting a div's data-role to "footer", which contains a statement about the application.

Now, it's time to learn how we can define event handlers on page HTML elements and use the Apache Cordova API inside our defined event handlers to implement the application's functionality.

The following code snippet shows the page initialization code:

(function() {

   $(document).on("pageinit", "#main", function(e) {
          e.preventDefault();
    
          function onDeviceReady() { 
                $("#recordSound").on("tap", function(e) {
                      // Action is defined here ...
                });       

                $("#recordSoundDialog").on("popupafterclose", function(event, ui) {
                      // Action is defined here ... 
                });        

                $("#stopRecordingSound").on("tap", function(e) {
                     // Action is defined here ... 
                });

                $("#playSound").on("tap", function(e) {
                     // Action is defined here ... 
                });    
            }

          $(document).on('deviceready', onDeviceReady);
          
          initPage();
   });

    // Code is omitted here for simplicity

    function initPage() {
        $("#playSound").closest('.ui-btn').hide();  
    }
})();

In jQuery Mobile, the "pageinit" event is called once during page initialization. In this event, the event handlers are defined and the page is initialized. Note that all of the event handlers are defined after the 'deviceready' event fires. The event handlers are defined for the following:

  • Tapping the "recordSound" button
  • Closing the "recordSoundDailog" dialog
  • Tapping the "stopRecordingSound" button
  • Tapping the "playSound" button

In initPage(), the "playSound" button is hidden as no voice has been recorded yet. As you noticed, in order to hide an element in jQuery Mobile, you just need to call its hide() method. We can now see the details of each event handler; the next code snippet shows the "recordSound" tap event handler:

var recInterval;
$("#recordSound").on("tap", function(e) {
    e.preventDefault();

    var recordingCallback = {};
    
    recordingCallback.recordSuccess = handleRecordSuccess;
    recordingCallback.recordError = handleRecordError;
    
    startRecordingSound(recordingCallback);
    
    var recTime = 0;
    
    $("#soundDuration").html("Duration: " + recTime + " seconds");
    
    $("#recordSoundDialog").popup("open");
    
    recInterval = setInterval(function() {
                                 recTime = recTime + 1;
                                 $("#soundDuration").html("Duration: " + recTime + " seconds");
                              }, 1000);
});  

The following actions are performed in the "recordSound" tap event handler:

  1. A call to the startRecordingSound(recordingCallback) function is performed. The startRecordingSound(recordingCallback) function is a helper function that starts the sound recording process using the Apache Cordova Media API. Its recordingCallback parameter represents a JSON object, which has the recordSuccess and recordError callback attributes. The recordSuccess callback will be called if the recording operation is a success, and the recordError callback will be called if the recording operation is a failure.
  2. Then, the "recordSoundDialog" dialog is opened and its "soundDuration" div is updated every second with the duration of the recorded sound.

The following code snippet shows the startRecordingSound(recordingCallback), stopRecordingSound(), and requestApplicationDirectory(callback) functions:

var BASE_DIRECTORY = "CS_Recorder";
var recordingMedia;     

function startRecordingSound(recordingCallback) {
    var recordVoice = function(dirPath) {
        var basePath = "";

        if (dirPath) {
            basePath = dirPath + "/";
        }

        var mediaFilePath = basePath + (new Date()).getTime() + ".wav";
        
        var recordingSuccess = function() {
            recordingCallback.recordSuccess(mediaFilePath);
        };            
        recordingMedia = new Media(mediaFilePath, recordingSuccess, recordingCallback.recordError);

        // Record audio
        recordingMedia.startRecord(); 
    };
    
    if (device.platform === "Android") {
        var callback = {};
    
        callback.requestSuccess = recordVoice;
        callback.requestError = recordingCallback.recordError;

        requestApplicationDirectory(callback);
    } else {

        recordVoice();
    }
}

function stopRecordingSound() {
    recordingMedia.stopRecord();
    recordingMedia.release();
}                 

function requestApplicationDirectory(callback) {
    var directoryReady = function (dirEntry) { 
        callback.requestSuccess(dirEntry.toURL());
    };
  
    var fileSystemReady = function(fileSystem) {
        fileSystem.root.getDirectory(BASE_DIRECTORY, {create: true}, directoryReady);                    
    };
  
    window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, fileSystemReady, callback.requestError);
}

The next section illustrates the preceding code snippet.

Recording and playing the audio files back

In order to record the audio files using Apache Cordova, we need to create a Media object, as follows:

recordingMedia = new Media(src, mediaSuccess, mediaError);

The Media object constructor has the following parameters:

  • src: This refers to the URI of the media file
  • mediaSuccess: This refers to the callback that will be invoked if the media operation (play/record or stop function) succeeds
  • mediaError: This refers to the callback that will be invoked if the media operation (again a play/record or stop function) fails

In order to start recording an audio file, a call to the startRecord() method of the Media object must be performed. When the recording is over, a call to stopRecord() of the Media object method must be performed.

In startRecordingSound(recordingCallback), the function gets the current device platform by using device.platform, as follows:

  • If the current platform is Android, then a call to requestApplicationDirectory(callback) is performed in order to create an application directory (if it is not already created) called "CS_Recorder" under the device's SD card root directory using the Apache Cordova File API. If the directory creation operation succeeds, recordVoice() will be called by passing the application directory path as a parameter. The recordVoice() function starts recording the sound and saves the resulting audio file under the application directory. Note that if there is no SD card in your Android device, then the application directory will be created under the app's private data directory (/data/data/[app_directory]), and the audio file will be saved under it.
  • In the else block which refers to the other supported platforms (Windows Phone 8 and iOS, which we will add using Cordova CLI in the next chapter), recordVoice() is called without creating an application-specific directory. At the time of writing this book, in iOS and Windows Phone 8, every application has a private directory, and applications cannot store their files in any place other than this directory, using the Apache Cordova APIs. In the case of iOS, the audio files will be stored under the tmp directory of the application's sandbox directory (the application's private directory). In the case of Windows Phone 8, the audio files will be stored under the application's local directory.

    Tip

    Note that using the native Windows Phone 8 API (Window.Storage), you can read and write files in an SD card with some restrictions. However, until the moment you cannot do this using Apache Cordova; hopefully this capability will soon be supported by Cordova (http://msdn.microsoft.com/en-us/library/windows/apps/xaml/dn611857.aspx).

  • In recordVoice(), it starts creating a media file using the Media object's startRecord() function. After calling the media file's stopRecord() function and after the success of the recording operation, recordingCallback.recordSuccess will be called by recordingSuccess. The recordingCallback.recordSuccess function calls handleRecordSuccess, passing the audio file's full path mediaFilePath as a parameter.
  • The following code snippet shows the handleRecordSuccess function:
    function handleRecordSuccess(currentFilePath) {
       
        $("#location").val(currentFilePath);
        $("#playSound").closest('.ui-btn').show();
    }
  • The handleRecordSuccess function stores the recorded audio filepath in the "location" hidden field, which is used later by the playback button, and shows the "playSound" button.
  • In requestApplicationDirectory(callback), which is called in case of Android, it does the following:
    • Calls window.requestFileSystem in order to request the device filesystem before performing any file operation(s)
    • Calls fileSystem.root.getDirectory when the filesystem is ready in order to create our custom application directory
    • When our custom application directory is created successfully, the path of the created directory, or the existing directory, is passed to recordVoice() that was illustrated earlier
  • In the other application actions, the following code snippet shows the "stopRecordingSound" tapping and "recordSoundDialog" closing event handlers:
    $("#recordSoundDialog").on("popupafterclose", function(event, ui) {
        clearInterval(recInterval);
        stopRecordingSound();
    });        
    $("#stopRecordingSound").on("tap", function(e) {
        $("#recordSoundDialog").popup("close");
    });
    
    function stopRecordingSound(recordingCallback) {
        recordingMedia.stopRecord();   
        recordingMedia.release();   
    }

In the "stopRecordingSound" tapping event handler, it closes the open "recordSoundDialog" pop up. Generally, if "recordSoundDialog" is closed by the "stopRecordingSound" button's tapping action or by pressing special device keys, such as the back button in Android devices, then the recording timer stops as a result of calling clearInterval(recInterval), and then it calls the stopRecordingSound() function to stop recording the sound.

The stopRecordingSound() function calls the Media object's stopRecord() method, and then releases it by calling the Media object's release() method. The following code snippet shows the "playSound" tap event handler:

var audioMedia;
var recordingMedia;

$("#playSound").on("tap", function(e) {
    e.preventDefault();

    var playCallback = {};
    
    playCallback.playSuccess = handlePlaySuccess;
    playCallback.playError = handlePlayError;
    
    playSound($("#location").val(), playCallback);
});    

function playSound(filePath, playCallback) {
    if (filePath) {                  
        cleanUpResources();
    
        audioMedia = new Media(filePath, playCallback.playSuccess, playCallback.playError);

        // Play audio
        audioMedia.play();
    }            
}
function cleanUpResources() {
    if (audioMedia) {
        audioMedia.stop();
        audioMedia.release();
        audioMedia = null;
    } 

    if (recordingMedia) {
        recordingMedia.stop();
        recordingMedia.release();
        recordingMedia = null;
    } 
}

In the "playSound" tap event handler, it calls the playSound(filePath, playCallback) function by passing the audio file location, which is stored in the "location" hidden field and playCallback.

The playSound(filePath, playCallback) function uses the Media object's play() method to play back the saved audio file after releasing used Media objects. Note that this is a requirement to avoid running out of system audio resources.

This is all you need to know about our Sound Recorder application. In order to see the complete application's source code, you can download it from the book page or get it from GitHub at https://github.com/hazems/soundRecorder.