Skip to main content

Audio in HTML5

I have been looking into HTML5 and it's capabilities to play audio and video content on the web without requiring any browser plug-in. In this post I will be explaining how one can embed an audio file into a web page and add controls to perform various operations on it.

Embedding an audio file in a web page is as easy as this:
 <audio src="testFile.mp3">  
 </audio>  

Attributes that make life easier:

The audio tag has few attributes and one of them is the autoplay attribute. This is a boolean attribute and when specified it means that whenever a user visits the web page the audio file will be played automatically.  One can just specify the attribute and provide no value for it. This makes sense because one has included it implies that the attribute value is true:
 <audio autoplay="" src="testFile.mp3">  
 </audio>  
By specifying autoplay="true" or autoplay="false" wouldn't make any difference at all. It is same as writing just autoplay. IMO one would not like to hear an audio file whenever he\she visits the web page.

The other attribute is loop. The loop attribute as it says that once the audio file has been played it will start playing from the beginning of the file.
 <audio autoplay="" loop="" src="testFile.mp3">  
 </audio>  
The next attribute is controls, this attribute provides the user a control over the playback of the audio file mentioned in the src attribute. User can play, pause and increase\decrease the volume using the native controls provided by the browser.
 <audio controls src="testFile.mp3">  
 </audio>  
The Audio API is provided to the users that provides control over the playback of the audio file using Javascript. The Audio API provides methods play and pause to control the playback and a property volume is provided to increase\decrease the volume. In the below example I'm placing four buttons that can be used control the playback:
 <audio id="myPlayer" src="testFile.mp3">  
 </audio>  
 <button onclick="document.getElementById('myPlayer').play()"> Play </button>  
 <button onclick="document.getElementById('myPlayer').pause()"> Pause </button>  
 <button onclick="document.getElementById('myPlayer').volume +=0.1"> Volume Up </button>  
 <button onclick="document.getElementById('myPlayer').volume -=0.1"> Volume Down </button>  
There is also an attribute by name preload that lets user to start buffering the audio file when the page is loaded. Unlike other attributes the preload attribute takes three values: none, auto and meta. 'none' tells the browser not to preload the audio, on the other hand 'auto' tells the browser to to preload the audio and 'meta' tells the browser to load only the meta data.
 <audio controls="" preload="auto" src="testFile.mp3">  
 </audio>  
If there are many audio tags with preload set to auto then the bandwidth could get affected with immoderate preloading.

Formats:

Although most of us use the mp3 format of audio files, one should note that some browser may support this format but some may not. For example, the Safari browser does support the mp3 format but Firefox does not. The reason behind this is that the mp3 format is patented, but there are always other formats at your disposal which are open to use. One such format is the ogg format which is supported in the Firefox browser, however Safari doesn't support that. Fortunately there is way that we can use the audio element without having to make a choice between the file formats.One can specify multiple format using the source elements:
 <audio controls="">  
 <span class="Apple-tab-span" style="white-space: pre;"> </span><source src="testFile.ogg"></source>  
 <span class="Apple-tab-span" style="white-space: pre;"> </span><source src="testFile.mp3"></source>  
 </audio>  
A browser that can play the ogg file format will skip the formats mentioned after the first source element. If the browser is not capable of playing the format mentioned in the first source element then it will see whether the file format mentioned for the next source element can be played. One can also specify the mime type for the source elements.
 <audio controls="">  
 <source src="testFile.ogg" type="audio/ogg"></source>  
 <source src="testFile.mp3" type="audio/mp3"></source>  
 </audio>  

Comments

Popular posts from this blog

Adding beforeRender and afterRender functions to a Backbone View

I was working on a Backbone application that updated the DOM when a response was received from the server. In a Backbone View, the initialize method would perform some operations and then call the render method to update the view. This worked fine, however there was scenario where in I wanted to perform some tasks before and after rendering the view. This can be considered as firing an event before and after the function had completed its execution. I found a very simple way to do this with Underscore's wrap method.

De-obfuscating javascript code in Chrome Developer Tools

I had blogged about JavaScript debugging with Chrome Developer Tools  some time back, wherein I have explained how these developer tools can help in debugging javascript code. Today Google Chrome 12 was released and my Chrome browser was updated to this version. As with every release, there have been some improvements made on performance, usability etc,. One feature that stood out for me is the ability to De-obfuscate the javascript code. What is Minification? Minification is the process of removing unnecessary characters such as white spaces, comments, new lines from the source code. These otherwise would be added to make the code more readable. Minifying the source code helps in reducing the file size and thereby reducing the time taken to download the file. This is the reason why most of the popular javascript libraries such as jQuery are minified. A minified jQuery file is of 31 KB in size where as an uncompressed one is about 229 KB. Unfortunately, debugging minified javascript f

On GraphQL and building an application using React Apollo

When I visualize building an application, I would think of using React and Redux on the front-end which talks to a set of RESTful services built with Node and Hapi (or Express). However, over a period of time, I've realized that this approach does not scale well when you add new features to the front-end. For example, consider a page that displays user information along with courses that a user has enrolled in. At a later point, you decide to add a section that displays popular book titles that one can view and purchase. If every entity is considered as a microservice then to get data from three different microservices would require three http  requests to be sent by the front-end app. The performance of the app would degrade with the increase in the number of http requests. I read about GraphQL and knew that it is an ideal way of building an app and I need not look forward to anything else. The GraphQL layer can be viewed as a facade which sits on top of your RESTful services o