rss
twitter
facebook

Tuesday, June 8, 2010

What's new in iPhone OS 4 (iOS 4)

Multitasking

Applications built using iPhone SDK 4.0 or later (and running in iPhone OS 4.0 and later) are no longer terminated when the user presses the Home button; instead, they now shift to a background execution context. For many applications, this means that the application enters a suspended state of execution shortly after entering the background. Keeping the application in memory avoids the subsequent launch cycle and allows an application to simply reactivate itself, which improves the overall user experience. And suspending the application improves overall system performance by minimizing power usage and giving more execution time to the foreground application.

Although most applications are suspended shortly after moving to the background, applications that need to continue working in the background may do so using one of the following techniques:

  • An application can request a finite amount of time to complete some important task.

  • An application can declare itself as supporting specific services that require regular background execution time.

  • An application can use local notifications to generate user alerts at designated times, whether or not the application is running.

Regardless of whether your application is suspended or continues running in the background, supporting multitasking does require some additional work on your part. Background applications can still be terminated under certain conditions (such as during low-memory conditions), and so applications must be ready to exit at any time. This means that many of the tasks you used to perform at quit time must now be performed when your application moves to the background. This requires implementing some new methods in your application delegate to respond to application state transitions.


Integration Technologies

The following sections describe the technologies you can use to enhance your application’s user experience.

Local Notifications

Local notifications complement the existing push notifications by giving applications an avenue for generating the notifications locally instead of relying on an external server. Background applications can use local notifications as a way to get a user’s attention when important events happen. For example, a navigation application running in the background can use local notifications to alert the user when it is time to make a turn. Applications can also schedule the delivery of local notifications for a future date and time and have those notifications delivered even if the application is not running.

The advantage of local notifications is that they are independent of your application. Once a notification is scheduled, the system manages the delivery of it. Your application does not even have to be running when the notification is delivered.

Event Kit

The Event Kit framework (EventKit.framework) provides an interface for accessing calendar events on a user’s device. You can use this framework to get existing events and add new events to the user’s calendar. Calendar events can include alarms that you can configure with rules for when they should be delivered. In addition to using Event Kit for creating new events, you can use the view controllers of the Event Kit UI framework (EventKitUI.framework) to present standard system interfaces for viewing and editing events.

Core Motion

The Core Motion framework (CoreMotion.framework) provides a single set of interfaces for accessing all motion-based data available on a device. The framework supports accessing both raw and processed accelerometer data using a new set of block-based interfaces. For devices with a built-in gyroscope, you can retrieve the raw gyro data as well as processed data reflecting the attitude and rotation rates of the device. You can use both the accelerometer and gyro-based data for games or other applications that use motion as input or as a way to enhance the overall user experience.

Data Protection

Applications that work with sensitive user data can now take advantage of the built-in encryption available on some devices to protect that data. When your application designates a particular file as protected, the system stores that file on-disk in an encrypted format. While the device is locked, the contents of the file are inaccessible to both your application and to any potential intruders. However, when the device is unlocked by the user, a decryption key is created to allow your application to access the file.

Implementing data protection requires you to be considerate in how you create and manage the data you want to protect. Applications must themselves be designed to secure the data at creation time and to be prepared for changes in access to that data when the user locks and unlocks the device.


Core Telephony

The Core Telephony framework (CoreTelephony.framework) provides interfaces for interacting with phone-based information on devices that have a cellular radio. Applications can use this framework to get information about a user’s cellular service provider. Applications interested in cellular call events can also be notified when those events occur.

iAd

You can use iAd (iAd.framework) to deliver banner-based advertisements from your application. Advertisements are incorporated into standard views that you integrate into your user interface and present when you want. The views themselves work with Apple’s ad service to automatically handle all the work associated with loading and presenting the ad content and responding to taps in those ads. 


Graphics and Multimedia

High-Resolution Screen Support

For the most part, applications running on devices with high-resolution screens should work with little or no modifications. The coordinate values you specify during drawing or when manipulating views are all mapped to a logical coordinate system, which is decoupled from the underlying screen resolution. Any content you draw is automatically scaled as needed to support high-resolution screens. For vector-based drawing code, the system frameworks automatically use any extra pixels to improve the crispness of your content. And if you use images in your application, UIKit provides support for loading high-resolution variants of your existing images automatically.

Quick Look Framework

The Quick Look framework (QuickLook.framework) provides a direct interface for previewing the contents of files your application does not support directly. This framework is intended primarily for applications that download files from the network or that otherwise work with files from unknown sources. After obtaining the file, you use the view controller provided by this framework to display the contents of that file directly in your user interface.

AV Foundation

The AV Foundation framework (AVFoundation.framework) is for applications that need to go beyond the music and movie playback features found in the Media Player framework. Originally introduced in iPhone OS 3.0, this framework has been expanded in iPhone OS 4.0 to include significant new capabilities, substantially broadening its usage beyond basic audio playback and recording capabilities. Specifically, this framework now includes support for the following features:

  • Media asset management

  • Media editing

  • Movie capture

  • Movie playback

  • Track management

  • Metadata management for media items

  • Stereophonic panning

  • Precise synchronization between sounds

  • An Objective-C interface for determining details about sound files, such as the data format, sample rate, and number of channels

The AV Foundation framework is a single source for recording and playing back audio and video in iPhone OS. This framework also provides much more sophisticated support for handling and managing media items.

Assets Library

The Assets Library framework (AssetsLibrary.framework) provides a query-based interface for retrieving a user’s photos and videos. Using this framework, you can access the same assets that are nominally managed by the Photos application, including items in the user’s saved photos album and any photos and videos that were imported onto the device. You can also save new photos and videos back to the user’s saved photos album.

Image I/O

The Image I/O framework (ImageIO.framework) provides interfaces for importing and exporting image data and image metadata. This framework is built on top of the Core Graphics data types and functions and supports all of the standard image types available in iPhone OS.

Core Media

The Core Media framework (CoreMedia.framework) provides the low-level media types used by AV Foundation. Most applications should never need to use this framework, but it is provided for those few developers who need more precise control over the creation and presentation of audio and video content.

Core Video

The Core Video framework (CoreVideo.framework) provides buffer and buffer pool support for Core Media. Most applications should never need to use this framework directly.


Core Services

Block Objects

Block objects are a C-level language construct that you can incorporate into your C, C++, and Objective-C code. A block object is a mechanism for creating an ad hoc function body, something which in other languages is sometimes called a closure or lambda. You use block objects in places where you need to create a reusable segment of code but where creating a function or method might be too heavyweight or inflexible.

In iPhone OS, blocks are commonly used in the following scenarios:

  • As a replacement for delegates and delegate methods

  • As a replacement for callback functions

  • To implement completion handlers for one-time operations

  • To facilitate performing a task on all the items in a collection

  • Together with dispatch queues, to perform asynchronous tasks

Grand Central Dispatch

Grand Central Dispatch (GCD) is a BSD-level technology that you use to manage the execution of tasks in your application. GCD combines an asynchronous programming model with a highly optimized core to provide a convenient (and more efficient) alternative to threading. GCD also provides convenient alternatives for many types of low-level tasks, such as reading and writing file descriptors, implementing timers, monitoring signals and process events, and more.

Accelerate Framework

The Accelerate framework (Accelerate.framework) contains interfaces for performing math, big-number, and DSP calculations, among others. The advantage of using this framework over writing your own versions of these libraries is that it is optimized for the different hardware configurations present in iPhone OS–based devices. Therefore, you can write your code once and be assured that it runs efficiently on all devices.


Framework Enhancements

UIKit Framework Enhancements

The UIKit framework includes the following enhancements:

  • The UIApplication class and UIApplicationDelegate protocol include new methods for scheduling local notifications and for supporting multitasking.

  • Drawing to a graphics context in UIKit is now thread-safe. Specifically:

    • The routines used to access and manipulate the graphics context can now correctly handle contexts residing on different threads.

    • String and image drawing is now thread-safe.

    • Using color and font objects in multiple threads is now safe to do.

  • The UIImagePickerController class includes methods for programmatically starting and stopping video capture. It also includes options for selecting which camera you want to use on a device and for enabling a built-in flash.

  • The UILocalNotification class supports the configuration of local notifications; see “Local Notifications.”

  • The UIView class includes new block-based methods for implementing animations.

  • The UIWindow class has a new rootViewController property that you can use to change the contents of the window.

  • Media applications can now receive events related to the controls on an attached set of headphones. You can use these events to control the playback of media-related items.

  • Several new accessibility interfaces help you make some UI elements more accessible and allow you to customize your application experience specifically for VoiceOver users:

    • The UIAccessibilityAction protocol makes it easy for VoiceOver users to adjust the value of UI elements, such as pickers and sliders.

    • UIPickerViewAccessibilityDelegate protocol enables access to the individual components of a picker.

    • UIAccessibilityFocus protocol allows you to find out when VoiceOver is focused on an element, so you can help users avoid making unnecessary taps.

    • The UIAccessibilityTraitStartsMediaSession trait allows you to prevent VoiceOver from speaking during a media session that should not be interrupted.

    • New interfaces in UIAccessibility protocol allow you to specify the language in which labels and hints are spoken, and provide announcements that describe events that don't update application UI in way that would be perceptible to VoiceOver users.

  • The UINib class provides a way to instantiate multiple sets of objects efficiently from the same nib file.

Foundation Framework Enhancements

The Foundation framework includes the following enhancements:

  • Most delegate methods are now declared in formal protocols instead of as categories on NSObject.

  • Block-based variants are now available for many types of operations.

  • There is new support for creating and formatting date information in NSDate and NSDateFormatter.

  • The NSDateComponents class added support for specifying time zone and quarter information.

  • There is support for regular-expression matching using the NSRegularExpression, NSDataDetector, and NSTextCheckingResult classes.

  • The NSBlockOperation class allows you to add blocks to operation queues.

  • You can use the NSFileManager class to mark files as protected; see “Data Protection.”

  • The NSFileWrapper class allows you to work with package-based document types.

  • The NSOrthography class describes the linguistic content of a piece of text.

  • The NSCache class provides support for storing and managing temporary data.

  • The URL-related classes have been updated so that you can now pipeline URL requests and set request priorities.

OpenGL ES Enhancements

The OpenGL ES framework includes the following enhancements:

Game Kit Enhancements

The Game Kit framework includes a beta implementation of a centralized service called Game Center. This service provides game developers with a standard way to implement the following features:

  • Aliases allow users to create their own online persona. Users log in to Game Center and interact with other players anonymously through their alias. Players can set status messages as well as mark specific people as their friends.

  • Leader boards allow your application to post scores to Game Center and retrieve them later.

  • Matchmaking allows players to connect with other players with Game Center accounts.

Important: GameCenter is available to developers only in iPhoneOS 4.0. It is introduced as a developer-only feature so that you can provide feedback as you implement and test Game Center features in your applications. However, Game Center is not a user feature in iPhone OS 4.0 and you should not deploy applications that use it to the App Store.

Core Location Enhancements

The Core Location framework now supports the following features:

  • A location monitoring service that tracks significant changes using only cellular information. This solution offers a lower-power alternative for determining the user’s location.

  • The ability to define arbitrary regions and detect boundary crossings into or out of those regions. This feature can be used for proximity detection regardless of whether the application is running.

Map Kit Enhancements

The Map Kit framework includes the following enhancements:

  • Support for draggable map annotations

  • Support for map overlays

Draggable map annotations make it much easier to reposition those annotations after they have been added to a map. The Map Kit framework handles most of the touch events associated with initiating, tracking, and ending a drag operation. However, the annotation view must work in conjunction with the map view delegate to ensure that dragging of the annotation view is supported.

Map overlays provide a way to create more complex types of annotations. Instead of being pinned to a single point, an overlay can represent a path or shape that spans a wider region. You can use overlays to layer information such as bus routes, election maps, park boundaries, and weather maps on top of the map.

Message UI Enhancements

The Message UI framework includes a new MFMessageComposeViewController class for composing SMS messages. This class manages a standard system interface for composing and sending SMS messages. In contrast with sending SMS messages using a specially formatted URL, this class allows you to create and send the message entirely from within your application.

Core Graphics Enhancements

The Core Graphics framework includes the following enhancements:

  • The ability to embed metadata into PDF files using the CGPDFContextAddDocumentMetadata function

  • Support for creating color spaces using an ICC profile

  • Graphics context support for font smoothing and fine-grained pixel manipulation

ICU Enhancements

The International Components for Unicode (ICU) libraries were updated to version 4.4. ICU is an open-source project for Unicode support and software internationalization. The installed version of ICU includes only a subset of the header files that are part of the broader ICU library. Specifically, iPhone OS includes only the headers used to support regular expressions. 


Inherited Improvements

Although iPhone OS 3.2 does not run on iPhone and iPod touch devices, many of the features introduced in that version of the operating system are also supported in iPhone OS 4.0. Specifically, iPhone OS 4.0 supports:

  • Custom input views

  • Connecting external displays

  • File-sharing support

  • Gesture recognizers

  • Core Text for text layout and rendering

  • Text input through integration with the keyboard

  • Custom fonts

  • ICU Regular Expressions

  • Document types

  • PDF generation

  • Xcode Tools changes

  • UIKit framework changes

  • Media Player framework changes

  • Core Animation changes

  • Foundation framework changes





Friday, May 21, 2010

Android 2.2 Platform Highlights


The Android 2.2 platform introduces many new and exciting features for users and developers. This document provides a glimpse at some of the new user features and technologies in Android 2.2. For more information about the new developer APIs, see the Android 2.2 version notes.


New User Features


Home


New Home screen tips widget assists new users on how to configure the home

screen with shortcuts and widgets and how to make use of multiple home screens.


The Phone, applications Launcher, and Browser now have dedicated shortcuts on

the Home screen, making it easy to access them from any of the 5 home screen panels.


Exchange support


Improved security with the addition of numeric pin or alpha-numeric password

options to unlock device. Exchange administrators can enforce password policy

across devices.

Remote wipe: Exchange administrators can remotely reset the device to

factory defaults to secure data in case device is lost or stolen.

Exchange Calendars are now supported in the Calendar application.

Auto-discovery: you just need to know your user-name and password to easily

set up and sync an Exchange account (available for Exchange 2007 and higher).

Global Address Lists look-up is now available in the Email application, enabling

users to auto-complete recipient names from the directory.


Camera and Gallery


Gallery allows you to peek into picture stacks using a zoom gesture.

Camera onscreen buttons provide easy access to a new UI for controlling zoom,

flash, white balance, geo-tagging, focus and exposure. Camcorder also provides

an easy way to set video size/quality for MMS and YouTube.


With the LED flash now enabled for the Camcorder, videos can be shot at night

or in low light settings.


Portable hotspot


Certain devices like the Nexus One can be turned into a portable Wi-Fi hotspot
that can be shared with up to 8 devices.

You can use your Android-powered phone as a 3G connection for a Windows or

Linux laptop by connecting their phone to the computer with a USB cable. The connection is then shared between the two devices.



Multiple keyboard languages


Multi-lingual users can add multiple languages to the keyboard and switch

between multiple Latin-based input languages by swiping across the space

bar. This changes the keys as well as the auto-suggest dictionary.



Improved performance


Performance of the browser has been enhanced using the V8 engine, which
enables faster loading of JavaScript-heavy pages.

Dalvik Performance Boost: 2x-5x performance speedup for CPU-heavy code

over Android 2.1 with Dalvik JIT.

The graph to the right shows the performance speedup from Android 2.1 to

Android 2.2 using various benchmark tests. For example, LinPack is now more

than 5 times faster.

Kernel Memory Management Boost: Improved memory reclaim by up to 20x,

which results in faster app switching and smoother performance on memory-

constrained devices.


New Platform Technologies


Media framework


  • New media framework (Stagefright) that supports local file playback and HTTP progressive streaming
  • Continued support for OpenCore in Android 2.2



Bluetooth

  • Voice dialing over Bluetooth

  • Ability to share contacts with other phones

  • Support for Bluetooth enabled car and desk docks

  • Improved compatibility matrix with car kits and headsets


2.6.32 kernel upgrade

  • HIGHMEM support for RAM >256MB

  • SDIO scheduling and BT improvements



New Developer Services



Android Cloud to Device Messaging


Apps can utilize Android Cloud to Device Messaging to enable mobile alert, send to phone, and two-way push sync functionality.

Android Application Error Reports

New bug reporting feature for Android Market apps enables developers to receive crash and freeze reports from their users. The reports will be available when they log into their publisher account.


New Developer APIs


Apps on external storage

Applications can now request installation on the shared external storage (such as an SD card).

Media framework

Provides new APIs for audio focus, routing audio to SCO, and auto-scan of files to media database. Also provides APIs to let applications detect completion of sound loading and auto-pause and auto-resume audio playback.


Camera and Camcorder

New preview API doubles the frame rate from ~10FPS to ~20FPS. Camera now supports portrait orientation, zoom controls, access to exposure data, and a thumbnail utility. A new camcorder profile enables apps to determine device hardware capabilities.

Graphics

New APIs for OpenGL ES 2.0, working with YUV image format, and ETC1 for texture compression.

Data backup

Apps can participate in data backup and restore, to ensure that users maintain their data after performing a factory reset or when switching devices.

Device policy manager

New device policy management APIs allow developers to write "device administrator" applications that can control security features on the device, such as the minimum password strength, data wipe, and so on. Users can select the administrators that are enabled on their devices.

UI framework

New "car mode" and "night mode" controls and configurations allow applications to adjust their UI for these situations. A scale gesture detector API provides improved definition of multi-touch events. Applications can now customize the bottom strip of a TabWidget.

For more information about the new developer APIs, see the Android 2.2 version notes and the API Differences Report.







Wednesday, May 12, 2010

How to play video and audio on Android

There is more than one way, to play media files on an Android phone, let me show you two of them.

Audio:
MediaPlayer is the easier way, if you just want to play an audio file in the background, somewhere in an appliaction. There are no ui controls here, but of course you can use MediaPlayer.stop(), play(), seekTo() ,etc. Just bind the needed functions to a button, gesture, or event. As you can see, it also throws a lot of exceptions, which you need to catch.


  1. public void audioPlayer(String path, String fileName){
  2. //set up MediaPlayer
  3. MediaPlayer mp = new MediaPlayer();
  4. try {
  5. mp.setDataSource(path+"/"+fileName);
  6. // TODO Auto-generated catch block
  7. e.printStackTrace();
  8. } catch (IllegalStateException e) {
  9. // TODO Auto-generated catch block
  10. e.printStackTrace();
  11. } catch (IOException e) {
  12. // TODO Auto-generated catch block
  13. e.printStackTrace();
  14. }
  15. try {
  16. mp.prepare();
  17. } catch (IllegalStateException e) {
  18. // TODO Auto-generated catch block
  19. e.printStackTrace();
  20. } catch (IOException e) {
  21. // TODO Auto-generated catch block
  22. e.printStackTrace();
  23. }
  24. mp.start();
  25. }

Video:
In this example, I open a video file, and decide if I want it to autoplay after it is loaded.
You propably want to store your video files, on the sd card. You can get the path to the sd card via: Environment.getExternalStorageDirectory().


  1. public
    void videoPlayer(String path, String fileName, boolean autoplay){
  2. //get current window information, and set format, set it up differently, if you need some special effects
  3. getWindow().setFormat(PixelFormat.TRANSLUCENT);
  4. //the VideoView will hold the video
  5. VideoView videoHolder = new VideoView(this);
  6. //MediaController is the ui control howering above the video (just like in the default youtube player).
  7. videoHolder.setMediaController(new MediaController(this));
  8. //assing a video file to the video holder
  9. videoHolder.setVideoURI(Uri.parse(path+"/"+fileName));
  10. //get focus, before playing the video.
  11. videoHolder.requestFocus();
  12. if(autoplay){
  13. videoHolder.start();
  14. }
  15. }





Friday, May 7, 2010

Android vs. iPhone : Developer's view

There are a variety of areas to consider regarding the relative restrictiveness of the two platforms.

The Market

Android Market is undoubtedly less restrictive than the App Store when it comes to the submission process. The upside is that you can get pretty much anything you want into the Android Market. The downside is that you can get pretty much anything you want into the Android Market (a market flooded with spam "applications" is in some ways a restriction).

A big negative on the iPhone side is the fact that your options for installing applications that are not in the App Store are limited -- you can either distribute the application as a beta (limited to 100 users) or jailbreak your iPhone. Android, however, allows you to install apps from anywhere, including a web page.

The Applications

One of the core design philosophies of the Android platform is "All Applications are Created Equal", which is supposed to mean that you can freely replace applications on the phone with a third party version. In practice this is not really the case, as many of the Google applications either have special capabilities not available to most applications (see: Android Market) or use undocumented/native APIs (see: Calendar). In addition you are out of luck if you want to run slightly modified versions of the built in applications like the calendar because of code signing issues.

The iPhone on the other hand makes no such claims about equality, and the Apple stance in general is "There is only one way to do it, and that is the Apple way". New for iPhone OS 4.0 is the ability for apps to run in the background. One thing iPhone applications can currently do that Android applications can not is receive push notifications.

The Source

Android is open source - mostly (some firmware components are closed source). Even so, there is some rocket science involved just in getting the Android codebase to compile. In addition Google has sent cease & desist orders for redistributing custom images that include the Android Market and Google Maps application.

The iPhone is completely closed source, and recent changes to the developer agreement have been controversial as they mandate that all apps submitted to the app store be originally written in "Objective-C, C, C++, or JavaScript as executed by the iPhone OS WebKit engine, and only code written in C, C++, and Objective-C may compile and directly link against the Documented APIs"

The SDK

The Android SDK can be freely downloaded; the iPhone SDK requires free user registration to download. Android development can be done under Mac OS X, Windows or Linux, while iPhone development is only possible under Mac OS X. You'll also need to pay for the $99 iPhone developer account if you want to test your software on an actual device (rather than the emulator).

The Userbase

And last but certainly not least, the userbase. When this answer was originally written the iPhone had a much larger userbase and was growing much faster than Android. This is changing as Android begins to support multiple carriers and hardware platforms (see: the Open Handset Alliance). The list of devicesrunning Android is now quite long although none yet match the popularity of the iPhone. It is safe to say that the iPhone will still be the dominant smartphone platform for at least the next year or two.