Friday, July 22, 2005

Nokia Developer's Suite for J2ME™, Version 2.0

View the complete article
The key features of the Nokia Developer's Suite for J2ME™, Version 2.0 are:

Full development cycle when integrated into Borland JBuilder and Sun ONE Mobile Edition
Also works in stand-alone mode with additional external tools
Support for MIDP 1.0 and MIDP 2.0
Series 60 MIDP Concept SDK Beta 0.3, Nokia edition
Deploys on devices using IrDA, USB, and RS-232 (Windows)
FTP-upload capability, including WML deck
Supports application signing with public/private key model
Audio converter
Support for Series 30, Series 40, and Series 60 devices

Taking Pictures with MMAPI

view the complete article

The Mobile Media API (MMAPI) enables MIDlets to play and record audio and video data. The exact capabilities depend on the device implementation. Currently the only device that supports MMAPI is the Nokia 3650 mobile phone. This article describes how to build a simple camera application using MMAPI.

Tools for MMAPI Development

The first challenge of building an MMAPI application is finding the right tools. The J2ME Wireless Toolkit 2.0 includes support for MIDP 2.0, MMAPI 1.0, and WMA 1.1. At first glance, it looks like this is all you need — but the Nokia 3650 runs MIDP 1.0, not MIDP 2.0. To develop MIDP 1.0 applications, you'll need the 1.0.4 version of the J2ME Wireless Toolkit and an emulator that supports MMAPI.

Sun offers such an emulator, available from the MMAPI home page. While this emulator will allow you to build MMAPI applications, it does not support image capture, so it can't be used to test a camera application.

Nokia has an emulator that does support image capture, part of the Nokia Series 60 MIDP Concept SDK Beta 0.2, available from Forum Nokia, Nokia's developer web site. The file name is nS60_jme_concept_sdk_b0_2_014.zip. Linux developers should be able to use the Nokia Developer's Suite for J2ME, Version 2.0. The following installation instructions apply to Windows only.

Install the Nokia SDK in the \wtklib\devices directory (or copy the Nokia_Series_60_MIDP_Concept_SDK_Beta_0_2 directory there after installation is complete). Next time you run the J2ME Wireless Toolkit, you'll have an additional emulator available, Nokia_Series_60_MIDP_Concept_SDK_Beta_0_2. When this emulator is selected, you can build MMAPI applications and test them as well. The emulator itself looks very much like the Nokia 3650.

MMAPI Device Testing

There's no substitute for testing on real devices. In the long term, many devices will support MMAPI, even though the Nokia 3650 is the only MMAPI device available just now.

There are several ways to deploy a MIDlet on the Nokia 3650. I transfer files via infrared between my laptop and the device. You can also use Bluetooth or do an OTA installation.

Getting a Video Capture Player

The first step in taking pictures (officially called video capture) in a MIDlet is obtaining a Player from the Manager. A special locator, capture://video, indicates that pictures will be captured from the camera using a default image size.

mPlayer = Manager.createPlayer("capture://video");

If the device does not support video capture, a MediaException will be thrown. You can check ahead of time to see whether video capture is supported; if it is, the system property supports.video.capture will be true.

The Player needs to be realized to obtain the resources that are needed to take pictures. If you haven't learned or don't remember what "realized" means, read through Mobile Media API Overview or The J2ME Mobile Media API.

mPlayer.realize();

Showing the Camera Video

The video coming from the camera can be displayed on the screen either as an Item in a Form or as part of a Canvas. A VideoControl makes this possible. To get a VideoControl, just ask the Player for it:

mVideoControl = (VideoControl)
mPlayer.getControl("VideoControl");

If you wish to show the video coming from the camera in a Canvas, initialize the VideoControl, then set the size and location of the video in the Canvas, then make the video visible. The following example (the constructor of a Canvas subclass) shows how to place the video two pixels in from the sides of the Canvas. If the incoming video cannot be placed that way, the constructor tries using the full screen. Finally, it calls setVisible() to make the camera video visible.

public CameraCanvas(SnapperMIDlet midlet,
VideoControl videoControl) {
int width = getWidth();
int height = getHeight();

mSnapperMIDlet = midlet;

videoControl.initDisplayMode(
VideoControl.USE_DIRECT_VIDEO, this);

try {
videoControl.setDisplayLocation(2, 2);
videoControl.setDisplaySize(width - 4, height - 4);
}
catch (MediaException me) {
try { videoControl.setDisplayFullScreen(true); }
catch (MediaException me2) {}
}
videoControl.setVisible(true);
}

Showing the camera video in a Form is slightly different. Instead of calling VideoControl's initDisplayMode() method with USE_DIRECT_VIDEO, use the USE_GUI_PRIMITIVE value instead. On MIDP devices, you'll get back an Item you can place in a Form to be displayed.

Form form = new Form("Camera form");
Item item = (Item)mVideoControl.initDisplayMode(
GUIControl.USE_GUI_PRIMITIVE, null);
form.append(item);

Progress Report

Video Streaming: A Preliminary Report
Jenny Cheng, Jeffrey Louie Quiambao, Lily Joy Regalado

Since it has already been agreed upon by J2L, JSP and our thesis adviser that the media format to be used is 3gp, our group has been researching on technologies that we might possibly use and other related issues.

Introduction to Streaming

We have found steps in processing a video and a few related literature:

1. Acquiring a video is basically shooting a scene with a camera or gathering pre-recorded content.
We have found that MMAPI (Mobile Media API) can control a camera, audio, etc. It has also method that can set the location where to save the video. Video is fetched in this location and uploaded to the server.

2. Capturing the video includes digitizing the video into a standard format.

3. Editing the video. If broadcasting live, this can be the encoding of the streaming video directly from the source.
Editing the video means avoiding certain types of transitions and keeping the scene changes to a minimum to improve video encoding. In live streaming, this stage is skipped.

4. Encoding the clip is the encoding the source into streaming format, targeting network bandwidth(s) or set of bandwidths and choosing a codec.
Compressing or encoding the video is the process of altering the file to reduce its size. Usually, the resulting transformed digitized video file from recorded video is far too big to transmit over the Internet. Our target media format is 3gp. Most of the mobile phones support 3gp format. 3gp is already in a streaming format. It is also popular because it has a smaller size than other media formats, therefore no more encoding operations will be performed.

5. Mounting to server is streaming all relevant files to the streaming server.
We are looking at the Darwin server, an open source server for streaming via RTP or HTTP.

6. Delivering a clip. If the presentation is ready, it is in this stage when the clip made or broadcasted to different devices.

On this six stages, we are deeply looking at the first five stages, delivering the clip from the server to different devices are not very much of our concern.

Setting-up Video Streaming

The following is an option for our group:

1. Download & install Darwin Streaming Server (DSS) from http://developer.apple.com/darwin/projects/streaming/. It is an open source application.
2. Convert current video and audio files to 3GP file.
3. After the video file has been converted to a 3gp file, we can now upload it into the Darwin Streaming Server.
4. To then view it on phone, it must have Internet access. Enter the address of the server e.g. rtsp://my-server/my-file.3gp

We had to address the following issues: platform and media format. We chose Linux to be our platform.

File format is the wrapper for the encoded tracks within the file. There must be a sort of coherency between file format and codecs. The format is important for the server, because it must understand how to handle data in the file and how to properly packetize them according to RTP standards. For example Darwin needs hinted files, but other servers don't. Formats are MP4/3GP, codecs are Mpeg4 or H.263, and protocols are SDP/RTSP/RTP. So, for file based streaming, the file format is important because the streaming server must understand it, but for live streams, the file format does not exist, instead the server must know how to handle the streams, which arrive live from the encoder. As it was said before, we have agreed on using 3GP.

Outsourcing streaming services

We are looking at the differences between Live and On Demand. To stream live content in real time, we’ll need an application powerful enough to encode in real time. This is more expensive than encoding for archival purposes, wherein the media content is stored on a streaming server and is delivered on demand when an end user makes a request, and hence real time encoding is not required. Content that is streamed live is usually also recorded for archival purposes and can actually be enhanced with good editing tools for future playback.



References:
Tips For Capturing. Retrieved July 15, 2005 from http://www.clickandgovideo.ac.uk/capturing2.htm

Steaming Video: Thery, processes and Applications. Retrieved July 13, 2005 from http://aaaprod.gsfc.nasa.gov/teas/TEASStreamingVideo.ppt

Popwire. Retrieved July 13, 2005 from http://www.pats.no/events/workshop1/presentasjoner/ARTS%20pre%20March%200319.pdf

http://www.mobiledia.com/forum/topic30163.html

http://www.via.ecp.fr/via/ml/streaming/2005-05/msg00009.html

Tuesday, July 19, 2005

Suggested outline for Research Plan

from http://www.cau.edu/acad_prog/comp_sci/comp_sci_grad.html

1. Statement of Research Question
2. Why is this important or interesting?
3. What are some of the issues that need to be addressed?
4. What is the approach to the problem? (General Approach)
5. Detail related or background work
6. Detail the work done to date
7. Detailed Research Plan
8. Research Schedule

Next Deliverable

on July 23
Submit article summaries (literature search) and research plan
(e-mail to spancho@acm.org)

Saturday, July 16, 2005

Links

TEASStreamingVideo.ppt
pats

mobiledia
via.ecp.fr

Capturing Process

from http://www.clickandgovideo.ac.uk/capturing2.htm

This Capturing Process, put simple, takes you through the stages necessary to deliver the film that is on your camera in a streaming format.

Definition

If the source footage is analogue, "capture" refers to the act of digitisation (conversion to a digital format) to make the video usable on a computer and, usually, the simultaneous application of compression to reduce the video to a manageable data rate for processing and storage.

If the source video is digital, "capture" typically refers to the simple transfer of video from an external device, such as a digital camcorder or tape deck, to a computer hard drive.

The term digitising is often used interchangeably with capturing.

Acquire video. Shooting a scene with a camera or gathering pre-recorded content

Capture video. Digitise the video to a standard file format, such as AVI or QuickTime.

Edit the on-demand video if necessary with your video editing software. If you are broadcasting live, however, you encode the streaming video directly from the source.

Encode clip. Encode your source into streaming format e.g. RealVideo. Targeting a network bandwidth(s) or set of bandwidths and choose a codec.

Mount to server. You FTP (File Transfer Protocol) all the relevant files to the streaming server.

Deliver clip. With your presentation is ready, you make your clip or broadcast available through your website with links. If you are combining video with another streaming clip you write the code, e.g. SMIL, that assembles the pieces

Wednesday, July 13, 2005

Server/Client Streaming Application Using JMF

from http://java.sun.com

Q: Where do I begin to develop my server/client streaming application using JMF?

You can start looking at the solutions on our website at: http://java.sun.com/products/java-media/jmf/2.1.1/solutions/index.html

There you'll find various sample programs that highlight different aspects of JMF. For example for RTP transmission check out AVTransmit.

Tuesday, July 12, 2005

JMF 2.1.1 - Supported Formats

Hindi ako makahanap ng alternative...pero eto muna...baka kailangan natin tong i-note:

This page lists the media formats supported in the JMF 2.1.1 FCS implementation, the RTP formats this implementation can receive and transmit, and the capture devices that it supports.

Eto summary lang:

Media type: AIFF, AVI, GSM, HotMedia, MIDI, MPEG-1 video, MPEG LAYER 2 AUDIO, QUICKTIME, SUN AUDIO, WAVE

The JMF 2.1.1 Reference Implementation can receive and transmit the following RTP formats:

Video: JPEG (420, 422, 444)*
Video: H.261 31 - R R
Video: H.263** 34 Mode A Only
Video: MPEG-I***

Saturday, July 09, 2005

LIVE.COM Streaming Media

from http://live.sourceforge.net/

This code forms a set of C++ libraries for multimedia streaming, using open standard protocols (RTP/RTCP and RTSP). These libraries - which can be compiled for Unix (including Linux and Mac OS X), Windows, and QNX (and other POSIX-compliant systems) - can be used to build streaming applications. The libraries are already being used to implement applications such as "liveCaster" and "playRTPMPEG" (for streaming MP3 audio using RTP/RTCP). The libraries can also be used to stream, receive, and process MPEG video, and can easily be extended to support additional (audio and/or video) codecs. They can also be used to build basic RTSP clients and servers, and have been used to add streaming support to existing media player applications, such as "mplayer". (For some specific examples of how these libraries can be used, see the test programs.)

This code is "open source", and is released under the LGPL. This allows you to use these libraries (via linking) inside closed-source products. It also allows you to make closed-source binary extensions to these libraries - for instance, to support proprietary media codecs that subclass the existing "liveMedia" class hierarchy. Nonetheless, we hope that subclass extensions of these libraries will also be developed under the LGPL, and contributed for inclusion here. We encourage developers to contribute to the development and enhancement of these libraries.

The project source files, and additional documentation, are online at http://www.live.com/liveMedia.

Based on my research...

1) Presently the JMF doesn’t support 3gp content types.
2) There's a JMF MP3 Plugin, meaning it supports MP3.
3) I found a library called MP4File(C++) that is used in streaming.
3) I found another library called LIVE.COM Streaming Media, again a set of C++ libraries for multimedia streaming.
4) Pwede pa lang streaming sa K500i.(la lang..:D)

Currently, we decided to use 3gp and JMF.

But based on my research, medyo mahirap yatang pagsamahin yung dalawa. We have to start from scratch kung 3gp and JMF ang gagawin namin.

What I am trying to say is, eto ang pwedeng choices:
a) If JMF and MP3 na lang since may Java MP3 PlugIn Manual Installation, slight. But hindi ko pa naaaral kung ano tong plugin na to hindi ko alam kung ok to.
b) If C++ and 3gp naman, kailangan pa din naming aralin yung MP4File library. Still, hindi ko pa rin alam kung ok tong library na to.
c) Last choice and I think pinaka-OK sa lahat, C++ and MP3 kc may library na called LIVE.COM Streaming Media, "open source" and is released under the LGPL. Feeling ko malaking tulong to kahit na hindi ko pa siya masyadong naaaral.

Wednesday, July 06, 2005

MP4File

The MP4File handles a MP4 file.

The following is a list of the main features of the MP4File library:
* read the MP4 file (ISMA and 3GPP file)
* write the MP4 file (ISMA and 3GPP file)
* selection of the standard to apply according the extension file: 3GPP for the ‘.3gp’ extension, ISMA for the ‘.mp4’ extension
* create and add the hint tracks
* add and delete atoms
* change the properties of an atom
* generate the Session Description Protocol (SDP)
* management of the cache
* management of a MP4 File Factory

The steps to use the library are the following:
* instantiate a MP4File object (MP4File_t mp4File)
* initialize the MP4 file (mp4File. init (pPathName, bUse64Bits, bUseMP4ConsistencyCheck, bToBeModified, sStandard, >Tracer))
* use of the object (mp4File. … ())
* finish of the object (mp4File. finish ())

Similar project: Catra Streaming Server

http://www.catrasoftware.it/Streaming/CatraStreamingPlatform.htm

The Catra Streaming Server can deliver stored content to any open standards RTSP/RTP clients.

Catra Streaming Server accepts the following audio codecs: AAC, GSMAMR, AMR-WB.

The video codecs accepted are: MPEG-4 and H263.

Catra Streaming Server is compliant with the following standards:
· 3GPP TS 26.234 – PSS Protocols and codecs (Release 5)
· rfc2326: Real-Time Streaming Protocol (RTSP)
· rfc2327: Session Description Protocol (SDP)
· rfc1889: A Transport Protocol for Real-Time Applications (RTP)
· rfc2429: RTP Payload Format for the 1998 Version of ITU-T Rec. H.263 Video (H.263+)
· rfc3016: RTP Payload Format for MPEG-4 Audio/Visual Streams
· rfc3267: Real-Time Transport Protocol (RTP) Payload Format and File Storage Format for the Adaptive Multi-Rate (AMR) and Adaptive Multi-Rate Wideband (AMR-WB) Audio Codecs
· rfcisma

Specific technical features of Catra Streaming Server are:
· Streaming of both off-line and live request
· Use of a configurable cache to prefetch the RTP packets.
· Use of a configurable cache for the content.
· Handle of three timeouts
o RTSP and RTCP timeout to avoid maintaining active a RTSP session when it does not receive any RTSP command and RTCP packets
o PAUSE timeout to avoid maintaining active a RTSP session when it remains in PAUSE for a long time.
o Session timeout in order to drop the streaming session after a specified period of time
· Ability to configure the interval between two RTCP sender report packets.
· Ability to distribute the RTSP, RTP and RTCP traffic on different network cards.
· Generation of log files ensuring proper sizing, rotation and trace levels through configurable parameters.

General technical features of the Catra Streaming Platform are:
· Independence of the Operative System (currently tested on Linux, HPUX, Sun Solaris, and Windows)
· C++ the language used to implement the server
· CORBA used for the communication between Server and GUI. MICO (MICO Is CORBA: free available and fully compliant implementation of the CORBA standard) is used by Catra Streaming Server.
· Use of the catralibraries (including the MP4File library)

About 3GPP

from http://www.3gpp.org/About/about.htm

The 3rd Generation Partnership Project (3GPP) is a collaboration agreement that was established in December 1998. The collaboration agreement brings together a number of telecommunications standards bodies which are known as “Organizational Partners”. The current Organizational Partners are ARIB, CCSA, ETSI, ATIS, TTA, and TTC.

The establishment of 3GPP was formalized in December 1998 by the signing of the “The 3rd Generation Partnership Project Agreement”.

The original scope of 3GPP was to produce globally applicable Technical Specifications and Technical Reports for a 3rd Generation Mobile System based on evolved GSM core networks and the radio access technologies that they support (i.e., Universal Terrestrial Radio Access (UTRA) both Frequency Division Duplex (FDD) and Time Division Duplex (TDD) modes). The scope was subsequently amended to include the maintenance and development of the Global System for Mobile communication (GSM) Technical Specifications and Technical Reports including evolved radio access technologies (e.g. General Packet Radio Service (GPRS) and Enhanced Data rates for GSM Evolution (EDGE)).

The discussions that led to the signing of the 3GPP Agreement were recorded in a series of slides called the “Partnership Project Description” that describes the basic principles and ideas on which the project is based. The Partnership Project Description has not been maintained since it’s first creation but the principles of operation of the project still remain valid.

In order to obtain a consolidated view of market requirements a second category of partnership was created within the project called “Market Representation Partners”.

“Observer” status is also possible within 3GPP for those telecommunication standards bodies which have the potential to become Organizational Partners but which, for various reasons, have not yet done so.

A permanent project support group called the “Mobile Competence Centre (MCC)“ has been established to ensure the efficient day to day running of 3GPP. The MCC is based at the ETSI headquarters in Sophia Antipolis, France.

3GPP

From Wikipedia, the free encyclopedia.

The 3rd Generation Partnership Project (3GPP) is a collaboration agreement that was established in December 1998. It's a co-operation between ETSI (Europe), ARIB/TTC (Japan), CCSA (China), ATIS (North America) and TTA (South Korea).

The scope of 3GPP was to make a globally applicable third generation (3G) mobile phone system specification within the scope of the ITU's IMT-2000 project. 3GPP specifications are based on evolved GSM specifications, now generally known as the UMTS system.

Note that 3GPP should not be confused with 3GPP2, which specifies standards for another 3G technology known as based on IS-95 (CDMA) networks, commonly known as CDMA2000.

3gp

From Wikipedia, the free encyclopedia.

3gp is a file format which is used in mobile phones to store media (audio/video). This file format is a simpler version of "ISO 14496-1 Media Format". MOV (used by QuickTime) is also a format which follows similar file format. This format can only carry video encoded as MPEG-4 or H.263. Audio is stored in AMR-NB or AAC-LC formats.

In this format, values are stored as big-endian.

Friday, July 01, 2005

Phone Prices

As of July 1, 2005

Nokia 6680 - Php 27,800.00 (Smart Postpaid plan 500)
Sony Ericsson V800 - $560.00 (Amazon)
Sony Ericsson P910i Php 33,500.00(Smart Postpaid plan 500)
Nokia 6630 - Php 26,900.00(Smart Postpaid plan 500)
Samsung D500 -$430.00 (Amazon)
Sony Ericsson K700i - Php 13,000.00(Smart Postpaid plan 500)
Sony Ericsson S700i - Php 25,000.00(Smart Postpaid plan 500)
Motorola V3 RZR - Php 21,500.00(Smart Postpaid plan 500)
Samsung E720 - $325-500 (Froogle)
Nokia 6230i - Php 14,500.00(Smart Prepaid Phonekit)

Video Formats

To view a comparison of the support, uses, advantages, disadvantages, bitrate, standards of the following video formats: AVI, DV and Mini-DV, MPEG, RealMedia and RealVideo, Flash and Shockwave click here.
For comparison of variations of MPEG formats, click here.

Rates

GLOBE
MMS/GPRS Rates and ChargesGood news! You can enjoy the lower rate of P0.25 per kilobyte when you use your GPRS-capable phone as a modem to SURF THE INTERNET on your laptop or PDA. To get internet browsing settings in place for you phone, contact our 24-hour Call-in Service at 730-1000 from your landline or 211 from your Handyphone.

To know more about WAP/Internet browsing charges over GPRS, check out this section

CONSOLIDATED RATE TABLE AS OF MARCH 2004
Globe Services - Rate Per Sending - Rate Per Download
MMS - P 3.00* - N/A
International MMS - P20.00 - N/A
Video MMS - P15.00 - N/A
International Video MMS - P40.00 - N/A
Colored Wallpaper - P 5.00 - P20.00

Globe Services - Rate Per Download
Polytones - P 20.00
Handygames - P30.00-regular/P50.00-premium
Colored Icons - P20.00
Traffic Cam - P5.00
Globe XTM - P1.00/send
Message True Tones - P25.00
WAP Browsing - P0.15/kb
Internet Browsing (postpaid only) - P0.25/kb
Streaming Services (postpaid only) - P0.10/kb or about P20/min

*Effective Nov. 28, 2004, sending of local Person-to-Person MMS pictures and videos will be reduced from P5.00 to P3.00. This applies to the following:

-sending of local P2P MMS photo to Globe and other network
-sending of local P2P MMS video to Globe and other network

As a privilege, this rate will be enjoyed permanently by Gentxt or Gizmo subscribers. Regular subscribers (non-Gentxt or non-Gizmo) will be able to enjoy this promotional rate until further notice!

SUN
Access charge is P0.25 per kilobyte. Applicable charges shall be applied for information and content downloads.

SMART
Access to WAP over GPRS:
1. When browsing inside the SMART WAP site - P2.50 / transaction (Per transaction is defined as when info requested is retrieved, browsing the menus within the site has no charge)
2. When browsing outside the SMART WAP site - P0.50 / kilobyte (One kilobyte is approximately a page and a half on info - depending on the content of the page)