Android :: Difference Between StageFright And Opencore
May 24, 2010
What are the differences between StageFright and Opencore? What does this mean to app developers? Please point me to any links, tutorials online. You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscribe@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
View 6 Replies
May 22, 2010
I'm very sad that the new Stagefright broke my application ability to streams AAC+ content :( Any pointers on this library? does it still support AAC? Where can I find Android StageFright source code? Anyone having issues with the new media library?
View 12 Replies
View Related
Sep 7, 2010
Is it possible to detect scan type (progressive or interlaced) of a video file in stagefright?
I see below listed fields are query-able with metadata of file through stagefright, but not scan-type which i guess is avaiable in MP4 file header. Any hint on how to detect scan-type of video file at Stagefright player driver? Or is it available in opencore?
CODE:.............................
View 3 Replies
View Related
Jun 28, 2010
I am interested in adding support for some video containers not supported by default in a video player application. flash player plugin being able to play video on froyo, I suppose there are some api to extend the media framework. I've heard of stagefright but didn't find much details. where can I find details on how the flash player works ? does it use publicly available api ? public or not, how can I access them ?
View 4 Replies
View Related
Aug 2, 2010
I am trying to integrate VP8 video codec into PV opencore FW in Froyo & test at PV test application level. I have done the following steps, 1. Made a new folder for vp8 @ external/opencore/codecs_v2/omx/ omx_vp8. 2. Made appropriate modifications in the omx_baseclass, omx_common etc. 3. Modified the PV test app (external/opencore/codecs_v2/omx/ omx_testapp) for testing VP8. 4. included the VP8 source code to froyo. 5. Included new oscluuid for VP8 With these modifications, the build is fine. The shared libraries and binaries are creating properly. Also OMX-MasterInit is passing. But the application fails at OMX_GetHandle(). *pHandle is coming as zero. I am using the PV OMX core. I am just adding a new codec which is not supported by PV Opencore and test at PV test_app level.Can you please answer my below queries: 1. Do we need to change the pvplayer.cfg file? Bcoz i am using the PC omx_core itself. 2. Any other changes needs to be done for integrating a new component into Opencore? Please provide you valuable inputs. Let me know if need any additional informations.
View 4 Replies
View Related
Sep 27, 2009
Anybody know if it is possible use directly OpenCore from java or native with NDK ?
View 2 Replies
View Related
Mar 22, 2010
Here is what I got while running a test case of open core . *************** # pvplayer_ engine _test -test 1 1 SDK Labeled: PVDEV_ CORE_RELEASE_6.506.4.1 built on 20090312 Test Program for pvPlayer engine class. Input file name 'test.mp4' Test case range 1 to 1 Compressed output Video(No) Audio(No)Log level 8; Log node 0 Log Text 0 Log Mem 0 Starting Test 1: Open-Play-Stop-Reset Results for Test Case 1: Successes 1, Failures 1 Total Execution time for file test.mp4 is : 2.712000 seconds# *************** I want to ask why it shows 1 success and 1 failure . there should be only 1 pass or 1 fail only . Do each test case have multiple sub-tests within them ? which class in source code finally decide whether it is pass or faiL?
View 2 Replies
View Related
Jul 14, 2009
I am trying to add flash (flv) support to openCORE, i found some hints in this forum but still there are many things that unclear. I would like in the beginning to play local .flv file,from SDCARD, without streaming. As i understand i need to implement Parser within fileformats, "parser node" and "recognizer" and register it with player engine, and of cause add h.263 Sorenson codec. Can someone point me to the information on the data flow from the main Player Engine to the "parser node", what API is used,? How will "parser node" send the data to the Codec Engine? Which parts will i have to add to enable streaming later? I also looked inside the documentation within opencore but did not find relevant information, perhaps i missed something and someone can point me to where should i look.
View 2 Replies
View Related
May 12, 2010
Im experiencing an intermittent and very frustrating issue with MediaPlayer on Android when streaming MP3 over HTTP. Ive done as much research and debugging as I possibly can to try and find a solution, but am coming up short. I cant find reference to this issue anywhere, including b.android.com. Essentially, whats happening is MediaPlayer will not come out of the preparing state. The device is receiving audio data during this time, as indicated by network sniffs and the MediaPlayer buffering callback being invoked repeatedly with increasing percentage values, but eventually the buffer appears to become full and stops receiving. Because prepare() never returns, or more accurately when prepareAsync() is used and the onPrepared callback is never invoked, and neither the onError() nor onInfo() callbacks are invoked, the application can neither call MediaPlayer.start() nor make any sort of recovery. Other notable observations: The problem is intermittent, meaning that subsequent attempts to play the exact same MP3 stream from the exact same server may exhibit this behavior or not - The problem seems to be exasperated when the device is connected via 3G versus wifi - The problem occurs on different MP3s encoded by different software - The problem occurs when streaming from different HTTP servers - The problem cannot be reproduced on any of the Android SDK emulators - The problem can be reproduced on the HTC Incredible, Verizon Motorola DROID, and HTC Touch - When MediaPlayer successfully returns from the preparing state, there is an info message (what=1, extra=44) received. This info message is not received during the problematic case. Anyone ever experienced this, or even find a solution?
View 10 Replies
View Related
Feb 24, 2010
I guess ur doing it correctly.While running the test cases already provided by the PVcore,one can only see the logs and not the output on the screen or the speaker. This is what my experience says.Other people who worked on this can confirm.
View 2 Replies
View Related
Nov 20, 2009
when a video is played over http (progressive download), in some cases, the application stops the current http connection and continues the download using a new http request with a range header. closing the connection is the phone choice, not server. somebody knows in which cases the phone behaves this way?
View 3 Replies
View Related
Aug 5, 2009
I am not able to use MediaPlayer/VideoView to make rtsp to work in Android. So I have created a client to interact with RTSP server, I have succeeded in doing this.I am able to get the video/audio frame from RTSP server (MySpace) in Android. Now I want to play the frames. I have searched OpenCore APIs to play the frames, but didn't get any APIs. My investigation: There is a class PlayerDriver.c It creates two sinks one audio and other video. handleSetVideoSurface handleSetAudioSink Two objects of type PVPlayerDataSinkPVMFNode are created. I suspect this class has a way to give the stream as input, but I am not getting the definition of this class. Can you suggest me is there any class I need to look into it?
View 1 Replies
View Related
Jul 17, 2010
I create aviparsernode folder by coping from MP4 parser node folder under /external/opencore/nodes/. But when I type "make" command under /external/opencore/build_config/opencore_dynamic/. The files under aviparsernode folder are not compilied. Does any one knows how to let OpenCore to make files under aviparsernode folder?
View 4 Replies
View Related
Jan 29, 2009
What are the video resolutions supported by the PV codes in the opencore module? Is there a manual or guide describing this?
View 2 Replies
View Related
Oct 8, 2009
The following what written on the Android 1.6 Platform Highlights: Android 1.6 includes the updated OpenCore 2 media engine, which has: * Support for OpenMAX encoders * Support for additional audio codecs in AuthorEngine * Improved buffering model supports shared buffers allocated in the decoder Could someone explain to me how this affects us the developers? Does this mean now we can play other media types? What in reality is new to us on Android 1.6 in regards to OpenCore?
View 2 Replies
View Related
Feb 26, 2009
I am trying to integrate my codec on the ARM side using my own OMX core. Now I would like to test it on the Android Emulator using the Eclipse and Android SDK. I would like to know, *Tool chain used to compile openCore * How to set up Android Emulator to test my latest openCore (after integrating the Codec)?
View 2 Replies
View Related
Jul 15, 2010
I want to use the codecs in Android from my application. For now I just want to use the H.264 codec for testing, unless the mp3 or aac codecs provide functions for sending the audio to the device's speaker in which case I would prefer one of those.I have the NDK installed along with Cygwin, GNU Make, and GNU Awk. I can't figure out what I need to do from here though. I'm downloading the entire OpenCORE tree right now but I don't even know how to build it or make Eclipse plugin know it needs to include the files.An example or a tutorial would be much appreciated.EDIT:It looks like I can use JNI like P/Invoke which would mean I don't have to build the OpenCORE libraries myself. However, I can't find any documentation on the names of the libraries I need to load. I'm also confused as to how to do it. I'm looking at http://www.koushikdutta.com/2009/01/jni-in-android-and-foreword-of-why-jni.html and I don't understand what the purpose of writing a library to access a library is. Couldn't you just use something like System.loadLibrary("opencore.so")?
View 2 Replies
View Related
Mar 3, 2010
I am New to the android ,what is the difference between android sdk 1.5 and android sdk 2.0
View 3 Replies
View Related
Oct 22, 2010
If so what is it, and which one is made accessible by Android being open source? (Yup, I'm a newbie.)
View 7 Replies
View Related
Jan 8, 2010
Can anyone tell what is the difference between px, dip, dp and sp in android?
View 2 Replies
View Related
Oct 15, 2009
In android build environment, what is the difference between 'mmm' and 'mm' command?
View 2 Replies
View Related
Jul 24, 2010
Is there a difference between the one that came loaded on the phone and the app in Android Market?
View 1 Replies
View Related
Mar 3, 2009
what is the scenario that makes me to pick between startActivity and startSubActivity
View 2 Replies
View Related
May 3, 2010
Between rooted and Unrooted? I see it being brought up all the time in these forums? And when I get my EVO what should I do?
View 3 Replies
View Related
Aug 18, 2010
What is the difference between these two backup solutions? I am rooted now with Clockwork Recovery installed via ROM Manager, and I have successfully created a "nandroid" backup with Clockwork.
Now, many that have Clockwork also use Titanium Backup for root users. What is the advantage of using Titanium, or does each backup method serve different purposes?
Is it that the nandroid backups can only be used on the same device on which it was created, and Titanium can be used when migrating to a new and different device in order to preserve app data?
View 4 Replies
View Related
Oct 5, 2010
I was looking at an Android powered Tablet made in China. They are confirming that it is running version 2.2. But when looking at the about phone UI, it shows that the version is "2.2-update1". Is this real? Or is it 2.1 with some 2.2 features? Any advise would be appreciated, since I wanted to buy these ASAP.
View 2 Replies
View Related
May 26, 2009
I want to download some data from internet.It should not block main UI thread. I know both thread and service can handle the work. What is the difference between them?
View 4 Replies
View Related
Dec 30, 2009
On methods like these kind, onItemClick(AdapterView parent, View view, int position, long id), what's the difference between position and id.
View 2 Replies
View Related
Sep 24, 2010
I'm usign starMethodTrace() and stopmethodTtrace() to get trace data to use on traceview. Everything seems to work fine. I test my app for 50 (real world) seconds having two important events after 15 and 45 seconds from start tracing. However, when analyzing the traceview's timeline I found the total tracetime is only 1.068,964 (msec). My two events are displayed as they were occurred at time 257,743 msec and 642,654 msec. This is just about 1 second of total execution time. Why there is this huge difference between real world time and trace view time? May be because I have a lot of idle time? Is there a way to relate the times displayed in the time line with real world time?
View 2 Replies
View Related
Apr 1, 2010
There are several different ways of definition the ID. What is the difference? android:id="@id/android:list" android:id="@+id/android:list" android:id="@+id/confirm"
View 3 Replies
View Related