Android :: Android 1.6 New OpenCore - How Shared Buffers Affects?
Oct 8, 2009
The following what written on the Android 1.6 Platform Highlights: Android 1.6 includes the updated OpenCore 2 media engine, which has: * Support for OpenMAX encoders * Support for additional audio codecs in AuthorEngine * Improved buffering model supports shared buffers allocated in the decoder Could someone explain to me how this affects us the developers? Does this mean now we can play other media types? What in reality is new to us on Android 1.6 in regards to OpenCore?
View 2 Replies
Jun 16, 2009
In a process of switching from indirect to direct buffers for OpenGL geometry specification I've found a problem with the following function in java.nio.IntBuffer class: IntBuffer put(int[] src, int off, int len) this function is supposed to do a bulk transfer of int data from the given array starting at the specified offset. It should be an optimized version of the following code: for (int i = 0; i < len; i++) {buffer.put(src[off + i]);}
View 2 Replies
View Related
Aug 30, 2010
I'm using MediaRecorder.setOutputFile(FileDescriptor fd) over a UNIX socket, so that local code on the device can see the encoded stream in real time (for video streaming out).
What I observed is that the encoder doesn't write encoded frames continuously, when they are produced, but by "bursts", exactly one second apart. I tried both containers (THREE_GPP and MPEG_4) and all three available codecs (MPEG_4_SP, H263, H264), the behavior is always the same. This one-second period is notably unrelated to framerate or bitrate, so it is not flushing by size; rather, it seems to be flushing by time (possibly to mask SD card write overheads ? Bummer, I 'm not using the SD here !). Is it a known limitation ? Is there a known workaround ?
View 2 Replies
View Related
Jul 21, 2009
When using Protocol Buffers with Android (running in the Windows VM) I'm noticing that the first time I instantiate a protocol buffer based class Android lags for some time. When I step through it looks like it's spending a lot of time dynamically loading the protocol buffer classes. Is there something I can do to speed up this process? I haven't tested yet on a physical device, so the delay might not be as bad but in the VM it takes about 30 - 45 seconds to load all the classes and get past that first message instantiation. What I did was downloaded the Protocol Buffers source code. Compiled the.java files. Stuffed them into a .jar. Added the .jar as an external library to my Eclipse project. Generated the .java classes from the .proto files. Added the generated .java files to my project. Ran my project. I'm not sure if I'm doing something wrong it terms of how to reference external libraries with an Android project, but 30-45 seconds seems like a very long time to wait on a 2.6GHz machine. Also, I recently learned that android uses Protocol Buffers internally. Will this cause any sort of conflicts with my included .jar that could be the problem? I *think* the namespace is different but I'm not entirely sure. Lastly. Why did Google use Protocol Buffers internally and not make it available to SDK developers. This makes me very sad knowing that it's all right there and I can't use it.
View 20 Replies
View Related
Dec 8, 2009
for all those gl*Pointer functions you need to create bytebuffers through ByteBuffer.allocatedirect. when you want to load a texture through glteximage2d you need to supply a bytebuffer, too. can this be a "normal" bytebuffer or do I have to create this one too via allocatedirect?
View 2 Replies
View Related
Nov 24, 2010
I have used absolute layout in order to display image buttons in my application's main.xml.
how exactly it affects to using my app on different density screens..
View 2 Replies
View Related
Apr 8, 2010
I can't seem to find the proper audio source for recording/analyzing/receiving the currently played music track (or just any playing media). I'm not talking about the Mic.
The spectrum live wallpaper does this on the Nexus One AFAIK.
How can I keep receiving wave buffers of the currently playing media?
(I would like to support everything from 1.5, but 2.1 specific solutions are also welcome)
View 1 Replies
View Related
Feb 4, 2009
I'd like to have some fancy effects on my Activities when entering or exiting. So I tried to use theming. But it seems that it's completely ignored?
themes.xml (activated in Manifest) <itemname="android:windowAnimationStyle">
@style/Animation.Activity</ item> styles.xml
<style name="Animation" parent="android:Animation"/>
<style name="Animation.Activity" parent="android:Animation.Activity">
<item name="android:activityOpenEnterAnimation">@anim/fade_in</ item>
<item name="android:activityOpenExitAnimation">@anim/fade_in</ item>
<item name="android:activityCloseEnterAnimation">@anim/fade_in</ item>
<item name="android:activityCloseExitAnimation">@anim/fade_in</ item>
</style> fade_in.xml <alpha xmlns:
android="http://schemas.android.com/apk/res/android"
android:fromAlpha="0.0" android:toAlpha="1.0"
android:duration="1000" />
View 4 Replies
View Related
Aug 16, 2010
Can someone please explain Adobe AIR and how it affects android.. All i see is that it is like a flash or something, but not popular.
View 7 Replies
View Related
Feb 15, 2010
i have list views with image views and text views etc in them, and i'm trying to fix up a situation where the background image of a list element changes in the "pressed" state. if there's a straightforward way of doing that, please let me know! right now, i override onTouchEvent() in the topmost view, and also in those child views who don't just delegate to their parents which would be image views. i have to check for long clicks myself, but basically this scheme works. the list views are potentially different sizes depending on how much text is in the views etc, so i thought i would use a 9-patch as the background. however, when the background changes from flat white to the 9-patch, this seems to indent the view a bit - ie give it margins. eh? if i change the image to a non-9-patch, the margins remain as they are. i've tried different sizes of 9-patch, from 320x150 or so down to 100x30 etc, no change. what's going on?
View 3 Replies
View Related
Aug 2, 2010
I am trying to integrate VP8 video codec into PV opencore FW in Froyo & test at PV test application level. I have done the following steps, 1. Made a new folder for vp8 @ external/opencore/codecs_v2/omx/ omx_vp8. 2. Made appropriate modifications in the omx_baseclass, omx_common etc. 3. Modified the PV test app (external/opencore/codecs_v2/omx/ omx_testapp) for testing VP8. 4. included the VP8 source code to froyo. 5. Included new oscluuid for VP8 With these modifications, the build is fine. The shared libraries and binaries are creating properly. Also OMX-MasterInit is passing. But the application fails at OMX_GetHandle(). *pHandle is coming as zero. I am using the PV OMX core. I am just adding a new codec which is not supported by PV Opencore and test at PV test_app level.Can you please answer my below queries: 1. Do we need to change the pvplayer.cfg file? Bcoz i am using the PC omx_core itself. 2. Any other changes needs to be done for integrating a new component into Opencore? Please provide you valuable inputs. Let me know if need any additional informations.
View 4 Replies
View Related
Sep 27, 2009
Anybody know if it is possible use directly OpenCore from java or native with NDK ?
View 2 Replies
View Related
Mar 22, 2010
Here is what I got while running a test case of open core . *************** # pvplayer_ engine _test -test 1 1 SDK Labeled: PVDEV_ CORE_RELEASE_6.506.4.1 built on 20090312 Test Program for pvPlayer engine class. Input file name 'test.mp4' Test case range 1 to 1 Compressed output Video(No) Audio(No)Log level 8; Log node 0 Log Text 0 Log Mem 0 Starting Test 1: Open-Play-Stop-Reset Results for Test Case 1: Successes 1, Failures 1 Total Execution time for file test.mp4 is : 2.712000 seconds# *************** I want to ask why it shows 1 success and 1 failure . there should be only 1 pass or 1 fail only . Do each test case have multiple sub-tests within them ? which class in source code finally decide whether it is pass or faiL?
View 2 Replies
View Related
Jul 14, 2009
I am trying to add flash (flv) support to openCORE, i found some hints in this forum but still there are many things that unclear. I would like in the beginning to play local .flv file,from SDCARD, without streaming. As i understand i need to implement Parser within fileformats, "parser node" and "recognizer" and register it with player engine, and of cause add h.263 Sorenson codec. Can someone point me to the information on the data flow from the main Player Engine to the "parser node", what API is used,? How will "parser node" send the data to the Codec Engine? Which parts will i have to add to enable streaming later? I also looked inside the documentation within opencore but did not find relevant information, perhaps i missed something and someone can point me to where should i look.
View 2 Replies
View Related
May 24, 2010
What are the differences between StageFright and Opencore? What does this mean to app developers? Please point me to any links, tutorials online. You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscribe@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
View 6 Replies
View Related
May 12, 2010
Im experiencing an intermittent and very frustrating issue with MediaPlayer on Android when streaming MP3 over HTTP. Ive done as much research and debugging as I possibly can to try and find a solution, but am coming up short. I cant find reference to this issue anywhere, including b.android.com. Essentially, whats happening is MediaPlayer will not come out of the preparing state. The device is receiving audio data during this time, as indicated by network sniffs and the MediaPlayer buffering callback being invoked repeatedly with increasing percentage values, but eventually the buffer appears to become full and stops receiving. Because prepare() never returns, or more accurately when prepareAsync() is used and the onPrepared callback is never invoked, and neither the onError() nor onInfo() callbacks are invoked, the application can neither call MediaPlayer.start() nor make any sort of recovery. Other notable observations: The problem is intermittent, meaning that subsequent attempts to play the exact same MP3 stream from the exact same server may exhibit this behavior or not - The problem seems to be exasperated when the device is connected via 3G versus wifi - The problem occurs on different MP3s encoded by different software - The problem occurs when streaming from different HTTP servers - The problem cannot be reproduced on any of the Android SDK emulators - The problem can be reproduced on the HTC Incredible, Verizon Motorola DROID, and HTC Touch - When MediaPlayer successfully returns from the preparing state, there is an info message (what=1, extra=44) received. This info message is not received during the problematic case. Anyone ever experienced this, or even find a solution?
View 10 Replies
View Related
Feb 24, 2010
I guess ur doing it correctly.While running the test cases already provided by the PVcore,one can only see the logs and not the output on the screen or the speaker. This is what my experience says.Other people who worked on this can confirm.
View 2 Replies
View Related
Nov 20, 2009
when a video is played over http (progressive download), in some cases, the application stops the current http connection and continues the download using a new http request with a range header. closing the connection is the phone choice, not server. somebody knows in which cases the phone behaves this way?
View 3 Replies
View Related
Aug 5, 2009
I am not able to use MediaPlayer/VideoView to make rtsp to work in Android. So I have created a client to interact with RTSP server, I have succeeded in doing this.I am able to get the video/audio frame from RTSP server (MySpace) in Android. Now I want to play the frames. I have searched OpenCore APIs to play the frames, but didn't get any APIs. My investigation: There is a class PlayerDriver.c It creates two sinks one audio and other video. handleSetVideoSurface handleSetAudioSink Two objects of type PVPlayerDataSinkPVMFNode are created. I suspect this class has a way to give the stream as input, but I am not getting the definition of this class. Can you suggest me is there any class I need to look into it?
View 1 Replies
View Related
Dec 27, 2013
We are new to Android development. We have written small code which reads /dev/graphics/fb0 and sends frames to other application through network. We are successful if /dev/graphics/fb0 is available. But some android devices have fb0 and fb1(ex - Samsung Galaxy Grand - GT- I9082, Android version 4.2.2). luckly sending fb0 frames is sufficient to get required frames. We are not able to understand why fb1 exist and what is significance of the fb1. We also found few mobiles have fb0, fb1, ... , fb13. How to find which is main framebuffer?
View 2 Replies
View Related
Jul 17, 2010
I create aviparsernode folder by coping from MP4 parser node folder under /external/opencore/nodes/. But when I type "make" command under /external/opencore/build_config/opencore_dynamic/. The files under aviparsernode folder are not compilied. Does any one knows how to let OpenCore to make files under aviparsernode folder?
View 4 Replies
View Related
Jan 29, 2009
What are the video resolutions supported by the PV codes in the opencore module? Is there a manual or guide describing this?
View 2 Replies
View Related
Jun 29, 2010
I realize how much faster the hero can be when its wimpy processor is overclocked. I would imagine that ocerclocking would be bad for it though, to push the CPU faster than its made to be, but most people don't mind it. I'm wondering, realistically, if it is hard on the CPU to be overclocked and also how it affects battery life. Keep in mind, I'm talking about overclocking the ROMs made to be like zen, aloysius, purehero, busted, etc.
View 4 Replies
View Related
Jan 11, 2013
Does moving apps to system/app folder affects battery life? I've moved Dolphin & removed the default browser. It opens a bit faster now. But I'm just curious about whether it takes up battery in the same manner to give performance.
ST25i
View 9 Replies
View Related
Sep 26, 2010
Is copying the reference to the object and not the value of the object. Meaning, when I modify pos.top or pos.bottom, the original object gets modified. I'm guessing I am missing a concept of pass object by reference vs value here which I thought I understood. What is the fix here? Is it a problem with how I defined my custom class?
View 2 Replies
View Related
Feb 26, 2009
I am trying to integrate my codec on the ARM side using my own OMX core. Now I would like to test it on the Android Emulator using the Eclipse and Android SDK. I would like to know, *Tool chain used to compile openCore * How to set up Android Emulator to test my latest openCore (after integrating the Codec)?
View 2 Replies
View Related
Jul 15, 2010
I want to use the codecs in Android from my application. For now I just want to use the H.264 codec for testing, unless the mp3 or aac codecs provide functions for sending the audio to the device's speaker in which case I would prefer one of those.I have the NDK installed along with Cygwin, GNU Make, and GNU Awk. I can't figure out what I need to do from here though. I'm downloading the entire OpenCORE tree right now but I don't even know how to build it or make Eclipse plugin know it needs to include the files.An example or a tutorial would be much appreciated.EDIT:It looks like I can use JNI like P/Invoke which would mean I don't have to build the OpenCORE libraries myself. However, I can't find any documentation on the names of the libraries I need to load. I'm also confused as to how to do it. I'm looking at http://www.koushikdutta.com/2009/01/jni-in-android-and-foreword-of-why-jni.html and I don't understand what the purpose of writing a library to access a library is. Couldn't you just use something like System.loadLibrary("opencore.so")?
View 2 Replies
View Related
Sep 17, 2010
I am using the "Mock location" apps in there. That is interesting and would like to share here.Propagation Systems Limited - Android Apps
View 1 Replies
View Related
Oct 14, 2010
I need to add few shared libraries to the andriod as part my application installation. Can you please suggest right information resources regarding the same.
View 2 Replies
View Related
Oct 10, 2010
My wife and I both have Android phones (EVO and Samsung Transform) and have the Google Calendars shared. All of the sudden, her calendar is completely empty and can't see mine anymore.But on mine, I still see all of her events and see it listed as one of my Calendars both in Google Calendars and on my phone.Everything looks perfect on my end but everything is back to square one on hers.Any ideas how to restore her calendar or what may have caused it.I am afraid that just starting hers over is going to start duplicating events in mine.
View 1 Replies
View Related