Android :: Motion Detection Using Camera?
Jun 6, 2010Is this possible? How do i do it? I want to learn but dont know where to begin even
View 3 RepliesIs this possible? How do i do it? I want to learn but dont know where to begin even
View 3 RepliesWe have just published our smart camera application. Ever wished your pictures taken at night can be better? SmartCam helps you to take clear (non-blurry) pictures especially at night. It monitors your hand stability using the accelerometer and automatically clicks when the camera is steady. So, forget about touching that small button while trying hard to keep your hand steady. Just buy this app and let it do the clicking job for
View 1 Replies View RelatedOn my Google dev phone there is a problem with the new 1.5 version in combination with an official HTC adapter cable YC A300 and 3.5mm earphones. Android doesn't recognize the earphones (it worked on 1.1!). Android still recognizes the HTC earphones with ExtUSB.
View 2 Replies View RelatedI want to know how to calculate the accelerometer movement.When i place the mobile horizontally in idle state and similarly when i place the mobile in hands,So there accelerometer value change when it is place horizontally or while it is in hand vertically.How To Identify them,Any Sample Codes will be useful.
View 1 Replies View RelatedI am pretty new to android. I have a use case where I need to detect a shake and show some images in my application. Can I test shake functionality on android emulator ? What are the other alternatives apart from testing it on a real phone ?
View 1 Replies View RelatedAs part of an application that I'm developing for Android I'd like to show the user an edge-detected version of an image they have taken (something similar to the example below).To achieve this I've been looking at the Sobel operator and how to implement it in Java. However, many of the examples that I've found make use of objects and methods found in AWT (like this example) that isn't part of Android.My question is then really, does Android provide any alternatives to the features of AWT that have been used in the above example? If we were to rewrite that example just using the libraries built into Android, how would we go about it?
View 1 Replies View RelatedI have a code snippet to detect accelerometer movements. It works some times by properly detecting slight movements, but sometimes it detects movements when I kept my device idle too. Are there any problems with built-in accelerometer detection on Android?
I use an HTC G-1 device. My code snippet is below. How do I resolve it so I can detect small device movements but not detect anything when the device is idle?
CODE:..............
I have an itemizedOverlay extended and I want to increase the hit detection on the OverLayItem's if this is possible?
View 1 Replies View RelatedI'm trying to product a binary that is compatible with 1.1 and 1.5, but i'm having trouble with the ViewSwitcher class behaving different. I'd like to create a wrapper for it, but I need a way to programatically detect the api level at runtime.
View 3 Replies View RelatedI am currently working on an experimental camera app. I'm looking into implementing face detection at the moment and am currently weighing up my options.I have considered the OpenCV port available for android and using their face detection functions, but from demos I have seen of previous implementations the camera seems to lag a lot.Considering the camera on the HTC desire has face detecting, I know it must be possible to get at least a semi decent face detection system in place. I was just wondering if anyone had an opinion on how I could get the best results.Using an available library? Implementing a particular algorithm myself?
View 1 Replies View RelatedHas anyone ever used the API for face detect in their android program?More simply stated, can I have a java class that pulls up the camera, scans a person's face and returns parameters back into the program?
View 1 Replies View RelatedOn Froyo, we found that some new "Task Manager" apps are now using the ActivityManager.killBackgroundProcesses() to kill apps. When this happens, Intent.ACTION_PACKAGE_RESTARTED is no longer fired. How can I find out that my application has been killed? I tried to start a service, and I do see this message printed in logcat:
W/ActivityManager( 2426): Scheduling restart of crashed service com.example.android.apis/.app.RemoteService in 20000ms
However, the service is never restarted as advertised, if the app is killed using the killBackgroundProcesses API. (If I go into adb shell and kill the service process, the service will indeed be restarted ...) This looks like a bug anyway, because the notification created by the app is no longer removed like in eclair (the StatusBarService, and a bunch of other system services, depend on the Intent.ACTION_PACKAGE_RESTARTED broadcast).
I'm working on a soundboard, however I've got a problem when it come to drag the finger over the screen to play the sounds for the buttons I drag the finger over.Do anyone know how I can detect when a finger enter a button and not click the button?
View 1 Replies View RelatedI've been trying out Google Sky Map and it seems a bit off. I hold my phone in landscape mode and point it at the moon as if I were going to take a picture of it. The application shows the moon about 5-10 degrees off from where it actually is. Is this just the accuracy of the compass and orientation detection of the phone?
View 9 Replies View RelatedReally general, very new to bluetooth. Can we have one android device broadcasting a simple 'hello' bluetooth network (maybe a radius of just a few feet) - then when other android devices come into that area, reply with a 'hello' back? The client devices moving through the 'hello' radius wouldn't need to be always be checking for presence. I'd let the user open an app and check for bluetooth networks nearby. If they find one of these hubs, they can then choose to broadcast back a hello.Is that at all possible? Any general info would be great.
View 2 Replies View RelatedI am currently working on a collision detection routine for an Android based game which is capable of handling complex concave curves in 2D. I use complex in this post to describe any non-trivial, arbitrary shape (beyond circles, squares, etc). My problem is that all of the various methods I have come across are either too simplistic to be realistic or too complex for a cell phone. At the moment I am favoring a tile-based scheme but I am having problems figuring out how to do this with convex curves that span several tiles and may have several line segments per tile. I gave some thought to representing the curves mathematically, with each tile being an interval of the function, but there will likely be points where the curve doubles-back on itself (think the big loop at the top of a pinball table that brings the ball all the way around the table and back the direction it came). My questions boil down to:
1. Does anyone know of a way of hit testing a given simple shape (e.g. a circle) against a concave series of line segments (shapes beyond circles can be figured out from there) beyond just a Boolean result?
2. What is the best way to represent large, complex shapes programatically? I am currently favoring an XML file describing my levels and the objects in them with shape/position/physics/etc. data to be parsed in during load time.
3. Is there an altogether better way of doing this beyond line segments?
The one saving grace in this conundrum is that I know the vast majority of the objects are stationary with, at most, three or four (usually one) dynamic objects moving around.
I am writing an application that will behave similar to the existing Voice recognition but will be sending the sound data to a proprietary web service to perform the speech recognition part. I am using the standard MediaRecord (which is AMR-NB encoded) which seems to be perfect to speech recognition. The only data provided by this is the Amplitude via the getMaxAmplitude() method.I am trying to detect when the person starts to talk so that when the person stops talking for about 2 seconds I can proceed to send the sound data to the web service. Right now I am using a threshold for the amplitude that if its goes over a value (i.e. 1500) then I assume the person is speaking. My concern is that the amplitude levels may vary by device (i.e. Nexus One v Droid), so I am looking for a more standard approach to this that can be derived from the amplitude values.
View 3 Replies View RelatedAccording to the MS website, Windows Media Player queries an MTP device to see what formats it supports. My Razr Maxx HD supports ogg and m4a. When attempting to sync WMP does not sync those files stating they might not be supported on the device. Being that I've had so much trouble in the past even getting any of the Android devices to be properly detected as an MTP device I'm worried this is a driver issue.I should be able to get WMP to at least convert them.
this is the device mis-reporting the formats, Windows not querying them properly, or an actual WMP issue? Would like to be able to get sync working without any conversions. Maybe there is a way to hardcode formats within the mtp inf file?
I just got an android, and I love it. But I was wondering if there is an app where it records video, but it automatically starts recording when there is motion, and stops when the motion goes away. I believe its possible, just wondering if is been done.
View 2 Replies View RelatedI am looking for an app for my Incredible that records video at a higher frame rate, then will play back at regular speed so that the video plays in slo-mo. My old LG Dare had this, and i used it quite often. ANyone know of any app out there that will do this? I have searched the forum, and the market, and cannot find what i am after.
View 3 Replies View RelatedIn my app, users can paint with their finger a path. An image should move on this path through the view (it's not a straight path. The path has waves). In flash, you call this motion tween. Is there a way for android? How can I implement this?
View 3 Replies View Relatedsuppose i have two 'main' subtrees ov Views in my Activity View. now under some circumstances i want only one subtree to receive touch/ key events. is it possible with one/two calls or i have to recursively disable every View in subtree which is to be disabled from getting touch/key events?
View 3 Replies View RelatedI would like to detect when the phone is in motion, but not all kind of motion. For example picking up or waving the phone should not trigger. I would like ideally to run a code when the phone/person is in > "walking" state. What options I have?
View 2 Replies View RelatedIs it possible to detect that the viewport is being dragged by a touch event?
Using the following code I am able to get the position of where the finger touched the screen, what node started the event and where it was dragged to. Which solves one of the problems but I would really like to detect when the user has dragged the page/window/viewport down.
To attempt to be more clear as to what I am trying to do: I would like simulate the refresh activity in Tweetie 2/Twitter for iPhone but in HTML5 and JavaScript. Code....
i need left, right motion event and a click event . problem is that, if i have onclickEvent then onTouchEvent doesn't fire. onTouchEvent works only if i disable onclickEvent and also it doesn't work what i expect.
1-only ACTION_MOVE fires twice everytime when i move on screen.
2-can't get left and right motion.
I've been reading through the docs, and I cannot figure out how to cancel a "drag" on a Seekbar once the bar has been completed.
For example, when I drag the bar to the end, I want an event to happen. However, if I keep my finger on the screen and drag my finger past the SeekBar, it keeps firing off that event even after I reset the progress of the Seekbar. How can I prevent this action from happening?
I am developing an input method with full screen size views, How can I send a grabbed motion event to the client applications?
View 2 Replies View RelatedMy Activity using transparent Theme in AndroidManifest.xml *android:theme="@android:style/Theme.Translucent"* Which works fine. When it's started from the Home screen, we can see both the content of my activity and the home screen below it. It gives the feeling of "floating". Furthermore, to make it real "floating", the motionEvent has to go down to the home screen to let the home screen response some motion touch. But I have no idea of 2 things : 1. Whether the event can be send down to any thing behind the current activity. 2. If so, how. The solution don't need to be restricted in API level.I mean if any solution need us to rewrite home screen or something else, It's also ok for me.
View 10 Replies View Relatedi would like a program that take pictures with the camera in my samsung galaxy s. i do some long walks and i am hoping someone can make a program that takes a picture every preset time intervals so i can join all the pictures into a movie. i was thinking to just have my phone in a strap around my neck. then the camera would take a picture every say 10 seconds and save image to the sd card.
View 3 Replies View RelatedThis is the code for motion events in my instrumentation testing.
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis();
MotionEvent event = MotionEvent.obtain(downTime, eventTime,
MotionEvent.ACTION_DOWN, 100,100, 0);
MotionEvent event2 = MotionEvent.obtain(downTime, eventTime,
MotionEvent.ACTION_UP, 100, 100, 0);...............................