Android Camera2 Api

If you are building an app with the android camera2 api here are my thoughts after fighting with it. (Full code at bottom for copy paste junkies)

This api was a little verbose with me using about 1200 lines of code. It could probably be done easier but if you want something custom here is what you might end up with. I used the github example to copy this code with full blown example here.

 

Here is my code all folded up with some clear descriptions of what everything does. If you the my code below which is essentially just the basic camera example twisted and reorganized so it makes sense to me. Notice this is all contained in a fragment.

 

android camera2 api

There are three things someone using my version, or the original github version would need to change. If you are tackling this project don’t hesitate to copy the code on this page and focus on the changes you need instead of trying to wrap you head around the whole project.

The first is the button setup. Im not really interested into diving into this. Check my codes “camera still picture chain” and you can see how the events are initiated.

The second is the save method(see “Inner classes” in my code folds) . The example gives you a runnable image saver which will probably need to be reworked according to your file storage system or if you need to handle the image for further processing. Working with large image files its best to save the file and pass a URI and take smaller samples of the image to reduce heap size.

Third why does samsung spin the dang images. This took me a while to figure out and I was super upset about it. Here is the code my “Image Review” fragment used to flip and save the image the right way. I believe this was sourced from several sources and have no idea who to give credit too.

 private void rotateImage(int degree)
    {
        Log.d(TAG, "rotateImage: ");

        Matrix mat = new Matrix();
        mat.postRotate(degree);
        bitmapToReview = Bitmap.createBitmap(bitmapToReview, 0,0,bitmapToReview.getWidth(), bitmapToReview.getHeight(), mat, true);

    }

    private void createPreviewImage()
    {
        //get exif data and make bitmap
        int orientation = 0;
        try {
            ExifInterface exifInterface = new ExifInterface(uriOfImage.getPath());
            bitmapToReview = MediaStore.Images.Media.getBitmap(getActivity().getContentResolver(), uriOfImage);
            orientation =  exifInterface.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL);
        }catch (Exception e){
            Log.e(TAG, "createPreviewImage: ", e);
            Crashlytics.log(TAG + " " + e);
            Toast.makeText(getActivity(), "Error loading image", Toast.LENGTH_SHORT).show();
        }

        //check rotation and rotate if needed
        switch (orientation){

            case ExifInterface.ORIENTATION_ROTATE_90:
                Log.d(TAG, "createPreviewImage: 90");
                rotateImage(90);
                break;

            case ExifInterface.ORIENTATION_ROTATE_180:
                Log.d(TAG, "createPreviewImage: 180");
                rotateImage(180);
                break;

                case ExifInterface.ORIENTATION_ROTATE_270:
                    Log.d(TAG, "createPreviewImage: 270");
                    rotateImage(270);
                    break;



        }

        //display on screen
        imageView_Preview.setImageBitmap(bitmapToReview);


    }

 

So that’s it. This post was basically to complain that I spent a week retyping this entire thing out to prove that I could tame it. In reality i licked my wounds and moved on with my life because sometimes there are more important things to do than fight the system.

 

 

For the full code see below and try not too be frightened.

import android.Manifest;
import android.app.Activity;
import android.app.AlertDialog;
import android.app.Dialog;
import android.app.DialogFragment;
import android.app.Fragment;
import android.content.Context;
import android.content.DialogInterface;
import android.content.pm.PackageManager;
import android.content.res.Configuration;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.Point;
import android.graphics.RectF;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.net.Uri;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.support.design.widget.FloatingActionButton;
import android.support.design.widget.Snackbar;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.util.Log;
import android.util.SparseIntArray;
import android.view.LayoutInflater;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.view.ViewGroup;
import android.widget.Toast;

import com.crashlytics.android.Crashlytics;
import com.signal.cagney.easyreceipt.AutoFitTextureView;
import com.signal.cagney.easyreceipt.EasyReceipt;
import com.signal.cagney.easyreceipt.MainActivity;
import com.signal.cagney.easyreceipt.R;
import com.signal.cagney.easyreceipt.Util.FileManager;
import com.squareup.leakcanary.RefWatcher;

import java.io.File;
import java.io.FileOutputStream;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import java.util.concurrent.Semaphore;
import java.util.concurrent.TimeUnit;

public class Main_Fragment extends Fragment implements ActivityCompat.OnRequestPermissionsResultCallback{

private static final String TAG = "MAIN_FRAGMENT";
View myFragmentView;
private AutoFitTextureView mTextureView;

boolean currentlyCapturing;

public final static int GALLERY_CHOOSE = 12;

FileManager fileManager;

//region------------------------camera states

private static final int STATE_PREVIEW = 0;

/**
* Camera state: Waiting for the focus to be locked.
*/
private static final int STATE_WAITING_LOCK = 1;

/**
* Camera state: Waiting for the exposure to be precapture state.
*/
private static final int STATE_WAITING_PRECAPTURE = 2;

/**
* Camera state: Waiting for the exposure state to be something other than precapture.
*/
private static final int STATE_WAITING_NON_PRECAPTURE = 3;

/**
* Camera state: Picture was taken.
*/
private static final int STATE_PICTURE_TAKEN = 4;

/**
* Max preview width that is guaranteed by Camera2 API
*/
private static final int MAX_PREVIEW_WIDTH = 1920;

/**
* Max preview height that is guaranteed by Camera2 API
*/
private static final int MAX_PREVIEW_HEIGHT = 1080;

//endregion

//region------------------------------------------------------- camera fields

private CameraDevice mCameraDevice;
private CaptureRequest.Builder previewBuilder;
private CaptureRequest mPreviewRequest;
private CameraCaptureSession mCameraCaptureSession;

private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
private static final int REQUEST_CAMERA_PERMISSIONS = 1;
private static final String FRAGMENT_DIALOG = "dialog";

private ImageReader imageReader;
private int mSensorOrientation;
private Handler mBackgroundHandler;
private int mState = STATE_PREVIEW;
private Semaphore mCameraOpenCloseLock = new Semaphore(1);
private String mCameraId;
private HandlerThread mBackgroundThread;
private boolean mFlashSupported;

private android.util.Size mPreviewSize;

static {

ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}

private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice cameraDevice) {
Log.d(TAG, "onOpened: ");
mCameraOpenCloseLock.release();
mCameraDevice = cameraDevice;
creatCameraPreviewSession();

}

@Override
public void onDisconnected(@NonNull CameraDevice cameraDevice) {
Log.d(TAG, "onDisconnected: ");
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice =null;

}

@Override
public void onError(@NonNull CameraDevice cameraDevice, int i) {
Log.d(TAG, "onError: ");
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
Activity activity = getActivity();
if (null != activity){
activity.finish();
}

}
};

private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader imageReader) {
Log.d(TAG, "onImageAvailable: ");
Image image = imageReader.acquireNextImage();

mBackgroundHandler.post(new ImageSaver(image, fileManager ));

}
};

private CameraCaptureSession.CaptureCallback mCaptureCallback = new CameraCaptureSession.CaptureCallback() {

private void process(CaptureResult result)
{

switch (mState){

case STATE_PREVIEW: {
//working normal. do nothing
//Log.d(TAG, "process: " + result.toString());
break;
}

case STATE_WAITING_LOCK: {
Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
Log.d(TAG, "process: state awaiting afstate = " + String.valueOf(afState) + " Captureresult = " + result.toString());

if (afState == null || afState == CaptureResult.CONTROL_MODE_OFF ) {
Log.d(TAG, "process: null");
captureStillPicture();
} else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {
Log.d(TAG, "process: something else");
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {
Log.d(TAG, "process: something even more");
mState = STATE_PICTURE_TAKEN;
captureStillPicture();
} else {

runPreCaptureSequence();
}

}

break;
}

case STATE_WAITING_PRECAPTURE: {

Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
Log.d(TAG, "process: precapture " + String.valueOf(aeState) + " Captureresult = " + result.toString());
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
mState = STATE_WAITING_NON_PRECAPTURE;
}

break;
}

case STATE_WAITING_NON_PRECAPTURE: {

Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);

Log.d(TAG, "process: non-precapture" + String.valueOf(aeState) + " Captureresult = " + result.toString());

if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE){
mState =STATE_PICTURE_TAKEN;
captureStillPicture();
}

break;
}

}
}

@Override
public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
//Log.d(TAG, "onCaptureProgressed: ");
process(partialResult);

}

@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
//Log.d(TAG, "onCaptureCompleted: callback");
process(result);
}
};

private final TextureView.SurfaceTextureListener mSurfaceTextureListener
= new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {
Log.d(TAG, "onSurfaceTextureAvailable: ");
openCamera(i, i1);

}

@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int i, int i1) {
Log.d(TAG, "onSurfaceTextureSizeChanged: ");
configureTransform(i, i1);
}

@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
Log.d(TAG, "onSurfaceTextureDestroyed: ");
return false;
}

@Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
//Log.d(TAG, "onSurfaceTextureUpdated: ");
}
};

//endregion

//region------------------------------------------------------------------- Fragment Setup

@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
myFragmentView = inflater.inflate(R.layout.main_frag_layout, container,false);

setupUI();

return myFragmentView;

}

@Override
public void onViewCreated(View view, @Nullable Bundle savedInstanceState) {
super.onViewCreated(view, savedInstanceState);
}

@Override
public void onActivityCreated(@Nullable Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);

fileManager = ((MainActivity)getActivity()).getFileManager();

//mFile = newPictureFileName();

}

private void setupUI()
{
mTextureView = (AutoFitTextureView) myFragmentView.findViewById(R.id.texture);

FloatingActionButton fabGall = (FloatingActionButton) myFragmentView.findViewById(R.id.fabGallery);
fabGall.setImageResource(R.drawable.folder);
fabGall.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {

if (notToBusyToComply()){
((MainActivity)getActivity()).openGallery();
}

/*
Snackbar.make(view, "Replace with your own action", Snackbar.LENGTH_LONG)
.setAction("Action", null).show();
*/
}
});

FloatingActionButton fabPic = (FloatingActionButton) myFragmentView.findViewById(R.id.fabTakePicture);
fabPic.setImageResource(R.drawable.camera);
fabPic.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {

if (notToBusyToComply()){
takePicture();
}

}
});

}

//endregion

//region------------------------------------------------------------------- Camera Main Methods

private void openCamera(int width, int height)
{
Log.d(TAG, "openCamera: ");
if (ContextCompat.checkSelfPermission(getActivity(), android.Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED){
requestCameraPermission();
return;
}

Log.d(TAG, "openCamera: setup");
setUpCameraOutputs(width, height);
Log.d(TAG, "openCamera: configure");
configureTransform(width, height);
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);

try{

if ( !mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)){
throw new RuntimeException("Time out waiting to lock camera opening");
}

manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);

}catch (CameraAccessException e){
e.printStackTrace();
}catch (InterruptedException e){
throw new RuntimeException("Interupted while trying to lock camera opening", e);
}

}

private void closeCamera()
{
Log.d(TAG, "closeCamera: ");
try {
mCameraOpenCloseLock.acquire();
if (null != mCameraCaptureSession) {
mCameraCaptureSession.close();
mCameraCaptureSession = null;
}
if (null != mCameraDevice) {
mCameraDevice.close();
mCameraDevice = null;
}
if (null != imageReader) {
imageReader.close();
imageReader = null;
}
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
} finally {
mCameraOpenCloseLock.release();
}

}

private void creatCameraPreviewSession()
{
Log.d(TAG, "creatCameraPreviewSession: ");
try {

SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;

texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());

Surface surface = new Surface(texture);

previewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewBuilder.addTarget(surface);

mCameraDevice.createCaptureSession(Arrays.asList(surface, imageReader.getSurface()),
new CameraCaptureSession.StateCallback() {

@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
Log.d(TAG, "onConfigured: ");
if (null == mCameraDevice){
return;
}

mCameraCaptureSession = cameraCaptureSession;

try{

previewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);

mPreviewRequest = previewBuilder.build();
mCameraCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);

}catch (CameraAccessException e){
Log.e(TAG, "onConfigured: ", e);
Crashlytics.log(TAG + " " + e);
}

}

@Override
public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
showToast("Failed Preview");
}
}, null);

}catch (CameraAccessException e){
Log.e(TAG, "creatCameraPreviewSession: ", e);
Crashlytics.log(TAG + " " + e);
}

}

private void takePicture()
{
Log.d(TAG, "takePicture: capture chain 1");
//mFile = newPictureFileName();

lockFocus();

}

//endregion

//region------------------------------------------------------------------- Camera Still Picture Chain

private void lockFocus()
{
Log.d(TAG, "lockFocus: capture chain 2");

try{

previewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_START);

mState = STATE_WAITING_LOCK;

mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback,
mBackgroundHandler);

} catch (CameraAccessException e){
Log.e(TAG, "lockFocus: ", e);
Crashlytics.log(TAG + " " + e);
}

}

private void runPreCaptureSequence()
{
Log.d(TAG, "runPreCaptureSequence: capture chain 3");

try{

previewBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER,
CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);

mState = STATE_WAITING_PRECAPTURE;
mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback,
mBackgroundHandler);

}catch (CameraAccessException e){
Log.e(TAG, "runPreCaptureSequence: ", e);
Crashlytics.log(TAG + " " + e);
}

}

private void captureStillPicture()
{

if (currentlyCapturing){
Log.d(TAG, "captureStillPicture: returning");
return;
}
Log.d(TAG, "captureStillPicture: capture chain 4");
//currentlyCapturing = true;

try{
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice){
Log.d(TAG, "captureStillPicture: null checks");
return;
}

final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(imageReader.getSurface());

captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);

int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));

CameraCaptureSession.CaptureCallback captureCallback = new CameraCaptureSession.CaptureCallback() {

@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);

Log.d(TAG, "onCaptureCompleted: from chain 4");
unlockFocus();
//currentlyCapturing = false;

}
};

mCameraCaptureSession.stopRepeating();
mCameraCaptureSession.abortCaptures();
mCameraCaptureSession.capture(captureBuilder.build(), captureCallback, null);

}catch (CameraAccessException cae){
Log.e(TAG, "captureStillPicture: ", cae);
Crashlytics.log(TAG + " " + cae);
}

}

//endregion

//region------------------------------------------------------------------- Camera Supporting Methods

private int getOrientation(int rotation)
{
int returnValue = (ORIENTATIONS.get(rotation) + mSensorOrientation + 270) % 360;

Log.d(TAG, "getOrientation: in " + String.valueOf(rotation) + " out " + String.valueOf(returnValue));

return returnValue;
}

private void unlockFocus()
{

try {

previewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback,
mBackgroundHandler);

mState = STATE_PREVIEW;

mCameraCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler);

} catch (CameraAccessException cae){
Log.e(TAG, "unlockFocus: ", cae );
Crashlytics.log(TAG + " " + cae);
}

}

private void requestCameraPermission()
{

if (ContextCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED){

new ConfirmationDialog().show(getChildFragmentManager(), FRAGMENT_DIALOG);
}else {

Snackbar.make(myFragmentView, "Camera Permissions Already Granted", Snackbar.LENGTH_SHORT).setAction("action", null).show();
}

}

@SuppressWarnings("SuspiciousNameCombination")
private void setUpCameraOutputs(int width, int height)
{
Log.d(TAG, "setUpCameraOutputs: ");

Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try{

for (String cameraID :
manager.getCameraIdList()) {

CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraID);

Integer frontFacing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (frontFacing != null && frontFacing == CameraCharacteristics.LENS_FACING_FRONT){
continue;
}

StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map== null){
continue;
}

android.util.Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new CompareSizesByArea());
imageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, 2);
imageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler);

// Find out if we need to swap dimension to get the preview size relative to sensor
// coordinate.
int displayRotation = activity.getWindowManager().getDefaultDisplay().getRotation();
//noinspection ConstantConditions
mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
boolean swappedDimensions = false;
switch (displayRotation) {
case Surface.ROTATION_0:
case Surface.ROTATION_180:
if (mSensorOrientation == 90 || mSensorOrientation == 270) {
swappedDimensions = true;
}
break;
case Surface.ROTATION_90:
case Surface.ROTATION_270:
if (mSensorOrientation == 0 || mSensorOrientation == 180) {
swappedDimensions = true;
}
break;
default:
Log.e(TAG, "Display rotation is invalid: " + displayRotation);
Crashlytics.log(TAG + " " + displayRotation);
}

Point displaySize = new Point();
activity.getWindowManager().getDefaultDisplay().getSize(displaySize);
int rotatedPreviewWidth = width;
int rotatedPreviewHeight = height;
int maxPreviewWidth = displaySize.x;
int maxPreviewHeight = displaySize.y;

if (swappedDimensions) {
rotatedPreviewWidth = height;
rotatedPreviewHeight = width;
maxPreviewWidth = displaySize.y;
maxPreviewHeight = displaySize.x;
}

if (maxPreviewWidth > MAX_PREVIEW_WIDTH) {
maxPreviewWidth = MAX_PREVIEW_WIDTH;
}

if (maxPreviewHeight > MAX_PREVIEW_HEIGHT) {
maxPreviewHeight = MAX_PREVIEW_HEIGHT;
}

mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
rotatedPreviewWidth, rotatedPreviewHeight, maxPreviewWidth,
maxPreviewHeight, largest);

// We fit the aspect ratio of TextureView to the size of preview we picked.
int orientation = getResources().getConfiguration().orientation;
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
mTextureView.setAspectRatio(
mPreviewSize.getWidth(), mPreviewSize.getHeight());
} else {
mTextureView.setAspectRatio(
mPreviewSize.getHeight(), mPreviewSize.getWidth());
}

// Check if the flash is supported.
Boolean available = characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE);
mFlashSupported = available == null ? false : available;

mCameraId = cameraID;
return;

}

} catch (CameraAccessException e){
e.printStackTrace();
}catch (NullPointerException e){
ErrorDialog.newInstance(getString(R.string.camera_error))
.show(getChildFragmentManager(), FRAGMENT_DIALOG);
}

}

private void configureTransform(int viewWidth, int viewHeight)
{
Log.d(TAG, "configureTransform: ");

Activity activity = getActivity();

if (null == mTextureView || null == mPreviewSize || null == activity){

return;
}

int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
Matrix matrix = new Matrix();
RectF viewRect = new RectF(0,0, viewWidth, viewHeight);
RectF bufferRect = new RectF(0,0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
float centerX = viewRect.centerX();
float centerY = viewRect.centerY();

if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation){

bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
float scale = Math.max(
(float) viewHeight / mPreviewSize.getHeight(),
(float) viewWidth / mPreviewSize.getWidth());

matrix.postScale(scale, scale, centerX, centerY);
matrix.postRotate(90 * (rotation - 2), centerX, centerY);
} else if (Surface.ROTATION_180 == rotation){
matrix.postRotate(180, centerX ,centerY);
}

mTextureView.setTransform(matrix);

}

private static android.util.Size chooseOptimalSize(android.util.Size[] choices, int textureViewWidth,
int textureViewHeight, int maxWidth, int maxHeight,
android.util.Size aspectRatio)
{

Log.d(TAG, "chooseOptimalSize: ");
List bigEnough = new ArrayList<>();

List notBigEnough = new ArrayList<>();

int w = aspectRatio.getWidth();
int h = aspectRatio.getHeight();

for (android.util.Size option : choices){

if (option.getWidth() <= maxWidth && option.getHeight() <= maxHeight && option.getHeight() == option.getWidth() * h / w){ if (option.getWidth() >= textureViewWidth &&
option.getHeight() >= textureViewHeight){
bigEnough.add(option);
}else {
notBigEnough.add(option);
}

}

}

if (bigEnough.size() > 0){

return Collections.min(bigEnough, new CompareSizesByArea());

} else if (notBigEnough.size() > 0 ){

return Collections.max(notBigEnough, new CompareSizesByArea());

}else {
Log.e(TAG, "chooseOptimalSize: couldnt find suitable preview size");
Crashlytics.log(TAG + " " + "chooseOptimalSize: couldnt find suitable preview size");
return choices[0];
}

}

// TODO: 6/1/2018 set auto flash

//endregion

//region------------------------------------------------------------------- Lesser Methods

private void showToast(final String text)
{
Log.d(TAG, "showToast: ");
final Activity activity = getActivity();
if (activity != null){

activity.runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(activity, text, Toast.LENGTH_SHORT).show();
}
});

}

}

@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
}

private void startBackgroundThread()
{
Log.d(TAG, "startBackgroundThread: ");

mBackgroundThread = new HandlerThread("CameraBackground");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());

}

private void stopBackgroundThread()
{
Log.d(TAG, "stopBackgroundThread: ");
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}

}

private boolean notToBusyToComply()
{
Log.d(TAG, "notToBusyToComply: ");
return ((MainActivity)getActivity()).notToBusyToComply();

}

//endregion

//region------------------------------------------------------------------- LifeCycle

@Override
public void onResume() {
super.onResume();

startBackgroundThread();

if (mTextureView.isAvailable()) {
openCamera(mTextureView.getWidth(), mTextureView.getHeight());
} else {
mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
}
}

@Override
public void onPause() {
closeCamera();
stopBackgroundThread();
super.onPause();
}

@Override
public void onDestroyView() {
super.onDestroyView();
//RefWatcher refWatcher = EasyReceipt.getRefwatcher(getActivity());
//refWatcher.watch(this);
}

//endregion

//region------------------------------------------------------------------- Inner Classes

private class ImageSaver implements Runnable{

private final Image mImage;
private final FileManager mFileManager;

//private final File mFile;

ImageSaver(Image image, FileManager fileManager){

mImage = image;

mFileManager = fileManager;

//mFile = file;

}

@Override
public void run() {

File outputFile = null;
try {
outputFile = File.createTempFile(String.valueOf(System.currentTimeMillis()), ".jpg", getActivity().getCacheDir());
}catch (Exception e){
Log.e(TAG, "run: ", e);
Crashlytics.log(TAG + " " + e);
}

ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);

FileOutputStream fos = null;

try{
fos = new FileOutputStream(outputFile);
fos.write(bytes);

}catch (Exception e){
Log.e(TAG, "run: ", e);
Crashlytics.log(TAG + " " + e);
}finally {
mImage.close();

if (fos != null){
try{
fos.close();
}catch (Exception e){
Log.e(TAG, "run: ", e);
Crashlytics.log(TAG + " " + e);
}
}

}

((MainActivity)getActivity()).setUriofImageTOReview(Uri.fromFile(outputFile));
((MainActivity)getActivity()).loadCameraPreviewApprovalFrag();

/*
((MainActivity)getActivity()).loadCameraPreviewApprovalFrag(bytes);

mImage.close();

*/

}

}

static class CompareSizesByArea implements Comparator {

@Override
public int compare(android.util.Size lhs, android.util.Size rhs) {
// We cast here to ensure the multiplications won't overflow
return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
(long) rhs.getWidth() * rhs.getHeight());
}

}

public static class ErrorDialog extends DialogFragment {

private static final String ARG_MESSAGE = "message";

public static ErrorDialog newInstance(String message){

ErrorDialog dialog = new ErrorDialog();
Bundle args = new Bundle();
args.putString(ARG_MESSAGE, message);
dialog.setArguments(args);
return dialog;

}

@NonNull
@Override
public Dialog onCreateDialog(Bundle savedInstanceState) {

final Activity activity = getActivity();
return new AlertDialog.Builder(activity)
.setMessage(getArguments().getString(ARG_MESSAGE))
.setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int i) {
activity.finish();
}
}).create();

}

}

public static class ConfirmationDialog extends DialogFragment{

@NonNull
@Override
public Dialog onCreateDialog(Bundle savedInstanceState)
{

final Fragment parent = getParentFragment();
return new AlertDialog.Builder(getActivity())
.setMessage(R.string.request_permission)
.setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int i) {

ActivityCompat.requestPermissions(getActivity(), new String[]{Manifest.permission.CAMERA}, REQUEST_CAMERA_PERMISSIONS);
}
})
.setNegativeButton(android.R.string.cancel, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int i) {
Activity activity = parent.getActivity();
if (activity != null){
activity.finish();
}
}
}).create();

}

}

//endregion

}

 

 

My recent attempts at painting

Recently I started painting. As an unexpected perk the leisure activity is reducing my inability to examine the ui for my apps.I didn’t have the introspection to figure this out myself but I was suggested painting as a way to relax after work and not stare at a television screen.Besides offering a relaxing activity it has offered the added bonus of exercising my mind in areas of art and I have been purchasing books with paintings and on how to paint.Getting set up with inexpensive enough and I find myself excited to relax after work and continue on with the current project. Hopefully after a year I will achieve some measure of success.For now it has changed how I think. When I’m painting I tend to think how can I make this more interesting and more pleasing to the viewer. If I don’t completely blend the colors or fill in the details then the viewers mind finishes the job for me. The artwork is interactive and engaging.When I used to build software I think about making it easy to use and fulfilling a function, nothing more. Now I tend to consider how the person will quickly glance at the ui and let the colors and symbols blend together in a meaningful collage of useful information.I have only begun the process but hopefully I will find the time to continue.

Working with Google Vision API on Android

Thanks to googles text recognition API I was able to effortlessly get text from the receipts I wanted to parse.

Using google firebase vision text on android app easy receipt

 

In the picture above you can see I created a method I could pass images into which returned results through a callback. The strategy was simple.

Here is the link I used to set it up.

 

1) Add the library through gradle.

 

2) Create a method where you can pass an image and probably a countdown latch

 

3) The using the firebase api create an Image Input Object, a Detector Object and an on success listener.

 

Thats it! But you may notice two things

A)  I used a countdown latch. My text recognition ran in an Intent Service completely separate from the UI and this allowed me to halt the overall parsing strategy to wait for the Vision Results.

B) Googles API returns a vision object but this object was not serializable so I made a serialize-able version and did he conversion myself. The vision result gives you a box where you can easily detect its location and size on the screen and therefore make critical decisions about what information you are looking at.

For example the vendors name is often larger and directly adjacent the phone number or address. These two factors alone will give you the correct vendor with startling accuracy.

Or for example the price is usually on the right lower half (localization excluded).

 

Anyways, I enjoyed using it and its always fun to stand on somebody else’s shoulders when building your project. Thanks Google

Using google firebase vision text on android app easy receipt

 

Using google firebase vision text on android app easy receipt

 

Easy Receipt

Easy receipts is an app I made that allows you to easily organize your receipts. Any one who has had to track their receipts knows the labors of carrying a folder around and then manually entering them into a computer and probably scanning them as well.

 

This app is CONVENIENT.

Just take a picture of the receipt as soon as someone gives it to you and store the receipt in a box to never be opened again.

easy receipt receipt tracking app

Once easy receipt is on your phone you can casually swipe through them and edit them at your leisure. Easy receipt even guesses what  should be entered so most of the time its already done for you.

easy receipt receipt tracking app

As you can see its not perfectly accurate on the first try. But it stores the names of locations so this receipt will likely scan perfectly the next time its scanned.

 

There is also custom editing and cropping features.

easy receipt receipt tracking app

 

When its done you can send yourself an email with the receipt images and a csv file.

easy receipt receipt tracking app

Electric Skate Board

electric skateboard

 

Describing my electric skateboard build. I have only driven this board seven miles so I cant say how far it will go. But I can say that its scary fast and very fun!

Electrical was a cinch!

Below you can see the layout of the parts that are needed for the electronics. I screwed and glued a waterproof case to contain all this to the bottom of the board.

electric skateboard parts

My big mistake

This board is fast I wish it was higher off the ground with larger inflatable tires. Secondly because electric motor low rpm torque is terrible I wish I had a kit that reduced the gear ratio even more than my current kit which was purchased off amazon. See below and notice how this wheel had a bunch of small holes. Making sure your gear fits your wheel is probably the most difficult part of the build if you order the wrong stuff. Also my belt was at maximum tension and I ended up adding bearings on each side that tension the belt.

electric skateboard wheel

 

electric skateboard pulley idler

 

Controlling software

So I had to code the arduino to accept blue tooth commands. Here the code below loops on the bluetooth signal and the gradually increases or reduces an int value that the esc is programmed to recognize.

 

 

#include <Servo.h>
#include <SoftwareSerial.h>
#include <Arduino.h>


SoftwareSerial mySerial(5,6);
Servo myServo;
int input = 0;
int inChar = 90;
int current = 90;
int waiter = 0;

void setup() {

Serial.begin(9600);
mySerial.begin(9600);
myServo.attach(9);

Serial.println("Arduino Ready!!");
}

void loop() {

waiter ++;

if (mySerial.available() > 0){

input = mySerial.read();

if (input != 0)
{
inChar = input;
}

//Serial.println(inChar);


}

delay(15);
//Serial.println("Set motor");
//Serial.println(inChar);
if (current < inChar)
{
if (waiter > 10){
current ++;
waiter = 0;
Serial.println(current);
}
}else if (current > inChar){

current--;
}
myServo.write(current);

Serial.println(current);

}

 

The android code was lengthy. Below is a view of inside android studio and youll see my constraint layout.

skateboard control android app

 

 

Following that is my two classes composing the skateboard controller.

 

<pre>public class MainActivity extends AppCompatActivity {

    String TAG = "MainActivity";
    TextView textViewSkateBoardInfo, textViewFeedback;
    EditText editTextInputToBoard;
    Button buttonSubmit, buttonBrake, buttonCoast, buttonCruise, buttonMinus, buttonPlus;
    Spinner spinnerDevices;
    SeekBar seekBarSpeed;

    int desiredSpeed = 0;


    String[] deviceNames;
    BluetoothAdapter bluetoothAdapter;
    ArrayAdapter<String> spinnerAdapter;
    int chosenDevice = 0;
    UUID MY_UUID;
    boolean threadControl = false;
    boolean bluetoothConnectionActive = false;
    Set<BluetoothDevice> pairedDevices;
    MyBlueToothService myBlueToothService;
     Handler mHandler;

    ConnectThread connectThread;

    //region----------------------------------------    Overrides & Permissions




    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
        this.setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);


            setUpUI();

            checkPermissions();







    }



    private void checkPermissions()
    {
        String[] permissions = new String[] {

                Manifest.permission.ACCESS_FINE_LOCATION,
                Manifest.permission.ACCESS_COARSE_LOCATION,
                Manifest.permission.BLUETOOTH,
                Manifest.permission.BLUETOOTH_ADMIN
        };


        ActivityCompat.requestPermissions(MainActivity.this, permissions, 10);


    }



    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);


        setupBlueTooth();

    }


    //endregion


    //region----------------------------------------    UI



    private void setUpUI()
    {
        MY_UUID = UUID.fromString("ca805ff2-39d8-47ca-a9a4-ca6227358943");
        textViewFeedback = findViewById(R.id.textview_FeedBack);
        textViewSkateBoardInfo = findViewById(R.id.textView_SkateBoard);
        editTextInputToBoard = findViewById(R.id.editText_InputToSkateBoard);
        buttonSubmit = findViewById(R.id.button_Submit);
        spinnerDevices = findViewById(R.id.spinner_Devices);
        buttonBrake = findViewById(R.id.button_Brake);
        seekBarSpeed = findViewById(R.id.seekBar_speed);
        buttonCoast = findViewById(R.id.button_Coast);
        buttonCruise = findViewById(R.id.button_Push);
        buttonMinus = findViewById(R.id.button_minusFive);
        buttonPlus = findViewById(R.id.button_plusFive);

        seekBarSpeed.setProgress(90);

        buttonSubmit.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                buttonClick();
            }
        });

        buttonCoast.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = 98;
                seekBarSpeed.setProgress(98);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });

        buttonCruise.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = 105;
                seekBarSpeed.setProgress(105);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });

        buttonMinus.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = desiredSpeed -5;
                seekBarSpeed.setProgress(desiredSpeed);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });

        buttonPlus.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = desiredSpeed +5;
                seekBarSpeed.setProgress(desiredSpeed);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });




        seekBarSpeed.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
            @Override
            public void onProgressChanged(SeekBar seekBar, int i, boolean b) {
                desiredSpeed = i;

            }

            @Override
            public void onStartTrackingTouch(SeekBar seekBar) {

            }

            @Override
            public void onStopTrackingTouch(SeekBar seekBar) {
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);
            }
        });


        buttonBrake.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = 90;
                seekBarSpeed.setProgress(90);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });



        mHandler = new  Handler(){

            @Override
            public void handleMessage(Message msg) {

                if (msg.what == 0){

                byte[] b = (byte[]) msg.obj;
                    // TODO: 5/12/2018 heres the response
                    String s = new String(b);

                    textViewFeedback.setText(s);

                }



            }
        };

    }




    private void buttonClick()
    {
        String s = editTextInputToBoard.getText().toString();

        if (!bluetoothConnectionActive ){
            runConnectThread();
            Log.d(TAG, "buttonClick: connecting");
        }else if(s.equals(""))
        {
            connectThread.cancel();
            myBlueToothService.cancel();
            Log.d(TAG, "buttonClick: dissconnecting");
            setupBlueTooth();
        }


            if (!s.equals(""))
            {
            sendString(s);
                Log.i(TAG, "buttonClick: sending ext input");
            }




    }

    public void writeToInternal(String input)
    {
        final String s = input;

        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                textViewSkateBoardInfo.setText(s);

            }
        });



    }

    public void writeToBlueToothResponse(String input)
    {
        final String s = input;
        Log.d(TAG, "writeToBlueToothResponse: " + input);

        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                textViewFeedback.setText(s);
            }
        });


    }


    public void sendString(String s)
    {
        try {
            byte[] bytes = s.getBytes("UTF-8");
            myBlueToothService.writeToDevice(bytes);
            writeToInternal(s);
        }catch (UnsupportedEncodingException e){
            Log.e(TAG, "sendString: ", e);
            writeToInternal("Failed to send");
        }

    }


    public void sendSpeed(int toSend)
    {


        byte[] b = ByteBuffer.allocate(4).putInt(toSend).array();

        if (bluetoothConnectionActive) {

            myBlueToothService.writeToDevice(b);

            Log.i(TAG, "sendSpeed: sending...");
        }
    }


    //endregion


    //region----------------------------------------    Bluetooth

    private void setupBlueTooth()
    {


        bluetoothAdapter = BluetoothAdapter.getDefaultAdapter();
        if (bluetoothAdapter == null){
            Toast.makeText(this, "device doesnt support bluetooth", Toast.LENGTH_SHORT).show();
        }

        if (!bluetoothAdapter.isEnabled()){
            Intent enableBTIntent = new Intent(BluetoothAdapter.ACTION_REQUEST_ENABLE);
            startActivityForResult(enableBTIntent, 20);
        }

        pairedDevices = bluetoothAdapter.getBondedDevices();


        if (pairedDevices.size() > 0){
            deviceNames = new String[pairedDevices.size()];


            int tick = 0;
            for (BluetoothDevice device :
                    pairedDevices) {

                deviceNames[tick] = device.getName();



                tick++;
            }

            spinnerAdapter = new ArrayAdapter<String>(this,android.R.layout.simple_spinner_item, deviceNames);

            spinnerAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);

            spinnerDevices.setAdapter(spinnerAdapter);

            spinnerDevices.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
                @Override
                public void onItemSelected(AdapterView<?> adapterView, View view, int i, long l) {

                    chosenDevice = i;

                }

                @Override
                public void onNothingSelected(AdapterView<?> adapterView) {

                }
            });

        }



    }



    public void runConnectThread()
    {
    BluetoothDevice deviceToInsert = null;

    for (BluetoothDevice d :
            pairedDevices) {
        if (d.getName().equals(deviceNames[chosenDevice])){
            deviceToInsert = d;
        }
    }
    Log.i(TAG, "runConnectThread: try to connect");

   connectThread = new ConnectThread(deviceToInsert);
    connectThread.begin();

}


    private class ConnectThread implements Runnable
    {
    private  BluetoothSocket mmSocket;
    private final BluetoothDevice mmDevice;
    Thread thread;


    public ConnectThread(BluetoothDevice device)
    {
        if (device == null){
            Toast.makeText(MainActivity.this, "BT Device Null", Toast.LENGTH_SHORT).show();
        }

        BluetoothSocket tmp = null;

        mmDevice = device;


        try{

            tmp = device.createRfcommSocketToServiceRecord(MY_UUID);
        }catch (Exception e){
            Log.e(TAG, "ConnectThread: ",e );
        }


        mmSocket = tmp;

    }




    @Override
    public void run() {

        bluetoothAdapter.cancelDiscovery();

        try {
            mmSocket = (BluetoothSocket) mmDevice.getClass().getMethod("createRfcommSocket", new Class[]{int.class}).invoke(mmDevice, 1);
            mmSocket.connect();

        }catch (Exception e){
            Log.e(TAG, "run: ", e);
            writeToInternal( "failed connection to " + deviceNames[chosenDevice]);

                try {

                    mmSocket.close();
                }catch (Exception e2){
                    Log.e(TAG, "run: ", e2);
                }
                return;



        }

            writeToInternal( "connected to " + deviceNames[chosenDevice]);
            //do something with mmSOcket here
            myBlueToothService = new MyBlueToothService(mmSocket);
            Log.i(TAG, "Connection Success");
            bluetoothConnectionActive = true;
            passSTart();


    }




    public void begin()
    {
        thread = new Thread(this);
        threadControl = true;
        thread.start();
    }


    public void cancel()
    {
        bluetoothConnectionActive = false;
        threadControl = false;
        try {
            mmSocket.close();
        }catch (Exception e){
            Log.e(TAG, "cancel: ", e);
        }

    }


}

    public void passSTart()
    {
        myBlueToothService.passMain(this);
    }




    //endregion



    //region----------------------------------------    lifecycle

    @Override
    protected void onDestroy() {
        super.onDestroy();
        myBlueToothService.cancel();

    }


    //endregion


}</pre>

 

</pre>
<pre>    private interface MessageConstants {
        public static final int MESSAGE_READ = 0;
        public static final int MESSAGE_WRITE = 1;
        public static final int MESSAGE_TOAST = 2;


    }


    public  MyBlueToothService(BluetoothSocket socket)
    {
        connectedThread = new ConnectedThread(socket);
        connectedThread.startListeneing();


    }


    public void writeToDevice(byte[] b)
    {
        connectedThread.write(b);
    }


    public void cancel()
    {
        connectedThread.cancel();
    }

    public void passMain(MainActivity a)
    {
        connectedThread.setMainActivity(a);
    }



    private class ConnectedThread implements Runnable
    {
        Thread thread;
        MainActivity mainActivity;

        private final BluetoothSocket mmSocket;
        private  DataInputStream mmInputStream;
        private DataOutputStream mmOutputStream;
        private byte[] mmBuffer;

        public ConnectedThread(BluetoothSocket socket){

            mmSocket = socket;
            InputStream tmpIn = null;
            OutputStream tmpOut = null;


            try
            {
              tmpIn = socket.getInputStream();
                Log.i(TAG, "ConnectedThread: input stream success");
            }catch (Exception e){
                Log.e(TAG, "ConnectedThread: failed", e);
            }
            try
            {
              tmpOut = socket.getOutputStream();
                Log.i(TAG, "ConnectedThread: output stream success");
            }catch (Exception e){
                Log.e(TAG, "ConnectedThread: failed", e);
            }

            mmInputStream = new DataInputStream(tmpIn);

            mmOutputStream = new DataOutputStream(tmpOut);





        }


        public void run()
        {
            mmBuffer = new byte[1024];
            int numbytes = 0;


            while (threadBool)
            {
                try
                {
                  numbytes = mmInputStream.read(mmBuffer);
                    //Log.i(TAG, "mybluetooth read bufer bypassed");

                    if (numbytes > 0){
                        //Log.d(TAG, "read numbytes > 0");

                        String s = new String(mmBuffer, "UTF-8");
                        mainActivity.writeToBlueToothResponse(s);

                    }

                }catch (Exception e){
                    Log.e(TAG, "run: ", e);
                    break;
                }



            }




        }





        public void write(byte[] bytes)
        {
            byteHolder = bytes;

            Thread t = new Thread(new Runnable() {
                @Override
                public void run() {

                            WriteToBT();

                }
            });
            t.start();

        }





        private void WriteToBT()
        {
            //Log.d(TAG, "WriteToBT: " + String.valueOf(byteHolder.length));


            try{
                mmOutputStream.write(byteHolder);
                //Log.i(TAG, "WriteToBT: passed");
                int i = new BigInteger(byteHolder).intValue();
                //Log.d(TAG, "WriteToBT: " + String.valueOf(i));

            }catch (Exception e){
                Log.e(TAG, "write: ", e);

            }
        }



        public void startListeneing()
        {threadBool = true;
        thread = new Thread(this);

        thread.start();

        }



        public void cancel()
        {
            threadBool = false;
            try{
                mmSocket.close();
            }catch (Exception e){
                Log.e(TAG, "cancel: ", e);
            }


        }



        public void setMainActivity(MainActivity a)
        {
            mainActivity = a;
        }



    }



}</pre>
<pre>

and that’s it!
The post Electric Skate Board appeared first on SignalHillTechnology.

Powered by WPeMatico

Updating the easy time card app

easy time card app android

 

Updated and revised

 

After releasing the easy time card app I realized that it looked very dull and relied heavily on androids basic graphics. It was a utility app I made for myself so early on this was ok. But after releasing it I wanted it to look great but keeps its same super simplistic functionality. As you can see in the image above the display really looks like a time card. I’m very pleased that I was also able to add a csv and pdf attachment to the emails as well. Here is the little youtube video promo I made for it. You can check it our on the google play store 

 

 

 

 

 

 

 

The post Updating the easy time card app appeared first on SignalHillTechnology.

Powered by WPeMatico

Git Hub Tutorial / Cheat Sheet

git tutorial / cheat sheet

 

 

Using Git was a bit of a headache the first time around.

I remember day 1 complaining about it every 5 minutes. After a few days it becomes a trusted friend.

Here is the tutorial I wish I found when learning to use it.

Enjoy…and download the pdf version for the yellow highlighting which helps a lot

 

Download PDF Here

 

GIT CHEAT SHEET

Highlighted yellow are steps to load new project for first time

git version

Make sure git is installed

git config –global user.name “yourname”

git config –global user.email “youremail”

add your contact and signature to commits automatically

git config –list

see you current set up

git help <verb>

git <verb> –help

get help and commands about subject

dir cd “name of folder” cd ..

dir show a list of file and cd takes you there. Cd .. takes you back

git init

This creates a folder in your local code which will keep track of the code and communicate with remote code source. Nothing si automatic. Your “working folder” contents must be added in future steps for the folder git init to work with it.

git status

(See .gitignore at this point)

This will show files that are not being tracked by your git init object.

This will show files that are being tracked and are in the staging area

git add <somefile>

git add -A

This will add files to the staging area. They are still in working folder but git understand you want to work with themin future git commands.

git reset <filename>

git reset

removes this from staging are. Git still sees it but is not working with it anymore.

git commit -m “your message”

commit command creates a data point saying someone updated the project at this point. The -m command is required so you can send a message to let future readers know what this commit was about. (Try git status now. It will say that your working directory is clean. You havent uploaded yet but there is nothing more to upload besides the files buffered into your commit)

git log

You can see the log of commits

git remote add “somename” “somelocation”

This command will link your git init object to the remote repository. “somename might be “origin” and “somelocation might be ” c:projectsfilesthisproject” or “https://bitbucket.org/yourprofile/yourproject/project.git

git push -u origin master

Remember when you made a commit? This takes that commit and pushes it to the remote repository. The -u symbol i have no idea. The origin defines the repository location which was added in the previous instruction. And the master means added to the master branch

git pull origin master

same as above but reverse. Bring remote to local

git clone <url>

Navigate your console to desired download location and clone a repository there wth this command. Remmber this does everything for youso no need to make a git init.just reference project folder probably

git remote -v

view remote repo info

git branch -a

check branches. Unless you a master commit kinda person(your not)

git diff

see code changes between working and local staging

The post Git Hub Tutorial / Cheat Sheet appeared first on SignalHillTechnology.

Powered by WPeMatico

Adding a splash video player on Android

On my app I wanted to include an introductory video loaded directly ino the app instead of pulled form an internet source.

Once the app was created I used a resource called handbrake which compressed my video into just a few megs.

 

The I applied the following code…

 

My main activity has two class object.

FrameLayout frameLayout;
<pre>ImageView im;</pre>

 

My method calls lead to a splashPlayer() which loads the video. The sequence is @Override:onCreate() -> createStart()this loads all my startup stuff -> splashPlayer() this loads the video

The frame layout below is linked to my xml layout as usual. Here  we set up our video. Then call another method addPlayButton()

</pre>
<pre>public void splashPlayer()
{

    try {</pre>
<pre>final VideoView videoHolder = new VideoView(this);

frameLayout.addView(videoHolder);
Uri video = Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.sound);
videoHolder.setVideoURI(video);</pre>
<pre>addPlayButton(true);</pre>
<pre>

The addplay button shows a play button over your black square so people understand its a movie.

</pre>
<pre>public void addPlayButton(boolean show)
{

    if (show){
        im = new ImageView(this);
        im.setImageResource(R.drawable.playicon1);
        final WindowManager.LayoutParams params = new WindowManager.LayoutParams();
        params.width = 50;
        params.height = 50;

        frameLayout.addView(im, params);
    }else
    {
        frameLayout.removeView(im);
    }


}</pre>
<pre>

The where back in splashplayer and setting our controls.

</pre>
<pre>
        videoHolder.setOnTouchListener(new View.OnTouchListener() {
            @Override
            public boolean onTouch(View view, MotionEvent motionEvent) {

                switch ( motionEvent.getAction()) {

                    case MotionEvent.ACTION_DOWN:

                        Log.i(TAG, "onTouch: touched");
                        if (videoHolder.isPlaying()) {
                            Log.i(TAG, "onTouch: 1");
                            videoHolder.pause();
                            addPlayButton(true);

                        } else {
                            Log.i(TAG, "onTouch: 2");
                            videoHolder.start();
                            addPlayButton(false);
                        }
                        break;

                }

                return true;
            }
        });

    }catch (Exception e){
        Log.i(TAG, "splashPlayer: try/catch fail");
    }

}</pre>
<pre>

Total code as follows

</pre>
<pre>public void splashPlayer()
{

    try {

        Log.i(TAG, "splashPlayer: trying...");

        final VideoView videoHolder = new VideoView(this);

        frameLayout.addView(videoHolder);
        Uri video = Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.sound);
        videoHolder.setVideoURI(video);

        addPlayButton(true);

        //videoholder.setoncompletelistener
        videoHolder.setOnTouchListener(new View.OnTouchListener() {
            @Override
            public boolean onTouch(View view, MotionEvent motionEvent) {

                switch ( motionEvent.getAction()) {

                    case MotionEvent.ACTION_DOWN:

                        Log.i(TAG, "onTouch: touched");
                        if (videoHolder.isPlaying()) {
                            Log.i(TAG, "onTouch: 1");
                            videoHolder.pause();
                            addPlayButton(true);

                        } else {
                            Log.i(TAG, "onTouch: 2");
                            videoHolder.start();
                            addPlayButton(false);
                        }
                        break;

                }

                return true;
            }
        });

    }catch (Exception e){
        Log.i(TAG, "splashPlayer: try/catch fail");
    }

}</pre>
<pre>

The post Adding a splash video player on Android appeared first on SignalHillTechnology.

Powered by WPeMatico

Building a pdf programmatically on Android

In my recent project I needed to build a pdf and email it to a user. Android has a great library for this and I found it only took a few hours to get a reasonably readable pdf onto my screen.

 

The basic outline goes as follows

//create the pdf
<pre>PdfDocument reportDocument = new PdfDocument();
//create the page</pre>
<pre>PdfDocument.Page page = reportDocument.startPage(pageInfo);</pre>
<pre>
/*
* Do some stuff here. see further down
*/

</pre>
<pre>//end the page
reportDocument.finishPage(page);</pre>
<pre>//send it away to a file for storage or in my case an email for sending</pre>
<pre> pdfPath = getPDFFile(openProject.getProjectNme());
       FileOutputStream fos = new FileOutputStream(pdfPath);
        reportDocument.writeTo(fos);
        reportDocument.close();
        success = true;</pre>
&nbsp;

Now for writing stuff on the pdf itself

 

//I had  made a chart earlier and wanted it displayed
<pre>page.getCanvas().drawBitmap(charts[0], 5, graphy, paint)

//drawing some text on the page by setting up a paint object</pre>
<pre>    Paint paint = new Paint();
    paint.setColor(Color.BLACK);  
    paint.setTextAlign(Paint.Align.CENTER);
    paint.setTextSize(25);

//then writing</pre>
<pre>page.getCanvas().drawText("Source of Sound",width/2, titley, paint);</pre>
&nbsp;

 

Really this is a super hackey way of doing it. There must be a cleaner API or if i ever need to do this again I will write one. But it works so I guess Ill come back to that at a later time.

 

 

 

Complete code here.

<pre>private void makePDF()
{
    Log.i(TAG, "makePDF: started");

    String sHighestDec;
    String sAverageHighestDecibel;

    //region
    String weighting = "For the purpose of soundproofing a nuisance noise must be 10 decibel lower than the ambient decibels of ";
    String weighting2 = "the rooms future use. So an office/library of 40 decibels would need its soundproofing to bring exterior";
    String weighting3 = "nuisance noise to 30 decibels to not be noticed by end users. Frequently due to uncontrollable ambient ";
    String weighting4 = "factors it will be impossible to measure the new low of your target sound if it drop below 40 db";

    String outerLimits = "Remember this is app is a crude measurement and educational tool.";
    String outerlimits2 = "Your  device may not produce or record some sounds due to hardware limitations";
    String outerlimits3 = "Secondly, you may not be able to hear all range of sounds. Wear ear protection!";


    //endregion
    int width = 1224;
    int height = 1584;
    int left = 5;
    float titley = ((float)(height * .04));
    float titleOney = ((float)(height * .06));
    float titleTwoy = ((float)(height * .08));
    float graphy = ((float)(height * .1));
    float lineOney = ((float)(height * .55));
    float lineTwoy = ((float)(height * .57));
    float lineThreey = ((float)(height * .59));
    float lineFoury = ((float)(height * .61));
    float lineFivey = ((float)(height * .63));
    float lineSixy = ((float)(height * .65));
    float lineSeveny = ((float)(height * .67));
    float lineEighty = ((float)(height * .69));
    float lineNiney = ((float)(height * .71));

    PdfDocument reportDocument = new PdfDocument();

    //page 1
    PdfDocument.PageInfo pageInfo = new PdfDocument.PageInfo.Builder(width,height,1).create();

    PdfDocument.Page page = reportDocument.startPage(pageInfo);

    Paint paint = new Paint();
    paint.setColor(Color.BLACK);

    //title of page
    Log.i(TAG, "makePDF: page 1");
    paint.setTextAlign(Paint.Align.CENTER);
    paint.setTextSize(25);

    //title
    String info1 = openProject.getProjectNme() + " " + openProject.getProjectLocation();
    page.getCanvas().drawText("Source of Sound",width/2, titley, paint);
    page.getCanvas().drawText(info1,width/2, titleOney , paint);
    info1 = openProject.getDate();
    page.getCanvas().drawText(info1,width/2, titleTwoy, paint);


    //chart
    page.getCanvas().drawBitmap(charts[0], 5, graphy, paint);

    paint.setTextAlign(Paint.Align.LEFT);
    //highest decibel and at what frequency recordedHighestAverage
    sHighestDec = "The highest decibel recorded was : " + String.valueOf(valuesChartOne[0]) + "db at approximately " + String.valueOf(valuesChartOne[1] + " hz");
    page.getCanvas().drawText(sHighestDec, left, lineOney, paint);

    //average decibel recorded during total running average
    sAverageHighestDecibel = "The average decibel recorded was : " + String.valueOf(valuesChartOne[2]);
    page.getCanvas().drawText(sAverageHighestDecibel, left, lineTwoy, paint);

    //outerlimits
    page.getCanvas().drawText(outerLimits, left, lineThreey, paint);
    page.getCanvas().drawText(outerlimits2, left, lineFoury, paint);
    page.getCanvas().drawText(outerlimits3, left, lineFivey, paint);


    reportDocument.finishPage(page);


    //page 2
    Log.i(TAG, "makePDF: page 2");
    if  (charts[1] != null) {
        pageInfo = new PdfDocument.PageInfo.Builder(width, height, 2).create();

        page = reportDocument.startPage(pageInfo);

        //title of page
        paint.setTextAlign(Paint.Align.CENTER);
        paint.setTextSize(25);
        page.getCanvas().drawText("Receiving Area", width / 2, titley, paint);

        //chart of recording
        page.getCanvas().drawBitmap(charts[1], 5, graphy, null);

        paint.setTextAlign(Paint.Align.LEFT);
        //highest decibel and at what frequency
        sHighestDec = "The highest decibel recorded was : " + String.valueOf(valuesChartTwo[0]) + "db at approximately " + String.valueOf(valuesChartTwo[1] + " hz");
        page.getCanvas().drawText(sHighestDec, left, lineOney, paint);

        //average decibel and at what frequency
        sAverageHighestDecibel = "The average decibel recorded was : " + String.valueOf(valuesChartTwo[2]);
        page.getCanvas().drawText(sAverageHighestDecibel, left, lineTwoy, paint);

        generateSTC();

        //if barrier what stl or not calculatable
        if (stcSuccess){

            String values = "Db lost : ";

            for (double d :
                    diff) {
                //Log.i(TAG, "makePDF: " + String.valueOf(d));
                int t = (int) Math.round(d);

                values += String.valueOf(t) + " ";

            }



            page.getCanvas().drawText(values , left, lineThreey, paint);
            page.getCanvas().drawText("Stc rating of this barrier is " + String.valueOf(stcRating), left, lineFoury, paint);

        }else{

            page.getCanvas().drawText("Stc rating was not successfully calculated", left, lineThreey, paint);

        }



        // weighting a sound
        page.getCanvas().drawText(weighting, left, lineFivey, paint);
        page.getCanvas().drawText(weighting2, left, lineSixy, paint);
        page.getCanvas().drawText(weighting3, left, lineSeveny, paint);
        page.getCanvas().drawText(weighting4, left, lineEighty, paint);

        reportDocument.finishPage(page);

    }


    //page 3
        if (charts[2] != null) {

            Log.i(TAG, "makePDF: page 3");
            pageInfo = new PdfDocument.PageInfo.Builder(width, height, 3).create();
            page = reportDocument.startPage(pageInfo);

            Material m = chosenMaterial.get(0);


            //title of page
            paint.setTextAlign(Paint.Align.CENTER);
            paint.setTextSize(25);
            page.getCanvas().drawText("Simulation", width / 2, titley, paint);
            info1 = m.getMaterialName() + " - " + m.getMaterialDescription();
            if (info1.length() < 100)
            {
                page.getCanvas().drawText(info1, width / 2, titleOney, paint);
            } else  {

                String one = info1.substring(0, info1.length()/2);
                String two = info1.substring(info1.length()/2 +1, info1.length());

                page.getCanvas().drawText(one, width / 2, titleOney, paint);
                page.getCanvas().drawText(two, width / 2, titleTwoy, paint);
            }


            //chart of recording
            page.getCanvas().drawBitmap(charts[2], left, graphy, null);

            paint.setTextAlign(Paint.Align.LEFT);
            //highest decibel and at what frequency
            sHighestDec = "The highest decibel recorded was : " + String.valueOf(valuesChartThree[0]) + " at approximately " + String.valueOf(valuesChartThree[1] + " hz");
            page.getCanvas().drawText(sHighestDec, left, lineOney, paint);

            //average decibel not being calculated


            //describe green and red marks
            info1 = "The green bars are "A" weighted to human perception based on frequency.";
            page.getCanvas().drawText(info1,left,lineFoury , paint);

            info1 = "The red dots are NC25 rating. (Allowable sound penetration for many use types)";
            page.getCanvas().drawText(info1,left,lineFivey , paint);

            reportDocument.finishPage(page);
        }



    boolean success = false;

    try {


       pdfPath = getPDFFile(openProject.getProjectNme());
       FileOutputStream fos = new FileOutputStream(pdfPath);
        reportDocument.writeTo(fos);
        reportDocument.close();
        success = true;

}catch (IOException f) {
        Log.e(TAG, "makePDF: ", f);
    }finally {

        if (success) {
            notifyObserver(1);
        }else   {
            notifyObserver(2);
        }


    }





}</pre>

The post Building a pdf programmatically on Android appeared first on SignalHillTechnology.

Powered by WPeMatico

Draw an animated bar chart on Android

How I drew a live bar chart in my sound proofing and STC app available on google play

Fast Fourier transform FFT android

 

The above picture is a screenshot of my android app where I needed to live display over 100 data points onto my screen. Here is how I did just that.

Important to know is that these data points came in the form of double array of [156] from my fast Fourier transform. With these 156 data point I needed to do the following;

  1. Draw a background and figure out how much of this data was needed to fill in my chart
  2. Grab that data asynchronously from a separate thread doing the calculations
  3. Take my data point and figure how wide and tall they need to be.
  4. Draw the previous record and then draw the new total over the top of it.
  5. Do it 30 times per second and post back to the UI

 

So let me start with item #2… how the data was getting there. The background thread that was giving me the 156 data points was refreshing very fast. I didn’t need all the data for the purpose of the user display and I chose to simply post it to an array repeatedly. This way it could post as frequently as it wanted and I could grab the data as frequently as a wanted without interruption to either. Loosely coupled might be the appropriate terminology.

 

This is my class wide field I posted too.

private double[][] uiChartBuffer;

This is what I call a controlling method. I have several inner classes within my audiocontrol class which do the work on different worker threads.
My controlling methods allow the classes utilizing the audiocontrol object to make its inner classes perform their functions without actually touching them. Here you
can see that I start recording and I instantiate the inner class drawchart and call its resume method. These inner classes are 100% accessed by their own controlling methods as well. It may
also be important to mention that the audiocontrol's controlling methods are affected by the android lifecycle. Killing the inner classes automatically.

So if the user gets a phone call and moves out of the app there are no leaked resources:
 Fragment.onPause() -> audioControl.onPause(this call a bunch of methods such as the one below) -> drawChart.pause and null -> recordAudio.pause() and null;
public void startRecording(File file)
{
 Log.i(TAG, "startRecording: ");
 running = !running;

 drawChart = new DrawChart(context);

 fileToWork = file;


 recordAudio = new RecordAudio();
 recordAudio.execute(getFile());


 drawChart.resume();

 

At the very bottom is my entire inner drawchart class. There is a resume and pause method, a method to check if the screen update is ready and a method that just loops over and over trying to update the screen.
Anything that doesn't have to do with the user interaction should be in a background thread. This is no exception however android doesn't allow background threads to touch the UI so you have to us a workaround.
In this case I used a surface view which is a very typical way of doing this.






</pre><pre>private class DrawChart extends SurfaceView implements Runnable
{
    SurfaceView surfaceView;
    private SurfaceHolder surfaceHolder;
    private Canvas canvas;
    private Paint paint;
    Thread thread = null;
    TextView textViewDecibelShow;
    int highestDeciebelFromSample;


    private int mBarColorF;
    private int mBarColorS;
    private Bitmap bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.newhzdb);
    BitmapDrawable bitmapDrawable = new BitmapDrawable(getResources(),bitmap);



    private long nextFrameDue;
    private long frameLength = 100;
    int width, height;
    int barWidth, seqS, seqF,  mod;
    double eWidth, reverse, bHeight;
    float bh;


    public DrawChart(Context context) {
        super(context);

        surfaceView = myFragmentView.findViewById(R.id.surfaceView);
        textViewDecibelShow = myFragmentView.findViewById(R.id.textViewliveDB);
        surfaceHolder = surfaceView.getHolder();

        nextFrameDue = System.currentTimeMillis();



        mBarColorF = ResourcesCompat.getColor(getResources(),R.color.colorBarChartFluctuate,null);
        mBarColorS = ResourcesCompat.getColor(getResources(),R.color.colorBarChartStay, null);
        paint = new Paint();

        width = 0;



    }


    @Override
    public void run() {


        while  (running || playing)
        {

            if (thread.isInterrupted())
            {
                return;
            }


            if (updateRequired())
            {

                highestDeciebelFromSample = 0;

                if (surfaceHolder.getSurface().isValid() &amp;&amp; uiChartBuffer != null) {



                    canvas = surfaceHolder.lockCanvas();
                    // Log.i(TAG, "run: in midst of draws while loop");

                    if (width == 0){

                        width =  surfaceView.getWidth();    //screen dimensions 540
                        height = surfaceView.getHeight();   //screen dimensions 600

                        int almostwidth = (int) Math.round( width *.89) ;
                        barWidth = (int) almostwidth/reqBars;                       //4?
                        int netwidth = reqBars * barWidth;                         //464?
                        int grossWidth = (int) Math.round(netwidth/.90);            //515

                        width = grossWidth;



                        mod = getMod(width);
                        bHeight = height * .965;             // random number i found to keep max reading within chart
                        bh = (float) bHeight;

                        setDbChartScaleFactor(bHeight/myscale);

                        bitmapDrawable.setBounds(0,0,width,height);
                        //scaledBitmap = Bitmap.createScaledBitmap(bitmap,width,height,true);
                    }

                    bitmapDrawable.draw(canvas);





                    for (int i = 1; i &lt; reqBars ; i++) {

                        seqF = i * barWidth + mod; //right
                        seqS = seqF - barWidth; //left



                        int bottom = reverseandCalc(uiChartBuffer[0][i], false);

                        //Log.i(TAG, "run: " + String.valueOf(bottom));

                        if (bottom &gt; highestDeciebelFromSample){
                            highestDeciebelFromSample = bottom;
                        }




                        if (bottom &gt; highestDecibelByBar[i])
                        {// if this data is higher then add to record
                            highestDecibelByBar[i] = bottom;
                        }else {
                            //if not then draw prev record behind it

                            int top = (int) bh - highestDecibelByBar[i];

                            paint.setColor(mBarColorS);
                            canvas.drawRect(seqS, top, seqF, bh, paint);      //draw prev record redbar underneath
                        }


                        int top = (int) bh - bottom;


                        paint.setColor(mBarColorF);
                        canvas.drawRect(seqS, top, seqF, bh, paint); //draw green bar on top
                    }



                    surfaceHolder.unlockCanvasAndPost(canvas);


                }

                //averageDecibelForView = average(highestDeciebelFromSample);

                notifyObserver(3);

            }

        }

    }


    public void resume()
    {
        highestDecibelByBar = new int[reqBars];
        thread = new Thread(this);
        thread.start();
        Log.i(TAG, "resume: draw thread");

    }


    public void destroy()
    {
        thread.interrupt();
    }

    // TODO: 5/1/2018 update based not on time but on fft method calls
    public boolean updateRequired()
    {

        if (nextFrameDue &lt;= System.currentTimeMillis()){

            nextFrameDue = System.currentTimeMillis() + frameLength;


            return true;
        }
        return false;





    }





}
</pre><pre>

 


 

 

The post Draw an animated bar chart on Android appeared first on SignalHillTechnology.

Powered by WPeMatico