Stream Video From Android Part 4 – Parse Boxes and SPS or PPS

Alright, your have the taste of blood in your mouth and you like it. Lets dive deeper and actually try parsing some data.

->source code here <-

In our android app we have saved the file and now we are passing it to our sdp maker. Ironically I do not actually use session description protocol. I just didn’t know it was not needed so I named this class incorrectly.

Our sdpmaker class uses a randomaccessfile to parse through the data by reading each byte and looking for box headers. Here is the start of the method where you can get the idea of how the whole class works.


public static byte[][] retreiveSPSPPS(File file) throws IOException, FileNotFoundException
{
    byte[] sps = new byte[0];                                 //we will find the bytes and read into these arrays then convert to the string values
    byte[] pps = new byte[0];
    byte[] prfix = new byte[6];
    byte[][] spspps;
    String[] spsppsString = new String[2];


    RandomAccessFile randomAccessFile;              //file type to allow searching file byte by byte
    long fileLength = 0;
    long position = 0;
    long moovPos = 0;

    byte[] holder = new byte[8];

                                                        //get the file we saved our little video too

       randomAccessFile = new RandomAccessFile(file, "r");
        fileLength = randomAccessFile.length();

                                                            // here we find the moov box within the mp4 file
    while(position < fileLength) {
        //read our current position and then advance to next position
        randomAccessFile.read(holder, 0, 8);
        position += 8;

        if (checkForBox(holder)) {
            String name = new String(holder, 4, 4);

            if (name.equals("moov")) {
                moovPos = position;
                Log.d(TAG, "retreiveSPSPPS: found moov box = " + name);
                break;
            }
        }

    }

Check out this picture again. Notice how the boxes are nested?  My code above isn’t optimized for any use case. But as you can see I start by finding the moov box and then search my way through each nested box to find the data i need.

Here you can see where I am extracting my sps. Read through the full code for all the details.

if (read[0] == 'g') {

                                        //ascii 'g' = hex 67  &lt;- we found the sps
    int length = bLength &amp; 0xff;        //blength is the length of the sps


    remaining = new byte[length];

    randomAccessFile.read(remaining, 0, length-1); //minus 1 because we already read the g

                                            //scoot everything down and add our g at the begining
    for (int i = length-1; i &gt; 0 ; i--) {

        remaining[i] = remaining[i-1];

    }

    remaining[0] = read[0];
    sps = remaining;

    String s = bytesToHex(remaining);
    Log.d(TAG, "retreiveSPSPPS: found sps: " + s + " length used: " + String.valueOf(length));

 

Once this is done we need to save this data in our app. As long as we don’t change the media recorder settings when we stream this sps and pps data will allow a decoder to decode it correctly.

On a side note I also saved a prefix byte[] but this turned out to be unnecessary. Stick with the sps and pps.

 

Stream Video From Android Part 3 – Understanding h264 in mp4

The last section was tough, it only get tougher.

As I said, the mp4 file is streamed and then data to decode the file is written after.  So like most file types mp4 is constructed in parts. Frequently you hear the term file header which is a section that explain the files contents. With mp4 its full of boxes. These boxes might be at the beginning or they might be at the end. We don’t know and we have to find out. Below is a software that allows you to open up the contents of an mp4 file.

Some key parts…

fytp -> decsribes basic contents

mdat -> the actual video data

avCC -> the stuff we need to decode the data

 

Parsing a video file

We are examining the h264 codec. h264 is a software that takes images and encodes them to reduce file size. These images are then wrapped up in the above boxes and into an mp4 container.

Lets think about this, you camera takes a 2mb picture. A video plays 30 frames per second or 30fps. A cd can hold 700mb. So if a video was simply a series of pictures a DVD would only contain 60mb per second or 12 seconds of video total.

Instead the h264 codec compresses a single image then records a series of “changes” that happen to the image.  So 30 frames of a single second of video might be one actual complete image and 29 “changes” to that image. This is the pattern below repeated numerous times as video plays.

Image->change->->change->change->change->Image->change->->change->change->change

Of course there is a lot more to know but each of the above is called a slice. These slices are saved in that mdat box one after another. A slice is not the same as a frame but sometimes it can be.

NALU or network abstraction layer unit is what these slices are saved as. These nalus are saved on after another and are separated by headers. There are two main types of headers we will be dealing with. Below the are written in HEX

AnnexB   -> 0x00 0x00 0x00 0x01 0x65 The last tells you what type. The first four is just a startcode with no data

These headers are simply a string of zeros and a one plus the nalu type. The video codec makes sure there are no other instances where this format can be found in the data output.

Avcc ->  0x00 0x02 0x4A 0x8F 0x65 The first four are the length the last described what type it is.Obviously the first four change with each data it represents.

If you make the accidental mistake of padding some data by copying a half filled buffer you will destroy your data’s readability by any decoder because you emulate the annex-b style start code.  This goes for either type. Working with this data is unforgiving. (Sound like the voice of experience here!)

Here are the different types.

0      Unspecified                                            non-VCL
1      Coded slice of a non-IDR picture                             VCL
2      Coded slice data partition A                                 VCL
3      Coded slice data partition B                                 VCL
4      Coded slice data partition C                                 VCL
5      Coded slice of an IDR picture                                VCL
6      Supplemental enhancement information (SEI)              non-VCL
7      Sequence parameter set                                 non-VCL
8      Picture parameter set                                non-VCL
9      Access unit delimiter                                   non-VCL
10     End of sequence                                          non-VCL
11     End of stream                                           non-VCL
12     Filler data                                             non-VCL
13     Sequence parameter set extension                        non-VCL
14     Prefix NAL unit                                         non-VCL
15     Subset sequence parameter set                            non-VCL
16     Depth parameter set                                     non-VCL
17..18 Reserved                                                 non-VCL
19     Coded slice of an auxiliary coded picture without partitioning non-VCL
20     Coded slice extension                                 non-VCL
21     Coded slice extension for depth view components         non-VCL
22..23 Reserved                                               non-VCL
24..31 Unspecified                                           non-VCL

Based on this information expect to see files like this.

[size or start code][type][data payload]  repeated x infinity…might as well be

Parsing SPS & PPS

Data in each box can also be found if you know where to look. Check out our avcc box here. I have it labeled for you and you can see it in hex and ascii.

Here you can find the data necessary to parse you video file. According to this chart… source is stackoverflow

bits    
8   version ( always 0x01 )
8   avc profile ( sps[0][1] )
8   avc compatibility ( sps[0][2] )
8   avc level ( sps[0][3] )
6   reserved ( all bits on )
2   NALULengthSizeMinusOne
3   reserved ( all bits on )
5   number of SPS NALUs (usually 1)
repeated once per SPS:
  16     SPS size
  variable   SPS NALU data
8   number of PPS NALUs (usually 1)
repeated once per PPS
  16    PPS size
  variable PPS NALU data

Remember the avcc 4 header bytes that gave you the length? Those are described in NAlulengthsizeminusone. They could also be two bytes for example. Its minus one because you can only count to three with the two bits of space allowed so 11 = 4 and 01 = 2….a bit quirky.

Now we have an understanding of the basic makeup of a mp4 file lets parse it in next section. Where we go deeper.

Stream Video From Android Part 2 – Getting Camera Data

 

1. First we need a camera 2 object to record our video

-> Here is the code myvideo.java <-  (Its saved as a .txt just change the suffix if you want to cut and paste but I don’t recommend doing that)

 

Androids camera 2 api is a beast itself. Using the official example will take most of the work out. Here is the example I copied.

The above example is buggy (at time of writing) even though its provided by google.

Heres the thing, when you record a video the camera starts sending the data to a file right away. Only after the recording is complete does the api write the data needed to play that file in the form of file headers . We need to do 2 things with this api

  1. We need to record short videos and extract that header data before we even start streaming.
  2. When we are streaming we have to  direct androids media recorder to send the file instead of writing the file.

Here is my code all folded up

MyVideo

The top fold is boilerplate stuff not super important

 

The third fold are the normal camera methods called to operate the camera. Lets focus in on the ones that are tough and critical to you success

The android camera method setUpMediaRecorder needs to tell android where to send you actual video data. Below you will see that I have created a bool flag that either saves the data or calls a method.

if (collectSDP){
    Log.d(TAG, "setUpMediaRecorder: collect");
    mMediaRecorder.setOutputFile(getVideoFilePath(mActivity));
}else{
    Log.d(TAG, "setUpMediaRecorder: dontcollect");
    mMediaRecorder.setOutputFile(getStreamFd());
}

In order to get some critical data we need later we need to save a small video file. In the method getSDP you will see I go through the motions of recording a short video.  The video is then parsed with the SDPMaker class which is the next part of this tutorial.

You will also see that option 2 of the above code is getFD. This returns a file descriptor which allows you to connect an outputstream -> inputstream. This tricks your camera api into writing into a buffer which is immediately read by another class which will package and send the data to wherever you are sending it. Notice all the abandon code.


private FileDescriptor getStreamFd() {



    ParcelFileDescriptor[] pipe = null;

    try {
        pipe = ParcelFileDescriptor.createPipe();

        /*
        new TransferThread(new ParcelFileDescriptor.AutoCloseInputStream(pipe[0]),
                new Socket(), mRobotPoint).start();
                */

         transferH264 = new TransferH264(new ParcelFileDescriptor.AutoCloseInputStream(pipe[0]),
                new Socket(), mRobotPoint, this, ssrc);

        transferH264.start();

        /*
        transH264 = new TransH264(new ParcelFileDescriptor.AutoCloseInputStream(pipe[0]),
                new Socket(), mRobotPoint, this, ssrc);

        transH264.start();
        */

        /*
       new TransferH264(new ParcelFileDescriptor.AutoCloseInputStream(pipe[0]),
                new Socket(), mRobotPoint, this).start();
                */

    } catch (IOException e) {
        Log.e(getClass().getSimpleName(), "Exception opening pipe", e);
    }

    return (pipe[1].getFileDescriptor());
}

 

This is a multipart post…keep going

 

Stream Video From Android Part 1 – Why & How

I’m not sure how you got here, but if you need to stream or dissect video this might be your lucky day. Whenever I write a new article my biggest joy is choosing an image to represent the experience a user should expect when tackling this project. I couldn’t make up my mind and both are appropriate.

 

Over the last several weeks I have been falling down the rabbit hole of streaming from an android device to another computer. Despite the ubiquity of streaming apps right now there is not a lot of easy solutions  on how to stream from android and I found more unanswered questions than answers.

In this post I will tell you step by step how I streamed from android to a pc. But be warned this is for average to ninja developers. If its your first android, desktop of web app you might not make it through this explanation.

Secondly, I am no expert in video or streaming services. I’m posting this article as fresh as I can from learning in order to maintain my beginner point of view. This article is a great stepping stone for someone like me trying to understand the basics. The code works on my device. Its not production code, video artifacts remain and I have a lot of optimization and polishing that will be done. But I think its much easier to understand my rough code with all its notes etc that to see a bunch of polished methods

Posts

PART 1 – WHY & HOW

Part 2 – Getting Camera Data

Part 3- Understanding h264 mp4

PART 4 – PARSE BOXES AND SPS OR PPS

PART 5 – PARSE NALUS

PART 6 – PACKETIZE RTP

PART 7 – DEPACKETIZE AND DISPLAY

PART 8 – TIPS, TRICKS AND TESTS

Prologue-

There are several ways to come at streaming lets consider them all and I tell you why i did what i did.

— Using Someone Else’s Frame Work —

The first is relying on an outside streaming service. There are several services that will gladly provide a simple api and charge you to stream data from the phone to a server somewhere.

The second is a free library that will give you a framework. Webrtc is an example that can allow you to video chat and provides an api for establishing communications. The second is libstream which was one of my greatest learning tools in this project.

Third are headless api’s such as ffmpeg and gstreamer. Ffmpeg is a little tough to get started with but it provided some great data for me to examine.

If you need to video chat or stream entertainment content these are great solutions.

–Doing The dirty Work Yourself —

There are two pitfalls with the above methods. Either in the form of money or constraints based on the api. I wanted to really understand how the video  data was being routed. I wanted to route everything myself and control almost every detail.

My intention was to eventually use this android video and audio data for AI processing at a remote location. I didn’t want to let someone else’s API decide the bandwidth usage or whether FPS or image compression would be used to compensate for such issues of reduced connection.  I also wanted to control how when connection are established. Truthfully I had no idea how the above things would be handled in an api. But I didn’t want to find out after  I committed and decided I needed to learn a bit so the journey was worth it for me.

Inside android there are two methods for getting video data. MediaRecorder and MediaCodec. I chose MediaRecorder out of simplicity although I suspect MediaCodec has further advantages. If you follow this guide I suspect you can switch later when you are a bit smarter and have had some time to breath.

In this example I’m using android studio and sending the data to a javafx app on windows desktop

Below is the basic outline of the steps:

This is the very very broad strokes.

Camera2 ->Basic video class or recording with camera2 and a few tweaks

Data pipe -> This gets your data out of the camera2 class

Packetizer -> Bundle that data and send it somewhere

Depacketizer -> Get that video back out andinto a decoder

Why is it so hard?

Looking at the three steps above you may wonder, if you have already been trying a bit, why is it so tough? Think about it. The first successful streaming service was skype in 2003. Google duo only came out in 2016 and apple video chat a few years earlier.  Even more remarkable is that these video services share many of the similar software libraries and video streaming in any quality relies on the heavily patented h264 codec. So don’t stress out too much. Its tough and there’s a lot to know.

This is a multi part post so please keep moving forward.

Android Swipe Functionality

I needed to allow my easy receipt app to do a few simple things.

-A way for the user to review the receipts that was swipe-able.

-This view needed to show data and an image

-It needed left right detection as well

 

The basic components for this are as follows;

Layouts-

I used two main layout files plus others. A linear layout to hold our swipe-able layouts, a coordinator layout which is required to detect the swipe dismiss behavior and get swiped away, plus other various layouts that go inside the coordinator layout.

Here is my linear layout container

<?xml version="1.0" encoding="utf-8"?>

<RelativeLayout
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:id="@+id/coordinator_container"
    android:layout_width="match_parent"
    android:layout_height="match_parent">

    <com.signal.cagney.easyreceipt.Util.LinearInterceptLayout
        android:id="@+id/linearContainer"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:orientation="vertical">

    </com.signal.cagney.easyreceipt.Util.LinearInterceptLayout>



</RelativeLayout>

Here is my coordinator layout. Notice I used frame layout on the top for my textviews and buttons and a zoomable image view on the bottom which is not a standard android class.

<?xml version="1.0" encoding="utf-8"?>
<android.support.design.widget.CoordinatorLayout
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent">


       <android.support.design.widget.CoordinatorLayout
           android:id="@+id/coordinator_forswipe"
           android:layout_width="match_parent"
           android:layout_height="match_parent"
           xmlns:app="http://schemas.android.com/apk/res-auto"       >


        <android.support.design.widget.AppBarLayout
            android:id="@+id/appbar_Main"
            android:layout_width="match_parent"
            android:layout_height="340dp"
            android:theme="@style/ThemeOverlay.AppCompat.Dark.ActionBar">

            <android.support.design.widget.CollapsingToolbarLayout
                android:id="@+id/collapsingToolbar"
                android:layout_height="match_parent"
                android:layout_width="match_parent"
                app:layout_scrollFlags="scroll|exitUntilCollapsed"
                app:contentScrim="@color/colorAccent"
                app:expandedTitleMargin="48dp"
                app:expandedTitleMarginEnd="64dp">


                <FrameLayout
                    android:background="@color/colorBackground"
                    android:id="@+id/framelayout_Conatiner_Details"
                    android:layout_width="match_parent"
                    android:layout_height="match_parent"
                    />



                <android.support.v7.widget.Toolbar
                    android:id="@+id/toolbar_Main"
                    android:layout_height="?android:attr/actionBarSize"
                    android:layout_width="match_parent"
                    app:popupTheme="@style/ThemeOverlay.AppCompat.Light"
                    app:layout_collapseMode="pin"

                    />


            </android.support.design.widget.CollapsingToolbarLayout>


        </android.support.design.widget.AppBarLayout>


                  <android.support.v4.widget.NestedScrollView
                      android:id="@+id/nestedScrollView_Image"
                      android:layout_width="match_parent"
                      android:layout_height="match_parent"
                      app:layout_behavior="@string/appbar_scrolling_view_behavior">


                      <com.signal.cagney.easyreceipt.Util.ZoomableImageView
                          android:adjustViewBounds="true"
                          android:scaleType="fitCenter"
                          android:id="@+id/imageView_UpperImage"
                          android:layout_width="match_parent"
                          android:layout_height="match_parent"
                          app:layout_collapseMode="parallax"
                          />





                </android.support.v4.widget.NestedScrollView>






            </android.support.design.widget.CoordinatorLayout>



        </android.support.design.widget.CoordinatorLayout>

The java –

In order to get the swipe direction I had to create a custom swipe dismiss behavior class to get the callback direction.

In my fragment I used this code to set up the custom swipe dismiss behavior onto my coordinator layout.

private void setUPUI()
{
    Log.d(TAG, "setUPUI: ");
    final CustomSwipDismissBehavior mSwipe = new CustomSwipDismissBehavior();


    mSwipe.setSwipeDirection(SwipeDismissBehavior.SWIPE_DIRECTION_ANY);
    mSwipe.setSensitivity(.1f);
    mSwipe.setDragDismissDistance(.9f);
    swiper = mSwipe;

    mSwipe.setListener(new SwipeDismissBehavior.OnDismissListener() {

        @Override
        public void onDismiss(View view) {

           int i = mSwipe.getDirection();
            Log.d(TAG, "onDismiss: ---------------- " + String.valueOf(i));

            if (i == 2){


                   //left swipe


            }else {

                 //right swipe

            }



        }

        @Override
        public void onDragStateChanged(int state) {

        }
    });

 

But in order to get the direction which I could not find a way to do with androids standard setup I had to make a custom swipe dismiss behavior like so.

 

public class CustomSwipDismissBehavior extends SwipeDismissBehavior{

    private final String TAG = "CustomSwipeBehavior";

    public final static int IDLE = 0;
    public final static int LEFT = 1;
    public final static int RIGHT = 2;

    float x1, x2;
    float minimum = 0;

    //1 left, 2 = right
    int direction = 1;

    boolean acceptswipe = true;

    @Override
    public void setListener(OnDismissListener listener) {
        super.setListener(listener);
    }


    @Override
    public boolean onInterceptTouchEvent(CoordinatorLayout parent, View child, MotionEvent event) {
        //Log.d(TAG, "onInterceptTouchEvent: " + event);
        setDirection(event);
        return super.onInterceptTouchEvent(parent, child, event);
    }

    @Override
    public boolean onTouchEvent(CoordinatorLayout parent, View child, MotionEvent event) {
        setDirection(event);

            return super.onTouchEvent(parent, child, event);


    }


    @Override
    public boolean canSwipeDismissView(@NonNull View view) {


        if (acceptswipe) {
            Log.d(TAG, "onTouchEvent: 1");
        return super.canSwipeDismissView(view);
        }else {
            Log.d(TAG, "onTouchEvent: 2");
            return false;
        }

    }

    @Override
    public void setSensitivity(float sensitivity) {
        super.setSensitivity(sensitivity);
        if (sensitivity == 0.0f){
            acceptswipe = false;
        }else{
            acceptswipe = true;
        }

    }

    private void setDirection(MotionEvent event)
    {
        //Log.d(TAG, "setDirection: motion event =  " + event);
        switch (event.getAction()){

            case MotionEvent.ACTION_DOWN:
                if (x1 == 0) {
                    x1 = event.getX();
                    //Log.d(TAG, "calculate: x1: " + String.valueOf(x1));
                }
                break;

            case MotionEvent.ACTION_MOVE:
                if (x1 == 0) {
                    x1 = event.getX();
                    //Log.d(TAG, "calculate: x1: " + String.valueOf(x1));
                }
                break;

            case MotionEvent.ACTION_UP:
                x2 = event.getX();
               //Log.d(TAG, "calculate: x2 : " + String.valueOf(x2));
               calculate(x1, x2);
                break;
        }

    }


    private void calculate(float x1, float x2)
    {


        float delta = x1 - x2;

        //Log.d(TAG, "calculate: x1: " + String.valueOf(x1) + "  x2: " + String.valueOf(x2) + "  delta: " + String.valueOf(delta));

        if (delta > minimum){
            direction = LEFT;
        }
        else if (delta < -minimum){
            direction = RIGHT;
        }
        else{
            direction = IDLE;
        }

        x1 = 0;
        x2 = 0;

        //Log.d(TAG, "calculate: " + direction);

    }




    public int getDirection()
    {
        int temp = direction;
        direction = IDLE;

        return temp;

    }




}

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Android Camera2 Api

If you are building an app with the android camera2 api here are my thoughts after fighting with it. (Full code at bottom for copy paste junkies)

This api was a little verbose with me using about 1200 lines of code. It could probably be done easier but if you want something custom here is what you might end up with. I used the github example to copy this code with full blown example here.

 

Here is my code all folded up with some clear descriptions of what everything does. If you the my code below which is essentially just the basic camera example twisted and reorganized so it makes sense to me. Notice this is all contained in a fragment.

 

android camera2 api

There are three things someone using my version, or the original github version would need to change. If you are tackling this project don’t hesitate to copy the code on this page and focus on the changes you need instead of trying to wrap you head around the whole project.

The first is the button setup. Im not really interested into diving into this. Check my codes “camera still picture chain” and you can see how the events are initiated.

The second is the save method(see “Inner classes” in my code folds) . The example gives you a runnable image saver which will probably need to be reworked according to your file storage system or if you need to handle the image for further processing. Working with large image files its best to save the file and pass a URI and take smaller samples of the image to reduce heap size.

Third why does samsung spin the dang images. This took me a while to figure out and I was super upset about it. Here is the code my “Image Review” fragment used to flip and save the image the right way. I believe this was sourced from several sources and have no idea who to give credit too.

 private void rotateImage(int degree)
    {
        Log.d(TAG, "rotateImage: ");

        Matrix mat = new Matrix();
        mat.postRotate(degree);
        bitmapToReview = Bitmap.createBitmap(bitmapToReview, 0,0,bitmapToReview.getWidth(), bitmapToReview.getHeight(), mat, true);

    }

    private void createPreviewImage()
    {
        //get exif data and make bitmap
        int orientation = 0;
        try {
            ExifInterface exifInterface = new ExifInterface(uriOfImage.getPath());
            bitmapToReview = MediaStore.Images.Media.getBitmap(getActivity().getContentResolver(), uriOfImage);
            orientation =  exifInterface.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL);
        }catch (Exception e){
            Log.e(TAG, "createPreviewImage: ", e);
            Crashlytics.log(TAG + " " + e);
            Toast.makeText(getActivity(), "Error loading image", Toast.LENGTH_SHORT).show();
        }

        //check rotation and rotate if needed
        switch (orientation){

            case ExifInterface.ORIENTATION_ROTATE_90:
                Log.d(TAG, "createPreviewImage: 90");
                rotateImage(90);
                break;

            case ExifInterface.ORIENTATION_ROTATE_180:
                Log.d(TAG, "createPreviewImage: 180");
                rotateImage(180);
                break;

                case ExifInterface.ORIENTATION_ROTATE_270:
                    Log.d(TAG, "createPreviewImage: 270");
                    rotateImage(270);
                    break;



        }

        //display on screen
        imageView_Preview.setImageBitmap(bitmapToReview);


    }

 

So that’s it. This post was basically to complain that I spent a week retyping this entire thing out to prove that I could tame it. In reality i licked my wounds and moved on with my life because sometimes there are more important things to do than fight the system.

 

 

For the full code see below and try not too be frightened.

import android.Manifest;
import android.app.Activity;
import android.app.AlertDialog;
import android.app.Dialog;
import android.app.DialogFragment;
import android.app.Fragment;
import android.content.Context;
import android.content.DialogInterface;
import android.content.pm.PackageManager;
import android.content.res.Configuration;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.Point;
import android.graphics.RectF;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.net.Uri;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.support.design.widget.FloatingActionButton;
import android.support.design.widget.Snackbar;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.util.Log;
import android.util.SparseIntArray;
import android.view.LayoutInflater;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.view.ViewGroup;
import android.widget.Toast;

import com.crashlytics.android.Crashlytics;
import com.signal.cagney.easyreceipt.AutoFitTextureView;
import com.signal.cagney.easyreceipt.EasyReceipt;
import com.signal.cagney.easyreceipt.MainActivity;
import com.signal.cagney.easyreceipt.R;
import com.signal.cagney.easyreceipt.Util.FileManager;
import com.squareup.leakcanary.RefWatcher;

import java.io.File;
import java.io.FileOutputStream;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import java.util.concurrent.Semaphore;
import java.util.concurrent.TimeUnit;

public class Main_Fragment extends Fragment implements ActivityCompat.OnRequestPermissionsResultCallback{

private static final String TAG = "MAIN_FRAGMENT";
View myFragmentView;
private AutoFitTextureView mTextureView;

boolean currentlyCapturing;

public final static int GALLERY_CHOOSE = 12;

FileManager fileManager;

//region------------------------camera states

private static final int STATE_PREVIEW = 0;

/**
* Camera state: Waiting for the focus to be locked.
*/
private static final int STATE_WAITING_LOCK = 1;

/**
* Camera state: Waiting for the exposure to be precapture state.
*/
private static final int STATE_WAITING_PRECAPTURE = 2;

/**
* Camera state: Waiting for the exposure state to be something other than precapture.
*/
private static final int STATE_WAITING_NON_PRECAPTURE = 3;

/**
* Camera state: Picture was taken.
*/
private static final int STATE_PICTURE_TAKEN = 4;

/**
* Max preview width that is guaranteed by Camera2 API
*/
private static final int MAX_PREVIEW_WIDTH = 1920;

/**
* Max preview height that is guaranteed by Camera2 API
*/
private static final int MAX_PREVIEW_HEIGHT = 1080;

//endregion

//region------------------------------------------------------- camera fields

private CameraDevice mCameraDevice;
private CaptureRequest.Builder previewBuilder;
private CaptureRequest mPreviewRequest;
private CameraCaptureSession mCameraCaptureSession;

private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
private static final int REQUEST_CAMERA_PERMISSIONS = 1;
private static final String FRAGMENT_DIALOG = "dialog";

private ImageReader imageReader;
private int mSensorOrientation;
private Handler mBackgroundHandler;
private int mState = STATE_PREVIEW;
private Semaphore mCameraOpenCloseLock = new Semaphore(1);
private String mCameraId;
private HandlerThread mBackgroundThread;
private boolean mFlashSupported;

private android.util.Size mPreviewSize;

static {

ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}

private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice cameraDevice) {
Log.d(TAG, "onOpened: ");
mCameraOpenCloseLock.release();
mCameraDevice = cameraDevice;
creatCameraPreviewSession();

}

@Override
public void onDisconnected(@NonNull CameraDevice cameraDevice) {
Log.d(TAG, "onDisconnected: ");
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice =null;

}

@Override
public void onError(@NonNull CameraDevice cameraDevice, int i) {
Log.d(TAG, "onError: ");
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
Activity activity = getActivity();
if (null != activity){
activity.finish();
}

}
};

private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader imageReader) {
Log.d(TAG, "onImageAvailable: ");
Image image = imageReader.acquireNextImage();

mBackgroundHandler.post(new ImageSaver(image, fileManager ));

}
};

private CameraCaptureSession.CaptureCallback mCaptureCallback = new CameraCaptureSession.CaptureCallback() {

private void process(CaptureResult result)
{

switch (mState){

case STATE_PREVIEW: {
//working normal. do nothing
//Log.d(TAG, "process: " + result.toString());
break;
}

case STATE_WAITING_LOCK: {
Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
Log.d(TAG, "process: state awaiting afstate = " + String.valueOf(afState) + " Captureresult = " + result.toString());

if (afState == null || afState == CaptureResult.CONTROL_MODE_OFF ) {
Log.d(TAG, "process: null");
captureStillPicture();
} else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {
Log.d(TAG, "process: something else");
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {
Log.d(TAG, "process: something even more");
mState = STATE_PICTURE_TAKEN;
captureStillPicture();
} else {

runPreCaptureSequence();
}

}

break;
}

case STATE_WAITING_PRECAPTURE: {

Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
Log.d(TAG, "process: precapture " + String.valueOf(aeState) + " Captureresult = " + result.toString());
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
mState = STATE_WAITING_NON_PRECAPTURE;
}

break;
}

case STATE_WAITING_NON_PRECAPTURE: {

Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);

Log.d(TAG, "process: non-precapture" + String.valueOf(aeState) + " Captureresult = " + result.toString());

if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE){
mState =STATE_PICTURE_TAKEN;
captureStillPicture();
}

break;
}

}
}

@Override
public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
//Log.d(TAG, "onCaptureProgressed: ");
process(partialResult);

}

@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
//Log.d(TAG, "onCaptureCompleted: callback");
process(result);
}
};

private final TextureView.SurfaceTextureListener mSurfaceTextureListener
= new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {
Log.d(TAG, "onSurfaceTextureAvailable: ");
openCamera(i, i1);

}

@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int i, int i1) {
Log.d(TAG, "onSurfaceTextureSizeChanged: ");
configureTransform(i, i1);
}

@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
Log.d(TAG, "onSurfaceTextureDestroyed: ");
return false;
}

@Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
//Log.d(TAG, "onSurfaceTextureUpdated: ");
}
};

//endregion

//region------------------------------------------------------------------- Fragment Setup

@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
myFragmentView = inflater.inflate(R.layout.main_frag_layout, container,false);

setupUI();

return myFragmentView;

}

@Override
public void onViewCreated(View view, @Nullable Bundle savedInstanceState) {
super.onViewCreated(view, savedInstanceState);
}

@Override
public void onActivityCreated(@Nullable Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);

fileManager = ((MainActivity)getActivity()).getFileManager();

//mFile = newPictureFileName();

}

private void setupUI()
{
mTextureView = (AutoFitTextureView) myFragmentView.findViewById(R.id.texture);

FloatingActionButton fabGall = (FloatingActionButton) myFragmentView.findViewById(R.id.fabGallery);
fabGall.setImageResource(R.drawable.folder);
fabGall.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {

if (notToBusyToComply()){
((MainActivity)getActivity()).openGallery();
}

/*
Snackbar.make(view, "Replace with your own action", Snackbar.LENGTH_LONG)
.setAction("Action", null).show();
*/
}
});

FloatingActionButton fabPic = (FloatingActionButton) myFragmentView.findViewById(R.id.fabTakePicture);
fabPic.setImageResource(R.drawable.camera);
fabPic.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {

if (notToBusyToComply()){
takePicture();
}

}
});

}

//endregion

//region------------------------------------------------------------------- Camera Main Methods

private void openCamera(int width, int height)
{
Log.d(TAG, "openCamera: ");
if (ContextCompat.checkSelfPermission(getActivity(), android.Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED){
requestCameraPermission();
return;
}

Log.d(TAG, "openCamera: setup");
setUpCameraOutputs(width, height);
Log.d(TAG, "openCamera: configure");
configureTransform(width, height);
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);

try{

if ( !mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)){
throw new RuntimeException("Time out waiting to lock camera opening");
}

manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);

}catch (CameraAccessException e){
e.printStackTrace();
}catch (InterruptedException e){
throw new RuntimeException("Interupted while trying to lock camera opening", e);
}

}

private void closeCamera()
{
Log.d(TAG, "closeCamera: ");
try {
mCameraOpenCloseLock.acquire();
if (null != mCameraCaptureSession) {
mCameraCaptureSession.close();
mCameraCaptureSession = null;
}
if (null != mCameraDevice) {
mCameraDevice.close();
mCameraDevice = null;
}
if (null != imageReader) {
imageReader.close();
imageReader = null;
}
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
} finally {
mCameraOpenCloseLock.release();
}

}

private void creatCameraPreviewSession()
{
Log.d(TAG, "creatCameraPreviewSession: ");
try {

SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;

texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());

Surface surface = new Surface(texture);

previewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewBuilder.addTarget(surface);

mCameraDevice.createCaptureSession(Arrays.asList(surface, imageReader.getSurface()),
new CameraCaptureSession.StateCallback() {

@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
Log.d(TAG, "onConfigured: ");
if (null == mCameraDevice){
return;
}

mCameraCaptureSession = cameraCaptureSession;

try{

previewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);

mPreviewRequest = previewBuilder.build();
mCameraCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);

}catch (CameraAccessException e){
Log.e(TAG, "onConfigured: ", e);
Crashlytics.log(TAG + " " + e);
}

}

@Override
public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
showToast("Failed Preview");
}
}, null);

}catch (CameraAccessException e){
Log.e(TAG, "creatCameraPreviewSession: ", e);
Crashlytics.log(TAG + " " + e);
}

}

private void takePicture()
{
Log.d(TAG, "takePicture: capture chain 1");
//mFile = newPictureFileName();

lockFocus();

}

//endregion

//region------------------------------------------------------------------- Camera Still Picture Chain

private void lockFocus()
{
Log.d(TAG, "lockFocus: capture chain 2");

try{

previewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_START);

mState = STATE_WAITING_LOCK;

mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback,
mBackgroundHandler);

} catch (CameraAccessException e){
Log.e(TAG, "lockFocus: ", e);
Crashlytics.log(TAG + " " + e);
}

}

private void runPreCaptureSequence()
{
Log.d(TAG, "runPreCaptureSequence: capture chain 3");

try{

previewBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER,
CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);

mState = STATE_WAITING_PRECAPTURE;
mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback,
mBackgroundHandler);

}catch (CameraAccessException e){
Log.e(TAG, "runPreCaptureSequence: ", e);
Crashlytics.log(TAG + " " + e);
}

}

private void captureStillPicture()
{

if (currentlyCapturing){
Log.d(TAG, "captureStillPicture: returning");
return;
}
Log.d(TAG, "captureStillPicture: capture chain 4");
//currentlyCapturing = true;

try{
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice){
Log.d(TAG, "captureStillPicture: null checks");
return;
}

final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(imageReader.getSurface());

captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);

int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));

CameraCaptureSession.CaptureCallback captureCallback = new CameraCaptureSession.CaptureCallback() {

@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);

Log.d(TAG, "onCaptureCompleted: from chain 4");
unlockFocus();
//currentlyCapturing = false;

}
};

mCameraCaptureSession.stopRepeating();
mCameraCaptureSession.abortCaptures();
mCameraCaptureSession.capture(captureBuilder.build(), captureCallback, null);

}catch (CameraAccessException cae){
Log.e(TAG, "captureStillPicture: ", cae);
Crashlytics.log(TAG + " " + cae);
}

}

//endregion

//region------------------------------------------------------------------- Camera Supporting Methods

private int getOrientation(int rotation)
{
int returnValue = (ORIENTATIONS.get(rotation) + mSensorOrientation + 270) % 360;

Log.d(TAG, "getOrientation: in " + String.valueOf(rotation) + " out " + String.valueOf(returnValue));

return returnValue;
}

private void unlockFocus()
{

try {

previewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback,
mBackgroundHandler);

mState = STATE_PREVIEW;

mCameraCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler);

} catch (CameraAccessException cae){
Log.e(TAG, "unlockFocus: ", cae );
Crashlytics.log(TAG + " " + cae);
}

}

private void requestCameraPermission()
{

if (ContextCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED){

new ConfirmationDialog().show(getChildFragmentManager(), FRAGMENT_DIALOG);
}else {

Snackbar.make(myFragmentView, "Camera Permissions Already Granted", Snackbar.LENGTH_SHORT).setAction("action", null).show();
}

}

@SuppressWarnings("SuspiciousNameCombination")
private void setUpCameraOutputs(int width, int height)
{
Log.d(TAG, "setUpCameraOutputs: ");

Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try{

for (String cameraID :
manager.getCameraIdList()) {

CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraID);

Integer frontFacing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (frontFacing != null &amp;&amp; frontFacing == CameraCharacteristics.LENS_FACING_FRONT){
continue;
}

StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map== null){
continue;
}

android.util.Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new CompareSizesByArea());
imageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, 2);
imageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler);

// Find out if we need to swap dimension to get the preview size relative to sensor
// coordinate.
int displayRotation = activity.getWindowManager().getDefaultDisplay().getRotation();
//noinspection ConstantConditions
mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
boolean swappedDimensions = false;
switch (displayRotation) {
case Surface.ROTATION_0:
case Surface.ROTATION_180:
if (mSensorOrientation == 90 || mSensorOrientation == 270) {
swappedDimensions = true;
}
break;
case Surface.ROTATION_90:
case Surface.ROTATION_270:
if (mSensorOrientation == 0 || mSensorOrientation == 180) {
swappedDimensions = true;
}
break;
default:
Log.e(TAG, "Display rotation is invalid: " + displayRotation);
Crashlytics.log(TAG + " " + displayRotation);
}

Point displaySize = new Point();
activity.getWindowManager().getDefaultDisplay().getSize(displaySize);
int rotatedPreviewWidth = width;
int rotatedPreviewHeight = height;
int maxPreviewWidth = displaySize.x;
int maxPreviewHeight = displaySize.y;

if (swappedDimensions) {
rotatedPreviewWidth = height;
rotatedPreviewHeight = width;
maxPreviewWidth = displaySize.y;
maxPreviewHeight = displaySize.x;
}

if (maxPreviewWidth &gt; MAX_PREVIEW_WIDTH) {
maxPreviewWidth = MAX_PREVIEW_WIDTH;
}

if (maxPreviewHeight &gt; MAX_PREVIEW_HEIGHT) {
maxPreviewHeight = MAX_PREVIEW_HEIGHT;
}

mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
rotatedPreviewWidth, rotatedPreviewHeight, maxPreviewWidth,
maxPreviewHeight, largest);

// We fit the aspect ratio of TextureView to the size of preview we picked.
int orientation = getResources().getConfiguration().orientation;
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
mTextureView.setAspectRatio(
mPreviewSize.getWidth(), mPreviewSize.getHeight());
} else {
mTextureView.setAspectRatio(
mPreviewSize.getHeight(), mPreviewSize.getWidth());
}

// Check if the flash is supported.
Boolean available = characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE);
mFlashSupported = available == null ? false : available;

mCameraId = cameraID;
return;

}

} catch (CameraAccessException e){
e.printStackTrace();
}catch (NullPointerException e){
ErrorDialog.newInstance(getString(R.string.camera_error))
.show(getChildFragmentManager(), FRAGMENT_DIALOG);
}

}

private void configureTransform(int viewWidth, int viewHeight)
{
Log.d(TAG, "configureTransform: ");

Activity activity = getActivity();

if (null == mTextureView || null == mPreviewSize || null == activity){

return;
}

int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
Matrix matrix = new Matrix();
RectF viewRect = new RectF(0,0, viewWidth, viewHeight);
RectF bufferRect = new RectF(0,0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
float centerX = viewRect.centerX();
float centerY = viewRect.centerY();

if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation){

bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
float scale = Math.max(
(float) viewHeight / mPreviewSize.getHeight(),
(float) viewWidth / mPreviewSize.getWidth());

matrix.postScale(scale, scale, centerX, centerY);
matrix.postRotate(90 * (rotation - 2), centerX, centerY);
} else if (Surface.ROTATION_180 == rotation){
matrix.postRotate(180, centerX ,centerY);
}

mTextureView.setTransform(matrix);

}

private static android.util.Size chooseOptimalSize(android.util.Size[] choices, int textureViewWidth,
int textureViewHeight, int maxWidth, int maxHeight,
android.util.Size aspectRatio)
{

Log.d(TAG, "chooseOptimalSize: ");
List bigEnough = new ArrayList&lt;&gt;();

List notBigEnough = new ArrayList&lt;&gt;();

int w = aspectRatio.getWidth();
int h = aspectRatio.getHeight();

for (android.util.Size option : choices){

if (option.getWidth() &lt;= maxWidth &amp;&amp; option.getHeight() &lt;= maxHeight &amp;&amp; option.getHeight() == option.getWidth() * h / w){ if (option.getWidth() &gt;= textureViewWidth &amp;&amp;
option.getHeight() &gt;= textureViewHeight){
bigEnough.add(option);
}else {
notBigEnough.add(option);
}

}

}

if (bigEnough.size() &gt; 0){

return Collections.min(bigEnough, new CompareSizesByArea());

} else if (notBigEnough.size() &gt; 0 ){

return Collections.max(notBigEnough, new CompareSizesByArea());

}else {
Log.e(TAG, "chooseOptimalSize: couldnt find suitable preview size");
Crashlytics.log(TAG + " " + "chooseOptimalSize: couldnt find suitable preview size");
return choices[0];
}

}

// TODO: 6/1/2018 set auto flash

//endregion

//region------------------------------------------------------------------- Lesser Methods

private void showToast(final String text)
{
Log.d(TAG, "showToast: ");
final Activity activity = getActivity();
if (activity != null){

activity.runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(activity, text, Toast.LENGTH_SHORT).show();
}
});

}

}

@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
}

private void startBackgroundThread()
{
Log.d(TAG, "startBackgroundThread: ");

mBackgroundThread = new HandlerThread("CameraBackground");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());

}

private void stopBackgroundThread()
{
Log.d(TAG, "stopBackgroundThread: ");
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}

}

private boolean notToBusyToComply()
{
Log.d(TAG, "notToBusyToComply: ");
return ((MainActivity)getActivity()).notToBusyToComply();

}

//endregion

//region------------------------------------------------------------------- LifeCycle

@Override
public void onResume() {
super.onResume();

startBackgroundThread();

if (mTextureView.isAvailable()) {
openCamera(mTextureView.getWidth(), mTextureView.getHeight());
} else {
mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
}
}

@Override
public void onPause() {
closeCamera();
stopBackgroundThread();
super.onPause();
}

@Override
public void onDestroyView() {
super.onDestroyView();
//RefWatcher refWatcher = EasyReceipt.getRefwatcher(getActivity());
//refWatcher.watch(this);
}

//endregion

//region------------------------------------------------------------------- Inner Classes

private class ImageSaver implements Runnable{

private final Image mImage;
private final FileManager mFileManager;

//private final File mFile;

ImageSaver(Image image, FileManager fileManager){

mImage = image;

mFileManager = fileManager;

//mFile = file;

}

@Override
public void run() {

File outputFile = null;
try {
outputFile = File.createTempFile(String.valueOf(System.currentTimeMillis()), ".jpg", getActivity().getCacheDir());
}catch (Exception e){
Log.e(TAG, "run: ", e);
Crashlytics.log(TAG + " " + e);
}

ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);

FileOutputStream fos = null;

try{
fos = new FileOutputStream(outputFile);
fos.write(bytes);

}catch (Exception e){
Log.e(TAG, "run: ", e);
Crashlytics.log(TAG + " " + e);
}finally {
mImage.close();

if (fos != null){
try{
fos.close();
}catch (Exception e){
Log.e(TAG, "run: ", e);
Crashlytics.log(TAG + " " + e);
}
}

}

((MainActivity)getActivity()).setUriofImageTOReview(Uri.fromFile(outputFile));
((MainActivity)getActivity()).loadCameraPreviewApprovalFrag();

/*
((MainActivity)getActivity()).loadCameraPreviewApprovalFrag(bytes);

mImage.close();

*/

}

}

static class CompareSizesByArea implements Comparator {

@Override
public int compare(android.util.Size lhs, android.util.Size rhs) {
// We cast here to ensure the multiplications won't overflow
return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
(long) rhs.getWidth() * rhs.getHeight());
}

}

public static class ErrorDialog extends DialogFragment {

private static final String ARG_MESSAGE = "message";

public static ErrorDialog newInstance(String message){

ErrorDialog dialog = new ErrorDialog();
Bundle args = new Bundle();
args.putString(ARG_MESSAGE, message);
dialog.setArguments(args);
return dialog;

}

@NonNull
@Override
public Dialog onCreateDialog(Bundle savedInstanceState) {

final Activity activity = getActivity();
return new AlertDialog.Builder(activity)
.setMessage(getArguments().getString(ARG_MESSAGE))
.setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int i) {
activity.finish();
}
}).create();

}

}

public static class ConfirmationDialog extends DialogFragment{

@NonNull
@Override
public Dialog onCreateDialog(Bundle savedInstanceState)
{

final Fragment parent = getParentFragment();
return new AlertDialog.Builder(getActivity())
.setMessage(R.string.request_permission)
.setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int i) {

ActivityCompat.requestPermissions(getActivity(), new String[]{Manifest.permission.CAMERA}, REQUEST_CAMERA_PERMISSIONS);
}
})
.setNegativeButton(android.R.string.cancel, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int i) {
Activity activity = parent.getActivity();
if (activity != null){
activity.finish();
}
}
}).create();

}

}

//endregion

}

 

 

My recent attempts at painting

Recently I started painting. As an unexpected perk the leisure activity is reducing my inability to examine the ui for my apps.I didn’t have the introspection to figure this out myself but I was suggested painting as a way to relax after work and not stare at a television screen.Besides offering a relaxing activity it has offered the added bonus of exercising my mind in areas of art and I have been purchasing books with paintings and on how to paint.Getting set up with inexpensive enough and I find myself excited to relax after work and continue on with the current project. Hopefully after a year I will achieve some measure of success.For now it has changed how I think. When I’m painting I tend to think how can I make this more interesting and more pleasing to the viewer. If I don’t completely blend the colors or fill in the details then the viewers mind finishes the job for me. The artwork is interactive and engaging.When I used to build software I think about making it easy to use and fulfilling a function, nothing more. Now I tend to consider how the person will quickly glance at the ui and let the colors and symbols blend together in a meaningful collage of useful information.I have only begun the process but hopefully I will find the time to continue.

Working with Google Vision API on Android

Thanks to googles text recognition API I was able to effortlessly get text from the receipts I wanted to parse.

Using google firebase vision text on android app easy receipt

 

In the picture above you can see I created a method I could pass images into which returned results through a callback. The strategy was simple.

Here is the link I used to set it up.

 

1) Add the library through gradle.

 

2) Create a method where you can pass an image and probably a countdown latch

 

3) The using the firebase api create an Image Input Object, a Detector Object and an on success listener.

 

Thats it! But you may notice two things

A)  I used a countdown latch. My text recognition ran in an Intent Service completely separate from the UI and this allowed me to halt the overall parsing strategy to wait for the Vision Results.

B) Googles API returns a vision object but this object was not serializable so I made a serialize-able version and did he conversion myself. The vision result gives you a box where you can easily detect its location and size on the screen and therefore make critical decisions about what information you are looking at.

For example the vendors name is often larger and directly adjacent the phone number or address. These two factors alone will give you the correct vendor with startling accuracy.

Or for example the price is usually on the right lower half (localization excluded).

 

Anyways, I enjoyed using it and its always fun to stand on somebody else’s shoulders when building your project. Thanks Google

Using google firebase vision text on android app easy receipt

 

Using google firebase vision text on android app easy receipt

 

Easy Receipt

Easy receipts is an app I made that allows you to easily organize your receipts. Any one who has had to track their receipts knows the labors of carrying a folder around and then manually entering them into a computer and probably scanning them as well.

 

This app is CONVENIENT.

Just take a picture of the receipt as soon as someone gives it to you and store the receipt in a box to never be opened again.

easy receipt receipt tracking app

Once easy receipt is on your phone you can casually swipe through them and edit them at your leisure. Easy receipt even guesses what  should be entered so most of the time its already done for you.

easy receipt receipt tracking app

As you can see its not perfectly accurate on the first try. But it stores the names of locations so this receipt will likely scan perfectly the next time its scanned.

 

There is also custom editing and cropping features.

easy receipt receipt tracking app

 

When its done you can send yourself an email with the receipt images and a csv file.

easy receipt receipt tracking app

Electric Skate Board

electric skateboard

 

Describing my electric skateboard build. I have only driven this board seven miles so I cant say how far it will go. But I can say that its scary fast and very fun!

Electrical was a cinch!

Below you can see the layout of the parts that are needed for the electronics. I screwed and glued a waterproof case to contain all this to the bottom of the board.

electric skateboard parts

My big mistake

This board is fast I wish it was higher off the ground with larger inflatable tires. Secondly because electric motor low rpm torque is terrible I wish I had a kit that reduced the gear ratio even more than my current kit which was purchased off amazon. See below and notice how this wheel had a bunch of small holes. Making sure your gear fits your wheel is probably the most difficult part of the build if you order the wrong stuff. Also my belt was at maximum tension and I ended up adding bearings on each side that tension the belt.

electric skateboard wheel

 

electric skateboard pulley idler

 

Controlling software

So I had to code the arduino to accept blue tooth commands. Here the code below loops on the bluetooth signal and the gradually increases or reduces an int value that the esc is programmed to recognize.

 

 

#include <Servo.h>
#include <SoftwareSerial.h>
#include <Arduino.h>


SoftwareSerial mySerial(5,6);
Servo myServo;
int input = 0;
int inChar = 90;
int current = 90;
int waiter = 0;

void setup() {

Serial.begin(9600);
mySerial.begin(9600);
myServo.attach(9);

Serial.println("Arduino Ready!!");
}

void loop() {

waiter ++;

if (mySerial.available() > 0){

input = mySerial.read();

if (input != 0)
{
inChar = input;
}

//Serial.println(inChar);


}

delay(15);
//Serial.println("Set motor");
//Serial.println(inChar);
if (current < inChar)
{
if (waiter > 10){
current ++;
waiter = 0;
Serial.println(current);
}
}else if (current > inChar){

current--;
}
myServo.write(current);

Serial.println(current);

}

 

The android code was lengthy. Below is a view of inside android studio and youll see my constraint layout.

skateboard control android app

 

 

Following that is my two classes composing the skateboard controller.

 

<pre>public class MainActivity extends AppCompatActivity {

    String TAG = "MainActivity";
    TextView textViewSkateBoardInfo, textViewFeedback;
    EditText editTextInputToBoard;
    Button buttonSubmit, buttonBrake, buttonCoast, buttonCruise, buttonMinus, buttonPlus;
    Spinner spinnerDevices;
    SeekBar seekBarSpeed;

    int desiredSpeed = 0;


    String[] deviceNames;
    BluetoothAdapter bluetoothAdapter;
    ArrayAdapter<String> spinnerAdapter;
    int chosenDevice = 0;
    UUID MY_UUID;
    boolean threadControl = false;
    boolean bluetoothConnectionActive = false;
    Set<BluetoothDevice> pairedDevices;
    MyBlueToothService myBlueToothService;
     Handler mHandler;

    ConnectThread connectThread;

    //region----------------------------------------    Overrides & Permissions




    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
        this.setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);


            setUpUI();

            checkPermissions();







    }



    private void checkPermissions()
    {
        String[] permissions = new String[] {

                Manifest.permission.ACCESS_FINE_LOCATION,
                Manifest.permission.ACCESS_COARSE_LOCATION,
                Manifest.permission.BLUETOOTH,
                Manifest.permission.BLUETOOTH_ADMIN
        };


        ActivityCompat.requestPermissions(MainActivity.this, permissions, 10);


    }



    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);


        setupBlueTooth();

    }


    //endregion


    //region----------------------------------------    UI



    private void setUpUI()
    {
        MY_UUID = UUID.fromString("ca805ff2-39d8-47ca-a9a4-ca6227358943");
        textViewFeedback = findViewById(R.id.textview_FeedBack);
        textViewSkateBoardInfo = findViewById(R.id.textView_SkateBoard);
        editTextInputToBoard = findViewById(R.id.editText_InputToSkateBoard);
        buttonSubmit = findViewById(R.id.button_Submit);
        spinnerDevices = findViewById(R.id.spinner_Devices);
        buttonBrake = findViewById(R.id.button_Brake);
        seekBarSpeed = findViewById(R.id.seekBar_speed);
        buttonCoast = findViewById(R.id.button_Coast);
        buttonCruise = findViewById(R.id.button_Push);
        buttonMinus = findViewById(R.id.button_minusFive);
        buttonPlus = findViewById(R.id.button_plusFive);

        seekBarSpeed.setProgress(90);

        buttonSubmit.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                buttonClick();
            }
        });

        buttonCoast.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = 98;
                seekBarSpeed.setProgress(98);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });

        buttonCruise.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = 105;
                seekBarSpeed.setProgress(105);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });

        buttonMinus.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = desiredSpeed -5;
                seekBarSpeed.setProgress(desiredSpeed);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });

        buttonPlus.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = desiredSpeed +5;
                seekBarSpeed.setProgress(desiredSpeed);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });




        seekBarSpeed.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
            @Override
            public void onProgressChanged(SeekBar seekBar, int i, boolean b) {
                desiredSpeed = i;

            }

            @Override
            public void onStartTrackingTouch(SeekBar seekBar) {

            }

            @Override
            public void onStopTrackingTouch(SeekBar seekBar) {
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);
            }
        });


        buttonBrake.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                desiredSpeed = 90;
                seekBarSpeed.setProgress(90);
                sendSpeed(desiredSpeed);
                String s = "Speed set at " + String.valueOf(desiredSpeed);
                writeToInternal(s);

            }
        });



        mHandler = new  Handler(){

            @Override
            public void handleMessage(Message msg) {

                if (msg.what == 0){

                byte[] b = (byte[]) msg.obj;
                    // TODO: 5/12/2018 heres the response
                    String s = new String(b);

                    textViewFeedback.setText(s);

                }



            }
        };

    }




    private void buttonClick()
    {
        String s = editTextInputToBoard.getText().toString();

        if (!bluetoothConnectionActive ){
            runConnectThread();
            Log.d(TAG, "buttonClick: connecting");
        }else if(s.equals(""))
        {
            connectThread.cancel();
            myBlueToothService.cancel();
            Log.d(TAG, "buttonClick: dissconnecting");
            setupBlueTooth();
        }


            if (!s.equals(""))
            {
            sendString(s);
                Log.i(TAG, "buttonClick: sending ext input");
            }




    }

    public void writeToInternal(String input)
    {
        final String s = input;

        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                textViewSkateBoardInfo.setText(s);

            }
        });



    }

    public void writeToBlueToothResponse(String input)
    {
        final String s = input;
        Log.d(TAG, "writeToBlueToothResponse: " + input);

        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                textViewFeedback.setText(s);
            }
        });


    }


    public void sendString(String s)
    {
        try {
            byte[] bytes = s.getBytes("UTF-8");
            myBlueToothService.writeToDevice(bytes);
            writeToInternal(s);
        }catch (UnsupportedEncodingException e){
            Log.e(TAG, "sendString: ", e);
            writeToInternal("Failed to send");
        }

    }


    public void sendSpeed(int toSend)
    {


        byte[] b = ByteBuffer.allocate(4).putInt(toSend).array();

        if (bluetoothConnectionActive) {

            myBlueToothService.writeToDevice(b);

            Log.i(TAG, "sendSpeed: sending...");
        }
    }


    //endregion


    //region----------------------------------------    Bluetooth

    private void setupBlueTooth()
    {


        bluetoothAdapter = BluetoothAdapter.getDefaultAdapter();
        if (bluetoothAdapter == null){
            Toast.makeText(this, "device doesnt support bluetooth", Toast.LENGTH_SHORT).show();
        }

        if (!bluetoothAdapter.isEnabled()){
            Intent enableBTIntent = new Intent(BluetoothAdapter.ACTION_REQUEST_ENABLE);
            startActivityForResult(enableBTIntent, 20);
        }

        pairedDevices = bluetoothAdapter.getBondedDevices();


        if (pairedDevices.size() > 0){
            deviceNames = new String[pairedDevices.size()];


            int tick = 0;
            for (BluetoothDevice device :
                    pairedDevices) {

                deviceNames[tick] = device.getName();



                tick++;
            }

            spinnerAdapter = new ArrayAdapter<String>(this,android.R.layout.simple_spinner_item, deviceNames);

            spinnerAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);

            spinnerDevices.setAdapter(spinnerAdapter);

            spinnerDevices.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
                @Override
                public void onItemSelected(AdapterView<?> adapterView, View view, int i, long l) {

                    chosenDevice = i;

                }

                @Override
                public void onNothingSelected(AdapterView<?> adapterView) {

                }
            });

        }



    }



    public void runConnectThread()
    {
    BluetoothDevice deviceToInsert = null;

    for (BluetoothDevice d :
            pairedDevices) {
        if (d.getName().equals(deviceNames[chosenDevice])){
            deviceToInsert = d;
        }
    }
    Log.i(TAG, "runConnectThread: try to connect");

   connectThread = new ConnectThread(deviceToInsert);
    connectThread.begin();

}


    private class ConnectThread implements Runnable
    {
    private  BluetoothSocket mmSocket;
    private final BluetoothDevice mmDevice;
    Thread thread;


    public ConnectThread(BluetoothDevice device)
    {
        if (device == null){
            Toast.makeText(MainActivity.this, "BT Device Null", Toast.LENGTH_SHORT).show();
        }

        BluetoothSocket tmp = null;

        mmDevice = device;


        try{

            tmp = device.createRfcommSocketToServiceRecord(MY_UUID);
        }catch (Exception e){
            Log.e(TAG, "ConnectThread: ",e );
        }


        mmSocket = tmp;

    }




    @Override
    public void run() {

        bluetoothAdapter.cancelDiscovery();

        try {
            mmSocket = (BluetoothSocket) mmDevice.getClass().getMethod("createRfcommSocket", new Class[]{int.class}).invoke(mmDevice, 1);
            mmSocket.connect();

        }catch (Exception e){
            Log.e(TAG, "run: ", e);
            writeToInternal( "failed connection to " + deviceNames[chosenDevice]);

                try {

                    mmSocket.close();
                }catch (Exception e2){
                    Log.e(TAG, "run: ", e2);
                }
                return;



        }

            writeToInternal( "connected to " + deviceNames[chosenDevice]);
            //do something with mmSOcket here
            myBlueToothService = new MyBlueToothService(mmSocket);
            Log.i(TAG, "Connection Success");
            bluetoothConnectionActive = true;
            passSTart();


    }




    public void begin()
    {
        thread = new Thread(this);
        threadControl = true;
        thread.start();
    }


    public void cancel()
    {
        bluetoothConnectionActive = false;
        threadControl = false;
        try {
            mmSocket.close();
        }catch (Exception e){
            Log.e(TAG, "cancel: ", e);
        }

    }


}

    public void passSTart()
    {
        myBlueToothService.passMain(this);
    }




    //endregion



    //region----------------------------------------    lifecycle

    @Override
    protected void onDestroy() {
        super.onDestroy();
        myBlueToothService.cancel();

    }


    //endregion


}</pre>

 

</pre>
<pre>    private interface MessageConstants {
        public static final int MESSAGE_READ = 0;
        public static final int MESSAGE_WRITE = 1;
        public static final int MESSAGE_TOAST = 2;


    }


    public  MyBlueToothService(BluetoothSocket socket)
    {
        connectedThread = new ConnectedThread(socket);
        connectedThread.startListeneing();


    }


    public void writeToDevice(byte[] b)
    {
        connectedThread.write(b);
    }


    public void cancel()
    {
        connectedThread.cancel();
    }

    public void passMain(MainActivity a)
    {
        connectedThread.setMainActivity(a);
    }



    private class ConnectedThread implements Runnable
    {
        Thread thread;
        MainActivity mainActivity;

        private final BluetoothSocket mmSocket;
        private  DataInputStream mmInputStream;
        private DataOutputStream mmOutputStream;
        private byte[] mmBuffer;

        public ConnectedThread(BluetoothSocket socket){

            mmSocket = socket;
            InputStream tmpIn = null;
            OutputStream tmpOut = null;


            try
            {
              tmpIn = socket.getInputStream();
                Log.i(TAG, "ConnectedThread: input stream success");
            }catch (Exception e){
                Log.e(TAG, "ConnectedThread: failed", e);
            }
            try
            {
              tmpOut = socket.getOutputStream();
                Log.i(TAG, "ConnectedThread: output stream success");
            }catch (Exception e){
                Log.e(TAG, "ConnectedThread: failed", e);
            }

            mmInputStream = new DataInputStream(tmpIn);

            mmOutputStream = new DataOutputStream(tmpOut);





        }


        public void run()
        {
            mmBuffer = new byte[1024];
            int numbytes = 0;


            while (threadBool)
            {
                try
                {
                  numbytes = mmInputStream.read(mmBuffer);
                    //Log.i(TAG, "mybluetooth read bufer bypassed");

                    if (numbytes > 0){
                        //Log.d(TAG, "read numbytes > 0");

                        String s = new String(mmBuffer, "UTF-8");
                        mainActivity.writeToBlueToothResponse(s);

                    }

                }catch (Exception e){
                    Log.e(TAG, "run: ", e);
                    break;
                }



            }




        }





        public void write(byte[] bytes)
        {
            byteHolder = bytes;

            Thread t = new Thread(new Runnable() {
                @Override
                public void run() {

                            WriteToBT();

                }
            });
            t.start();

        }





        private void WriteToBT()
        {
            //Log.d(TAG, "WriteToBT: " + String.valueOf(byteHolder.length));


            try{
                mmOutputStream.write(byteHolder);
                //Log.i(TAG, "WriteToBT: passed");
                int i = new BigInteger(byteHolder).intValue();
                //Log.d(TAG, "WriteToBT: " + String.valueOf(i));

            }catch (Exception e){
                Log.e(TAG, "write: ", e);

            }
        }



        public void startListeneing()
        {threadBool = true;
        thread = new Thread(this);

        thread.start();

        }



        public void cancel()
        {
            threadBool = false;
            try{
                mmSocket.close();
            }catch (Exception e){
                Log.e(TAG, "cancel: ", e);
            }


        }



        public void setMainActivity(MainActivity a)
        {
            mainActivity = a;
        }



    }



}</pre>
<pre>

and that’s it!
The post Electric Skate Board appeared first on SignalHillTechnology.

Powered by WPeMatico