If you are building an app with the android camera2 api here are my thoughts after fighting with it. (Full code at bottom for copy paste junkies)
This api was a little verbose with me using about 1200 lines of code. It could probably be done easier but if you want something custom here is what you might end up with. I used the github example to copy this code with full blown example here.
Here is my code all folded up with some clear descriptions of what everything does. If you the my code below which is essentially just the basic camera example twisted and reorganized so it makes sense to me. Notice this is all contained in a fragment.

There are three things someone using my version, or the original github version would need to change. If you are tackling this project don’t hesitate to copy the code on this page and focus on the changes you need instead of trying to wrap you head around the whole project.
The first is the button setup. Im not really interested into diving into this. Check my codes “camera still picture chain” and you can see how the events are initiated.
The second is the save method(see “Inner classes” in my code folds) . The example gives you a runnable image saver which will probably need to be reworked according to your file storage system or if you need to handle the image for further processing. Working with large image files its best to save the file and pass a URI and take smaller samples of the image to reduce heap size.
Third why does samsung spin the dang images. This took me a while to figure out and I was super upset about it. Here is the code my “Image Review” fragment used to flip and save the image the right way. I believe this was sourced from several sources and have no idea who to give credit too.
private void rotateImage(int degree) { Log.d(TAG, "rotateImage: "); Matrix mat = new Matrix(); mat.postRotate(degree); bitmapToReview = Bitmap.createBitmap(bitmapToReview, 0,0,bitmapToReview.getWidth(), bitmapToReview.getHeight(), mat, true); } private void createPreviewImage() { //get exif data and make bitmap int orientation = 0; try { ExifInterface exifInterface = new ExifInterface(uriOfImage.getPath()); bitmapToReview = MediaStore.Images.Media.getBitmap(getActivity().getContentResolver(), uriOfImage); orientation = exifInterface.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL); }catch (Exception e){ Log.e(TAG, "createPreviewImage: ", e); Crashlytics.log(TAG + " " + e); Toast.makeText(getActivity(), "Error loading image", Toast.LENGTH_SHORT).show(); } //check rotation and rotate if needed switch (orientation){ case ExifInterface.ORIENTATION_ROTATE_90: Log.d(TAG, "createPreviewImage: 90"); rotateImage(90); break; case ExifInterface.ORIENTATION_ROTATE_180: Log.d(TAG, "createPreviewImage: 180"); rotateImage(180); break; case ExifInterface.ORIENTATION_ROTATE_270: Log.d(TAG, "createPreviewImage: 270"); rotateImage(270); break; } //display on screen imageView_Preview.setImageBitmap(bitmapToReview); }
So that’s it. This post was basically to complain that I spent a week retyping this entire thing out to prove that I could tame it. In reality i licked my wounds and moved on with my life because sometimes there are more important things to do than fight the system.
For the full code see below and try not too be frightened.
import android.Manifest; import android.app.Activity; import android.app.AlertDialog; import android.app.Dialog; import android.app.DialogFragment; import android.app.Fragment; import android.content.Context; import android.content.DialogInterface; import android.content.pm.PackageManager; import android.content.res.Configuration; import android.graphics.ImageFormat; import android.graphics.Matrix; import android.graphics.Point; import android.graphics.RectF; import android.graphics.SurfaceTexture; import android.hardware.camera2.CameraAccessException; import android.hardware.camera2.CameraCaptureSession; import android.hardware.camera2.CameraCharacteristics; import android.hardware.camera2.CameraDevice; import android.hardware.camera2.CameraManager; import android.hardware.camera2.CameraMetadata; import android.hardware.camera2.CaptureRequest; import android.hardware.camera2.CaptureResult; import android.hardware.camera2.TotalCaptureResult; import android.hardware.camera2.params.StreamConfigurationMap; import android.media.Image; import android.media.ImageReader; import android.net.Uri; import android.os.Bundle; import android.os.Handler; import android.os.HandlerThread; import android.support.annotation.NonNull; import android.support.annotation.Nullable; import android.support.design.widget.FloatingActionButton; import android.support.design.widget.Snackbar; import android.support.v4.app.ActivityCompat; import android.support.v4.content.ContextCompat; import android.util.Log; import android.util.SparseIntArray; import android.view.LayoutInflater; import android.view.Surface; import android.view.TextureView; import android.view.View; import android.view.ViewGroup; import android.widget.Toast; import com.crashlytics.android.Crashlytics; import com.signal.cagney.easyreceipt.AutoFitTextureView; import com.signal.cagney.easyreceipt.EasyReceipt; import com.signal.cagney.easyreceipt.MainActivity; import com.signal.cagney.easyreceipt.R; import com.signal.cagney.easyreceipt.Util.FileManager; import com.squareup.leakcanary.RefWatcher; import java.io.File; import java.io.FileOutputStream; import java.nio.ByteBuffer; import java.util.ArrayList; import java.util.Arrays; import java.util.Collections; import java.util.Comparator; import java.util.List; import java.util.concurrent.Semaphore; import java.util.concurrent.TimeUnit; public class Main_Fragment extends Fragment implements ActivityCompat.OnRequestPermissionsResultCallback{ private static final String TAG = "MAIN_FRAGMENT"; View myFragmentView; private AutoFitTextureView mTextureView; boolean currentlyCapturing; public final static int GALLERY_CHOOSE = 12; FileManager fileManager; //region------------------------camera states private static final int STATE_PREVIEW = 0; /** * Camera state: Waiting for the focus to be locked. */ private static final int STATE_WAITING_LOCK = 1; /** * Camera state: Waiting for the exposure to be precapture state. */ private static final int STATE_WAITING_PRECAPTURE = 2; /** * Camera state: Waiting for the exposure state to be something other than precapture. */ private static final int STATE_WAITING_NON_PRECAPTURE = 3; /** * Camera state: Picture was taken. */ private static final int STATE_PICTURE_TAKEN = 4; /** * Max preview width that is guaranteed by Camera2 API */ private static final int MAX_PREVIEW_WIDTH = 1920; /** * Max preview height that is guaranteed by Camera2 API */ private static final int MAX_PREVIEW_HEIGHT = 1080; //endregion //region------------------------------------------------------- camera fields private CameraDevice mCameraDevice; private CaptureRequest.Builder previewBuilder; private CaptureRequest mPreviewRequest; private CameraCaptureSession mCameraCaptureSession; private static final SparseIntArray ORIENTATIONS = new SparseIntArray(); private static final int REQUEST_CAMERA_PERMISSIONS = 1; private static final String FRAGMENT_DIALOG = "dialog"; private ImageReader imageReader; private int mSensorOrientation; private Handler mBackgroundHandler; private int mState = STATE_PREVIEW; private Semaphore mCameraOpenCloseLock = new Semaphore(1); private String mCameraId; private HandlerThread mBackgroundThread; private boolean mFlashSupported; private android.util.Size mPreviewSize; static { ORIENTATIONS.append(Surface.ROTATION_0, 90); ORIENTATIONS.append(Surface.ROTATION_90, 0); ORIENTATIONS.append(Surface.ROTATION_180, 270); ORIENTATIONS.append(Surface.ROTATION_270, 180); } private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() { @Override public void onOpened(@NonNull CameraDevice cameraDevice) { Log.d(TAG, "onOpened: "); mCameraOpenCloseLock.release(); mCameraDevice = cameraDevice; creatCameraPreviewSession(); } @Override public void onDisconnected(@NonNull CameraDevice cameraDevice) { Log.d(TAG, "onDisconnected: "); mCameraOpenCloseLock.release(); cameraDevice.close(); mCameraDevice =null; } @Override public void onError(@NonNull CameraDevice cameraDevice, int i) { Log.d(TAG, "onError: "); mCameraOpenCloseLock.release(); cameraDevice.close(); mCameraDevice = null; Activity activity = getActivity(); if (null != activity){ activity.finish(); } } }; private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() { @Override public void onImageAvailable(ImageReader imageReader) { Log.d(TAG, "onImageAvailable: "); Image image = imageReader.acquireNextImage(); mBackgroundHandler.post(new ImageSaver(image, fileManager )); } }; private CameraCaptureSession.CaptureCallback mCaptureCallback = new CameraCaptureSession.CaptureCallback() { private void process(CaptureResult result) { switch (mState){ case STATE_PREVIEW: { //working normal. do nothing //Log.d(TAG, "process: " + result.toString()); break; } case STATE_WAITING_LOCK: { Integer afState = result.get(CaptureResult.CONTROL_AF_STATE); Log.d(TAG, "process: state awaiting afstate = " + String.valueOf(afState) + " Captureresult = " + result.toString()); if (afState == null || afState == CaptureResult.CONTROL_MODE_OFF ) { Log.d(TAG, "process: null"); captureStillPicture(); } else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState || CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) { Log.d(TAG, "process: something else"); Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE); if (aeState == null || aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) { Log.d(TAG, "process: something even more"); mState = STATE_PICTURE_TAKEN; captureStillPicture(); } else { runPreCaptureSequence(); } } break; } case STATE_WAITING_PRECAPTURE: { Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE); Log.d(TAG, "process: precapture " + String.valueOf(aeState) + " Captureresult = " + result.toString()); if (aeState == null || aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE || aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) { mState = STATE_WAITING_NON_PRECAPTURE; } break; } case STATE_WAITING_NON_PRECAPTURE: { Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE); Log.d(TAG, "process: non-precapture" + String.valueOf(aeState) + " Captureresult = " + result.toString()); if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE){ mState =STATE_PICTURE_TAKEN; captureStillPicture(); } break; } } } @Override public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) { //Log.d(TAG, "onCaptureProgressed: "); process(partialResult); } @Override public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) { //Log.d(TAG, "onCaptureCompleted: callback"); process(result); } }; private final TextureView.SurfaceTextureListener mSurfaceTextureListener = new TextureView.SurfaceTextureListener() { @Override public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) { Log.d(TAG, "onSurfaceTextureAvailable: "); openCamera(i, i1); } @Override public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int i, int i1) { Log.d(TAG, "onSurfaceTextureSizeChanged: "); configureTransform(i, i1); } @Override public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) { Log.d(TAG, "onSurfaceTextureDestroyed: "); return false; } @Override public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) { //Log.d(TAG, "onSurfaceTextureUpdated: "); } }; //endregion //region------------------------------------------------------------------- Fragment Setup @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { myFragmentView = inflater.inflate(R.layout.main_frag_layout, container,false); setupUI(); return myFragmentView; } @Override public void onViewCreated(View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); } @Override public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); fileManager = ((MainActivity)getActivity()).getFileManager(); //mFile = newPictureFileName(); } private void setupUI() { mTextureView = (AutoFitTextureView) myFragmentView.findViewById(R.id.texture); FloatingActionButton fabGall = (FloatingActionButton) myFragmentView.findViewById(R.id.fabGallery); fabGall.setImageResource(R.drawable.folder); fabGall.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { if (notToBusyToComply()){ ((MainActivity)getActivity()).openGallery(); } /* Snackbar.make(view, "Replace with your own action", Snackbar.LENGTH_LONG) .setAction("Action", null).show(); */ } }); FloatingActionButton fabPic = (FloatingActionButton) myFragmentView.findViewById(R.id.fabTakePicture); fabPic.setImageResource(R.drawable.camera); fabPic.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { if (notToBusyToComply()){ takePicture(); } } }); } //endregion //region------------------------------------------------------------------- Camera Main Methods private void openCamera(int width, int height) { Log.d(TAG, "openCamera: "); if (ContextCompat.checkSelfPermission(getActivity(), android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED){ requestCameraPermission(); return; } Log.d(TAG, "openCamera: setup"); setUpCameraOutputs(width, height); Log.d(TAG, "openCamera: configure"); configureTransform(width, height); Activity activity = getActivity(); CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE); try{ if ( !mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)){ throw new RuntimeException("Time out waiting to lock camera opening"); } manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler); }catch (CameraAccessException e){ e.printStackTrace(); }catch (InterruptedException e){ throw new RuntimeException("Interupted while trying to lock camera opening", e); } } private void closeCamera() { Log.d(TAG, "closeCamera: "); try { mCameraOpenCloseLock.acquire(); if (null != mCameraCaptureSession) { mCameraCaptureSession.close(); mCameraCaptureSession = null; } if (null != mCameraDevice) { mCameraDevice.close(); mCameraDevice = null; } if (null != imageReader) { imageReader.close(); imageReader = null; } } catch (InterruptedException e) { throw new RuntimeException("Interrupted while trying to lock camera closing.", e); } finally { mCameraOpenCloseLock.release(); } } private void creatCameraPreviewSession() { Log.d(TAG, "creatCameraPreviewSession: "); try { SurfaceTexture texture = mTextureView.getSurfaceTexture(); assert texture != null; texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight()); Surface surface = new Surface(texture); previewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW); previewBuilder.addTarget(surface); mCameraDevice.createCaptureSession(Arrays.asList(surface, imageReader.getSurface()), new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) { Log.d(TAG, "onConfigured: "); if (null == mCameraDevice){ return; } mCameraCaptureSession = cameraCaptureSession; try{ previewBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE); mPreviewRequest = previewBuilder.build(); mCameraCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler); }catch (CameraAccessException e){ Log.e(TAG, "onConfigured: ", e); Crashlytics.log(TAG + " " + e); } } @Override public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) { showToast("Failed Preview"); } }, null); }catch (CameraAccessException e){ Log.e(TAG, "creatCameraPreviewSession: ", e); Crashlytics.log(TAG + " " + e); } } private void takePicture() { Log.d(TAG, "takePicture: capture chain 1"); //mFile = newPictureFileName(); lockFocus(); } //endregion //region------------------------------------------------------------------- Camera Still Picture Chain private void lockFocus() { Log.d(TAG, "lockFocus: capture chain 2"); try{ previewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START); mState = STATE_WAITING_LOCK; mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback, mBackgroundHandler); } catch (CameraAccessException e){ Log.e(TAG, "lockFocus: ", e); Crashlytics.log(TAG + " " + e); } } private void runPreCaptureSequence() { Log.d(TAG, "runPreCaptureSequence: capture chain 3"); try{ previewBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START); mState = STATE_WAITING_PRECAPTURE; mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback, mBackgroundHandler); }catch (CameraAccessException e){ Log.e(TAG, "runPreCaptureSequence: ", e); Crashlytics.log(TAG + " " + e); } } private void captureStillPicture() { if (currentlyCapturing){ Log.d(TAG, "captureStillPicture: returning"); return; } Log.d(TAG, "captureStillPicture: capture chain 4"); //currentlyCapturing = true; try{ final Activity activity = getActivity(); if (null == activity || null == mCameraDevice){ Log.d(TAG, "captureStillPicture: null checks"); return; } final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE); captureBuilder.addTarget(imageReader.getSurface()); captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE); int rotation = activity.getWindowManager().getDefaultDisplay().getRotation(); captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation)); CameraCaptureSession.CaptureCallback captureCallback = new CameraCaptureSession.CaptureCallback() { @Override public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) { super.onCaptureCompleted(session, request, result); Log.d(TAG, "onCaptureCompleted: from chain 4"); unlockFocus(); //currentlyCapturing = false; } }; mCameraCaptureSession.stopRepeating(); mCameraCaptureSession.abortCaptures(); mCameraCaptureSession.capture(captureBuilder.build(), captureCallback, null); }catch (CameraAccessException cae){ Log.e(TAG, "captureStillPicture: ", cae); Crashlytics.log(TAG + " " + cae); } } //endregion //region------------------------------------------------------------------- Camera Supporting Methods private int getOrientation(int rotation) { int returnValue = (ORIENTATIONS.get(rotation) + mSensorOrientation + 270) % 360; Log.d(TAG, "getOrientation: in " + String.valueOf(rotation) + " out " + String.valueOf(returnValue)); return returnValue; } private void unlockFocus() { try { previewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL); mCameraCaptureSession.capture(previewBuilder.build(), mCaptureCallback, mBackgroundHandler); mState = STATE_PREVIEW; mCameraCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler); } catch (CameraAccessException cae){ Log.e(TAG, "unlockFocus: ", cae ); Crashlytics.log(TAG + " " + cae); } } private void requestCameraPermission() { if (ContextCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED){ new ConfirmationDialog().show(getChildFragmentManager(), FRAGMENT_DIALOG); }else { Snackbar.make(myFragmentView, "Camera Permissions Already Granted", Snackbar.LENGTH_SHORT).setAction("action", null).show(); } } @SuppressWarnings("SuspiciousNameCombination") private void setUpCameraOutputs(int width, int height) { Log.d(TAG, "setUpCameraOutputs: "); Activity activity = getActivity(); CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE); try{ for (String cameraID : manager.getCameraIdList()) { CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraID); Integer frontFacing = characteristics.get(CameraCharacteristics.LENS_FACING); if (frontFacing != null && frontFacing == CameraCharacteristics.LENS_FACING_FRONT){ continue; } StreamConfigurationMap map = characteristics.get( CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP); if (map== null){ continue; } android.util.Size largest = Collections.max( Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new CompareSizesByArea()); imageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(), ImageFormat.JPEG, 2); imageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler); // Find out if we need to swap dimension to get the preview size relative to sensor // coordinate. int displayRotation = activity.getWindowManager().getDefaultDisplay().getRotation(); //noinspection ConstantConditions mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION); boolean swappedDimensions = false; switch (displayRotation) { case Surface.ROTATION_0: case Surface.ROTATION_180: if (mSensorOrientation == 90 || mSensorOrientation == 270) { swappedDimensions = true; } break; case Surface.ROTATION_90: case Surface.ROTATION_270: if (mSensorOrientation == 0 || mSensorOrientation == 180) { swappedDimensions = true; } break; default: Log.e(TAG, "Display rotation is invalid: " + displayRotation); Crashlytics.log(TAG + " " + displayRotation); } Point displaySize = new Point(); activity.getWindowManager().getDefaultDisplay().getSize(displaySize); int rotatedPreviewWidth = width; int rotatedPreviewHeight = height; int maxPreviewWidth = displaySize.x; int maxPreviewHeight = displaySize.y; if (swappedDimensions) { rotatedPreviewWidth = height; rotatedPreviewHeight = width; maxPreviewWidth = displaySize.y; maxPreviewHeight = displaySize.x; } if (maxPreviewWidth > MAX_PREVIEW_WIDTH) { maxPreviewWidth = MAX_PREVIEW_WIDTH; } if (maxPreviewHeight > MAX_PREVIEW_HEIGHT) { maxPreviewHeight = MAX_PREVIEW_HEIGHT; } mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class), rotatedPreviewWidth, rotatedPreviewHeight, maxPreviewWidth, maxPreviewHeight, largest); // We fit the aspect ratio of TextureView to the size of preview we picked. int orientation = getResources().getConfiguration().orientation; if (orientation == Configuration.ORIENTATION_LANDSCAPE) { mTextureView.setAspectRatio( mPreviewSize.getWidth(), mPreviewSize.getHeight()); } else { mTextureView.setAspectRatio( mPreviewSize.getHeight(), mPreviewSize.getWidth()); } // Check if the flash is supported. Boolean available = characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE); mFlashSupported = available == null ? false : available; mCameraId = cameraID; return; } } catch (CameraAccessException e){ e.printStackTrace(); }catch (NullPointerException e){ ErrorDialog.newInstance(getString(R.string.camera_error)) .show(getChildFragmentManager(), FRAGMENT_DIALOG); } } private void configureTransform(int viewWidth, int viewHeight) { Log.d(TAG, "configureTransform: "); Activity activity = getActivity(); if (null == mTextureView || null == mPreviewSize || null == activity){ return; } int rotation = activity.getWindowManager().getDefaultDisplay().getRotation(); Matrix matrix = new Matrix(); RectF viewRect = new RectF(0,0, viewWidth, viewHeight); RectF bufferRect = new RectF(0,0, mPreviewSize.getHeight(), mPreviewSize.getWidth()); float centerX = viewRect.centerX(); float centerY = viewRect.centerY(); if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation){ bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY()); matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL); float scale = Math.max( (float) viewHeight / mPreviewSize.getHeight(), (float) viewWidth / mPreviewSize.getWidth()); matrix.postScale(scale, scale, centerX, centerY); matrix.postRotate(90 * (rotation - 2), centerX, centerY); } else if (Surface.ROTATION_180 == rotation){ matrix.postRotate(180, centerX ,centerY); } mTextureView.setTransform(matrix); } private static android.util.Size chooseOptimalSize(android.util.Size[] choices, int textureViewWidth, int textureViewHeight, int maxWidth, int maxHeight, android.util.Size aspectRatio) { Log.d(TAG, "chooseOptimalSize: "); List bigEnough = new ArrayList<>(); List notBigEnough = new ArrayList<>(); int w = aspectRatio.getWidth(); int h = aspectRatio.getHeight(); for (android.util.Size option : choices){ if (option.getWidth() <= maxWidth && option.getHeight() <= maxHeight && option.getHeight() == option.getWidth() * h / w){ if (option.getWidth() >= textureViewWidth && option.getHeight() >= textureViewHeight){ bigEnough.add(option); }else { notBigEnough.add(option); } } } if (bigEnough.size() > 0){ return Collections.min(bigEnough, new CompareSizesByArea()); } else if (notBigEnough.size() > 0 ){ return Collections.max(notBigEnough, new CompareSizesByArea()); }else { Log.e(TAG, "chooseOptimalSize: couldnt find suitable preview size"); Crashlytics.log(TAG + " " + "chooseOptimalSize: couldnt find suitable preview size"); return choices[0]; } } // TODO: 6/1/2018 set auto flash //endregion //region------------------------------------------------------------------- Lesser Methods private void showToast(final String text) { Log.d(TAG, "showToast: "); final Activity activity = getActivity(); if (activity != null){ activity.runOnUiThread(new Runnable() { @Override public void run() { Toast.makeText(activity, text, Toast.LENGTH_SHORT).show(); } }); } } @Override public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) { super.onRequestPermissionsResult(requestCode, permissions, grantResults); } private void startBackgroundThread() { Log.d(TAG, "startBackgroundThread: "); mBackgroundThread = new HandlerThread("CameraBackground"); mBackgroundThread.start(); mBackgroundHandler = new Handler(mBackgroundThread.getLooper()); } private void stopBackgroundThread() { Log.d(TAG, "stopBackgroundThread: "); mBackgroundThread.quitSafely(); try { mBackgroundThread.join(); mBackgroundThread = null; mBackgroundHandler = null; } catch (InterruptedException e) { e.printStackTrace(); } } private boolean notToBusyToComply() { Log.d(TAG, "notToBusyToComply: "); return ((MainActivity)getActivity()).notToBusyToComply(); } //endregion //region------------------------------------------------------------------- LifeCycle @Override public void onResume() { super.onResume(); startBackgroundThread(); if (mTextureView.isAvailable()) { openCamera(mTextureView.getWidth(), mTextureView.getHeight()); } else { mTextureView.setSurfaceTextureListener(mSurfaceTextureListener); } } @Override public void onPause() { closeCamera(); stopBackgroundThread(); super.onPause(); } @Override public void onDestroyView() { super.onDestroyView(); //RefWatcher refWatcher = EasyReceipt.getRefwatcher(getActivity()); //refWatcher.watch(this); } //endregion //region------------------------------------------------------------------- Inner Classes private class ImageSaver implements Runnable{ private final Image mImage; private final FileManager mFileManager; //private final File mFile; ImageSaver(Image image, FileManager fileManager){ mImage = image; mFileManager = fileManager; //mFile = file; } @Override public void run() { File outputFile = null; try { outputFile = File.createTempFile(String.valueOf(System.currentTimeMillis()), ".jpg", getActivity().getCacheDir()); }catch (Exception e){ Log.e(TAG, "run: ", e); Crashlytics.log(TAG + " " + e); } ByteBuffer buffer = mImage.getPlanes()[0].getBuffer(); byte[] bytes = new byte[buffer.remaining()]; buffer.get(bytes); FileOutputStream fos = null; try{ fos = new FileOutputStream(outputFile); fos.write(bytes); }catch (Exception e){ Log.e(TAG, "run: ", e); Crashlytics.log(TAG + " " + e); }finally { mImage.close(); if (fos != null){ try{ fos.close(); }catch (Exception e){ Log.e(TAG, "run: ", e); Crashlytics.log(TAG + " " + e); } } } ((MainActivity)getActivity()).setUriofImageTOReview(Uri.fromFile(outputFile)); ((MainActivity)getActivity()).loadCameraPreviewApprovalFrag(); /* ((MainActivity)getActivity()).loadCameraPreviewApprovalFrag(bytes); mImage.close(); */ } } static class CompareSizesByArea implements Comparator { @Override public int compare(android.util.Size lhs, android.util.Size rhs) { // We cast here to ensure the multiplications won't overflow return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight()); } } public static class ErrorDialog extends DialogFragment { private static final String ARG_MESSAGE = "message"; public static ErrorDialog newInstance(String message){ ErrorDialog dialog = new ErrorDialog(); Bundle args = new Bundle(); args.putString(ARG_MESSAGE, message); dialog.setArguments(args); return dialog; } @NonNull @Override public Dialog onCreateDialog(Bundle savedInstanceState) { final Activity activity = getActivity(); return new AlertDialog.Builder(activity) .setMessage(getArguments().getString(ARG_MESSAGE)) .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { activity.finish(); } }).create(); } } public static class ConfirmationDialog extends DialogFragment{ @NonNull @Override public Dialog onCreateDialog(Bundle savedInstanceState) { final Fragment parent = getParentFragment(); return new AlertDialog.Builder(getActivity()) .setMessage(R.string.request_permission) .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { ActivityCompat.requestPermissions(getActivity(), new String[]{Manifest.permission.CAMERA}, REQUEST_CAMERA_PERMISSIONS); } }) .setNegativeButton(android.R.string.cancel, new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { Activity activity = parent.getActivity(); if (activity != null){ activity.finish(); } } }).create(); } } //endregion }