init project

main
AlexWang 2 years ago
commit 97f11eda01

10
.gitignore vendored

@ -0,0 +1,10 @@
build
.gradle
.idea
*.iml
local.properties
gradle.properties
app/release/
*.apk
app/release/output.json

@ -0,0 +1,65 @@
# EasyPusher_Android
A simple, robust, low latency RTSP video&audio&screen stream pusher and recorder on android. 精炼、稳定、高效的安卓前/后摄像头/手机桌面屏幕采集、编码、RTSP直播推送工具充分秉承了RTP在即时通信领域中的技术特点网络条件满足的情况下延时控制在300ms~500ms非常适合于应急指挥、4G执法、远程遥控与直播等行业领域
EasyPusher是EasyDarwin流媒体团队开发的一个RTSP/RTP流媒体音/视频直播推送产品组件,全平台支持(包括Windows/Linux(32 & 64)ARM各平台Android、iOS)通过EasyPusher我们就可以避免接触到稍显复杂的RTSP/RTP/RTCP推送流程只需要调用EasyPusher的几个API接口就能轻松、稳定地把流媒体音视频数据推送给RTSP流媒体服务器进行转发和分发尤其是与EasyDarwin开源RTSP流媒体服务器、EasyPlayer-RTSP播放器可以无缝衔接EasyPusher经过长时间的企业用户和项目检验稳定性非常高;
## 分支说明 ##
- master分支是EasyPusher APP(https://fir.im/EasyPusher) 的工程。如果需要验证Pusher的功能可以使用这个工程进行编译运行AS的版本无要求。
- library分支主要面向开发者实现将pusher功能集成到现有APP的场景。library使用了android architecture component的一些特性非常便于集成。https://developer.android.com/topic/libraries/architecture/index.html 。该分支要求AS版本3.0以上。library分支里面包含libaray module和myapplication module,分别表示库工程源码和demo集成示例
## 功能点支持 ##
- [x] 多分辨率选择;
- [x] `音视频`推送、`纯音频`推送、`纯视频`推送;
- [x] 支持`边采集、边录像`
- [x] 稳定的录像、推流分离模式,**支持推流过程中随时开启录像,录像过程中,随时推流;**
- [x] 采集过程中,前后摄像头切换;
- [x] android完美支持`文字水印、实时时间水印`
- [x] 支持`推送端实时静音/取消静音`
- [x] 支持软硬编码设置;
- [x] android支持后台service推送摄像头或屏幕(推送屏幕需要5.0+版本)
- [x] 支持gop间隔、帧率、bierate、android编码profile和编码速度设置
- [x] [音频]android支持噪音抑制功能
- [x] [音频]android支持自动增益控制
- [x] 结合UVCCamera(https://github.com/saki4510t/UVCCamera) 开源工程,支持**UVC摄像头视频推送\以及UVC摄像头本地录像**
- [x] 配套免费开源的EasyDarwin流媒体服务器
## 工作流程 ##
![EasyPusher Work Flow](http://www.easydarwin.org/github/images/easypusher/easypusher_android_workfolw.png)
## 版本下载 ##
- Android [https://fir.im/EasyPusher ](https://fir.im/EasyPusher "EasyPusher_Android")
![EasyPusher_Android](http://www.easydarwin.org/skin/bs/images/app/EasyPusher_AN.png)
- iOS [https://itunes.apple.com/us/app/easypusher/id1211967057](https://itunes.apple.com/us/app/easypusher/id1211967057 "EasyPusher_iOS")
![EasyPusher_iOS](http://www.easydarwin.org/skin/bs/images/app/EasyPusher_iOS.png)
## 技术支持 ##
- 邮件:[support@easydarwin.org](mailto:support@easydarwin.org)
- Tel13718530929
- QQ交流群465901074
> EasyPusher是一款非常稳定的RTSP推流直播组件各平台版本需要经过授权才能商业使用商业授权方案可以通过以上渠道进行更深入的技术与合作咨询
## 获取更多信息 ##
**EasyDarwin**开源流媒体服务器:[www.EasyDarwin.org](http://www.easydarwin.org)
**EasyDSS**商用流媒体解决方案:[www.EasyDSS.com](http://www.easydss.com)
**EasyNVR**无插件直播方案:[www.EasyNVR.com](http://www.easynvr.com)
Copyright © EasyDarwin Team 2012-2018
![EasyDarwin](http://www.easydarwin.org/skin/easydarwin/images/wx_qrcode.jpg)

@ -0,0 +1,34 @@
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:7.1.2'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
google()
jcenter()
maven { url "https://jitpack.io" }
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
gradle.projectsEvaluated {
tasks.withType(JavaCompile) {
options.compilerArgs << "-Xmaxerrs" << "500" // or whatever number you want
}
}

Binary file not shown.

@ -0,0 +1,6 @@
#Sat Aug 17 10:22:30 CST 2019
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-7.2-all.zip

160
gradlew vendored

@ -0,0 +1,160 @@
#!/usr/bin/env bash
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn ( ) {
echo "$*"
}
die ( ) {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
esac
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Split up the JVM_OPTS And GRADLE_OPTS values into an array, following the shell quoting and substitution rules
function splitJvmOpts() {
JVM_OPTS=("$@")
}
eval splitJvmOpts $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS
JVM_OPTS[${#JVM_OPTS[*]}]="-Dorg.gradle.appname=$APP_BASE_NAME"
exec "$JAVACMD" "${JVM_OPTS[@]}" -classpath "$CLASSPATH" org.gradle.wrapper.GradleWrapperMain "$@"

90
gradlew.bat vendored

@ -0,0 +1,90 @@
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windowz variants
if not "%OS%" == "Windows_NT" goto win9xME_args
if "%@eval[2+2]" == "4" goto 4NT_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
goto execute
:4NT_args
@rem Get arguments from the 4NT Shell from JP Software
set CMD_LINE_ARGS=%$
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega

@ -0,0 +1,2 @@
/build
.apk

@ -0,0 +1,40 @@
apply plugin: 'com.android.library'
android {
compileSdkVersion 31
defaultConfig {
minSdkVersion 19
targetSdkVersion 31 //
versionCode 13190817
versionName "1.3.19.0817"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
}
repositories {
flatDir {
dirs 'libs'
}
mavenCentral()
}
dependencies {
implementation fileTree(include: ['*.jar'], dir: 'libs')
testImplementation 'junit:junit:4.13.2'
implementation 'androidx.lifecycle:lifecycle-extensions:2.2.0'
implementation 'androidx.lifecycle:lifecycle-reactivestreams:2.4.1'
annotationProcessor 'androidx.lifecycle:lifecycle-compiler:2.0.0'
implementation(name: 'libuvccamera-release', ext: 'aar') {
exclude module: 'support-v4'
exclude module: 'appcompat-v7'
}
}

@ -0,0 +1,17 @@
# Add project specific ProGuard rules here.
# By default, the flags in this file are appended to flags specified
# in D:\AndroidStudio\StudioSDK/tools/proguard/proguard-android.txt
# You can edit the include path and order by changing the proguardFiles
# directive in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# Add any project specific keep options here:
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}

@ -0,0 +1 @@
rtmp

@ -0,0 +1,19 @@
/*
Copyright (c) 2013-2016 EasyDarwin.ORG. All rights reserved.
Github: https://github.com/EasyDarwin
WEChat: EasyDarwin
Website: http://www.easydarwin.org
*/
package org.easydarwin.easypusher;
import android.app.Application;
import android.test.ApplicationTestCase;
/**
* <a href="http://d.android.com/tools/testing/testing_android.html">Testing Fundamentals</a>
*/
public class ApplicationTest extends ApplicationTestCase<Application> {
public ApplicationTest() {
super(Application.class);
}
}

@ -0,0 +1,80 @@
package org.easydarwin.easypusher;
import android.support.test.espresso.ViewInteraction;
import android.support.test.rule.ActivityTestRule;
import android.support.test.runner.AndroidJUnit4;
import android.test.suitebuilder.annotation.LargeTest;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import static android.support.test.espresso.Espresso.onView;
import static android.support.test.espresso.Espresso.pressBack;
import static android.support.test.espresso.action.ViewActions.click;
import static android.support.test.espresso.action.ViewActions.scrollTo;
import static android.support.test.espresso.matcher.ViewMatchers.isDisplayed;
import static android.support.test.espresso.matcher.ViewMatchers.withId;
import static android.support.test.espresso.matcher.ViewMatchers.withParent;
import static android.support.test.espresso.matcher.ViewMatchers.withText;
import static org.hamcrest.Matchers.allOf;
@LargeTest
@RunWith(AndroidJUnit4.class)
public class SplashActivityTest {
@Rule
public ActivityTestRule<SplashActivity> mActivityTestRule = new ActivityTestRule<>(SplashActivity.class);
@Test
public void splashActivityTest() {
// Added a sleep statement to match the app's execution delay.
// The recommended way to handle such scenarios is to use Espresso idling resources:
// https://google.github.io/android-testing-support-library/docs/espresso/idling-resource/index.html
ViewInteraction appCompatButton = onView(
allOf(withId(R.id.btn_switch), withText("推送"), isDisplayed()));
appCompatButton.perform(click());
ViewInteraction appCompatButton2 = onView(
allOf(withId(R.id.btn_setting), withText("设置"), isDisplayed()));
appCompatButton2.perform(click());
pressBack();
ViewInteraction appCompatCheckBox = onView(
allOf(withId(R.id.only_push_audio), withText("仅推送音频")));
appCompatCheckBox.perform(scrollTo(), click());
ViewInteraction appCompatButton3 = onView(
allOf(withId(R.id.btn_save), withText("保存")));
appCompatButton3.perform(scrollTo(), click());
ViewInteraction appCompatButton4 = onView(
allOf(withId(R.id.btn_setting), withText("设置"), isDisplayed()));
appCompatButton4.perform(click());
pressBack();
ViewInteraction appCompatCheckBox2 = onView(
allOf(withId(R.id.only_push_audio), withText("仅推送音频")));
appCompatCheckBox2.perform(scrollTo(), click());
ViewInteraction appCompatButton5 = onView(
allOf(withId(R.id.btn_save), withText("保存")));
appCompatButton5.perform(scrollTo(), click());
pressBack();
ViewInteraction appCompatButton6 = onView(
allOf(withId(android.R.id.button2), withText("取消"),
withParent(allOf(withId(R.id.buttonPanel),
withParent(withId(R.id.parentPanel)))),
isDisplayed()));
appCompatButton6.perform(click());
}
}

@ -0,0 +1,38 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="org.easydarwin.easypusher">
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW" />
<uses-feature android:name="android.hardware.camera" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-feature
android:name="android.hardware.usb.host"
android:required="true" />
<uses-feature
android:glEsVersion="0x00020000"
android:required="true" />
<application>
<service
android:name="org.easydarwin.push.PushScreenService"
android:enabled="true" />
<service
android:name="org.easydarwin.push.UVCCameraService"
android:enabled="true" />
<service
android:name="org.easydarwin.push.MediaStream"
android:enabled="true" />
</application>
</manifest>

@ -0,0 +1,315 @@
package com.android.webrtc.audio;
import android.content.Context;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.media.AudioTrack;
import android.media.MediaRecorder;
import android.media.audiofx.AudioEffect;
import android.media.audiofx.AutomaticGainControl;
import android.media.audiofx.EnvironmentalReverb;
import android.media.audiofx.LoudnessEnhancer;
import android.os.Build;
import android.os.Process;
import android.preference.PreferenceManager;
import android.util.Log;
import com.android.webrtc.audio.MobileAEC;
import org.easydarwin.easypusher.BuildConfig;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.PipedInputStream;
import java.io.PipedOutputStream;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.ShortBuffer;
/**
* Created by John on 2016/11/12.
*/
public class AudioIO {
private static final String TAG = "AudioIO";
private final Context mContext;
private final boolean mStereo;
private final int mSample;
private final int mAudioFormat;
private AudioTrack mAudioTrack;
private AudioEffect[] mAes = new AudioEffect[10];
private AudioRecord mAudioRecoder;
private MobileAEC aecm;
private PipedOutputStream os_farend;
private PipedInputStream is_farend;
private PipedOutputStream os_nearend;
private PipedInputStream is_nearend;
private class AudioThread extends Thread {
public AudioThread() {
super("AudioIO");
}
@Override
public void run() {
Process.setThreadPriority(Process.THREAD_PRIORITY_AUDIO);
aecm = new MobileAEC(MobileAEC.SamplingFrequency.FS_8000Hz);
if (BuildConfig.DEBUG) {
aecm.setAecmMode(getMode()).prepare();
} else {
aecm.setAecmMode(MobileAEC.AggressiveMode.HIGH).prepare();
}
final int sampleRateInHz = mSample;
final int channelConfig = mStereo ? AudioFormat.CHANNEL_OUT_STEREO : AudioFormat.CHANNEL_OUT_MONO;
final int bfSize = AudioTrack.getMinBufferSize(sampleRateInHz, channelConfig, mAudioFormat);
// 10毫秒内的字节数
final int unit_length = sampleRateInHz * 10 / 1000 * 2;
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRateInHz, channelConfig, mAudioFormat, bfSize * 2, AudioTrack.MODE_STREAM);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
int i = 0;
try {
EnvironmentalReverb er = new EnvironmentalReverb(0, mAudioTrack.getAudioSessionId());
if (er != null) {
er.setEnabled(true);
mAes[i++] = er;
}
} catch (Throwable ex) {
ex.printStackTrace();
}
try {
LoudnessEnhancer le = new LoudnessEnhancer(mAudioTrack.getAudioSessionId());
le.setEnabled(true);
mAes[i++] = le;
} catch (Throwable ex) {
ex.printStackTrace();
}
}
mAudioTrack.play();
int CC = AudioFormat.CHANNEL_IN_MONO;
int minBufSize = AudioRecord.getMinBufferSize(sampleRateInHz, CC, mAudioFormat);
final int audioSource = MediaRecorder.AudioSource.MIC;
// 初始化时,这个参数不是越小越好。这个参数(应该)是底层的音频buffer的尺寸如果太小了又读取不及时可能会溢出导致音质不好
minBufSize *= 2;
if (minBufSize < unit_length) {
minBufSize = unit_length;
}
mAudioRecoder = new AudioRecord(audioSource, sampleRateInHz, CC, mAudioFormat, minBufSize);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
int i = 0;
try {
AutomaticGainControl er = AutomaticGainControl.create(mAudioRecoder.getAudioSessionId());
if (er != null) {
er.setEnabled(true);
mAes[i++] = er;
}
} catch (Throwable ex) {
ex.printStackTrace();
}
}
mAudioRecoder.startRecording();
byte[] buffer = new byte[unit_length];
int sizeInShorts = unit_length / 2;
short[] farendPCM = new short[sizeInShorts];
short[] nearendPCM = new short[sizeInShorts];
short[] nearendCanceled = new short[sizeInShorts];
ByteBuffer bb = ByteBuffer.allocate(unit_length).order(ByteOrder.LITTLE_ENDIAN);
try {
while (t != null) {
boolean fillfarend = fillFarendBuffer(buffer);
if (fillfarend) {
save(buffer, "/sdcard/farend.pcm", true);
ByteBuffer.wrap(buffer).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(farendPCM);
mAudioTrack.write(farendPCM, 0, sizeInShorts);
}
if (readNearendBuffer(nearendPCM)) {
bb.clear();
if (fillfarend) {
int delay_level = PreferenceManager.getDefaultSharedPreferences(mContext).getInt("delay_level", 3);
if (BuildConfig.DEBUG) {
try {
delay_level = getDelay();
} catch (Exception ex) {
}
}
aecm.farendBuffer(farendPCM, sizeInShorts);
bb.asShortBuffer().put(nearendPCM);
save(bb.array(), "/sdcard/nearend.pcm", true);
aecm.echoCancellation(nearendPCM, null, nearendCanceled, (short) (sizeInShorts), (short) (10 * delay_level));
bb.clear();
bb.asShortBuffer().put(nearendCanceled);
save(bb.array(), "/sdcard/nearendCanceled.pcm", true);
os_nearend.write(bb.array());
} else {
bb.asShortBuffer().put(nearendPCM);
save(bb.array(), "/sdcard/nearend.pcm", true);
os_nearend.write(bb.array());
}
}
}
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
} finally {
mAudioTrack.release();
mAudioRecoder.release();
for (Object ae : mAes) {
if (ae != null) {
AudioEffect aet = (AudioEffect) ae;
aet.release();
}
}
}
}
private boolean readNearendBuffer(short[] pcm) {
int offset = 0;
do {
int i = mAudioRecoder.read(pcm, offset, pcm.length - offset);
if (i < 0 || t == null) return false;
offset += i;
} while (offset < pcm.length);
if (BuildConfig.DEBUG) {
Log.d(TAG, String.format("readNearendBuffer : %d", pcm.length));
}
return true;
}
private MobileAEC.AggressiveMode getMode() {
try {
// String[] arr = AddVideoOverlay.AddText.split("_");
// 如果为偶数,不读远端了。
int delay_level = 1;
return new MobileAEC.AggressiveMode(delay_level);
} catch (Exception ex) {
return MobileAEC.AggressiveMode.HIGH;
}
}
private int getDelay() {
try {
// String[] arr = AddVideoOverlay.AddText.split("_");
// 如果为偶数,不读远端了。
// int delay_level = Integer.parseInt(10);
return 10;
} catch (Exception ex) {
return 1;
}
}
private boolean fillFarendBuffer(byte[] bufferPCM) throws IOException {
if (is_farend.available() < 1) {
return false;
}
int left = bufferPCM.length;
do {
int i = is_farend.read(bufferPCM, bufferPCM.length - left, left);
if (i < 0 || t == null) return false;
left -= i;
} while (left > 0);
if (BuildConfig.DEBUG) {
Log.d(TAG, String.format("fillFarendBuffer : %d", bufferPCM.length));
}
return true;
}
}
private Thread t;
public AudioIO(Context context, int sample, boolean stereo) {
this(context, sample, stereo, AudioFormat.ENCODING_PCM_16BIT);
}
public AudioIO(Context context, int sample, boolean stereo, int audioFormat) {
mContext = context.getApplicationContext();
mSample = sample;
mStereo = stereo;
mAudioFormat = audioFormat;
}
public synchronized void start() throws IOException {
if (t != null) {
return;
}
is_farend = new PipedInputStream(1024);
os_farend = new PipedOutputStream(is_farend);
is_nearend = new PipedInputStream(1024);
os_nearend = new PipedOutputStream(is_nearend);
t = new AudioThread();
t.start();
}
public synchronized void release() throws IOException, InterruptedException {
if (is_farend != null) {
is_farend.close();
}
if (os_nearend != null)
os_nearend.close();
if (is_nearend != null)
is_nearend.close();
if (os_farend != null)
os_farend.close();
Thread t = this.t;
this.t = null;
if (t != null) {
t.interrupt();
t.join();
}
}
public void pumpAudio(short[] pcm, int offset, int length) throws InterruptedException, IOException {
ByteBuffer bb = ByteBuffer.allocate(length * 2).order(ByteOrder.LITTLE_ENDIAN);
ShortBuffer sb = bb.asShortBuffer();
sb.put(pcm, offset, length);
os_farend.write(bb.array());
}
public int retrieveAudio(byte[] pcm, int offset, int length) throws IOException {
return is_nearend.read(pcm, offset, length);
}
public static void save(byte[] buffer, String path, boolean append) {
FileOutputStream fos = null;
try {
fos = new FileOutputStream(path, append);
fos.write(buffer);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (fos != null) {
try {
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}

@ -0,0 +1,441 @@
package com.android.webrtc.audio;
/**
* This class supports the acoustic echo cancellation for mobile edition. Please <b>bug me</b> if you find any bugs in
* this toolkit.<br>
* <br>
* <b>[Notice]</b><br>
* 1. there are 5 more native interface that I'm not trying to provide in this MobileAEC toolkit.<br>
* But I think I should mention it out as a list below, for secondary development if necessary: <br>
* <ul>
* <li>WebRtc_Word32 WebRtcAecm_get_config(void *, AecmConfig *);</li>
* <li>WebRtc_Word32 WebRtcAecm_InitEchoPath(void* , const void* , size_t);</li>
* <li>WebRtc_Word32 WebRtcAecm_GetEchoPath(void* , void* , size_t);</li>
* <li>size_t WebRtcAecm_echo_path_size_bytes();</li>
* <li>WebRtc_Word32 WebRtcAecm_get_error_code(void *);</li>
* </ul>
* 2. if you are working on an android platform, put the shared library "libwebrtc_aecm.so"<br>
* into path "/your project/libs/armeabi/", if the dir does not exist, you should create it, otherwise you<br>
* will get a "unsatisfied link error" at run time.<br>
* 3. you should always call <b>close()</b> method <b>manually</b> when all things are finished.<br>
* <br>
* <b>[Usage]</b> <br>
* <ul>
* 1. You create a MobileAEC object first(set the parameters to constructor or null are both Ok, if null are set, then
* we will use default values instead).<br>
* 2. change the aggressiveness or sampling frequency of the AECM instance if necessary.<br>
* 3. call <b>prepare()</b> method to make the AECM instance prepared. <br>
* 4. then call "farendBuffer" to set far-end signal to AECM instance. <br>
* 5. now you call "echoCancellation()" to deal with the acoustic echo things.<br>
* The order of step 1,2,3,4 and 5 is significant, when all settings are done or you changed previous<br>
* settings, <b>DO NOT</b> forget to call prepare() method, otherwise your new settings will be ignored by AECM
* instance. <br>
* 6. finally you should call <b>close()</b> method <b>manually</b> when all things are done, after that, the AECM
* instance is no longer available until next <b>prepare()</b> is called.<br>
* </ul>
* <b>[Samples]</b><br>
* <ul>
* see doAECM() in {@link combillhoo.android.aec.demo.DemoActivity DEMO}
* </ul>
*
* @version 0.1 2013-3-8
*
* @author billhoo E-mail:billhoo@126.com
*/
public class MobileAEC {
static {
System.loadLibrary("webrtc_aecm"); // to load the libwebrtc_aecm.so library.
}
// /////////////////////////////////////////////////////////
// PUBLIC CONSTANTS
/**
* constant unable mode for Aecm configuration settings.
*/
public static final short AECM_UNABLE = 0;
/**
* constant enable mode for Aecm configuration settings.
*/
public static final short AECM_ENABLE = 1;
// /////////////////////////////////////////////////////////
// PUBLIC NESTED CLASSES
/**
* For security reason, this class supports constant sampling frequency values in
* {@link SamplingFrequency#FS_8000Hz FS_8000Hz}, {@link SamplingFrequency#FS_16000Hz FS_16000Hz}
*/
public static final class SamplingFrequency {
public long getFS() {
return mSamplingFrequency;
}
/**
* This constant represents sampling frequency in 8000Hz
*/
public static final SamplingFrequency FS_8000Hz = new SamplingFrequency(
8000);
/**
* This constant represents sampling frequency in 16000Hz
*/
public static final SamplingFrequency FS_16000Hz = new SamplingFrequency(
16000);
private final long mSamplingFrequency;
private SamplingFrequency(long fs) {
this.mSamplingFrequency = fs;
}
}
/**
* For security reason, this class supports constant aggressiveness of the AECM instance in
* {@link AggressiveMode#MILD MILD}, {@link AggressiveMode#MEDIUM MEDIUM}, {@link AggressiveMode#HIGH HIGH},
* {@link AggressiveMode#AGGRESSIVE AGGRESSIVE}, {@link AggressiveMode#MOST_AGGRESSIVE MOST_AGGRESSIVE}.
*/
public static final class AggressiveMode {
public int getMode() {
return mMode;
}
/**
* This constant represents the aggressiveness of the AECM instance in MILD_MODE
*/
public static final AggressiveMode MILD = new AggressiveMode(
0);
/**
* This constant represents the aggressiveness of the AECM instance in MEDIUM_MODE
*/
public static final AggressiveMode MEDIUM = new AggressiveMode(
1);
/**
* This constant represents the aggressiveness of the AECM instance in HIGH_MODE
*/
public static final AggressiveMode HIGH = new AggressiveMode(
2);
/**
* This constant represents the aggressiveness of the AECM instance in AGGRESSIVE_MODE
*/
public static final AggressiveMode AGGRESSIVE = new AggressiveMode(
3);
/**
* This constant represents the aggressiveness of the AECM instance in MOST_AGGRESSIVE_MODE
*/
public static final AggressiveMode MOST_AGGRESSIVE = new AggressiveMode(
4);
private final int mMode;
public AggressiveMode(int mode) {
mMode = mode;
}
}
// /////////////////////////////////////////////////////////
// PRIVATE MEMBERS
private int mAecmHandler = -1; // the handler of AECM instance.
private AecmConfig mAecmConfig = null; // the configurations of AECM instance.
private SamplingFrequency mSampFreq = null; // sampling frequency of input speech data.
private boolean mIsInit = false; // whether the AECM instance is initialized or not.
// /////////////////////////////////////////////////////////
// CONSTRUCTOR
/**
* To generate a new AECM instance, whether you set the sampling frequency of each parameter or not are both ok.
*
* @param sampFreqOfData
* - sampling frequency of input audio data. if null, then {@link SamplingFrequency#FS_16000Hz
* FS_16000Hz} is set.
*/
public MobileAEC(SamplingFrequency sampFreqOfData) {
setSampFreq(sampFreqOfData);
mAecmConfig = new AecmConfig();
// create new AECM instance but without initialize. Init things are in prepare() method instead.
mAecmHandler = nativeCreateAecmInstance();
}
// /////////////////////////////////////////////////////////
// PUBLIC METHODS
/**
* set the sampling rate of speech data.
*
* @param fs
* - sampling frequency of speech data, if null then {@link SamplingFrequency#FS_16000Hz FS_16000Hz} is
* set.
*/
public void setSampFreq(SamplingFrequency fs) {
if (fs == null)
mSampFreq = SamplingFrequency.FS_16000Hz;
else
mSampFreq = fs;
}
/**
* set the far-end signal of AECM instance.
*
* @param farendBuf
* @param numOfSamples
* @return the {@link MobileAEC MobileAEC} object itself.
* @throws Exception
* - if farendBuffer() is called on an unprepared AECM instance or you pass an invalid parameter.<br>
*/
public MobileAEC farendBuffer(short[] farendBuf, int numOfSamples)
throws Exception {
// check if AECM instance is not initialized.
if (!mIsInit) {
// TODO(billhoo) - create a custom exception instead of using java.lang.Exception
throw new Exception(
"setFarendBuffer() called on an unprepared AECM instance.");
}
if (nativeBufferFarend(mAecmHandler, farendBuf, numOfSamples) == -1)
// TODO(billhoo) - create a custom exception instead of using java.lang.Exception
throw new Exception(
"setFarendBuffer() failed due to invalid arguments.");
return this;
}
/**
* core process of AECM instance, must called on a prepared AECM instance. we only support 80 or 160 sample blocks
* of data.
*
* @param nearendNoisy
* - In buffer containing one frame of reference nearend+echo signal. If noise reduction is active,
* provide the noisy signal here.
* @param nearendClean
* - In buffer containing one frame of nearend+echo signal. If noise reduction is active, provide the
* clean signal here. Otherwise pass a NULL pointer.
* @param out
* - Out buffer, one frame of processed nearend.
* @param numOfSamples
* - Number of samples in nearend buffer
* @param delay
* - Delay estimate for sound card and system buffers <br>
* delay = (t_render - t_analyze) + (t_process - t_capture)<br>
* where<br>
* - t_analyze is the time a frame is passed to farendBuffer() and t_render is the time the first sample
* of the same frame is rendered by the audio hardware.<br>
* - t_capture is the time the first sample of a frame is captured by the audio hardware and t_process is
* the time the same frame is passed to echoCancellation().
*
* @throws Exception
* - if echoCancellation() is called on an unprepared AECM instance or you pass an invalid parameter.<br>
*/
public void echoCancellation(short[] nearendNoisy, short[] nearendClean,
short[] out, short numOfSamples, short delay) throws Exception {
// check if AECM instance is not initialized.
if (!mIsInit) {
// TODO(billhoo) - create a custom exception instead of using java.lang.Exception
throw new Exception(
"echoCancelling() called on an unprepared AECM instance.");
}
if (nativeAecmProcess(mAecmHandler, nearendNoisy, nearendClean, out,
numOfSamples, delay) == -1)
// TODO(billhoo) - create a custom exception instead of using java.lang.Exception
throw new Exception(
"echoCancellation() failed due to invalid arguments.");
}
/**
* Set the aggressiveness mode of AECM instance, more higher the mode is, more aggressive the instance will be.
*
* @param mode
* @return the {@link MobileAEC MobileAEC} object itself.
* @throws NullPointerException
* - if mode is null.
*/
public MobileAEC setAecmMode(AggressiveMode mode)
throws NullPointerException {
// check the mode argument.
if (mode == null)
throw new NullPointerException(
"setAecMode() failed due to null argument.");
mAecmConfig.mAecmMode = (short) mode.getMode();
return this;
}
/**
* When finished the pre-works or any settings are changed, call this to make AECM instance prepared. Otherwise your
* new settings will be ignored by the AECM instance.
*
* @return the {@link MobileAEC MobileAEC} object itself.
*/
public MobileAEC prepare() {
if (mIsInit) {
close();
mAecmHandler = nativeCreateAecmInstance();
}
mInitAecmInstance((int) mSampFreq.getFS());
mIsInit = true;
// set AecConfig to native side.
nativeSetConfig(mAecmHandler, mAecmConfig);
return this;
}
/**
* Release the resources in AECM instance and the AECM instance is no longer available until next <b>prepare()</b>
* is called.<br>
* You should <b>always</b> call this <b>manually</b> when all things are done.
*/
public void close() {
if (mIsInit) {
nativeFreeAecmInstance(mAecmHandler);
mAecmHandler = -1;
mIsInit = false;
}
}
// ////////////////////////////////////////////////////////
// PROTECTED METHODS
@Override
protected void finalize() throws Throwable {
super.finalize();
// TODO(billhoo) need a safety one.
if (mIsInit) {
close();
}
}
// ////////////////////////////////////////////////////////
// PRIVATE METHODS
/**
* initialize the AECM instance
*
* @param SampFreq
*/
private void mInitAecmInstance(int SampFreq) {
if (!mIsInit) {
nativeInitializeAecmInstance(mAecmHandler, SampFreq);
// initialize configurations of AECM instance.
mAecmConfig = new AecmConfig();
// set default configuration of AECM instance
nativeSetConfig(mAecmHandler, mAecmConfig);
mIsInit = true;
}
}
// ////////////////////////////////////////////////////////
// PRIVATE NESTED CLASSES
/**
* Acoustic Echo Cancellation for Mobile Configuration class, holds the config Info. of AECM instance.<br>
* [NOTE] <b>DO NOT</b> modify the name of members, or you must change the native code to match your modifying.
* Otherwise the native code could not find pre-binding members name.<br>
*
*/
@SuppressWarnings("unused")
public class AecmConfig {
private short mAecmMode = (short) AggressiveMode.AGGRESSIVE.getMode(); // default AggressiveMode.AGGRESSIVE
private short mCngMode = AECM_ENABLE; // AECM_UNABLE, AECM_ENABLE (default)
}
// ///////////////////////////////////////////
// PRIVATE NATIVE INTERFACES
/**
* Allocates the memory needed by the AECM. The memory needs to be initialized separately using the
* nativeInitializeAecmInstance() method.
*
* @return -1: error<br>
* other values: created AECM instance handler.
*
*/
private static native int nativeCreateAecmInstance();
/**
* Release the memory allocated by nativeCreateAecmInstance().
*
* @param aecmHandler
* - handler of the AECM instance created by nativeCreateAecmInstance()
* @return 0: OK<br>
* -1: error
*/
private static native int nativeFreeAecmInstance(int aecmHandler);
/**
* Initializes an AECM instance.
*
* @param aecmHandler
* - Handler of AECM instance
* @param samplingFrequency
* - Sampling frequency of data
* @return: 0: OK<br>
* -1: error
*/
private static native int nativeInitializeAecmInstance(int aecmHandler,
int samplingFrequency);
/**
* Inserts an 80 or 160 sample block of data into the farend buffer.
*
* @param aecmHandler
* - Handler to the AECM instance
* @param farend
* - In buffer containing one frame of farend signal for L band
* @param nrOfSamples
* - Number of samples in farend buffer
* @return: 0: OK<br>
* -1: error
*/
private static native int nativeBufferFarend(int aecmHandler,
short[] farend, int nrOfSamples);
/**
* Runs the AECM on an 80 or 160 sample blocks of data.
*
* @param aecmHandler
* - Handler to the AECM handler
* @param nearendNoisy
* - In buffer containing one frame of reference nearend+echo signal. If noise reduction is active,
* provide the noisy signal here.
* @param nearendClean
* - In buffer containing one frame of nearend+echo signal. If noise reduction is active, provide the
* clean signal here.Otherwise pass a NULL pointer.
* @param out
* - Out buffer, one frame of processed nearend.
* @param nrOfSamples
* - Number of samples in nearend buffer
* @param msInSndCardBuf
* - Delay estimate for sound card and system buffers <br>
* @return: 0: OK<br>
* -1: error
*/
private static native int nativeAecmProcess(int aecmHandler,
short[] nearendNoisy, short[] nearendClean, short[] out,
short nrOfSamples, short msInSndCardBuf);
/**
* Enables the user to set certain parameters on-the-fly.
*
* @param aecmHandler
* - Handler to the AECM instance
* @param aecmConfig
* - the new configuration of AECM instance to set.
*
* @return 0: OK<br>
* -1: error
*/
private static native int nativeSetConfig(int aecmHandler,
AecmConfig aecmConfig);
}

@ -0,0 +1,244 @@
package org.easydarwin.audio;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.MediaRecorder;
import android.os.Process;
import androidx.annotation.Nullable;
import android.util.Log;
import org.easydarwin.easypusher.BuildConfig;
import org.easydarwin.muxer.EasyMuxer;
import org.easydarwin.push.Pusher;
import java.nio.ByteBuffer;
public class AudioStream {
EasyMuxer muxer;
private int samplingRate = 8000;
private int bitRate = 16000;
private int BUFFER_SIZE = 1920;
int mSamplingRateIndex = 0;
AudioRecord mAudioRecord;
MediaCodec mMediaCodec;
Pusher easyPusher;
private Thread mThread = null;
String TAG = "AudioStream";
//final String path = Environment.getExternalStorageDirectory() + "/123450001.aac";
protected MediaCodec.BufferInfo mBufferInfo = new MediaCodec.BufferInfo();
protected ByteBuffer[] mBuffers = null;
/**
* There are 13 supported frequencies by ADTS.
**/
public static final int[] AUDIO_SAMPLING_RATES = {96000, // 0
88200, // 1
64000, // 2
48000, // 3
44100, // 4
32000, // 5
24000, // 6
22050, // 7
16000, // 8
12000, // 9
11025, // 10
8000, // 11
7350, // 12
-1, // 13
-1, // 14
-1, // 15
};
private Thread mWriter;
private MediaFormat newFormat;
public AudioStream(Pusher easyPusher) {
this.easyPusher = easyPusher;
int i = 0;
for (; i < AUDIO_SAMPLING_RATES.length; i++) {
if (AUDIO_SAMPLING_RATES[i] == samplingRate) {
mSamplingRateIndex = i;
break;
}
}
}
/**
*
*/
public void startRecord() {
mThread = new Thread(new Runnable() {
@Override
public void run() {
Process.setThreadPriority(Process.THREAD_PRIORITY_AUDIO);
int len = 0, bufferIndex = 0;
try {
int bufferSize = AudioRecord.getMinBufferSize(samplingRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, samplingRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
mMediaCodec = MediaCodec.createEncoderByType("audio/mp4a-latm");
MediaFormat format = new MediaFormat();
format.setString(MediaFormat.KEY_MIME, "audio/mp4a-latm");
format.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1);
format.setInteger(MediaFormat.KEY_SAMPLE_RATE, samplingRate);
format.setInteger(MediaFormat.KEY_AAC_PROFILE,
MediaCodecInfo.CodecProfileLevel.AACObjectLC);
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, BUFFER_SIZE);
mMediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
mWriter = new WriterThread();
mWriter.start();
mAudioRecord.startRecording();
final ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
long presentationTimeUs = 0;
while (mThread != null) {
bufferIndex = mMediaCodec.dequeueInputBuffer(1000);
if (bufferIndex >= 0) {
inputBuffers[bufferIndex].clear();
len = mAudioRecord.read(inputBuffers[bufferIndex], BUFFER_SIZE);
long timeUs = System.nanoTime() / 1000;
// Log.i(TAG, String.format("audio: %d [%d] ", timeUs, timeUs - presentationTimeUs));
presentationTimeUs = timeUs;
if (len == AudioRecord.ERROR_INVALID_OPERATION || len == AudioRecord.ERROR_BAD_VALUE) {
mMediaCodec.queueInputBuffer(bufferIndex, 0, 0, presentationTimeUs, 0);
} else {
mMediaCodec.queueInputBuffer(bufferIndex, 0, len, presentationTimeUs, 0);
}
}
}
} catch (Exception e) {
Log.e(TAG, "Record___Error!!!!!");
e.printStackTrace();
} finally {
Thread t = mWriter;
mWriter = null;
while (t != null && t.isAlive()) {
try {
t.interrupt();
t.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
try {
if (mAudioRecord != null) {
mAudioRecord.stop();
mAudioRecord.release();
mAudioRecord = null;
}
} catch (Throwable ex) {
ex.printStackTrace();
}
try {
if (mMediaCodec != null) {
mMediaCodec.stop();
mMediaCodec.release();
mMediaCodec = null;
}
} catch (Throwable ex) {
ex.printStackTrace();
}
}
}
}, "AACRecoder");
mThread.start();
}
public synchronized void setMuxer(EasyMuxer muxer) {
if (muxer != null) {
if (newFormat != null)
muxer.addTrack(newFormat, false);
}
this.muxer = muxer;
}
private class WriterThread extends Thread {
@Override
public void run() {
int index = 0;
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
} else {
mBuffers = mMediaCodec.getOutputBuffers();
}
ByteBuffer mBuffer = ByteBuffer.allocate(10240);
do {
index = mMediaCodec.dequeueOutputBuffer(mBufferInfo, 10000);
if (index >= 0) {
if (mBufferInfo.flags == MediaCodec.BUFFER_FLAG_CODEC_CONFIG) {
continue;
}
mBuffer.clear();
ByteBuffer outputBuffer = null;
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
outputBuffer = mMediaCodec.getOutputBuffer(index);
} else {
outputBuffer = mBuffers[index];
}
if (muxer != null)
muxer.pumpStream(outputBuffer, mBufferInfo, false);
outputBuffer.get(mBuffer.array(), 7, mBufferInfo.size);
outputBuffer.clear();
mBuffer.position(7 + mBufferInfo.size);
addADTStoPacket(mBuffer.array(), mBufferInfo.size + 7);
mBuffer.flip();
easyPusher.push(mBuffer.array(), 0, mBufferInfo.size + 7, mBufferInfo.presentationTimeUs / 1000, 0);
if (BuildConfig.DEBUG)
Log.i(TAG, String.format("push audio stamp:%d", mBufferInfo.presentationTimeUs / 1000));
mMediaCodec.releaseOutputBuffer(index, false);
} else if (index == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
mBuffers = mMediaCodec.getOutputBuffers();
} else if (index == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
synchronized (AudioStream.this) {
Log.v(TAG, "output format changed...");
newFormat = mMediaCodec.getOutputFormat();
if (muxer != null)
muxer.addTrack(newFormat, false);
}
} else if (index == MediaCodec.INFO_TRY_AGAIN_LATER) {
// Log.v(TAG, "No buffer available...");
} else {
Log.e(TAG, "Message: " + index);
}
} while (mWriter != null);
}
}
private void addADTStoPacket(byte[] packet, int packetLen) {
packet[0] = (byte) 0xFF;
packet[1] = (byte) 0xF1;
packet[2] = (byte) (((2 - 1) << 6) + (mSamplingRateIndex << 2) + (1 >> 2));
packet[3] = (byte) (((1 & 3) << 6) + (packetLen >> 11));
packet[4] = (byte) ((packetLen & 0x7FF) >> 3);
packet[5] = (byte) (((packetLen & 7) << 5) + 0x1F);
packet[6] = (byte) 0xFC;
}
public void stop() {
try {
Thread t = mThread;
mThread = null;
if (t != null) {
t.interrupt();
t.join();
}
} catch (InterruptedException e) {
e.fillInStackTrace();
}
}
}

@ -0,0 +1,8 @@
package org.easydarwin.bus;
/**
* Created by apple on 2017/7/21.
*/
public class StartRecord {
}

@ -0,0 +1,8 @@
package org.easydarwin.bus;
/**
* Created by apple on 2017/7/21.
*/
public class StopRecord {
}

@ -0,0 +1,14 @@
package org.easydarwin.bus;
/**
* Created by apple on 2017/5/14.
*/
public class StreamStat {
public final int fps, bps;
public StreamStat(int fps, int bps) {
this.fps = fps;
this.bps = bps;
}
}

@ -0,0 +1,8 @@
package org.easydarwin.bus;
/**
* Created by apple on 2017/8/29.
*/
public class SupportResolution {
}

@ -0,0 +1,30 @@
/*
Copyright (c) 2013-2016 EasyDarwin.ORG. All rights reserved.
Github: https://github.com/EasyDarwin
WEChat: EasyDarwin
Website: http://www.easydarwin.org
*/
package org.easydarwin.config;
/**
* Config
*/
public class Config {
public static final String SERVER_IP = "serverIp";
public static final String SERVER_PORT = "serverPort";
public static final String STREAM_ID = "streamId";
public static final String STREAM_ID_PREFIX = "";
public static final String DEFAULT_SERVER_IP = "cloud.easydarwin.org";
public static final String DEFAULT_SERVER_PORT = "554";
public static final String DEFAULT_STREAM_ID = STREAM_ID_PREFIX + String.valueOf((int) (Math.random() * 1000000 + 100000));
public static final String PREF_NAME = "easy_pref";
public static final String K_RESOLUTION = "k_resolution";
public static final String SERVER_URL = "serverUrl";
public static final String DEFAULT_SERVER_URL = "rtmp://www.easydss.com:10085/live/stream_"+String.valueOf((int) (Math.random() * 1000000 + 100000));
}

@ -0,0 +1,112 @@
package org.easydarwin.easypusher;
import android.app.Application;
import android.content.SharedPreferences;
import android.content.res.AssetManager;
import android.preference.PreferenceManager;
import org.easydarwin.config.Config;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
public class EasyApplication extends Application {
public static final String KEY_ENABLE_VIDEO = "key-enable-video";
private static EasyApplication mApplication;
public long mRecordingBegin;
public boolean mRecording;
@Override
public void onCreate() {
super.onCreate();
mApplication = this;
// for compatibility
resetDefaultServer();
File youyuan = getFileStreamPath("SIMYOU.ttf");
if (!youyuan.exists()){
AssetManager am = getAssets();
try {
InputStream is = am.open("zk/SIMYOU.ttf");
FileOutputStream os = openFileOutput("SIMYOU.ttf", MODE_PRIVATE);
byte[] buffer = new byte[1024];
int len = 0;
while ((len = is.read(buffer)) != -1) {
os.write(buffer, 0, len);
}
os.close();
is.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
private void resetDefaultServer() {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String defaultIP = sharedPreferences.getString(Config.SERVER_IP, Config.DEFAULT_SERVER_IP);
if ("114.55.107.180".equals(defaultIP)
|| "121.40.50.44".equals(defaultIP)
|| "www.easydarwin.org".equals(defaultIP)){
sharedPreferences.edit().putString(Config.SERVER_IP, Config.DEFAULT_SERVER_IP).apply();
}
String defaultRtmpURL = sharedPreferences.getString(Config.SERVER_URL, Config.DEFAULT_SERVER_URL);
int result1 = defaultRtmpURL.indexOf("rtmp://www.easydss.com/live");
int result2 = defaultRtmpURL.indexOf("rtmp://121.40.50.44/live");
if(result1 != -1 || result2 != -1){
sharedPreferences.edit().putString(Config.SERVER_URL, Config.DEFAULT_SERVER_URL).apply();
}
}
public static EasyApplication getEasyApplication() {
return mApplication;
}
public void saveStringIntoPref(String key, String value) {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putString(key, value);
editor.commit();
}
public String getIp() {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String ip = sharedPreferences.getString(Config.SERVER_IP, Config.DEFAULT_SERVER_IP);
return ip;
}
public String getPort() {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String port = sharedPreferences.getString(Config.SERVER_PORT, Config.DEFAULT_SERVER_PORT);
return port;
}
public String getId() {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String id = sharedPreferences.getString(Config.STREAM_ID, Config.DEFAULT_STREAM_ID);
if (!id.contains(Config.STREAM_ID_PREFIX)) {
id = Config.STREAM_ID_PREFIX + id;
}
saveStringIntoPref(Config.STREAM_ID, id);
return id;
}
public String getUrl() {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String defValue = Config.DEFAULT_SERVER_URL;
String ip = sharedPreferences.getString(Config.SERVER_URL, defValue);
if (ip.equals(defValue)){
sharedPreferences.edit().putString(Config.SERVER_URL, defValue).apply();
}
return ip;
}
}

@ -0,0 +1,165 @@
/*
* Copyright (C) 2011-2014 GUIGUI Simon, fyhertz@gmail.com
*
* This file is part of Spydroid (http://code.google.com/p/spydroid-ipcamera/)
*
* Spydroid is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 3 of the License, or
* (at your option) any later version.
*
* This source code is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this source code; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*/
package org.easydarwin.hw;
import java.util.ArrayList;
import java.util.HashSet;
import java.util.Set;
import android.annotation.SuppressLint;
import android.media.MediaCodecInfo;
import android.media.MediaCodecList;
import android.util.Log;
@SuppressLint("InlinedApi")
public class CodecManager {
public final static String TAG = "CodecManager";
public static final int[] SUPPORTED_COLOR_FORMATS = {
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar,
MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar
};
private static Codec[] sEncoders = null;
private static Codec[] sDecoders = null;
static class Codec {
public Codec(String name, Integer[] formats) {
this.name = name;
this.formats = formats;
}
public String name;
public Integer[] formats;
}
/**
* Lists all encoders that claim to support a color format that we know how to use.
* @return A list of those encoders
*/
@SuppressLint("NewApi")
public synchronized static Codec[] findEncodersForMimeType(String mimeType) {
if (sEncoders != null) return sEncoders;
ArrayList<Codec> encoders = new ArrayList<Codec>();
// We loop through the encoders, apparently this can take up to a sec (testes on a GS3)
for(int j =0; j < MediaCodecList.getCodecCount() - 1; j++){
MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(j);
if (!codecInfo.isEncoder()) continue;
String[] types = codecInfo.getSupportedTypes();
for (int i = 0; i < types.length; i++) {
if (types[i].equalsIgnoreCase(mimeType)) {
try {
MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
Set<Integer> formats = new HashSet<Integer>();
// And through the color formats supported
for (int k = 0; k < capabilities.colorFormats.length; k++) {
int format = capabilities.colorFormats[k];
for (int l=0;l<SUPPORTED_COLOR_FORMATS.length;l++) {
if (format == SUPPORTED_COLOR_FORMATS[l]) {
formats.add(format);
break;
}
}
}
Codec codec = new Codec(codecInfo.getName(), (Integer[]) formats.toArray(new Integer[formats.size()]));
encoders.add(codec);
} catch (Exception e) {
Log.wtf(TAG,e);
}
}
}
}
sEncoders = (Codec[]) encoders.toArray(new Codec[encoders.size()]);
if (sEncoders.length == 0) {
sEncoders = new Codec[]{new Codec(null, new Integer[]{0})};
}
return sEncoders;
}
/**
* Lists all decoders that claim to support a color format that we know how to use.
* @return A list of those decoders
*/
@SuppressLint("NewApi")
public synchronized static Codec[] findDecodersForMimeType(String mimeType) {
if (sDecoders != null) return sDecoders;
ArrayList<Codec> decoders = new ArrayList<Codec>();
// We loop through the decoders, apparently this can take up to a sec (testes on a GS3)
for(int j = MediaCodecList.getCodecCount() - 1; j >= 0; j--){
MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(j);
if (codecInfo.isEncoder()) continue;
String[] types = codecInfo.getSupportedTypes();
for (int i = 0; i < types.length; i++) {
if (types[i].equalsIgnoreCase(mimeType)) {
try {
MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
Set<Integer> formats = new HashSet<Integer>();
// And through the color formats supported
for (int k = 0; k < capabilities.colorFormats.length; k++) {
int format = capabilities.colorFormats[k];
for (int l=0;l<SUPPORTED_COLOR_FORMATS.length;l++) {
if (format == SUPPORTED_COLOR_FORMATS[l]) {
formats.add(format);
break;
}
}
}
Codec codec = new Codec(codecInfo.getName(), (Integer[]) formats.toArray(new Integer[formats.size()]));
decoders.add(codec);
} catch (Exception e) {
Log.wtf(TAG,e);
}
}
}
}
sDecoders = (Codec[]) decoders.toArray(new Codec[decoders.size()]);
// We will use the decoder from google first, it seems to work properly on many phones
for (int i=0;i<sDecoders.length;i++) {
if (sDecoders[i].name.equalsIgnoreCase("omx.google.h264.decoder")) {
Codec codec = sDecoders[0];
sDecoders[0] = sDecoders[i];
sDecoders[i] = codec;
}
}
return sDecoders;
}
}

@ -0,0 +1,579 @@
/*
* Copyright (C) 2011-2014 GUIGUI Simon, fyhertz@gmail.com
*
* This file is part of Spydroid (http://code.google.com/p/spydroid-ipcamera/)
*
* Spydroid is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 3 of the License, or
* (at your option) any later version.
*
* This source code is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this source code; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*/
package org.easydarwin.hw;
import java.io.IOException;
import java.io.PrintWriter;
import java.io.StringWriter;
import java.nio.ByteBuffer;
import java.util.Arrays;
import org.easydarwin.easypusher.BuildConfig;
import org.easydarwin.hw.CodecManager.Codec;
import android.annotation.SuppressLint;
import android.content.Context;
import android.content.SharedPreferences;
import android.content.SharedPreferences.Editor;
import android.media.MediaCodec;
import android.media.MediaCodec.BufferInfo;
import android.media.MediaFormat;
import android.os.Build;
import android.preference.PreferenceManager;
import android.util.Base64;
import android.util.Log;
/**
* The purpose of this class is to detect and by-pass some bugs (or
* underspecified configuration) that encoders available through the MediaCodec
* API may have. <br />
* Feeding the encoder with a surface is not tested here. Some bugs you may have
* encountered:<br />
* <ul>
* <li>U and V panes reversed</li>
* <li>Some padding is needed after the Y pane</li>
* <li>stride!=width or slice-height!=height</li>
* </ul>
*/
@SuppressLint("NewApi")
public class EncoderDebugger {
public final static String TAG = "EncoderDebugger";
/**
* Prefix that will be used for all shared preferences saved by
* libstreaming.
*/
private static final String PREF_PREFIX = "libstreaming-";
/**
* If this is set to false the test will be run only once and the result
* will be saved in the shared preferences.
*/
private static final boolean DEBUG = BuildConfig.DEBUG;
/**
* Set this to true to see more logs.
*/
private static final boolean VERBOSE = false;
/**
* Will be incremented every time this test is modified.
*/
private static final int VERSION = 3;
/**
* Bitrate that will be used with the encoder.
*/
private final static int BITRATE = 1000000;
/**
* Framerate that will be used to test the encoder.
*/
private final static int FRAMERATE = 20;
private final static String MIME_TYPE = "video/avc";
private final static int NB_DECODED = 34;
private final static int NB_ENCODED = 50;
private int mEncoderColorFormat;
private String mEncoderName, mErrorLog;
private MediaCodec mEncoder;
private int mWidth, mHeight, mSize;
private byte[] mSPS, mPPS;
private byte[] mData, mInitialImage;
private NV21Convertor mNV21;
private SharedPreferences mPreferences;
private byte[][] mVideo, mDecodedVideo;
private String mB64PPS, mB64SPS;
public synchronized static void asyncDebug(final Context context,
final int width, final int height) {
new Thread(new Runnable() {
@Override
public void run() {
try {
SharedPreferences prefs = PreferenceManager
.getDefaultSharedPreferences(context);
debug(prefs, width, height);
} catch (Exception e) {
}
}
}).start();
}
public synchronized static EncoderDebugger debug(Context context, int width, int height) {
SharedPreferences prefs = PreferenceManager
.getDefaultSharedPreferences(context);
return debug(prefs, width, height);
}
public synchronized static EncoderDebugger debug(SharedPreferences prefs, int width, int height) {
EncoderDebugger debugger = new EncoderDebugger(prefs, width, height);
debugger.debug();
return debugger;
}
public String getB64PPS() {
return mB64PPS;
}
public String getB64SPS() {
return mB64SPS;
}
public String getEncoderName() {
return mEncoderName;
}
public int getEncoderColorFormat() {
return mEncoderColorFormat;
}
/**
* This {@link NV21Convertor} will do the necessary work to feed properly
* the encoder.
*/
public NV21Convertor getNV21Convertor() {
return mNV21;
}
/**
* A log of all the errors that occured during the test.
*/
public String getErrorLog() {
return mErrorLog;
}
private EncoderDebugger(SharedPreferences prefs, int width, int height) {
mPreferences = prefs;
mWidth = width;
mHeight = height;
mSize = width * height;
reset();
}
private void reset() {
mNV21 = new NV21Convertor();
mVideo = new byte[NB_ENCODED][];
mDecodedVideo = new byte[NB_DECODED][];
mErrorLog = "";
mPPS = null;
mSPS = null;
}
private void debug() {
// If testing the phone again is not needed,
// we just restore the result from the shared preferences
if (!checkTestNeeded()) {
String resolution = mWidth + "x" + mHeight + "-";
boolean success = mPreferences.getBoolean(PREF_PREFIX + resolution
+ "success", false);
if (!success) {
throw new RuntimeException(
"Phone not supported with this resolution (" + mWidth
+ "x" + mHeight + ")");
}
mNV21.setSize(mWidth, mHeight);
mNV21.setSliceHeigth(mPreferences.getInt(PREF_PREFIX + resolution
+ "sliceHeight", 0));
mNV21.setStride(mPreferences.getInt(PREF_PREFIX + resolution
+ "stride", 0));
mNV21.setYPadding(mPreferences.getInt(PREF_PREFIX + resolution
+ "padding", 0));
mNV21.setPlanar(mPreferences.getBoolean(PREF_PREFIX + resolution
+ "planar", false));
mNV21.setColorPanesReversed(mPreferences.getBoolean(PREF_PREFIX
+ resolution + "reversed", false));
mEncoderName = mPreferences.getString(PREF_PREFIX + resolution
+ "encoderName", "");
mEncoderColorFormat = mPreferences.getInt(PREF_PREFIX + resolution
+ "colorFormat", 0);
mB64PPS = mPreferences.getString(PREF_PREFIX + resolution + "bps",
"");
mB64SPS = mPreferences.getString(PREF_PREFIX + resolution + "sps",
"");
return;
}
if (VERBOSE)
Log.d(TAG, ">>>> Testing the phone for resolution " + mWidth + "x"
+ mHeight);
// Builds a list of available encoders and decoders we may be able to
// use
// because they support some nice color formats
Codec[] encoders = CodecManager.findEncodersForMimeType(MIME_TYPE);
Codec[] decoders = CodecManager.findDecodersForMimeType(MIME_TYPE);
int count = 0, n = 1;
for (int i = 0; i < encoders.length; i++) {
count += encoders[i].formats.length;
}
// Tries available encoders
for (int i = 0; i < encoders.length; i++) {
for (int j = 0; j < encoders[i].formats.length; j++) {
reset();
mEncoderName = encoders[i].name;
mEncoderColorFormat = encoders[i].formats[j];
if (VERBOSE)
Log.v(TAG, ">> Test " + (n++) + "/" + count + ": "
+ mEncoderName + " with color format "
+ mEncoderColorFormat + " at " + mWidth + "x"
+ mHeight);
// Converts from NV21 to YUV420 with the specified parameters
mNV21.setSize(mWidth, mHeight);
mNV21.setSliceHeigth(mHeight);
mNV21.setStride(mWidth);
mNV21.setYPadding(0);
mNV21.setEncoderColorFormat(mEncoderColorFormat);
// /!\ NV21Convertor can directly modify the input
createTestImage();
mData = mNV21.convert(mInitialImage);
try {
// Starts the encoder
configureEncoder();
searchSPSandPPS();
saveTestResult(true);
Log.v(TAG, "The encoder " + mEncoderName
+ " is usable with resolution " + mWidth + "x"
+ mHeight);
return;
} catch (Exception e) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
String stack = sw.toString();
String str = "Encoder " + mEncoderName
+ " cannot be used with color format "
+ mEncoderColorFormat;
if (VERBOSE)
Log.e(TAG, str, e);
mErrorLog += str + "\n" + stack;
e.printStackTrace();
} finally {
releaseEncoder();
}
}
}
saveTestResult(false);
Log.e(TAG, "No usable encoder were found on the phone for resolution "
+ mWidth + "x" + mHeight);
throw new RuntimeException(
"No usable encoder were found on the phone for resolution "
+ mWidth + "x" + mHeight);
}
private boolean checkTestNeeded() {
String resolution = mWidth + "x" + mHeight + "-";
// Forces the test
if (DEBUG || mPreferences == null)
return true;
// If the sdk has changed on the phone, or the version of the test
// it has to be run again
if (mPreferences.contains(PREF_PREFIX + resolution + "lastSdk")) {
int lastSdk = mPreferences.getInt(PREF_PREFIX + resolution
+ "lastSdk", 0);
int lastVersion = mPreferences.getInt(PREF_PREFIX + resolution
+ "lastVersion", 0);
if (Build.VERSION.SDK_INT > lastSdk || VERSION > lastVersion) {
return true;
}
} else {
return true;
}
return false;
}
/**
* Saves the result of the test in the shared preferences, we will run it
* again only if the SDK has changed on the phone, or if this test has been
* modified.
*/
private void saveTestResult(boolean success) {
String resolution = mWidth + "x" + mHeight + "-";
Editor editor = mPreferences.edit();
editor.putBoolean(PREF_PREFIX + resolution + "success", success);
if (success) {
editor.putInt(PREF_PREFIX + resolution + "lastSdk",
Build.VERSION.SDK_INT);
editor.putInt(PREF_PREFIX + resolution + "lastVersion", VERSION);
editor.putInt(PREF_PREFIX + resolution + "sliceHeight",
mNV21.getSliceHeigth());
editor.putInt(PREF_PREFIX + resolution + "stride",
mNV21.getStride());
editor.putInt(PREF_PREFIX + resolution + "padding",
mNV21.getYPadding());
editor.putBoolean(PREF_PREFIX + resolution + "planar",
mNV21.getPlanar());
editor.putBoolean(PREF_PREFIX + resolution + "reversed",
mNV21.getUVPanesReversed());
editor.putString(PREF_PREFIX + resolution + "encoderName",
mEncoderName);
editor.putInt(PREF_PREFIX + resolution + "colorFormat",
mEncoderColorFormat);
editor.putString(PREF_PREFIX + resolution + "encoderName",
mEncoderName);
editor.putString(PREF_PREFIX + resolution + "bps", mB64PPS);
editor.putString(PREF_PREFIX + resolution + "sps", mB64SPS);
}
editor.commit();
}
/**
* Creates the test image that will be used to feed the encoder.
*/
private void createTestImage() {
mInitialImage = new byte[3 * mSize / 2];
for (int i = 0; i < mSize; i++) {
mInitialImage[i] = (byte) (40 + i % 199);
}
for (int i = mSize; i < 3 * mSize / 2; i += 2) {
mInitialImage[i] = (byte) (40 + i % 200);
mInitialImage[i + 1] = (byte) (40 + (i + 99) % 200);
}
}
/**
* Converts the image obtained from the decoder to NV21.
*/
/**
* Instantiates and starts the encoder.
*
* @throws IOException
*/
private void configureEncoder() throws IOException {
mEncoder = MediaCodec.createByCodecName(mEncoderName);
MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE,
mWidth, mHeight);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, BITRATE);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAMERATE);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
mEncoderColorFormat);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
mEncoder.configure(mediaFormat, null, null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
mEncoder.start();
}
private void releaseEncoder() {
if (mEncoder != null) {
try {
mEncoder.stop();
} catch (Exception ignore) {
}
try {
mEncoder.release();
} catch (Exception ignore) {
}
}
}
/**
* Tries to obtain the SPS and the PPS for the encoder.
*/
private long searchSPSandPPS() {
long elapsed = 0, now = timestamp();
ByteBuffer[] inputBuffers = mEncoder.getInputBuffers();
ByteBuffer[] outputBuffers = mEncoder.getOutputBuffers();
BufferInfo info = new BufferInfo();
byte[] csd = new byte[128];
int len = 0, p = 4, q = 4;
while (elapsed < 3000000 && (mSPS == null || mPPS == null)) {
// Some encoders won't give us the SPS and PPS unless they receive
// something to encode first...
int bufferIndex = mEncoder.dequeueInputBuffer(1000000 / FRAMERATE);
if (bufferIndex >= 0) {
check(inputBuffers[bufferIndex].capacity() >= mData.length,
"The input buffer is not big enough.");
inputBuffers[bufferIndex].clear();
inputBuffers[bufferIndex].put(mData, 0, mData.length);
mEncoder.queueInputBuffer(bufferIndex, 0, mData.length,
timestamp(), 0);
} else {
if (VERBOSE)
Log.e(TAG, "No buffer available !");
}
// We are looking for the SPS and the PPS here. As always, Android
// is very inconsistent, I have observed that some
// encoders will give those parameters through the MediaFormat
// object (that is the normal behaviour).
// But some other will not, in that case we try to find a NAL unit
// of type 7 or 8 in the byte stream outputed by the encoder...
int index = mEncoder.dequeueOutputBuffer(info, 1000000 / FRAMERATE);
if (index == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// The PPS and PPS shoud be there
MediaFormat format = mEncoder.getOutputFormat();
ByteBuffer spsb = format.getByteBuffer("csd-0");
ByteBuffer ppsb = format.getByteBuffer("csd-1");
mSPS = new byte[spsb.capacity() - 4];
spsb.position(4);
spsb.get(mSPS, 0, mSPS.length);
mPPS = new byte[ppsb.capacity() - 4];
ppsb.position(4);
ppsb.get(mPPS, 0, mPPS.length);
break;
} else if (index == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mEncoder.getOutputBuffers();
} else if (index >= 0) {
len = info.size;
if (len < 128) {
outputBuffers[index].get(csd, 0, len);
if (len > 0 && csd[0] == 0 && csd[1] == 0 && csd[2] == 0
&& csd[3] == 1) {
// Parses the SPS and PPS, they could be in two
// different packets and in a different order
// depending on the phone so we don't make any
// assumption about that
while (p < len) {
while (!(csd[p + 0] == 0 && csd[p + 1] == 0
&& csd[p + 2] == 0 && csd[p + 3] == 1)
&& p + 3 < len)
p++;
if (p + 3 >= len)
p = len;
if ((csd[q] & 0x1F) == 7) {
mSPS = new byte[p - q];
System.arraycopy(csd, q, mSPS, 0, p - q);
} else {
mPPS = new byte[p - q];
System.arraycopy(csd, q, mPPS, 0, p - q);
}
p += 4;
q = p;
}
}
}
mEncoder.releaseOutputBuffer(index, false);
}
elapsed = timestamp() - now;
}
check(mPPS != null & mSPS != null, "Could not determine the SPS & PPS.");
mB64PPS = Base64.encodeToString(mPPS, 0, mPPS.length, Base64.NO_WRAP);
mB64SPS = Base64.encodeToString(mSPS, 0, mSPS.length, Base64.NO_WRAP);
return elapsed;
}
static int getXPS(byte[] data, int offset, int length, byte[] dataOut,
int[] outLen, int type) {
int i;
int pos0;
int pos1;
pos0 = -1;
for (i = offset; i < length - 4; i++) {
if ((0 == data[i]) && (0 == data[i + 1]) && (1 == data[i + 2])
&& (type == (0x0F & data[i + 3]))) {
pos0 = i;
break;
}
}
if (-1 == pos0) {
return -1;
}
pos1 = -1;
for (i = pos0 + 4; i < length - 4; i++) {
if ((0 == data[i]) && (0 == data[i + 1]) && (0 == data[i + 2])) {
pos1 = i;
break;
}
}
if (-1 == pos1) {
return -2;
}
if (pos1 - pos0 + 1 > outLen[0]) {
return -3; // 输入缓冲区太小
}
dataOut[0] = 0;
System.arraycopy(data, pos0, dataOut, 1, pos1 - pos0);
// memcpy(pXPS+1, pES+pos0, pos1-pos0);
// *pMaxXPSLen = pos1-pos0+1;
outLen[0] = pos1 - pos0 + 1;
return 0;
}
private void check(boolean cond, String message) {
if (!cond) {
if (VERBOSE)
Log.e(TAG, message);
throw new IllegalStateException(message);
}
}
private long timestamp() {
return System.nanoTime() / 1000;
}
@Override
public String toString() {
return "EncoderDebugger [mEncoderColorFormat=" + mEncoderColorFormat
+ ", mEncoderName=" + mEncoderName + ", mErrorLog=" + mErrorLog + ", mEncoder="
+ mEncoder + ", mWidth=" + mWidth
+ ", mHeight=" + mHeight + ", mSize=" + mSize + ", mSPS="
+ Arrays.toString(mSPS) + ", mPPS=" + Arrays.toString(mPPS)
+ ", mData=" + Arrays.toString(mData) + ", mInitialImage="
+ Arrays.toString(mInitialImage) + ", mNV21=" + mNV21 + ", mPreferences="
+ mPreferences + ", mVideo=" + Arrays.toString(mVideo)
+ ", mDecodedVideo=" + Arrays.toString(mDecodedVideo)
+ ", mB64PPS=" + mB64PPS + ", mB64SPS=" + mB64SPS
+ "]";
}
}

@ -0,0 +1,174 @@
/*
* Copyright (C) 2011-2014 GUIGUI Simon, fyhertz@gmail.com
*
* This file is part of Spydroid (http://code.google.com/p/spydroid-ipcamera/)
*
* Spydroid is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 3 of the License, or
* (at your option) any later version.
*
* This source code is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this source code; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*/
package org.easydarwin.hw;
import java.nio.ByteBuffer;
import android.media.MediaCodecInfo;
import android.util.Log;
/**
* Converts from NV21 to YUV420 semi planar or planar.
*/
public class NV21Convertor {
private int mSliceHeight, mHeight;
private int mStride, mWidth;
private int mSize;
private boolean mPlanar, mPanesReversed = false;
private int mYPadding;
private byte[] mBuffer;
ByteBuffer mCopy;
public void setSize(int width, int height) {
mHeight = height;
mWidth = width;
mSliceHeight = height;
mStride = width;
mSize = mWidth * mHeight;
}
public void setStride(int width) {
mStride = width;
}
public void setSliceHeigth(int height) {
mSliceHeight = height;
}
public void setPlanar(boolean planar) {
mPlanar = planar;
}
public void setYPadding(int padding) {
mYPadding = padding;
}
public int getBufferSize() {
return 3 * mSize / 2;
}
public void setEncoderColorFormat(int colorFormat) {
switch (colorFormat) {
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
setPlanar(false);
break;
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
setPlanar(true);
break;
}
}
public void setColorPanesReversed(boolean b) {
mPanesReversed = b;
}
public int getStride() {
return mStride;
}
public int getSliceHeigth() {
return mSliceHeight;
}
public int getYPadding() {
return mYPadding;
}
public boolean getPlanar() {
return mPlanar;
}
public boolean getUVPanesReversed() {
return mPanesReversed;
}
public void convert(byte[] data, ByteBuffer buffer) {
byte[] result = convert(data);
int min = buffer.capacity() < data.length ? buffer.capacity() : data.length;
buffer.put(result, 0, min);
}
public byte[] convert(byte[] data) {
// A buffer large enough for every case
if (mBuffer == null || mBuffer.length != 3 * mSliceHeight * mStride / 2 + mYPadding) {
mBuffer = new byte[3 * mSliceHeight * mStride / 2 + mYPadding];
}
if (!mPlanar) {
if (mSliceHeight == mHeight && mStride == mWidth) {
// Swaps U and V
if (!mPanesReversed) {
for (int i = mSize; i < mSize + mSize / 2; i += 2) {
mBuffer[0] = data[i + 1];
data[i + 1] = data[i];
data[i] = mBuffer[0];
}
}
if (mYPadding > 0) {
System.arraycopy(data, 0, mBuffer, 0, mSize);
System.arraycopy(data, mSize, mBuffer, mSize + mYPadding, mSize / 2);
return mBuffer;
}
return data;
}
}
else {
if (mSliceHeight == mHeight && mStride == mWidth) {
// De-interleave U and V
if (!mPanesReversed) {
for (int i = 0; i < mSize / 4; i += 1) {
mBuffer[i] = data[mSize + 2 * i + 1];
mBuffer[mSize / 4 + i] = data[mSize + 2 * i];
}
}
else {
for (int i = 0; i < mSize / 4; i += 1) {
mBuffer[i] = data[mSize + 2 * i];
mBuffer[mSize / 4 + i] = data[mSize + 2 * i + 1];
}
}
if (mYPadding == 0) {
System.arraycopy(mBuffer, 0, data, mSize, mSize / 2);
}
else {
System.arraycopy(data, 0, mBuffer, 0, mSize);
System.arraycopy(mBuffer, 0, mBuffer, mSize + mYPadding, mSize / 2);
return mBuffer;
}
return data;
}
}
return data;
}
}

@ -0,0 +1,148 @@
package org.easydarwin.muxer;
import android.annotation.TargetApi;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Build;
import android.util.Log;
import org.easydarwin.bus.StartRecord;
import org.easydarwin.bus.StopRecord;
import org.easydarwin.easypusher.BuildConfig;
import org.easydarwin.easypusher.EasyApplication;
import org.easydarwin.push.EasyPusher;
import java.io.File;
import java.io.IOException;
import java.nio.ByteBuffer;
/**
* Created by John on 2017/1/10.
*/
public class EasyMuxer {
private static final boolean VERBOSE = BuildConfig.DEBUG;
private static final String TAG = EasyMuxer.class.getSimpleName();
private final String mFilePath;
private MediaMuxer mMuxer;
private final long durationMillis;
private int index = 0;
private int mVideoTrackIndex = -1;
private int mAudioTrackIndex = -1;
private long mBeginMillis;
private MediaFormat mVideoFormat;
private MediaFormat mAudioFormat;
@TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
public EasyMuxer(String path, long durationMillis) throws IOException {
if(path.endsWith(".mp4")) {
path = path.substring(0, path.lastIndexOf(".mp4"));
}
mFilePath = path;
this.durationMillis = durationMillis;
mMuxer = new MediaMuxer(path + "_" + index++ + ".mp4", MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
}
public synchronized void addTrack(MediaFormat format, boolean isVideo) {
// now that we have the Magic Goodies, start the muxer
if (mAudioTrackIndex != -1 && mVideoTrackIndex != -1)
throw new RuntimeException("already add all tracks");
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR2) {
int track = mMuxer.addTrack(format);
if (VERBOSE)
Log.i(TAG, String.format("addTrack %s result %d", isVideo ? "video" : "audio", track));
if (isVideo) {
mVideoFormat = format;
mVideoTrackIndex = track;
if (mAudioTrackIndex != -1) {
if (VERBOSE)
Log.i(TAG, "both audio and video added,and muxer is started");
mMuxer.start();
mBeginMillis = System.currentTimeMillis();
}
} else {
mAudioFormat = format;
mAudioTrackIndex = track;
if (mVideoTrackIndex != -1) {
mMuxer.start();
mBeginMillis = System.currentTimeMillis();
}
}
}
}
public synchronized void pumpStream(ByteBuffer outputBuffer, MediaCodec.BufferInfo bufferInfo, boolean isVideo) {
if (mAudioTrackIndex == -1 || mVideoTrackIndex == -1) {
Log.i(TAG, String.format("pumpStream [%s] but muxer is not start.ignore..", isVideo ? "video" : "audio"));
return;
}
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
// The codec config data was pulled out and fed to the muxer when we got
// the INFO_OUTPUT_FORMAT_CHANGED status. Ignore it.
} else if (bufferInfo.size != 0) {
if (isVideo && mVideoTrackIndex == -1) {
throw new RuntimeException("muxer hasn't started");
}
// adjust the ByteBuffer values to match BufferInfo (not needed?)
outputBuffer.position(bufferInfo.offset);
outputBuffer.limit(bufferInfo.offset + bufferInfo.size);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR2) {
mMuxer.writeSampleData(isVideo ? mVideoTrackIndex : mAudioTrackIndex, outputBuffer, bufferInfo);
}
if (VERBOSE)
Log.d(TAG, String.format("sent %s [" + bufferInfo.size + "] with timestamp:[%d] to muxer", isVideo ? "video" : "audio", bufferInfo.presentationTimeUs / 1000));
}
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
if (VERBOSE)
Log.i(TAG, "BUFFER_FLAG_END_OF_STREAM received");
}
if (System.currentTimeMillis() - mBeginMillis >= durationMillis && isVideo && ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0)) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR2) {
if (VERBOSE)
Log.i(TAG, String.format("record file reach expiration.create new file:" + index));
mMuxer.stop();
mMuxer.release();
mMuxer = null;
mVideoTrackIndex = mAudioTrackIndex = -1;
try {
mMuxer = new MediaMuxer(mFilePath + "-" + ++index + ".mp4", MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
addTrack(mVideoFormat, true);
addTrack(mAudioFormat, false);
pumpStream(outputBuffer, bufferInfo, isVideo);
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
public synchronized void release() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR2) {
if (mMuxer != null) {
if (mAudioTrackIndex != -1 && mVideoTrackIndex != -1) {
if (VERBOSE)
Log.i(TAG, String.format("muxer is started. now it will be stoped."));
try {
mMuxer.stop();
mMuxer.release();
} catch (IllegalStateException ex) {
ex.printStackTrace();
}
if (System.currentTimeMillis() - mBeginMillis <= 1500){
new File(mFilePath + "-" + index + ".mp4").delete();
}
mAudioTrackIndex = mVideoTrackIndex = -1;
// EasyApplication.BUS.post(new StopRecord());
}
}
}
}
}

@ -0,0 +1,151 @@
/*
Copyright (c) 2013-2016 EasyDarwin.ORG. All rights reserved.
Github: https://github.com/EasyDarwin
WEChat: EasyDarwin
Website: http://www.easydarwin.org
*/
package org.easydarwin.push;
import android.content.Context;
import android.util.Log;
import org.easydarwin.bus.StreamStat;
import org.easydarwin.easypusher.BuildConfig;
public class EasyPusher implements Pusher {
private static String TAG = "EasyPusher";
static {
System.loadLibrary("easypusher");
}
private long pPreviewTS;
private long mTotal;
private int mTotalFrms;
public interface OnInitPusherCallback {
public void onCallback(int code);
static class CODE {
public static final int EASY_ACTIVATE_INVALID_KEY = -1; //无效Key
public static final int EASY_ACTIVATE_TIME_ERR = -2; //时间错误
public static final int EASY_ACTIVATE_PROCESS_NAME_LEN_ERR = -3; //进程名称长度不匹配
public static final int EASY_ACTIVATE_PROCESS_NAME_ERR = -4; //进程名称不匹配
public static final int EASY_ACTIVATE_VALIDITY_PERIOD_ERR = -5; //有效期校验不一致
public static final int EASY_ACTIVATE_PLATFORM_ERR = -6; //平台不匹配
public static final int EASY_ACTIVATE_COMPANY_ID_LEN_ERR = -7; //授权使用商不匹配
public static final int EASY_ACTIVATE_SUCCESS = 0; //激活成功
public static final int EASY_PUSH_STATE_CONNECTING = 1; //连接中
public static final int EASY_PUSH_STATE_CONNECTED = 2; //连接成功
public static final int EASY_PUSH_STATE_CONNECT_FAILED = 3; //连接失败
public static final int EASY_PUSH_STATE_CONNECT_ABORT = 4; //连接异常中断
public static final int EASY_PUSH_STATE_PUSHING = 5; //推流中
public static final int EASY_PUSH_STATE_DISCONNECTED = 6; //断开连接
public static final int EASY_PUSH_STATE_ERROR = 7;
}
}
private long mPusherObj = 0;
// public native void setOnInitPusherCallback(OnInitPusherCallback callback);
/**
*
*
* @param key
*/
public native long init(String key, Context context, OnInitPusherCallback callback);
public native void setMediaInfo(long pusherObj, int videoCodec, int videoFPS, int audioCodec, int audioChannel, int audioSamplerate, int audioBitPerSample);
/**
*
* @param pusherObj init
* @param serverIP IP
* @param serverPort
* @param streamName
* @param transType 1TCP 2UDP
*/
public native void start(long pusherObj, String serverIP, String serverPort, String streamName, int transType);
/**
* H264
*
* @param data H264
* @param timestamp
*/
private native void push(long pusherObj, byte[] data, int offset, int length, long timestamp, int type);
/**
*
*/
private native void stopPush(long pusherObj);
public synchronized void stop() {
Log.i(TAG, "PusherStop");
if (mPusherObj == 0) return;
stopPush(mPusherObj);
mPusherObj = 0;
}
@Override
public synchronized void initPush(Context context, final InitCallback callback) {
Log.i(TAG, "PusherStart");
mPusherObj = init("", context, new OnInitPusherCallback() {
int code = Integer.MAX_VALUE;
@Override
public void onCallback(int code) {
if (code != this.code) {
this.code = code;
if (callback != null) callback.onCallback(code);
}
}
});
}
@Override
public void initPush(String url, Context context, InitCallback callback) {
throw new RuntimeException("not support");
}
@Override
public void initPush(String url, Context context, InitCallback callback, int fps) {
throw new RuntimeException("not support");
}
public synchronized void setMediaInfo(int videoCodec, int videoFPS, int audioCodec, int audioChannel, int audioSamplerate, int audioBitPerSample){
if (mPusherObj == 0) return;
setMediaInfo(mPusherObj, videoCodec, videoFPS, audioCodec, audioChannel, audioSamplerate, audioBitPerSample);
}
public synchronized void start(String serverIP, String serverPort, String streamName, int transType){
if (mPusherObj == 0) return;
start(mPusherObj, serverIP, serverPort, streamName, transType);
}
public synchronized void push(byte[] data, int offset, int length, long timestamp, int type) {
if (mPusherObj == 0) return;
mTotal += length;
if (type == 1){
mTotalFrms++;
}
long interval = System.currentTimeMillis() - pPreviewTS;
if (interval >= 3000){
long bps = mTotal * 1000 / (interval);
long fps = mTotalFrms * 1000 / (interval);
Log.i(TAG, String.format("bps:%d, fps:%d", fps, bps));
pPreviewTS = System.currentTimeMillis();
mTotal = 0;
mTotalFrms = 0;
}
push(mPusherObj, data, offset, length, timestamp, type);
}
public synchronized void push(byte[] data, long timestamp, int type) {
push( data, 0, data.length, timestamp, type);
}
}

@ -0,0 +1,263 @@
package org.easydarwin.push;
import android.content.Context;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.os.Build;
import android.os.Bundle;
import android.preference.PreferenceManager;
import android.util.Log;
import org.easydarwin.easypusher.BuildConfig;
import org.easydarwin.muxer.EasyMuxer;
import org.easydarwin.sw.JNIUtil;
import java.io.IOException;
import java.nio.ByteBuffer;
import static android.media.MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar;
import static android.media.MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar;
import static android.media.MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar;
/**
* Created by apple on 2017/5/13.
*/
public class HWConsumer extends Thread implements VideoConsumer {
private static final String TAG = "Pusher";
private final MediaStream.CodecInfo info;
public EasyMuxer mMuxer;
private final Context mContext;
private final Pusher mPusher;
private int mHeight;
private int mWidth;
private MediaCodec mMediaCodec;
private ByteBuffer[] inputBuffers;
private ByteBuffer[] outputBuffers;
private volatile boolean mVideoStarted;
private MediaFormat newFormat;
public HWConsumer(Context context, Pusher pusher, MediaStream.CodecInfo info) {
mContext = context;
mPusher = pusher;
this.info = info;
}
@Override
public void onVideoStart(int width, int height) throws IOException {
newFormat = null;
this.mWidth = width;
this.mHeight = height;
startMediaCodec();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP + 1) {
inputBuffers = outputBuffers = null;
} else {
inputBuffers = mMediaCodec.getInputBuffers();
outputBuffers = mMediaCodec.getOutputBuffers();
}
start();
mVideoStarted = true;
}
final int millisPerframe = 1000 / 20;
long lastPush = 0;
@Override
public int onVideo(byte[] data, int format) {
if (!mVideoStarted) return 0;
try {
if (lastPush == 0) {
lastPush = System.currentTimeMillis();
}
long time = System.currentTimeMillis() - lastPush;
if (time >= 0) {
time = millisPerframe - time;
if (time > 0) Thread.sleep(time / 2);
}
if (info.mColorFormat == COLOR_FormatYUV420SemiPlanar) {
JNIUtil.yuvConvert(data, mWidth, mHeight, 6);
} else if (info.mColorFormat == COLOR_TI_FormatYUV420PackedSemiPlanar) {
JNIUtil.yuvConvert(data, mWidth, mHeight, 6);
} else if (info.mColorFormat == COLOR_FormatYUV420Planar) {
JNIUtil.yuvConvert(data, mWidth, mHeight, 5);
} else {
JNIUtil.yuvConvert(data, mWidth, mHeight, 5);
}
int bufferIndex = mMediaCodec.dequeueInputBuffer(0);
if (bufferIndex >= 0) {
ByteBuffer buffer = null;
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
buffer = mMediaCodec.getInputBuffer(bufferIndex);
} else {
buffer = inputBuffers[bufferIndex];
}
buffer.clear();
buffer.put(data);
buffer.clear();
mMediaCodec.queueInputBuffer(bufferIndex, 0, data.length, System.nanoTime() / 1000, MediaCodec.BUFFER_FLAG_KEY_FRAME);
}
if (time > 0) Thread.sleep(time / 2);
lastPush = System.currentTimeMillis();
} catch (InterruptedException ex) {
ex.printStackTrace();
}
return 0;
}
@Override
public void run() {
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = 0;
byte[] mPpsSps = new byte[0];
byte[] h264 = new byte[mWidth * mHeight];
do {
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 10000);
if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not expected for an encoder
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
synchronized (HWConsumer.this) {
newFormat = mMediaCodec.getOutputFormat();
EasyMuxer muxer = mMuxer;
if (muxer != null) {
// should happen before receiving buffers, and should only happen once
muxer.addTrack(newFormat, true);
}
}
} else if (outputBufferIndex < 0) {
// let's ignore it
} else {
ByteBuffer outputBuffer;
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
outputBuffer = mMediaCodec.getOutputBuffer(outputBufferIndex);
} else {
outputBuffer = outputBuffers[outputBufferIndex];
}
outputBuffer.position(bufferInfo.offset);
outputBuffer.limit(bufferInfo.offset + bufferInfo.size);
EasyMuxer muxer = mMuxer;
if (muxer != null) {
muxer.pumpStream(outputBuffer, bufferInfo, true);
}
boolean sync = false;
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {// sps
sync = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0;
if (!sync) {
byte[] temp = new byte[bufferInfo.size];
outputBuffer.get(temp);
mPpsSps = temp;
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
continue;
} else {
mPpsSps = new byte[0];
}
}
sync |= (bufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0;
int len = mPpsSps.length + bufferInfo.size;
if (len > h264.length) {
h264 = new byte[len];
}
if (sync) {
System.arraycopy(mPpsSps, 0, h264, 0, mPpsSps.length);
outputBuffer.get(h264, mPpsSps.length, bufferInfo.size);
mPusher.push(h264, 0, mPpsSps.length + bufferInfo.size, bufferInfo.presentationTimeUs / 1000, 1);
if (BuildConfig.DEBUG)
Log.i(TAG, String.format("push i video stamp:%d", bufferInfo.presentationTimeUs / 1000));
} else {
outputBuffer.get(h264, 0, bufferInfo.size);
mPusher.push(h264, 0, bufferInfo.size, bufferInfo.presentationTimeUs / 1000, 1);
if (BuildConfig.DEBUG)
Log.i(TAG, String.format("push video stamp:%d", bufferInfo.presentationTimeUs / 1000));
}
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
}
}
while (mVideoStarted);
}
@Override
public void onVideoStop() {
do {
newFormat = null;
mVideoStarted = false;
try {
join();
} catch (InterruptedException e) {
e.printStackTrace();
}
} while (isAlive());
if (mMediaCodec != null) {
stopMediaCodec();
mMediaCodec = null;
}
}
@Override
public synchronized void setMuxer(EasyMuxer muxer) {
if (muxer != null) {
if (newFormat != null)
muxer.addTrack(newFormat, true);
}
mMuxer = muxer;
}
/**
*
*/
private void startMediaCodec() throws IOException {
/*
SD (Low quality) SD (High quality) HD 720p
1 HD 1080p
1
Video resolution 320 x 240 px 720 x 480 px 1280 x 720 px 1920 x 1080 px
Video frame rate 20 fps 30 fps 30 fps 30 fps
Video bitrate 384 Kbps 2 Mbps 4 Mbps 10 Mbps
*/
int framerate = 20;
// if (width == 640 || height == 640) {
// bitrate = 2000000;
// } else if (width == 1280 || height == 1280) {
// bitrate = 4000000;
// } else {
// bitrate = 2 * width * height;
// }
int bitrate = (int) (mWidth * mHeight * 20 * 2 * 0.05f);
if (mWidth >= 1920 || mHeight >= 1920) bitrate *= 0.3;
else if (mWidth >= 1280 || mHeight >= 1280) bitrate *= 0.4;
else if (mWidth >= 720 || mHeight >= 720) bitrate *= 0.6;
mMediaCodec = MediaCodec.createByCodecName(info.mName);
MediaFormat mediaFormat = MediaFormat.createVideoFormat(info.mime, mWidth, mHeight);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, info.mColorFormat);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
Bundle params = new Bundle();
params.putInt(MediaCodec.PARAMETER_KEY_REQUEST_SYNC_FRAME, 0);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
mMediaCodec.setParameters(params);
}
}
/**
*
*/
private void stopMediaCodec() {
mMediaCodec.stop();
mMediaCodec.release();
}
}

@ -0,0 +1,9 @@
package org.easydarwin.push;
/**
* Created by john on 2017/5/6.
*/
public interface InitCallback {
public void onCallback(int code);
}

File diff suppressed because it is too large Load Diff

@ -0,0 +1,371 @@
package org.easydarwin.push;
import android.annotation.TargetApi;
import android.app.Application;
import android.app.Notification;
import android.app.PendingIntent;
import android.app.Service;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.hardware.display.DisplayManager;
import android.hardware.display.VirtualDisplay;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.projection.MediaProjection;
import android.media.projection.MediaProjectionManager;
import android.os.Binder;
import android.os.Build;
import android.os.Environment;
import android.os.IBinder;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
import android.text.TextUtils;
import android.util.DisplayMetrics;
import android.util.Log;
import android.view.Surface;
import android.view.WindowManager;
import org.easydarwin.easypusher.BuildConfig;
import org.easydarwin.easypusher.R;
import java.io.IOException;
import java.nio.ByteBuffer;
import static android.app.PendingIntent.FLAG_CANCEL_CURRENT;
public class PushScreenService extends Service {
private static final String TAG = "RService";
public static final String ACTION_CLOSE_PUSHING_SCREEN = "ACTION_CLOSE_PUSHING_SCREEN";
private String mVideoPath;
private MediaProjectionManager mMpmngr;
private MediaProjection mMpj;
private VirtualDisplay mVirtualDisplay;
private int windowWidth;
private int windowHeight;
private int screenDensity;
private Surface mSurface;
private MediaCodec mMediaCodec;
private WindowManager wm;
MediaStream.CodecInfo info = new MediaStream.CodecInfo();
private MediaCodec.BufferInfo mBufferInfo = new MediaCodec.BufferInfo();
private Thread mPushThread;
private byte[] mPpsSps;
private BroadcastReceiver mReceiver = new BroadcastReceiver() {
@Override
public void onReceive(Context context, Intent intent) {
Application app = (Application) context.getApplicationContext();
MediaStream.stopPushScreen(app);
}
};
private final Pusher mEasyPusher = new EasyPusher();
private String ip;
private String port;
private String id;
private MediaStream.PushingScreenLiveData liveData;
public class MyBinder extends Binder
{
public PushScreenService getService(){
return PushScreenService.this;
}
}
MyBinder binder = new MyBinder();
@Nullable
@Override
public IBinder onBind(Intent intent) {
return binder;
}
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
@Override
public void onCreate() {
super.onCreate();
mMpmngr = (MediaProjectionManager) getApplicationContext().getSystemService(MEDIA_PROJECTION_SERVICE);
createEnvironment();
registerReceiver(mReceiver,new IntentFilter(ACTION_CLOSE_PUSHING_SCREEN));
}
@TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
private void configureMedia() throws IOException {
MediaStream.initEncoder(this, info);
if (TextUtils.isEmpty(info.mName) && info.mColorFormat == 0){
throw new IOException("media codec init error");
}
MediaFormat mediaFormat = MediaFormat.createVideoFormat(info.mime, windowWidth, windowHeight);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1200000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
mediaFormat.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1);
mediaFormat.setInteger(MediaFormat.KEY_CAPTURE_RATE, 25);
mediaFormat.setInteger(MediaFormat.KEY_REPEAT_PREVIOUS_FRAME_AFTER, 1000000);
mMediaCodec = MediaCodec.createByCodecName(info.mName);
mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mSurface = mMediaCodec.createInputSurface();
mMediaCodec.start();
}
private void createEnvironment() {
mVideoPath = Environment.getExternalStorageDirectory().getPath() + "/";
wm = (WindowManager) getSystemService(Context.WINDOW_SERVICE);
windowWidth = wm.getDefaultDisplay().getWidth();
windowHeight = wm.getDefaultDisplay().getHeight();
DisplayMetrics displayMetrics = new DisplayMetrics();
wm.getDefaultDisplay().getMetrics(displayMetrics);
screenDensity = displayMetrics.densityDpi;
while (windowWidth > 480){
windowWidth /= 2;
windowHeight /=2;
}
windowWidth /= 16;
windowWidth *= 16;
windowHeight /= 16;
windowHeight *= 16;
}
private void startPush() {
// liveData.postValue(new MediaStream.PushingState(0, "未开始", true));
mPushThread = new Thread(){
@TargetApi(Build.VERSION_CODES.LOLLIPOP)
@Override
public void run() {
startForeground(111, new Notification.Builder(PushScreenService.this).setContentTitle(getString(R.string.screen_pushing))
.setSmallIcon(R.drawable.ic_pusher_screen_pushing)
.addAction(new Notification.Action(R.drawable.ic_close_pushing_screen, "关闭",
PendingIntent.getBroadcast(getApplicationContext(), 10000, new Intent(ACTION_CLOSE_PUSHING_SCREEN), FLAG_CANCEL_CURRENT))).build());
final String url = String.format("rtsp://%s:%s/%s.sdp", ip, port, id);
InitCallback _callback = new InitCallback() {
@Override
public void onCallback(int code) {
String msg = "";
switch (code) {
case EasyPusher.OnInitPusherCallback.CODE.EASY_ACTIVATE_INVALID_KEY:
msg = ("无效Key");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_ACTIVATE_SUCCESS:
msg = ("未开始");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_PUSH_STATE_CONNECTING:
msg = ("连接中");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_PUSH_STATE_CONNECTED:
msg = ("连接成功");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_PUSH_STATE_CONNECT_FAILED:
msg = ("连接失败");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_PUSH_STATE_CONNECT_ABORT:
msg = ("连接异常中断");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_PUSH_STATE_PUSHING:
msg = ("推流中");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_PUSH_STATE_DISCONNECTED:
msg = ("断开连接");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_ACTIVATE_PLATFORM_ERR:
msg = ("平台不匹配");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_ACTIVATE_COMPANY_ID_LEN_ERR:
msg = ("授权使用商不匹配");
break;
case EasyPusher.OnInitPusherCallback.CODE.EASY_ACTIVATE_PROCESS_NAME_LEN_ERR:
msg = ("进程名称长度不匹配");
break;
}
liveData.postValue(new MediaStream.PushingState(url, code, msg, true));
}
};
// startStream(ip, port, id, _callback);
mEasyPusher.initPush( getApplicationContext(), _callback);
MediaStream.PushingState.sCodec = (info.hevcEncode ? "hevc":"avc");
mEasyPusher.setMediaInfo(info.hevcEncode ? Pusher.Codec.EASY_SDK_VIDEO_CODEC_H265:Pusher.Codec.EASY_SDK_VIDEO_CODEC_H264, 25, Pusher.Codec.EASY_SDK_AUDIO_CODEC_AAC, 1, 8000, 16);
mEasyPusher.start(ip, port, String.format("%s.sdp", id), Pusher.TransType.EASY_RTP_OVER_TCP);
try {
byte[] h264 = new byte[102400];
while (mPushThread != null) {
int index = mMediaCodec.dequeueOutputBuffer(mBufferInfo, 10000);
Log.d(TAG, "dequeue output buffer index=" + index);
if (index == MediaCodec.INFO_TRY_AGAIN_LATER) {//请求超时
try {
// wait 10ms
Thread.sleep(10);
} catch (InterruptedException e) {
}
} else if (index >= 0) {//有效输出
ByteBuffer outputBuffer = mMediaCodec.getOutputBuffer(index);
outputBuffer.position(mBufferInfo.offset);
outputBuffer.limit(mBufferInfo.offset + mBufferInfo.size);
boolean sync = false;
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {// sps
sync = (mBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0;
if (!sync) {
byte[] temp = new byte[mBufferInfo.size];
outputBuffer.get(temp);
mPpsSps = temp;
mMediaCodec.releaseOutputBuffer(index, false);
continue;
} else {
mPpsSps = new byte[0];
}
}
sync |= (mBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0;
int len = mPpsSps.length + mBufferInfo.size;
if (len > h264.length) {
h264 = new byte[len];
}
if (sync) {
System.arraycopy(mPpsSps, 0, h264, 0, mPpsSps.length);
outputBuffer.get(h264, mPpsSps.length, mBufferInfo.size);
mEasyPusher.push(h264, 0, mPpsSps.length + mBufferInfo.size, mBufferInfo.presentationTimeUs / 1000, 1);
if (BuildConfig.DEBUG)
Log.i(TAG, String.format("push i video stamp:%d", mBufferInfo.presentationTimeUs / 1000));
} else {
outputBuffer.get(h264, 0, mBufferInfo.size);
mEasyPusher.push(h264, 0, mBufferInfo.size, mBufferInfo.presentationTimeUs / 1000, 1);
if (BuildConfig.DEBUG)
Log.i(TAG, String.format("push video stamp:%d", mBufferInfo.presentationTimeUs / 1000));
}
mMediaCodec.releaseOutputBuffer(index, false);
}
}
stopForeground(true);
}finally {
mEasyPusher.stop();
liveData.postValue(new MediaStream.PushingState("", 0, "未开始", true));
}
}
};
mPushThread.start();
}
@TargetApi(Build.VERSION_CODES.LOLLIPOP)
private void stopPush(){
Thread t = mPushThread;
if (t != null){
mPushThread = null;
try {
t.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
@TargetApi(Build.VERSION_CODES.LOLLIPOP)
void startVirtualDisplay(int resultCode, Intent resultData, String ip, String port, String id, final MediaStream.PushingScreenLiveData liveData) {
try {
configureMedia();
} catch (IOException e) {
e.printStackTrace();
liveData.postValue(new MediaStream.PushingState("",-1, "编码器初始化错误", true));
return;
}
if (mMpj == null) {
mMpj = mMpmngr.getMediaProjection(resultCode, resultData);
}
if (mMpj == null) {
liveData.postValue(new MediaStream.PushingState("",-1, "未知错误", true));
return;
}
mVirtualDisplay = mMpj.createVirtualDisplay("record_screen", windowWidth, windowHeight, screenDensity,
DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR|DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC|DisplayManager.VIRTUAL_DISPLAY_FLAG_PRESENTATION, mSurface, null, null);
this.ip = ip;
this.port = port;
this.id = id;
this.liveData = liveData;
startPush();
}
@TargetApi(Build.VERSION_CODES.LOLLIPOP)
private void encodeToVideoTrack(int index) {
ByteBuffer encodedData = mMediaCodec.getOutputBuffer(index);
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {//是编码需要的特定数据,不是媒体数据
// The codec config data was pulled out and fed to the muxer when we got
// the INFO_OUTPUT_FORMAT_CHANGED status.
// Ignore it.
Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
mBufferInfo.size = 0;
}
if (mBufferInfo.size == 0) {
Log.d(TAG, "info.size == 0, drop it.");
encodedData = null;
} else {
Log.d(TAG, "got buffer, info: size=" + mBufferInfo.size
+ ", presentationTimeUs=" + mBufferInfo.presentationTimeUs
+ ", offset=" + mBufferInfo.offset);
}
if (encodedData != null) {
encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
// mMuxer.writeSampleData(mVideoTrackIndex, encodedData, mBufferInfo);//写入
Log.i(TAG, "sent " + mBufferInfo.size + " bytes to muxer...");
}
}
@TargetApi(Build.VERSION_CODES.KITKAT)
private void release() {
Log.i(TAG, " release() ");
if (mMediaCodec != null) {
mMediaCodec.stop();
mMediaCodec.release();
mMediaCodec = null;
}
if (mSurface != null){
mSurface.release();
}
if (mVirtualDisplay != null) {
mVirtualDisplay.release();
mVirtualDisplay = null;
}
}
@TargetApi(Build.VERSION_CODES.LOLLIPOP)
@Override
public void onDestroy() {
super.onDestroy();
stopPush();
release();
if (mMpj != null) {
mMpj.stop();
}
unregisterReceiver(mReceiver);
}
}

@ -0,0 +1,40 @@
package org.easydarwin.push;
import android.content.Context;
/**
* Created by john on 2017/5/6.
*/
public interface Pusher {
public static class Codec {
/* 视频编码 */
public static final int EASY_SDK_VIDEO_CODEC_H264 = 0x1C;
public static final int EASY_SDK_VIDEO_CODEC_H265 = 0x48323635;
/* 音频编码 */
public static final int EASY_SDK_AUDIO_CODEC_AAC = 0x15002;
public static final int EASY_SDK_AUDIO_CODEC_G711U = 0x10006;
public static final int EASY_SDK_AUDIO_CODEC_G711A = 0x10007;
public static final int EASY_SDK_AUDIO_CODEC_G726 = 0x1100B;
}
public static class TransType {
public static final int EASY_RTP_OVER_TCP = 1; //TCP推送
public static final int EASY_RTP_OVER_UDP = 2; //UDP推送
}
public void stop() ;
public void initPush(final Context context, final InitCallback callback);
public void initPush(final String url, final Context context, final InitCallback callback, int pts);
public void initPush(final String url, final Context context, final InitCallback callback);
public void setMediaInfo(int videoCodec, int videoFPS, int audioCodec, int audioChannel, int audioSamplerate, int audioBitPerSample);
public void start(String serverIP, String serverPort, String streamName, int transType);
public void push(byte[] data, int offset, int length, long timestamp, int type);
public void push(byte[] data, long timestamp, int type);
}

@ -0,0 +1,129 @@
package org.easydarwin.push;
import android.content.Context;
import android.util.Log;
import org.easydarwin.muxer.EasyMuxer;
import org.easydarwin.sw.JNIUtil;
import org.easydarwin.sw.X264Encoder;
import java.util.concurrent.ArrayBlockingQueue;
/**
* Created by apple on 2017/5/13.
*/
public class SWConsumer extends Thread implements VideoConsumer {
private static final String TAG = "SWConsumer";
private int mHeight;
private int mWidth;
private X264Encoder x264;
private final Pusher mPusher;
private volatile boolean mVideoStarted;
public SWConsumer(Context context, Pusher pusher){
mPusher = pusher;
}
@Override
public void onVideoStart(int width, int height) {
this.mWidth = width;
this.mHeight = height;
x264 = new X264Encoder();
int bitrate = (int) (mWidth*mHeight*20*2*0.07f);
x264.create(width, height, 20, bitrate/500);
mVideoStarted = true;
start();
}
class TimedBuffer {
byte[] buffer;
long time;
public TimedBuffer(byte[] data) {
buffer = data;
time = System.currentTimeMillis();
}
}
private ArrayBlockingQueue<TimedBuffer> yuvs = new ArrayBlockingQueue<TimedBuffer>(2);
private ArrayBlockingQueue<byte[]> yuv_caches = new ArrayBlockingQueue<byte[]>(10);
@Override
public void run(){
byte[]h264 = new byte[mWidth*mHeight*3/2];
byte[] keyFrm = new byte[1];
int []outLen = new int[1];
do {
try {
int r = 0;
TimedBuffer tb = yuvs.take();
byte[] data = tb.buffer;
long begin = System.currentTimeMillis();
r = x264.encode(data, 0, h264, 0, outLen, keyFrm);
if (r > 0) {
Log.i(TAG, String.format("encode spend:%d ms. keyFrm:%d", System.currentTimeMillis() - begin, keyFrm[0]));
// newBuf = new byte[outLen[0]];
// System.arraycopy(h264, 0, newBuf, 0, newBuf.length);
}
keyFrm[0] = 0;
yuv_caches.offer(data);
mPusher.push(h264, 0, outLen[0], tb.time, 1);
} catch (InterruptedException e) {
e.printStackTrace();
}
}while (mVideoStarted);
}
final int millisPerframe = 1000/20;
long lastPush = 0;
@Override
public int onVideo(byte[] data, int format) {
try {
if (lastPush == 0) {
lastPush = System.currentTimeMillis();
}
long time = System.currentTimeMillis() - lastPush;
if (time >= 0) {
time = millisPerframe - time;
if (time > 0) Thread.sleep(time / 2);
}
byte[] buffer = yuv_caches.poll();
if (buffer == null || buffer.length != data.length) {
buffer = new byte[data.length];
}
System.arraycopy(data, 0, buffer, 0, data.length);
JNIUtil.yuvConvert(buffer, mWidth, mHeight, 4);
yuvs.offer(new TimedBuffer(buffer));
if (time > 0) Thread.sleep(time / 2);
lastPush = System.currentTimeMillis();
}catch (InterruptedException ex){
ex.printStackTrace();
}
return 0;
}
@Override
public void onVideoStop() {
do {
mVideoStarted = false;
try {
interrupt();
join();
} catch (InterruptedException e) {
e.printStackTrace();
}
}while (isAlive());
if (x264 != null) {
x264.close();
}
x264 = null;
}
@Override
public void setMuxer(EasyMuxer muxer) {
}
}

@ -0,0 +1,200 @@
package org.easydarwin.push;
import android.app.Service;
import androidx.lifecycle.LiveData;
import android.content.Intent;
import android.hardware.usb.UsbDevice;
import android.os.Binder;
import android.os.IBinder;
import android.util.Log;
import android.util.SparseArray;
import android.widget.Toast;
import com.serenegiant.usb.DeviceFilter;
import com.serenegiant.usb.IButtonCallback;
import com.serenegiant.usb.IStatusCallback;
import com.serenegiant.usb.USBMonitor;
import com.serenegiant.usb.UVCCamera;
import org.easydarwin.easypusher.BuildConfig;
import org.easydarwin.easypusher.R;
import java.nio.ByteBuffer;
public class UVCCameraService extends Service {
public static class UVCCameraLivaData extends LiveData<UVCCamera>{
@Override
protected void postValue(UVCCamera value) {
super.postValue(value);
}
}
public static final UVCCameraLivaData liveData = new UVCCameraLivaData();
public static class MyUVCCamera extends UVCCamera {
boolean prev = false;
@Override
public synchronized void startPreview() {
if (prev ) return;
super.startPreview();
prev = true;
}
@Override
public synchronized void stopPreview() {
if (!prev )return;
super.stopPreview();
prev = false;
}
@Override
public synchronized void destroy() {
prev = false;
super.destroy();
}
}
private static final String TAG = "OutterCamera";
private USBMonitor mUSBMonitor;
private UVCCamera mUVCCamera;
private SparseArray<UVCCamera> cameras = new SparseArray<>();
public class MyBinder extends Binder {
public UVCCameraService getService() {
return UVCCameraService.this;
}
}
MyBinder binder = new MyBinder();
@Override
public IBinder onBind(Intent intent) {
return binder;
}
public UVCCamera getCamera() {
return mUVCCamera;
}
private void releaseCamera() {
if (mUVCCamera != null) {
try {
mUVCCamera.close();
mUVCCamera.destroy();
mUVCCamera = null;
} catch (final Exception e) {
//
}
}
}
@Override
public void onCreate() {
super.onCreate();
mUSBMonitor = new USBMonitor(this, new USBMonitor.OnDeviceConnectListener() {
@Override
public void onAttach(final UsbDevice device) {
Log.v(TAG, "onAttach:" + device);
mUSBMonitor.requestPermission(device);
}
@Override
public void onConnect(final UsbDevice device, final USBMonitor.UsbControlBlock ctrlBlock, final boolean createNew) {
releaseCamera();
if (BuildConfig.DEBUG) Log.v(TAG, "onConnect:");
try {
final UVCCamera camera = new MyUVCCamera();
camera.open(ctrlBlock);
camera.setStatusCallback(new IStatusCallback() {
@Override
public void onStatus(final int statusClass, final int event, final int selector,
final int statusAttribute, final ByteBuffer data) {
Log.i(TAG, "onStatus(statusClass=" + statusClass
+ "; " +
"event=" + event + "; " +
"selector=" + selector + "; " +
"statusAttribute=" + statusAttribute + "; " +
"data=...)");
}
});
camera.setButtonCallback(new IButtonCallback() {
@Override
public void onButton(final int button, final int state) {
Log.i(TAG, "onButton(button=" + button + "; " + "state=" + state + ")");
}
});
// camera.setPreviewTexture(camera.getSurfaceTexture());
mUVCCamera = camera;
liveData.postValue(camera);
Toast.makeText(UVCCameraService.this, "UVCCamera connected!", Toast.LENGTH_SHORT).show();
if (device != null)
cameras.append(device.getDeviceId(), camera);
}catch (Exception ex){
ex.printStackTrace();
}
}
@Override
public void onDisconnect(final UsbDevice device, final USBMonitor.UsbControlBlock ctrlBlock) {
Log.v(TAG, "onDisconnect:");
// Toast.makeText(MainActivity.this, R.string.usb_camera_disconnected, Toast.LENGTH_SHORT).show();
// releaseCamera();
if (device != null) {
UVCCamera camera = cameras.get(device.getDeviceId());
if (mUVCCamera == camera) {
mUVCCamera = null;
Toast.makeText(UVCCameraService.this, "UVCCamera disconnected!", Toast.LENGTH_SHORT).show();
liveData.postValue(null);
}
cameras.remove(device.getDeviceId());
}else {
Toast.makeText(UVCCameraService.this, "UVCCamera disconnected!", Toast.LENGTH_SHORT).show();
mUVCCamera = null;
liveData.postValue(null);
}
// if (mUSBMonitor != null) {
// mUSBMonitor.destroy();
// }
//
// mUSBMonitor = new USBMonitor(OutterCameraService.this, this);
// mUSBMonitor.setDeviceFilter(DeviceFilter.getDeviceFilters(OutterCameraService.this, R.xml.device_filter));
// mUSBMonitor.register();
}
@Override
public void onCancel(UsbDevice usbDevice) {
releaseCamera();
}
@Override
public void onDettach(final UsbDevice device) {
Log.v(TAG, "onDettach:");
releaseCamera();
// AppContext.getInstance().bus.post(new UVCCameraDisconnect());
}
});
mUSBMonitor.setDeviceFilter(DeviceFilter.getDeviceFilters(this, R.xml.device_filter));
mUSBMonitor.register();
}
@Override
public void onDestroy() {
releaseCamera();
if (mUSBMonitor != null) {
mUSBMonitor.unregister();
}
super.onDestroy();
}
}

@ -0,0 +1,19 @@
package org.easydarwin.push;
import org.easydarwin.muxer.EasyMuxer;
import java.io.IOException;
/**
* Created by apple on 2017/5/13.
*/
public interface VideoConsumer {
public void onVideoStart(int width, int height) throws IOException;
public int onVideo(byte[] data, int format);
public void onVideoStop();
public void setMuxer(EasyMuxer muxer);
}

@ -0,0 +1,81 @@
package org.easydarwin.sw;
/**
*/
public class JNIUtil {
static {
System.loadLibrary("Utils");
}
/**
* YUV = 411 U V
*
* @param buffer
* @param width
* @param height
*/
public static void yV12ToYUV420P(byte[] buffer, int width, int height) {
callMethod("YV12ToYUV420P", null, buffer, width, height);
}
/**
* YU+V = 42,UV
*
* @param buffer
* @param width
* @param height
*/
public static void nV21To420SP(byte[] buffer, int width, int height) {
callMethod("NV21To420SP", null, buffer, width, height);
}
/**
* 1
*
* @param data
* @param offset
* @param width
* @param height
* @param degree
*/
public static void rotateMatrix(byte[] data, int offset, int width, int height, int degree) {
callMethod("RotateByteMatrix", null, data, offset, width, height, degree);
}
/**
* 2
*
* @param data
* @param offset
* @param width
* @param height
* @param degree
*/
public static void rotateShortMatrix(byte[] data, int offset, int width, int height, int degree) {
callMethod("RotateShortMatrix", null, data, offset, width, height, degree);
}
private static native void callMethod(String methodName, Object[] returnValue, Object... params);
/**
* 0 NULL,
* 1 yuv_to_yvu,
* 2 yuv_to_yuvuv,
* 3 yuv_to_yvuvu,
* 4 yuvuv_to_yuv,
* 5 yuvuv_to_yvu,
* 6 yuvuv_to_yvuvu,
*
* @param data
* @param width
* @param height
* @param mode
*/
public static native void yuvConvert(byte[] data, int width, int height, int mode);
}

@ -0,0 +1,59 @@
package org.easydarwin.sw;
import android.content.Context;
import android.text.TextUtils;
import java.io.File;
/**
* Created by John on 2017/2/23.
*/
public class TxtOverlay {
static {
System.loadLibrary("TxtOverlay");
}
private final Context context;
public TxtOverlay(Context context){
this.context = context;
}
private long ctx;
public void init(int width, int height,String fonts) {
if (TextUtils.isEmpty(fonts)){
throw new IllegalArgumentException("the font file must be valid!");
}
if (!new File(fonts).exists()){
throw new IllegalArgumentException("the font file must be exists!");
}
ctx = txtOverlayInit(width, height,fonts);
}
public void overlay(byte[] data,
String txt) {
// txt = "drawtext=fontfile="+context.getFileStreamPath("SIMYOU.ttf")+": text='EasyPusher 2017':x=(w-text_w)/2:y=H-60 :fontcolor=white :box=1:boxcolor=0x00000000@0.3";
// txt = "movie=/sdcard/qrcode.png [logo];[in][logo] "
// + "overlay=" + 0 + ":" + 0
// + " [out]";
// if (ctx == 0) throw new RuntimeException("init should be called at first!");
if (ctx == 0) return;
txtOverlay(ctx, data, txt);
}
public void release() {
if (ctx == 0) return;
txtOverlayRelease(ctx);
ctx = 0;
}
private static native long txtOverlayInit(int width, int height, String fonts);
private static native void txtOverlay(long ctx, byte[] data, String txt);
private static native void txtOverlayRelease(long ctx);
}

@ -0,0 +1,62 @@
package org.easydarwin.sw;
/**
* Created by John on 2016/11/13.
* mail:251139896@qq.com
*/
public class X264Encoder {
static {
System.loadLibrary("x264enc");
}
private long mHandle;
/**
*
*
* @param w
* @param h
* @param bitrate
*/
public void create(int w, int h, int frameRate, int bitrate) {
long[] handle = new long[1];
create(w, h, frameRate, bitrate, handle);
mHandle = handle[0];
}
/**
*
*
* @param yv12 yv12w*h*1.5
* @param offset yv12
* @param out
* @param outOffset out
* @param outLen outLen[0]
* @param keyFrame keyFrame[0]
* @return returns negative on error, zero if no NAL units returned.
*/
public int encode(byte[] yv12, int offset, byte[] out, int outOffset, int[] outLen, byte[] keyFrame) {
return encode(mHandle, yv12, offset, out, outOffset, outLen, keyFrame);
}
/**
*
*/
public void close() {
close(mHandle);
}
private static native void create(int width, int height, int frameRate, int bitRate, long[] handle);
private static native int encode(long handle, byte[] buffer, int offset, byte[] out, int outOffset, int[] outLen, byte[] keyFrame);
private static native void close(long handle);
//
// x264_ecoder_handle x264_ecoder_init(int nWidth, int nHeight, int bitRate, x264_PixelFormat pixelFromat);
//
// int x264_enocode(x264_ecoder_handle handle, unsigned char*pYUVData,
// unsigned int length, unsigned char*outData, int*nLen, unsigned char*keyFrame);
//
// void x264_close(x264_ecoder_handle handle);
}

@ -0,0 +1,30 @@
package org.easydarwin.util;
import org.reactivestreams.Subscriber;
import org.reactivestreams.Subscription;
/**
* Created by apple on 2017/10/21.
*/
public abstract class AbstractSubscriber<T> implements Subscriber<T> {
@Override
public void onSubscribe(Subscription s) {
}
@Override
public void onNext(T t) {
}
@Override
public void onError(Throwable t) {
t.printStackTrace();
}
@Override
public void onComplete() {
}
}

@ -0,0 +1,214 @@
/*
Copyright (c) 2013-2016 EasyDarwin.ORG. All rights reserved.
Github: https://github.com/EasyDarwin
WEChat: EasyDarwin
Website: http://www.easydarwin.org
*/
package org.easydarwin.util;
import android.content.Context;
import android.content.SharedPreferences;
import android.text.TextUtils;
import android.util.Log;
import org.easydarwin.config.Config;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
/**
* Util//TODO 类实现描述
*
* @author HELONG 2016/3/8 17:42
*/
public class Util {
/**
* YUV420SP90
*
* @param data
* @param imageWidth
* @param imageHeight
* @return
*/
public static byte[] rotateNV21Degree90(byte[] data, int imageWidth, int imageHeight) {
byte[] yuv = new byte[imageWidth * imageHeight * 3 / 2];
// Rotate the Y luma
int i = 0;
for (int x = 0; x < imageWidth; x++) {
for (int y = imageHeight - 1; y >= 0; y--) {
yuv[i] = data[y * imageWidth + x];
i++;
}
}
// Rotate the U and V color components
i = imageWidth * imageHeight * 3 / 2 - 1;
for (int x = imageWidth - 1; x > 0; x = x - 2) {
for (int y = 0; y < imageHeight / 2; y++) {
yuv[i] = data[(imageWidth * imageHeight) + (y * imageWidth) + x];
i--;
yuv[i] = data[(imageWidth * imageHeight) + (y * imageWidth) + (x - 1)];
i--;
}
}
return yuv;
}
/**
* YUV420SP90
*
* @param src
* @param srcWidth
* @param height
* @return
*/
public static byte[] rotateNV21Negative90(byte[] src, int srcWidth, int height)
{
byte[] dst = new byte[srcWidth * height * 3 / 2];
int nWidth = 0, nHeight = 0;
int wh = 0;
int uvHeight = 0;
if(srcWidth != nWidth || height != nHeight)
{
nWidth = srcWidth;
nHeight = height;
wh = srcWidth * height;
uvHeight = height >> 1;//uvHeight = height / 2
}
//旋转Y
int k = 0;
for(int i = 0; i < srcWidth; i++){
int nPos = srcWidth - 1;
for(int j = 0; j < height; j++)
{
dst[k] = src[nPos - i];
k++;
nPos += srcWidth;
}
}
for(int i = 0; i < srcWidth; i+=2){
int nPos = wh + srcWidth - 1;
for(int j = 0; j < uvHeight; j++) {
dst[k] = src[nPos - i - 1];
dst[k + 1] = src[nPos - i];
k += 2;
nPos += srcWidth;
}
}
return dst;
}
/**
* YUV420SP90
*
* @param src
* @param srcWidth
* @param srcHeight
* @return
*/
public static byte[] rotateNV21Positive90(byte[] src, int srcWidth, int srcHeight)
{
byte[] dst = new byte[srcWidth * srcHeight * 3 / 2];
int nWidth = 0, nHeight = 0;
int wh = 0;
int uvHeight = 0;
if(srcWidth != nWidth || srcHeight != nHeight)
{
nWidth = srcWidth;
nHeight = srcHeight;
wh = srcWidth * srcHeight;
uvHeight = srcHeight >> 1;//uvHeight = height / 2
}
//旋转Y
int k = 0;
for(int i = 0; i < srcWidth; i++) {
int nPos = 0;
for(int j = 0; j < srcHeight; j++) {
dst[k] = src[nPos + i];
k++;
nPos += srcWidth;
}
}
for(int i = 0; i < srcWidth; i+=2){
int nPos = wh;
for(int j = 0; j < uvHeight; j++) {
dst[k] = src[nPos + i];
dst[k + 1] = src[nPos + i + 1];
k += 2;
nPos += srcWidth;
}
}
return dst;
}
/**
*
*
* @param buffer
* @param offset
* @param length
* @param path
* @param append
*/
public static void save(byte[] buffer, int offset, int length, String path, boolean append) {
FileOutputStream fos = null;
try {
fos = new FileOutputStream(path, append);
fos.write(buffer, offset, length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (fos != null) {
try {
fos.flush();
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
/**
*
* @param context
* @return
*/
public static List<String> getSupportResolution(Context context){
List<String> resolutions=new ArrayList<>();
SharedPreferences sharedPreferences=context.getSharedPreferences(Config.PREF_NAME, Context.MODE_PRIVATE);
String r=sharedPreferences.getString(Config.K_RESOLUTION, "");
if(!TextUtils.isEmpty(r)){
String[] arr=r.split(";");
if(arr.length>0){
resolutions=Arrays.asList(arr);
}
}
return resolutions;
}
/**
*
* @param context
* @param value
*/
public static void saveSupportResolution(Context context,String value){
SharedPreferences sharedPreferences=context.getSharedPreferences(Config.PREF_NAME, Context.MODE_PRIVATE);
sharedPreferences.edit().putString(Config.K_RESOLUTION, value).commit();
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 266 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 230 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 198 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 176 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 323 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 247 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 531 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 375 B

@ -0,0 +1,4 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
<string name="screen_pushing">正在推送屏幕</string>
</resources>

@ -0,0 +1,22 @@
/*
Copyright (c) 2013-2016 EasyDarwin.ORG. All rights reserved.
Github: https://github.com/EasyDarwin
WEChat: EasyDarwin
Website: http://www.easydarwin.org
*/
package org.easydarwin.easypusher;
import org.junit.Test;
import static org.junit.Assert.*;
/**
* To work on unit tests, switch the Test Artifact in the Build Variants view.
*/
public class ExampleUnitTest {
@Test
public void addition_isCorrect() throws Exception {
assertEquals(4, 2 + 2);
}
}

@ -0,0 +1 @@
/build

@ -0,0 +1,57 @@
apply plugin: 'com.android.application'
android {
compileSdkVersion 31
lintOptions {
checkReleaseBuilds false
// Or, if you prefer, you can continue to check for errors in release builds,
// but continue the build even when errors are found:
abortOnError false
}
defaultConfig {
applicationId "com.example.myapplication"
minSdkVersion 19
targetSdkVersion 31
versionCode 1
versionName "1.0"
ndk {
//SOso
abiFilters "armeabi-v7a", "x86"//, "arm64-v8a", ,"arm64-v8a","x86_64"
}
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
}
repositories {
flatDir {
dirs 'libs'
}
mavenCentral()
}
dependencies {
implementation fileTree(include: ['*.jar'], dir: 'libs')
implementation 'androidx.appcompat:appcompat:1.4.1'
implementation 'androidx.constraintlayout:constraintlayout:2.1.3'
implementation project(':library')
implementation(name: 'libuvccamera-release', ext: 'aar') {
exclude module: 'support-v4'
exclude module: 'appcompat-v7'
}
implementation 'io.reactivex.rxjava2:rxjava:2.1.6'
implementation 'io.reactivex.rxjava2:rxandroid:2.0.1'
implementation 'androidx.lifecycle:lifecycle-extensions:2.2.0'
implementation 'androidx.lifecycle:lifecycle-reactivestreams:2.4.1'
annotationProcessor 'androidx.lifecycle:lifecycle-compiler:2.0.0'
}

@ -0,0 +1,21 @@
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile

@ -0,0 +1,54 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.myapplication">
<uses-feature
android:name="android.hardware.usb.host"
android:required="true" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW" />
<uses-feature
android:glEsVersion="0x00020000"
android:required="true" />
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity
android:name=".MainActivity"
android:exported="true"
android:launchMode="singleInstance">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<activity
android:name=".UVCActivity"
android:exported="false"
android:launchMode="singleInstance">
<intent-filter>
<action android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED" />
</intent-filter>
<intent-filter>
<action android:name="android.hardware.usb.action.USB_DEVICE_DETACHED" />
</intent-filter>
<meta-data
android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED"
android:resource="@xml/device_filter" />
</activity>
</application>
</manifest>

@ -0,0 +1,238 @@
package com.example.myapplication;
import androidx.lifecycle.Observer;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.graphics.SurfaceTexture;
import android.media.projection.MediaProjectionManager;
import android.os.Build;
import android.os.Bundle;
import android.preference.PreferenceManager;
import androidx.annotation.Nullable;
import androidx.core.app.ActivityCompat;
import androidx.appcompat.app.AppCompatActivity;
import android.view.TextureView;
import android.view.View;
import android.widget.CheckBox;
import android.widget.CompoundButton;
import android.widget.TextView;
import android.widget.Toast;
import org.easydarwin.push.MediaStream;
import io.reactivex.Single;
import io.reactivex.functions.Consumer;
public class MainActivity extends AppCompatActivity {
private static final int REQUEST_CAMERA_PERMISSION = 1000;
private static final int REQUEST_MEDIA_PROJECTION = 1001;
public static final String HOST = "cloud.easydarwin.org";
private MediaStream mediaStream;
private Single<MediaStream> getMediaStream() {
Single<MediaStream> single = RxHelper.single(MediaStream.getBindedMediaStream(this, this), mediaStream);
if (mediaStream == null) {
return single.doOnSuccess(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream ms) throws Exception {
mediaStream = ms;
}
});
} else {
return single;
}
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
CheckBox hevc_enable = findViewById(R.id.enable_265);
hevc_enable.setChecked(PreferenceManager.getDefaultSharedPreferences(this).getBoolean("try_265_encode", false));
hevc_enable.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
PreferenceManager.getDefaultSharedPreferences(MainActivity.this).edit().putBoolean("try_265_encode", isChecked).apply();
}
});
// 启动服务...
Intent intent = new Intent(this, MediaStream.class);
startService(intent);
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(final MediaStream ms) throws Exception {
ms.observeCameraPreviewResolution(MainActivity.this, new Observer<int[]>() {
@Override
public void onChanged(@Nullable int[] size) {
Toast.makeText(MainActivity.this, "当前摄像头分辨率为:" + size[0] + "*" + size[1], Toast.LENGTH_SHORT).show();
}
});
final TextView pushingStateText = findViewById(R.id.pushing_state);
final TextView pushingBtn = findViewById(R.id.pushing);
ms.observePushingState(MainActivity.this, new Observer<MediaStream.PushingState>() {
@Override
public void onChanged(@Nullable MediaStream.PushingState pushingState) {
if (pushingState.screenPushing) {
pushingStateText.setText("屏幕推送");
// 更改屏幕推送按钮状态.
TextView tview = findViewById(R.id.pushing_desktop);
if (ms.isScreenPushing()) {
tview.setText("取消推送");
} else {
tview.setText("推送屏幕");
}
findViewById(R.id.pushing_desktop).setEnabled(true);
} else {
pushingStateText.setText("推送");
if (ms.isCameraPushing()) {
pushingBtn.setText("停止");
} else {
pushingBtn.setText("推送");
}
}
pushingStateText.append(":\t" + pushingState.msg);
if (pushingState.state > 0) {
pushingStateText.append(pushingState.url);
pushingStateText.append("\n");
if ("avc".equals(pushingState.videoCodec)) {
pushingStateText.append("视频编码方式:" + "H264硬编码");
}else if ("hevc".equals(pushingState.videoCodec)) {
pushingStateText.append("视频编码方式:" + "H265硬编码");
}else if ("x264".equals(pushingState.videoCodec)) {
pushingStateText.append("视频编码方式:" + "x264");
}
}
}
});
TextureView textureView = findViewById(R.id.texture_view);
if (textureView.isAvailable()) {
ms.setSurfaceTexture(textureView.getSurfaceTexture());
} else {
textureView.setSurfaceTextureListener(new SurfaceTextureListenerWrapper() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {
ms.setSurfaceTexture(surfaceTexture);
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
ms.setSurfaceTexture(null);
return true;
}
});
}
if (ActivityCompat.checkSelfPermission(MainActivity.this, android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED ||
ActivityCompat.checkSelfPermission(MainActivity.this, android.Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(MainActivity.this, new String[]{android.Manifest.permission.CAMERA, android.Manifest.permission.RECORD_AUDIO}, REQUEST_CAMERA_PERMISSION);
}
}
}, new Consumer<Throwable>() {
@Override
public void accept(Throwable throwable) throws Exception {
Toast.makeText(MainActivity.this, "创建服务出错!", Toast.LENGTH_SHORT).show();
}
});
}
public void onPushing(View view) {
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream mediaStream) throws Exception {
MediaStream.PushingState state = mediaStream.getPushingState();
if (state != null && state.state > 0) { // 终止推送和预览
mediaStream.stopStream();
mediaStream.closeCameraPreview();
} else { // 启动预览和推送.
mediaStream.openCameraPreview();
String id = PreferenceManager.getDefaultSharedPreferences(MainActivity.this).getString("caemra-id", null);
if (id == null) {
double v = Math.random() * 1000;
id = "c_" + (int) v;
PreferenceManager.getDefaultSharedPreferences(MainActivity.this).edit().putString("caemra-id", id).apply();
}
mediaStream.startStream(HOST, "554", id);
}
}
});
}
@Override
public void onRequestPermissionsResult(int requestCode,
String permissions[], int[] grantResults) {
switch (requestCode) {
case REQUEST_CAMERA_PERMISSION: {
if (grantResults.length > 1
&& grantResults[0] == PackageManager.PERMISSION_GRANTED && grantResults[1] == PackageManager.PERMISSION_GRANTED) {
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream mediaStream) throws Exception {
mediaStream.notifyPermissionGranted();
}
});
} else {
finish();
}
break;
}
}
}
// 推送屏幕.
public void onPushScreen(final View view) {
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream mediaStream) {
if (mediaStream.isScreenPushing()) { // 正在推送,那取消推送。
// 取消推送。
mediaStream.stopPushScreen();
} else { // 没在推送,那启动推送。
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.LOLLIPOP) { // lollipop 以前版本不支持。
return;
}
MediaProjectionManager mMpMngr = (MediaProjectionManager) getApplicationContext().getSystemService(MEDIA_PROJECTION_SERVICE);
startActivityForResult(mMpMngr.createScreenCaptureIntent(), REQUEST_MEDIA_PROJECTION);
// 防止点多次.
view.setEnabled(false);
}
}
});
}
@Override
protected void onActivityResult(int requestCode, final int resultCode, final Intent data) {
if (requestCode == REQUEST_MEDIA_PROJECTION) {
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream mediaStream) {
mediaStream.pushScreen(resultCode, data, HOST, "554", "screen111");
}
});
}
}
public void onSwitchCamera(View view) {
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream mediaStream) throws Exception {
mediaStream.switchCamera();
}
});
}
public void onUVCCamera(View view) {
Intent intent = new Intent(this, UVCActivity.class);
startActivity(intent);
}
}

@ -0,0 +1,47 @@
package com.example.myapplication;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import org.easydarwin.util.AbstractSubscriber;
import org.reactivestreams.Publisher;
import io.reactivex.Single;
import io.reactivex.subjects.PublishSubject;
/**
* Created by apple on 2017/12/22.
*/
public class RxHelper {
static boolean IGNORE_ERROR = false;
public static <T> Single<T> single(@NonNull Publisher<T> t, @Nullable T defaultValueIfNotNull){
if (defaultValueIfNotNull != null) return Single.just(defaultValueIfNotNull);
final PublishSubject sub = PublishSubject.create();
t.subscribe(new AbstractSubscriber<T>() {
@Override
public void onNext(T t) {
super.onNext(t);
sub.onNext(t);
}
@Override
public void onError(Throwable t) {
if (IGNORE_ERROR) {
super.onError(t);
sub.onComplete();
}else {
sub.onError(t);
}
}
@Override
public void onComplete() {
super.onComplete();
sub.onComplete();
}
});
return sub.firstOrError();
}
}

@ -0,0 +1,27 @@
package com.example.myapplication;
import android.graphics.SurfaceTexture;
import android.view.TextureView;
/**
* Created by apple on 2017/9/11.
*/
public abstract class SurfaceTextureListenerWrapper implements TextureView.SurfaceTextureListener{
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int i, int i1) {
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
}
}

@ -0,0 +1,208 @@
package com.example.myapplication;
import androidx.lifecycle.Observer;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.graphics.SurfaceTexture;
import android.os.Environment;
import android.preference.PreferenceManager;
import androidx.annotation.Nullable;
import androidx.core.app.ActivityCompat;
import androidx.appcompat.app.AppCompatActivity;
import android.os.Bundle;
import android.view.TextureView;
import android.view.View;
import android.widget.TextView;
import android.widget.Toast;
import org.easydarwin.push.MediaStream;
import io.reactivex.Single;
import io.reactivex.functions.Consumer;
public class UVCActivity extends AppCompatActivity {
private MediaStream mediaStream;
private static final int REQUEST_CAMERA_PERMISSION = 1000;
private Single<MediaStream> getMediaStream() {
Single<MediaStream> single = RxHelper.single(MediaStream.getBindedMediaStream(this, this), mediaStream);
if (mediaStream == null) {
return single.doOnSuccess(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream ms) throws Exception {
mediaStream = ms;
}
});
} else {
return single;
}
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_uvc);
// 启动服务...
Intent intent = new Intent(this, MediaStream.class);
startService(intent);
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(final MediaStream ms) throws Exception {
final TextView pushingStateText = findViewById(R.id.pushing_state);
final TextView pushingBtn = findViewById(R.id.pushing);
ms.observePushingState(UVCActivity.this, new Observer<MediaStream.PushingState>() {
@Override
public void onChanged(@Nullable MediaStream.PushingState pushingState) {
if (pushingState.screenPushing) {
pushingStateText.setText("屏幕推送");
} else {
pushingStateText.setText("推送");
if (pushingState.state > 0) {
pushingBtn.setText("停止");
} else {
pushingBtn.setText("推送");
}
}
pushingStateText.append(":\t" + pushingState.msg);
if (pushingState.state > 0) {
pushingStateText.append(pushingState.url);
}
}
});
TextureView textureView = findViewById(R.id.texture_view);
textureView.setSurfaceTextureListener(new SurfaceTextureListenerWrapper() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {
ms.setSurfaceTexture(surfaceTexture);
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
ms.setSurfaceTexture(null);
return true;
}
});
if (ActivityCompat.checkSelfPermission(UVCActivity.this, android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED ||
ActivityCompat.checkSelfPermission(UVCActivity.this, android.Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(UVCActivity.this, new String[]{android.Manifest.permission.CAMERA, android.Manifest.permission.RECORD_AUDIO}, REQUEST_CAMERA_PERMISSION);
}
}
}, new Consumer<Throwable>() {
@Override
public void accept(Throwable throwable) throws Exception {
Toast.makeText(UVCActivity.this, "创建服务出错!", Toast.LENGTH_SHORT).show();
}
});
}
// 权限获取到了.
@Override
public void onRequestPermissionsResult(int requestCode,
String permissions[], int[] grantResults) {
switch (requestCode) {
case REQUEST_CAMERA_PERMISSION: {
if (grantResults.length > 1
&& grantResults[0] == PackageManager.PERMISSION_GRANTED && grantResults[1] == PackageManager.PERMISSION_GRANTED) {
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream mediaStream) throws Exception {
mediaStream.notifyPermissionGranted();
}
});
} else {
// 没有获取到权限,退出....
Intent intent = new Intent(this, MediaStream.class);
stopService(intent);
finish();
}
break;
}
}
}
public void onPush(View view) {
// 异步获取到MediaStream对象.
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(final MediaStream mediaStream) throws Exception {
// 判断当前的推送状态.
MediaStream.PushingState state = mediaStream.getPushingState();
if (state != null && state.state > 0) { // 当前正在推送,那终止推送和预览
mediaStream.stopStream();
mediaStream.closeCameraPreview();
}else{
// switch 0表示后置,1表示前置,2表示UVC摄像头
RxHelper.single(mediaStream.switchCamera(2), null).subscribe(new Consumer<Object>() {
@Override
public void accept(Object o) throws Exception {
String id = PreferenceManager.getDefaultSharedPreferences(UVCActivity.this).getString("uvc-id", null);
if (id == null) {
double v = Math.random() * 1000;
id = "uvc_" + (int) v;
PreferenceManager.getDefaultSharedPreferences(UVCActivity.this).edit().putString("uvc-id", id).apply();
}
mediaStream.startStream("cloud.easydarwin.org", "554", id);
}
}, new Consumer<Throwable>() {
@Override
public void accept(final Throwable t) throws Exception {
t.printStackTrace();
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(UVCActivity.this, "UVC摄像头启动失败.." + t.getMessage(), Toast.LENGTH_SHORT).show();
}
});
}
});
}
}
});
}
public void onRecord(View view) { // 开始或结束录像.
final TextView txt = (TextView) view;
getMediaStream().subscribe(new Consumer<MediaStream>() {
@Override
public void accept(MediaStream mediaStream) throws Exception {
if (mediaStream.isRecording()){ // 如果正在录像,那停止.
mediaStream.stopRecord();
txt.setText("录像");
}else { // 没在录像,开始录像...
// 表示最大录像时长为30秒,30秒后如果没有停止,会生成一个新文件.依次类推...
// 文件格式为test_uvc_0.mp4,test_uvc_1.mp4,test_uvc_2.mp4,test_uvc_3.mp4
String path = getExternalFilesDir(Environment.DIRECTORY_MOVIES) + "/test_uvc.mp4";
mediaStream.startRecord(path, 30000);
final TextView pushingStateText = findViewById(R.id.pushing_state);
pushingStateText.append("\n录像地址:" + path);
txt.setText("停止");
}
}
});
}
public void onQuit(View view) { // 退出
finish();
// 终止服务...
Intent intent = new Intent(this, MediaStream.class);
stopService(intent);
}
public void onBackground(View view) { // 后台
finish();
}
}

@ -0,0 +1,113 @@
<?xml version="1.0" encoding="utf-8"?>
<vector xmlns:android="http://schemas.android.com/apk/res/android"
android:width="24dp"
android:height="24dp"
android:viewportHeight="108.0"
android:viewportWidth="108.0">
<path
android:fillColor="#26A69A"
android:pathData="M0,0h108v108h-108z"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M19,0L19,108"
android:strokeColor="#33FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M9,0L9,108"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M39,0L39,108"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M29,0L29,108"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M59,0L59,108"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M49,0L49,108"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M79,0L79,108"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M69,0L69,108"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M89,0L89,108"
android:strokeColor="#33FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M99,0L99,108"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,89L108,89"
android:strokeColor="#33FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,99L108,99"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,69L108,69"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,79L108,79"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,49L108,49"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,59L108,59"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,29L108,29"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,39L108,39"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,19L108,19"
android:strokeColor="#33FFFFFF"
android:strokeWidth="0.8" />
<path
android:fillColor="#00000000"
android:pathData="M0,9L108,9"
android:strokeColor="#66FFFFFF"
android:strokeWidth="0.8" />
</vector>

@ -0,0 +1,106 @@
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.example.myapplication.MainActivity">
<TextureView
android:id="@+id/texture_view"
android:layout_width="0dp"
android:layout_height="0dp"
app:layout_constraintDimensionRatio="h,640:480"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintRight_toRightOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<CheckBox
android:id="@+id/enable_265"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="8dp"
android:layout_marginLeft="8dp"
android:layout_marginTop="8dp"
android:text="265格式编码"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@+id/texture_view" />
<TableLayout
android:layout_width="match_parent"
android:layout_gravity="bottom"
app:layout_constraintBottom_toBottomOf="parent"
android:stretchColumns="*"
android:layout_height="wrap_content">
<TextView
android:id="@+id/pushing_state"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
app:layout_constraintBottom_toTopOf="@+id/pushing"
android:background="#66ffffff"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
<TableRow>
<Button
android:id="@+id/pushing"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="推送"
android:onClick="onPushing"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintStart_toStartOf="parent" />
<Button
android:id="@+id/switching_camera"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="onSwitchCamera"
android:text="切换"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<Button
android:id="@+id/uvc_camera"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="onUVCCamera"
android:text="UVC"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<Button
android:id="@+id/pushing_desktop"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="onPushScreen"
android:text="推送屏幕"
app:layout_constraintRight_toLeftOf="@+id/switching_camera"
app:layout_constraintTop_toTopOf="parent" />
</TableRow>
<TableRow>
<Button
android:id="@+id/press_record"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="按住录像"
app:layout_constraintBottom_toBottomOf="parent" />
<TextView
android:id="@+id/record_time"
style="@style/Base.TextAppearance.AppCompat.Large"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="00:00"
android:textColor="#ff0000"
android:visibility="gone"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="@+id/texture_view" />
</TableRow>
</TableLayout>
</androidx.constraintlayout.widget.ConstraintLayout>

@ -0,0 +1,68 @@
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.example.myapplication.UVCActivity">
<TextureView
android:id="@+id/texture_view"
android:layout_width="0dp"
android:layout_height="0dp"
app:layout_constraintDimensionRatio="h,640:480"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintRight_toRightOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<TableLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginBottom="8dp"
app:layout_constraintBottom_toBottomOf="parent">
<TextView
android:id="@+id/pushing_state"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:background="#66ffffff"
app:layout_constraintBottom_toTopOf="@+id/pushing"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/pushing"
android:onClick="onPush"
android:text="推送" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="onRecord"
android:text="录像" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="onQuit"
android:text="退出" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="onBackground"
android:text="后台" />
</LinearLayout>
</TableLayout>
</androidx.constraintlayout.widget.ConstraintLayout>

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout
xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent"
android:layout_height="match_parent">
</androidx.constraintlayout.widget.ConstraintLayout>

@ -0,0 +1,5 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@mipmap/ic_launcher_foreground" />
</adaptive-icon>

@ -0,0 +1,5 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@mipmap/ic_launcher_foreground" />
</adaptive-icon>

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
<color name="colorPrimary">#3F51B5</color>
<color name="colorPrimaryDark">#303F9F</color>
<color name="colorAccent">#FF4081</color>
</resources>

@ -0,0 +1,3 @@
<resources>
<string name="app_name">My Application</string>
</resources>

@ -0,0 +1,11 @@
<resources>
<!-- Base application theme. -->
<style name="AppTheme" parent="Theme.AppCompat.Light.DarkActionBar">
<!-- Customize your theme here. -->
<item name="colorPrimary">@color/colorPrimary</item>
<item name="colorPrimaryDark">@color/colorPrimaryDark</item>
<item name="colorAccent">@color/colorAccent</item>
</style>
</resources>

@ -0,0 +1,5 @@
<?xml version="1.0" encoding="utf-8"?>
<usb>
<usb-device class="239" subclass="2" /> <!-- all device of UVC -->
</usb>

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save