AR人体姿态识别,实现无边界的人机交互( 二 )


3.调用ARBodyTrackingConfig接口,初始化人体跟踪场景 。
mArSession = new ARSession(context)ARBodyTrackingConfig config = new ARHandTrackingConfig(mArSession);Config.setEnableItem(ARConfigBase.ENABLE_DEPTH | ARConfigBase.ENABLE.MASK);配置session信息mArSession.configure(config);4.初始化BodyRelatedDisplay接口,用于渲染主体AR类型相关数据 。
Public interface BodyRelatedDisplay{Void init();Void onDrawFrame(Collection<ARBody> bodies,float[] projectionMatrix);}

  1. 初始化BodyRenderManager类,此类渲染AREngine获取的个人数据 。
Public class BodyRenderManager implements GLSurfaceView.Renderer{//实现onDrawFrame方法Public void onDrawFrame(){ARFrame frame = mSession.update();ARCamera camera = Frame.getCramera();//获取AR相机的投影矩阵 。Camera.getProjectionMatrix();//获取所有指定类型的可跟踪对像集合,传入ARBody.class, 用于人体骨骼跟踪时返回跟踪结果Collection<ARBody> bodies = mSession.getAllTrackbles(ARBody.class);}}
  1. 初始化BodySkeletonDisplay,用来获取骨架数据并将其传递给openGL ES,openGL ES将渲染数据并在屏幕上显示 。
Public class BodySkeletonDisplay implements BodyRelatedDisplay{//此类需要几个方法//初始化方法public void init(){}//使用OpenGL更新节点数据并绘制 。Public void onDrawFrame(Collection<ARBody> bodies,float[] projectionMatrix){for (ARBody body : bodies) {if (body.getTrackingState() == ARTrackable.TrackingState.TRACKING) {float coordinate = 1.0f;if (body.getCoordinateSystemType() == ARCoordinateSystemType.COORDINATE_SYSTEM_TYPE_3D_CAMERA) {coordinate = DRAW_COORDINATE;}findValidSkeletonPoints(body);updateBodySkeleton();drawBodySkeleton(coordinate, projectionMatrix);}}}//查找有效骨架点private void findValidSkeletonPoints(ARBody arBody) {int index = 0;int[] isExists;int validPointNum = 0;float[] points;float[] skeletonPoints;if (arBody.getCoordinateSystemType() == ARCoordinateSystemType.COORDINATE_SYSTEM_TYPE_3D_CAMERA) {isExists = arBody.getSkeletonPointIsExist3D();points = new float[isExists.length * 3];skeletonPoints = arBody.getSkeletonPoint3D();} else {isExists = arBody.getSkeletonPointIsExist2D();points = new float[isExists.length * 3];skeletonPoints = arBody.getSkeletonPoint2D();}for (int i = 0; i < isExists.length; i++) {if (isExists[i] != 0) {points[index++] = skeletonPoints[3 * i];points[index++] = skeletonPoints[3 * i + 1];points[index++] = skeletonPoints[3 * i + 2];validPointNum++;}}mSkeletonPoints = FloatBuffer.wrap(points);mPointsNum = validPointNum;}}
  1. 获取骨架点连接数据,并将其传递给OpenGL ES以便在屏幕上渲染 。
public class BodySkeletonLineDisplay implements BodyRelatedDisplay {//渲染身体骨骼之间的线条 。public void onDrawFrame(Collection<ARBody> bodies, float[] projectionMatrix) {for (ARBody body : bodies) {if (body.getTrackingState() == ARTrackable.TrackingState.TRACKING) {float coordinate = 1.0f;if (body.getCoordinateSystemType() == ARCoordinateSystemType.COORDINATE_SYSTEM_TYPE_3D_CAMERA) {coordinate = COORDINATE_SYSTEM_TYPE_3D_FLAG;}updateBodySkeletonLineData(body);drawSkeletonLine(coordinate, projectionMatrix);}}}}8.具体实现请参考:AR Engine示例代码-AR Engine | 华为开发者联盟 (huawei.com)
了解更多详情>>
访问华为开发者联盟官网获取开发指导文档华为移动服务开源仓库地址:GitHub、Gitee
关注我们,第一时间了解 HMS Core 最新技术资讯~
【AR人体姿态识别,实现无边界的人机交互】

经验总结扩展阅读