基于CAD数据的无标识3D跟踪使用EdgeBasedInitializationSensorSource,它会使用空跟踪或GPS/惯性传感器所提供的信息进行初始姿态的估测。然后,系统利用这个初始的估测来寻找更准确的摄像头姿态
找到姿态之后,无标识3D跟踪就可以启动,之后使用纹路(texture)来进行跟踪
TrackingValues输出
(初始化之前)
- COS 1:矫正的姿态。如果成功,质量数值为1.0;如果失败,质量数值为0.0
- COS 2:初始姿态。如果矫正成功,质量数值为0.0;失败数值为1.0
(初始化之后)
- COS 1:跟踪姿态。如果跟踪成功,质量数值为1.0;失败质量数值为0.0
范例配置
SDK包含一个很简单的范例:系统根据预设定的跟踪配置,使用一个(平面)圆环线模型来检测并跟踪车轮或其他圆状物。
在上一部分的教程中,我们已经根据既有的3D模型制作了相应的线模型和表面模型。在本教程中,我们对已生成的模型设定跟踪配置。配置的类型包括自由视角模式和户外模式,本文会张贴部分有代表性的代码
配置
本范例中的应用使用Assets文件夹中的模型和文件。如果您想自行对上文中制作的模型和初始姿态进行配置,您可以在下载栏找到并下载这些配置文件,然后对这些文件做相对应的调整。这些模板文件包括室内固定视角、室内自由视角和室外自由视角的配置。
要使用跟踪配置文件,请将它和模型置于同一个目录下,根据您自己的配置来设定参数,然后用代码来加载相应的模型和跟踪配置文件。对于初学者来说,设置下面的参数就已经足够了:
- 线模型:把.obj线模型设置到<EdgeAlignment/LineModel>
- 表面模型:把.ojb表面模型设置到<TriangleMesh>
- 初始姿态参数(下列其中一个已经在配置模板中设定):
- 空跟踪姿态:对于室内模式,把空跟踪配置XML文件(其中包含摄像头初始姿态)设置到<TrackingConfiguration>
- 坐标系原点:对于地球坐标系下的室外模式,把地球坐标系下的模型原点设置到<OriginCoordinates/Longitude>(经度)和<OriginCoordinates/Latitude>(纬度)
- 使用传感器:对于室内应用,把off或gravity设置到<UseSensorsForPoseCreation>;对于室外应用,将此参数设为all
- 可视测试:对于固定视角模式,把off设置到<EdgeAlignment/VisibilityTest>;对于自由视角模式,把此参数设为on。在此BETA版本中,对由上百个或更多三角元构成的复杂模型使用可视测试会产生很大的计算量,(请参考下一部分教程)。
- 精确度:<EdgeAlignment/NumFeatures>参数数值越高,精确度越高,但速度也越慢
- 查找范围:参数<EdgeAlignment/SearchRange>应该是模型尺寸的一个百分数(比如20%)。如果模型“粘在”重复结构(如高楼的窗户)上,您可以适当缩小此参数数值
- 最低质量:参数<EdgeAlignment/MinQuality>控制检测率和检测准确度的平衡,其数值范围为0.5到0.9
要了解所有参数及其用途,请参考跟踪配置教程
室内模式(固定视角)
在上一部分教程中,我们已经了解如何使用Creator 3.0的EdgeConfig工具来制作室内固定视角模式的跟踪配置。要使用此配置,您只需加载跟踪配置文件和相对应的模型即可。如果您需要进行多个视角的跟踪,那么需要您加载多个跟踪配置文件以及对应的多组模型。
注意:下面的代码范例默认所有的数据都存储在TutorialEdgeBasedInitialization/Assets/Custom/文件夹下- // javascript
- switch(orientation)
- {
- case LEFT: arel.Scene.setTrackingConfiguration("Assets/Custom/TrackingData_Left.xml"); break;
- case RIGHT: arel.Scene.setTrackingConfiguration("Assets/Custom/TrackingData_Right.xml"); break;
- }
复制代码- // java
- boolean result = false;
- switch (orientation)
- {
- case LEFT: result = setTrackingConfiguration("TutorialEdgeBasedInitialization/Assets/Custom/TrackingData_Left.xml"); break;
- case RIGHT: result = setTrackingConfiguration("TutorialEdgeBasedInitialization/Assets/Custom/TrackingData_Right.xml"); break;
- }
复制代码- // objective c
- switch ( orientation )
- {
- case LEFT: [self setTrackingConfiguration:@"tutorialContent_crossplatform/TutorialEdgeBasedInitialization/Assets/Custom/TrackingData_Left.xml"]; break;
- case RIGHT: [self setTrackingConfiguration:@"tutorialContent_crossplatform/TutorialEdgeBasedInitialization/Assets/Custom/TrackingData_Right.xml"]; break;
- }
复制代码- // C++
- bool result = false;
- switch (orientation)
- {
- case LEFT: result = m_pMetaioSDK->setTrackingConfiguration("TutorialEdgeBasedInitialization/Assets/Custom/TrackingData_Left.xml"); break;
- case RIGHT: result = m_pMetaioSDK->setTrackingConfiguration("TutorialEdgeBasedInitialization/Assets/Custom/TrackingData_Right.xml"); break;
- };
复制代码 我们把虚拟素材加载到COS 1,并把视觉辅具模型加载到COS 2,其主要用于提示用户固定视角初始化的大概角度。如果您希望让虚拟素材藏在跟踪对象的后面,您也可以为虚拟素材指定一个遮挡模型。遮挡模型可以和基于边缘初始化的表面模型一模一样。- // javascript
- var vizModel = arel.Scene.getObject("augmentation");
- var vizAidModel = arel.Scene.getObject("visualization");
- //optional occlusion model
- var vizOcclusionModel = arel.Scene.getObject("occlusion");
- vizOcclusionModel.setOccluding(true);
复制代码- // java
- public IGeometry loadModel(final String path)
- {
- String modelPath = AssetsManager.getAssetPath(path);
- return metaioSDK.createGeometry(modelPath);
- }
- @Override
- protected void loadContents()
- {
- mVizModel = loadModel("TutorialEdgeBasedInitialization/Assets/Custom/VisualizationModel.obj", true);
- mVizAidModel = loadModel("TutorialEdgeBasedInitialization/Assets/Custom/VisualizationAidModel.obj", true);
- mOcclusionModel = loadModel("TutorialEdgeBasedInitialization/Assets/Custom/SurfaceModel.obj", true); //optional occlusion model
- mVizModel.setCoordinateSystemID(1);
- mVizAidModel.setCoordinateSystemID(2);
- mOcclusionModel.setOcclusionMode(true);
- loadTrackingConfig();
- }
复制代码- // objective c
- -(metaio::IGeometry*)loadModel:(NSString*)path
- {
- NSString* modelPath = [[NSBundle mainBundle] pathForResource:path ofType:@"" inDirectory:@""];
- return m_metaioSDK->createGeometry([modelPath UTF8String]);
- }
- - (void) viewDidLoad
- {
- [super viewDidLoad];
- mVizModel = [self loadModel:@"tutorialContent_crossplatform/TutorialEdgeBasedInitialization/Assets/Custom/VisualizationModel.obj"];
- mVizAidModel = [self loadModel:@"tutorialContent_crossplatform/TutorialEdgeBasedInitialization/Assets/Custom/VisualizationAidModel"];
- mOcclusionModel = [self loadModel:@"tutorialContent_crossplatform/TutorialEdgeBasedInitialization/Assets/Custom/SurfaceModel.obj"]; //optional occlusion model
- mVizModel->setCoordinateSystemID(1);
- mVizAidModel->setCoordinateSystemID(2);
- mOcclusionModel->setOcclusionMode(true);
复制代码- // C++
- void loadContent()
- {
- m_pVizModel = m_pMetaioSDK->createGeometry("Assets/Custom/VisualizationModel.obj");
- m_pVizAidModel = m_pMetaioSDK->createGeometry("Assets/Custom/VisualizationAidModel.obj");
- m_pOcclusionModel = m_pMetaioSDK->createGeometry("Assets/Custom/SurfaceModel.obj"); //optional occlusion model
- m_pVizModel->setCoordinateSystemID(1);
- m_pVizAidModel->setCoordinateSystemID(2);
- m_pOcclusionModel->setOcclusionMode(true);
- loadTrackingConfig();
- }
复制代码 上文提到的模式和微调参数都存储在跟踪配置XML文件中。您可以使用下面的指令在运行时刻改变参数数值:- // javascript
- //Change mode parameters (result will be the new mode if successful)
- arel.Scene.sensorCommand("setVisibilityTestEnabled", "off"); // "off" or "on"
- arel.Scene.sensorCommand("setSensorUsage", "off"); // "off", "gravity" or "all"
- //Change tuning parameters (result will be "Done" if successful)
- arel.Scene.sensorCommand("setNumFeatures", "500"); // should be > 100
- arel.Scene.sensorCommand("setSearchRange", "50"); // in [mm]
- arel.Scene.sensorCommand("setMinQuality", "0.7"); // should be in [0.55, 0.85]
复制代码- // java
- String result;
- //Change mode parameters (result will be the new mode if successful)
- result = metaioSDK.sensorCommand("setVisibilityTestEnabled", "off"); // "off" or "on"
- result = metaioSDK.sensorCommand("setSensorUsage", "off"); // "off", "gravity" or "all"
- //Change tuning parameters (result will be "Done" if successful)
- result = metaioSDK.sensorCommand("setNumFeatures", "500"); // should be > 100
- result = metaioSDK.sensorCommand("setSearchRange", "50"); // in [mm]
- result = metaioSDK.sensorCommand("setMinQuality", "0.7"); // should be in [0.55, 0.85]
复制代码- // objective c
- std::string result;
- //Change mode parameters (result will be the new mode if successful)
- result = m_metaioSDK->sensorCommand("setVisibilityTestEnabled", "off"); // "off" or "on"
- result = m_metaioSDK->sensorCommand("setSensorUsage", "off"); // "off", "gravity" or "all"
- //Change tuning parameters (result will be "Done" if successful)
- result = m_metaioSDK->sensorCommand("setNumFeatures", "500"); // should be > 100
- result = m_metaioSDK->sensorCommand("setSearchRange", "50"); // in [mm]
- result = m_metaioSDK->sensorCommand("setMinQuality", "0.7"); // should be in [0.55, 0.85]
复制代码- // C++
- String result;
- //Change mode parameters (result will be the new mode if successful)
- result = m_pMetaioSDK->sensorCommand("setVisibilityTestEnabled", "off"); // "off" or "on"
- result = m_pMetaioSDK->sensorCommand("setSensorUsage", "off"); // "off", "gravity" or "all"
- //Change tuning parameters (result will be "Done" if successful)
- result = m_pMetaioSDK->sensorCommand("setNumFeatures", "500"); // should be > 100
- result = m_pMetaioSDK->sensorCommand("setSearchRange", "50"); // in [mm]
- result = m_pMetaioSDK->sensorCommand("setMinQuality", "0.7"); // should be in [0.55, 0.85]
复制代码- // XML
- <Sensor Type="EdgeBasedInitializationSensorSource">
- <Parameters>
- <!-- this is only a subset of all parameters for tuning and mode selection -->
- <TriangleMesh>SurfaceModel.obj</TriangleMesh>
- <UseSensorsForPoseCreation>off</UseSensorsForPoseCreation> <!-- "off", "gravity" or "all" -->
- <EdgeAlignment>
- <LineModel>LineModel.obj</LineModel>
- <MinQuality>0.7</MinQuality> <!-- should be in [0.55, 0.85] -->
- <SearchRange>50</SearchRange> <!-- in [mm] -->
- <NumFeatures>500</NumFeatures> <!-- should be bigger than 100 -->
- <VisibilityTest>
- <Enabled>off</Enabled> <!-- "off" or "on" -->
- </VisibilityTest>
- </EdgeAlignment>
- </Parameters>
- </Sensor>
复制代码
自由视角模式
自由视角模式主要用于从任意角度观测模型和对姿态的任意操纵。此模式和室内模式的唯一区别是:在自由角度模式下,您不需要使用不同线模型和表面模型来定义摄像头姿态;您只需要一个完整的线模型和表面模型,因为您可以任意(相对于摄像头姿态)旋转并摆放这些模型。
基于边缘的初始化可让您通过简单的指令来围绕坐标原点旋转或侧移摄像头,这和在屏幕上移动模型的效果是一样的
您可以参考下面代码来移动或旋转摄像头:- // javascript
- // Add to implementation
- arel.sceneReady(function()
- {
- console.log("sceneReady");
- //set a listener to tracking to get information about when the image is tracked
- arel.Events.setListener(arel.Scene, function(type, param){trackingHandler(type, param);});
- EInteraction = {ROTATION : 0, TRANSLATION : 1};
- var interaction = EInteraction.ROTATION;
- var posX = 0.0, posY = 0.0, movedX = 0.0, movedY = 0.0;
- var moving = false;
- // Perform interaction when mouse is clicked
- document.onmousedown = function(event){moving = true};
- document.onmouseup = function(event){moving = false};
- document.onmousemove = function(evt){
- if (moving)
- {
- movedX = evt.clientX - posX;
- movedY = evt.clientY - posY;
- // Perform command depending on interaction mode
- if (interaction == EInteraction.ROTATION)
- {
- //rotate around z-axis (in degree)
- arel.Scene.sensorCommand("rotatePose", movedX + " " + movedY, function(result){
- if (result != "Done") console.log("Error: " + result);}
- );
- }
- else if (interaction == EInteraction.TRANSLATION)
- {
- //move camera up/down and sideways (in mm)
- //Note: camera is moved in millimeters, so if your model is big you might want to scale the movement to be linear to the model size in the image first
- arel.Scene.sensorCommand("translatePose", movedX + " " + movedY, function(result){
- if (result != "Done") console.log("Error: " + result);}
- );
- }
- }
- // Remember current touch location for the next frame
- posX = evt.clientX;
- posY = evt.clientY;
- };
- });
复制代码- // java
- // Extend / add to implementation
- enum Interaction
- {
- ROTATE,
- TRANSLATE
- };
- Interaction mInteractionMode;
- float mTouchPosX;
- float mTouchPosY;
- final class MetaioSDKCallbackHandler extends IMetaioSDKCallback
- {
- @Override
- public boolean onTouch(View v, MotionEvent event)
- {
- if (event.getAction() == MotionEvent.ACTION_MOVE)
- {
- // Obtain current touch location
- final int distX = (int) event.getX() - mTouchPosX;
- final int distY = (int) event.getY() - mTouchPosY;
- // Call sensor command that performs manipulation of the pose, depending on interaction mode and with touch motion since last frame as parameters
- if (mInteractionMode == Interaction.ROTATE)
- {
- //rotate around z-axis (in degree)
- metaioSDK.sensorCommand("rotatePose", distX + " " + distY);
- }
- else if (mInteractionMode == Interaction.TRANSLATE)
- {
- //move camera up/down and sideways (in mm)
- //Note: camera is moved in millimeters, so if your model is big you might want to scale the movement to be linear to the model size in the image first
- metaioSDK.sensorCommand("translatePose", distX + " " + distY);
- }
- //Handle base events
- return super.onTouch(v, event);
- }
- }
- // Remember current touch location for the next frame
- mTouchPosX = (int) event.getX();
- mTouchPosY = (int) event.getY();
- }
复制代码- // objective c
- // Add to header
- enum EInteractionMode
- {
- TRANSLATION,
- ROTATION
- };
- CGPoint mTouchLocation;
- EInteractionMode mInteractionMode;
- // Add to implementation
- - (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- {
- // Obtain and remember initial touch location
- UITouch *touch = [touches anyObject];
- mTouchLocation = [touch locationInView:glView];
- }
- - (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
- {
- // Obtain current touch location
- UITouch *touch = [touches anyObject];
- CGPoint curTouchLocation = [touch locationInView:glView];
- // Setup command depending on interaction mode
- std::string command;
- switch (mInteractionMode)
- {
- case ROTATION: command = "rotatePose"; break;
- case TRANSLATION: command = "translatePose"; break;
- }
- // Use touch motion since last frame as parameters
- std::stringstream parameter;
- parameter << (curTouchLocation.x - mTouchLocation.x) << " " << (curTouchLocation.y - mTouchLocation.y);
- // Call sensor command that performs manipulation of the pose
- m_metaioSDK->sensorCommand(command, parameter.str());
- // Remember current touch location for the next frame
- mTouchLocation = curTouchLocation;
- }
复制代码- // C++
- // Extend / add to header
- class Example : protected metaio::IMetaioSDKCallback
- {
- Q_OBJECT
- enum EInteraction
- {
- ROTATION,
- TRANSLATION,
- };
- EInteraction m_interactionMode;
- /// From IMetaioSDKCallback
- virtual void mouseMoveEvent(QGraphicsSceneMouseEvent *mouseEvent) override;
- metaio::IMetaioSDKWin32* m_pMetaioSDK;
- };
- // Extend / add to implementation
- void Example::mouseMoveEvent(QGraphicsSceneMouseEvent *mouseEvent)
- {
- if (m_leftMouseDown)
- {
- // Obtain current touch location
- QPointF moved = mouseEvent->screenPos() - mouseEvent->lastScreenPos();
- // Setup command depending on interaction mode, using touch motion since last frame as parameters
- std::string command; std::stringstream parameter;
- if (m_interactionMode == ROTATION)
- {
- //rotate around z-axis (in degree)
- command = "rotatePose";
- parameter << moved.x() << " " << moved.y();
- }
- else //m_interactionMode == TRANSLATION
- {
- //move camera up/down and sideways (in mm)
- //Note: camera is moved in millimeters, so if your model is big you might want to scale the movement to be linear to the model size in the image first
- command = "translatePose";
- parameter << moved.x() << " " << moved.y();
- }
- // Call sensor command that performs manipulation of the pose
- m_pMetaioSDK->sensorCommand(command, parameter.str());
- }
- }
复制代码 (请注意,此范例可在ARELPlayer中运行。如果您希望在移动设备上运行此范例,您需要把鼠标事件监听器onmousedown、onmouseup、onmousemove更换成相应的触摸事件监听器ontouchstart、ontouchend、ontouchmove)
自由视角模式下,系统需要计算线模型中的线条的可视性,从而正确从视角中判断并且移除被遮挡的线条
您需要设定如下两个参数:- // javascript
- arel.Scene.sensorCommand("setVisibilityTestEnabled", "off"); // "off" or "on"
- arel.Scene.sensorCommand("setVisibilityTestRate", "0.2"); //value between 0.001 (higher error) and 1.0 (higher accuracy), default 0.2
复制代码- // java
- String result;
- result = metaioSDK.sensorCommand("setVisibilityTestEnabled", "on"); // "off" or "on"
- result = metaioSDK.sensorCommand("setVisibilityTestRate", "0.2"); //value between 0.001 (higher error) and 1.0 (higher accuracy), default 0.2
复制代码- // objective c
- std::string result;
- result = m_metaioSDK->sensorCommand("setVisibilityTestEnabled", "off"); // "off" or "on"
- result = m_metaioSDK->sensorCommand("setVisibilityTestRate", "0.2"); //value between 0.001 (higher error) and 1.0 (higher accuracy), default 0.2
复制代码- // C++
- std::string result; result = m_metaioSDK->sensorCommand("setVisibilityTestEnabled", "off"); // "off" or "on" result = m_metaioSDK->sensorCommand("setVisibilityTestRate", "0.2"); //value between 0.001 (higher error) and 1.0 (higher accuracy), default 0.2
复制代码- // XML
- <Sensor Type="EdgeBasedInitializationSensorSource">
- <Parameters>
- <EdgeAlignment>
- <VisibilityTest>
- <Enabled>on</Enabled> <!-- "off" or "on" -->
- <TestRate>5</TestRate> <!-- value between 0.001 (higher error) and 1.0 (higher accuracy), default 0.2 -->
- </VisibilityTest>
- </EdgeAlignment>
- </Parameters>
- </Sensor>
复制代码 注意:您也可以在固定视角模式下对姿态进行操纵。但是,由于模型是根据一个特定的角度制作的,您需要自行确保摄像头姿态在适合的范围之内,或者您可以自动转换到其他可用姿态
边缘成功初始化之后,事件监听器会和往常一样收到一个跟踪事件,来通知用户跟踪状态的改变。当系统转换到无标识3D跟踪时,状态为REGISTERED的事件就会触发。您可以根据需要来互动,如更改可视的模型或其他互动内容- // javascript
- function trackingHandler(type, param)
- {
- if (param[0] != undefined)
- {
- if (type && type == arel.Events.Scene.ONTRACKING && param[0].getState() == arel.Tracking.STATE_REGISTERED)
- {
- console.log("received STATE_REGISTERED from tracking");
- }
- }
- };
复制代码- // java
- final class MetaioSDKCallbackHandler extends IMetaioSDKCallback
- {
- @Override
- public void onTrackingEvent(TrackingValuesVector trackingValues)
- {
- if (trackingValues.size() > 0 &&
- trackingValues.get(0).getState() == ETRACKING_STATE.ETS_REGISTERED)
- {
- mState = EState.TRACKING;
- }
- }
- }
复制代码- // objective c
- - (void) onTrackingEvent:(const metaio::stlcompat::Vector<metaio::TrackingValues>&)poses
- {
- if (poses.size() > 0 && poses[0].state == metaio::ETRACKING_STATE::ETS_REGISTERED)
- {
- mState = EState::TRACKING;
- }
- }
复制代码- // C++
- void onTrackingEvent(const metaio::stlcompat::Vector<metaio::TrackingValues>& trackingValues)
- {
- if (trackingValues.size() > 0 &&
- trackingValues[0].state == metaio::ETS_REGISTERED)
- {
- m_state = STATE_TRACKING;
- }
- }
复制代码
室外模式
使用室外模式的目的是确定相对于地球坐标系中物体的摄像头姿态,因此我们只需要两样东西:
- 所使用的模型必须具有地球坐标系参数。也就是说,我们需要知道物体模型坐标系原点的GPS纬度和经度(请参考LLA标识部分教程),并且模型尺寸的1000.0f应该相当于1米的长度
- 所使用的设备必须可以提供GPS、罗盘和重力传感器信息
室外模式是在跟踪配置文件中设定的。我们必须在跟踪配置文件中指定模型的原点坐标- // XML
- <Sensor Type="EdgeBasedInitializationSensorSource">
- <Parameters>
- <!-- this is only a subset of all parameters for outdoor pose configuration -->
- <OriginCoordinates>
- <Longitude>0.0</Longitude>
- <Latitude>0.0</Latitude>
- </OriginCoordinates>
- <UseSensorsForPoseCreation>all</UseSensorsForPoseCreation> <!-- outdoor requires "all" device sensors-->
- </Parameters>
- </Sensor>
复制代码 注意:我们只使用了两个GPS坐标(纬度和经度),原因是高度数据非常不准确。默认高度为地平面以上1.65米,其为手握移动设备的平均高度。通常,模型处在一个近似平面的环境中,并且模型的地平面和高度为0的平面几乎是对等的。在这种情况下,1.65的高度设置已经足够准确了。初始姿态的高度只在初始估测中使用,在后续的精确调整(矫正)中会重新设定
如果您的应用需要在不同高度并且视角差大于等于30度的情况下使用,例如,应用要求用户可以从山脚和山顶或从不同楼层观测到物体,您需要允许用户通过移动虚拟摄像头来手动设定目前的高度,这和自由角度室内模式的设置是一样的。
结果
固定视角和自由视角模式的效果如下:
视觉辅具可以帮助用户在屏幕上找到正确的角度。如果用户离事物很近(左图),那么基于边缘的初始化就会触发,并且用无标识3D跟踪模式来继续跟踪(右图)。此范例所要增强效果是对齐的线模型,所以两个COS上使用的管状模型是一样的(两个模型加载的是同一个文件)
然后,在自由视角模式下,用户不需要移动实物来进行初始化。您只需用手指/鼠标或扭动设备来移动虚拟摄像头并转动屏幕上的模型。当虚拟摄像头离设备上的摄像头很近时,初始化就会触发。
在室外模式下,设备上的传感器提供初始化所需信息,并且虚拟素材和其他模型都有地球坐标
注意:GPS和罗盘读数的误差范围很大,以致估测的姿态太远,从而导致基于边缘的初始化无法立即成功。因此,在无限制的室外环境中,您需要(类似室内自由角度模式下)借助摄像头旋转或平移让用户通过互动来辅助初始化过程。
总结
通过本文,您应该了解了如何使用基于边缘初始化来进行跟踪,并且有能力开始自行制作自己的跟踪配置。在下一部分的教程里,我们将讨论一些高阶的技术和几个额外参数、选项以及性能问题。
|