ARkit 收藏本版 已有4人收藏 +发表新主题
查看: 21292|回复: 1
打印 上一主题 下一主题

ARKit官方文档翻译(1)-理解增强现实

[复制链接]

ARKit官方文档翻译(1)-理解增强现实

ZHANGKEFA 发表于 2017-8-7 16:37:47 浏览:  21292 回复:  1 只看该作者 复制链接
Understanding Augmented Reality
理解增强现实
告知:本文在ARinChina技术论坛首发,转载请注明出处。
ARKit技术交流qq群482631386
ARVR训练营www.arvrthink.com
Discover concepts, features, and best practices for building great AR experiences.
发现其概念,了解其特征,实践其功能,创造牛XAR体验。
The basic requirement for any AR experience—and the defining feature of ARKit—is the ability to create and track a correspondence between the real-world space the user inhabits and a virtual space where you can model visual content. When your app displays that content together with a live camera image, the user experiences augmented reality: the illusion that your virtual content is part of the real world.
AR体验的基本要求是(也是ARKit的功能):能够在用户所在真实世界空间(real-world space)与虚拟空间(virtual space)之间建立追踪(world tracking,与HoloLens是相同的概念),并在建立追踪后可视化虚拟内容。当你的app能够将虚拟内容显示在摄像头画面中的时候,会给用户一种虚拟内容存在于真实世界中的假象,而这种假象或者错觉,就是我们带给用户的增强现实体验。
How World Tracking Works
世界追踪如何工作呢?
To create a correspondence between real and virtual spaces, ARKit uses a technique calledvisual-inertial odometry. This process combines information from the iOS device’s motion sensing hardware with computer vision analysis of the scene visible to the device’s camera. ARKit recognizes notable features in the scene image, tracks differences in the positions of those features across video frames, and compares that information with motion sensing data. The result is a high-precision model of the device’s position and motion.
为了建立真实空间与虚拟空间的追踪定位,ARKit使用了一种叫做视觉惯性测距(calledvisual-inertial odometry)的技术。其实就是通过硬件(运动传感器)+软件(计算机视觉分析)来实现咯。ARKit识别到真实空间中比较显著的特征点(其实就是当前摄像机拍摄到的画面上的特征点),并且跟踪在不同时间这些特征点的位置变化,将这些数据与运动传感器的数据进行比对。最终得到一个高精度的有关设备位置和运动信息的数据模型。
World tracking also analyzes and understands the contents of a scene. Use hit-testing methods (see the ARHitTestResult class) to find real-world surfaces corresponding to a point in the camera image. If you enable the planeDetection setting in your session configuration, ARKit detects flat surfaces in the camera image and reports their position and sizes. You can use hit-test results or detected planes to place or interact with virtual content in your scene.
世界在采集到数据之后,会对场景中的内容进行分析和理解(HoloLens中的Spatial Understanding),并采用hit-test的方式(可以查看ARHitTestResult类型了解详细信息)检测真实世界表面在摄像机画面中的位置(射线检测)。如果在settion configuration中开启了planeDetection(平面检测)功能。ARKit将会检测到平面,并且将平面在摄像机画面中的位置以及尺寸报告给开发者,这样我们就可以给这个平面上放虚拟的物体啦。
Best Practices and Limitations
最佳实践与某些限制
World tracking is an inexact science. This process can often produce impressive accuracy, leading to realistic AR experiences. However, it relies on details of the device’s physical environment that are not always consistent or are difficult to measure in real time without some degree of error. To build high-quality AR experiences, be aware of these caveats and tips.
世界追踪并非是完全精确的,但是却常常带给我们各种神奇的体验,并感受到AR的魅力。但是(但是后面的才是最重要的),这种体验需要依赖于我们周围的环境,它并不总是那么稳定,以至于有时候会产生一些误差,为了创建高质量的AR效果,大家要注意下面的一些小提示哦:
Design AR experiences for predictable lighting conditions. World tracking involves image analysis, which requires a clear image. Tracking quality is reduced when the camera can’t see details, such as when the camera is pointed at a blank wall or the scene is too dark.
设计可预见光照条件的AR体验:世界追踪需要调用图像分析,所以需要清晰的图像,因此追踪的质量跟当前真实环境的光照条件有非常大的关系,当摄像机看不到细节时,追踪质量会降低,我们要防止摄像机去指向一面空白的墙,或者场景太暗。
Use tracking quality information to provide user feedback. World tracking correlates image analysis with device motion. ARKit develops a better understanding of the scene if the device is moving, even if the device moves only subtly. Excessive motion—too far, too fast, or shaking too vigorously—results in a blurred image or too much distance for tracking features between video frames, reducing tracking quality. The ARCamera class provides tracking state reason information, which you can use to develop UI that tells a user how to resolve low-quality tracking situations.
将追踪质量信息反馈给用户(ARKit开发者):世界追踪与图像分析和设备运动相关,当设备在移动时,即使是很轻微的移动,ARKit也能够很好的理解场景。运动过快,运动太远,或者强烈的晃动设备,会让图像模糊,或者在摄像机不同时间的画面帧中距离太大,导致最终无法追踪。在ARCamera这个类中提供了追踪状态信息,开发者可以将这些信息显示在UI中,以便解决低质量的追踪状态。
Allow time for plane detection to produce clear results, and disable plane detection when you have the results you need. Plane detection results vary over time—when a plane is first detected, its position and extent may be inaccurate. As the plane remains in the scene over time, ARKit refines its estimate of position and extent. When a large flat surface is in the scene, ARKit may continue changing the plane anchor’s position, extent, and transform after you’ve already used the plane to place content.
我们需要给ARKit一些时间去检测并生成清晰的平面信息,在得到我们想要的平面结果之后就关闭平面检测这个功能。平面检测结果会随着时间发生变化-第一次检测到的平面位置,边界信息可能并不精确(采集到的数据还不够),随着时间的推移,ARKit会不断提炼其位置和边界。当在场景中有一个大的平面时,ARKit可能会继续改变其锚点(plane anchor’s position,类似于HoloLens中的World Anchor)位置,即使你已经在这个平面上创建了内容(因此需要在达到目的后关闭平面检测功能)。
分享至:
| 人收藏
回复

使用道具 举报

该用户从未签到

沙发
Scarlett_1990 发表于 2017-11-16 10:04:55 只看该作者
ARKit 入门到精通视频教程地址:
蛮牛地址:   http://edu.manew.com/course/396
回复 支持 反对

使用道具 举报

*滑动验证:
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

Copyright © 2013-2017 ARinChina-增强现实中国技术论坛   All Rights Reserved.