跳转到主内容

Google Pixel 4 XL 拆解

英语
中文

正翻译步骤 12

步骤 12
Google Pixel 4 XL Teardown: 步骤 0 中的图像 1,3 Google Pixel 4 XL Teardown: 步骤 0 中的图像 2,3 Google Pixel 4 XL Teardown: 步骤 0 中的图像 3,3
  • Next we pry out this hunk of ... stuff, which turns out to be an earpiece speaker, mic, ambient light sensor (AMS TMD3702VC), and the Soli chip, for interpreting your gestures using the power of radar.

  • Google calls this implementation of its in-house Project Soli Motion Sense.

  • Although radar technology has been in use for a long time and seems simple enough on paper, we're at a loss as to how Google stuffed the entire system into a tiny featureless rectangle with no moving parts.

  • Motion Sense works by emitting precisely tuned waves of electromagnetic energy. When those waves bounce off of something (like your hand), some of them reflect back to the antenna.

  • The Soli chip then studies the reflected waves and analyzes their time delay, frequency shift, and other data to learn the characteristics of the object that reflected them—how big it is, how fast it's moving, in which direction, etc.

  • Soli then runs that data against its known gesture database to determine what action, if any, needs to be performed in the OS.

  • TL;DR: magic rectangle knows your every move.

接着我们取出这一大块……东西,里面有耳机扬声器,麦克风,环境光传感器,以及 Soli 芯片,利用雷达的力量来理解你的手势。

Google 将这个内部计划称之为 Project Soli Motion Sense

尽管雷达技术已经被使用了很长时间,而且理论上也很简单明了,但是对于Google如何把整个系统放进这么一个没有任何运动部件也缺乏细节的小方块里的这件事,我们也是哑口无言。

Motion Sense 通过射出特定波段的电磁波来运作。当这些电磁波弹射到物体(比如你的手)上时,一些会反射回天线上。

Soli 芯片这时会计算反射波以及它们的延迟,频率偏移,以及其他参数,来获取反射物的特性——有多大,移动得多快,往哪个方向等等。

然后 Soli 芯片会将数据与已有的手势数据库比对,从而决定操作系统需要做什么回应。

太长不看版:这个魔法小方块知道你所有的行动。

-[* black] Next we pry out this hunk of ... stuff, which turns out to be an earpiece speaker, mic, ambient light sensor, ''and'' the Soli chip, for interpreting your gestures using the power of [https://youtu.be/rGvblGCD7qM?t=49|radar|new_window=true].
+[* black] Next we pry out this hunk of ... stuff, which turns out to be an earpiece speaker, mic, ambient light sensor (AMS [link|https://ams.com/tmd3702vc|TMD3702VC]), ''and'' the Soli chip, for interpreting your gestures using the power of [https://youtu.be/rGvblGCD7qM?t=49|radar|new_window=true].
[* black] Google calls this implementation of its in-house [https://atap.google.com/soli/|Project Soli|new_window=true] ''Motion Sense''.
[* black] Although radar technology has been in use for a long time and seems simple enough on paper, we're at a loss as to how Google stuffed the entire system into a tiny featureless rectangle with no moving parts.
[* icon_note] Motion Sense works by emitting precisely tuned waves of electromagnetic energy. When those waves bounce off of something (like your hand), some of them reflect back to the antenna.
[* black] The Soli chip then studies the reflected waves and analyzes their time delay, frequency shift, and other data to learn the ''characteristics'' of the object that reflected them—how big it is, how fast it's moving, in which direction, etc.
[* black] Soli then runs that data against its known gesture database to determine what action, if any, needs to be performed in the OS.
[* black] TL;DR: magic rectangle knows your every move.

您的所有投稿皆享有基于开源创作共享许可协议(CC BY-NC-SA)著作权利