Many customers approached us in the last year inquiring about I3C, its benefits and market adoption. The I3C benefits compared to other Sensor interfaces used today, such as I2C and SPI are clear. There are many sensor interfaces which cause unnecessary fragmentation, and are not optimized for system power efficiently for communication with sensors. These drawbacks limit applications and scalability for usage of sensors in mobile, IoT, embedded, automotive environments.
Posts by Hezi Saar:
Microsoft shared some interesting details about their 28nm SoC design targeting augmented reality headset. Well, that is not just a processor, it’s custom vision processor which Microsoft calls HoloLens Processing Unit (HPU) as it is specifically targeting the augmented (possibly also virtual) reality needs. It is very interesting to see the computing power that was implemented on the chip to accommodate the imaging algorithms that used. The interfaces used on this chip are referenced as PCIe, DDR and MIPI. As the HPU uses several camera interfaces, depth and motion sensor for image identification and processing, recognizing gestures it’s clear that MIPI Camera and Display interfaces are probably used extensively. As per the die plot provided, the MIPI interfaces take a very small area of the processor compared to the computing blocks that are dominant utilizing 24 cores.
As electronics become smarter, require less human intervention, the machines around us are capable of doing more, making decisions based on environment and conditions. To facilitate that more sensors are used in electronics devices, it is common to see >12 sensors in latest smartphones used in the market, but this smarter device trend goes beyond mobile to markets such as consumer, industrial and automotive.
I got several inquiries about adoption rate of physical layers across the mobile and adjacent industries after posting the video showing D-PHY v1.2 silicon on 16nm I realize that it’s debatable if it’s fair to compare one spec vs the other. However, I would like to note that de-facto standard has a lot of weight and it is what sets it apart compared to other potential specifications which only a few vendors select. Once a certain standard is well adopted across the industry, it establishes an entrenched position and cannot be replaced instantly. Any potential replacement standard need to take into consideration backwards compatibility to ensure vendor’s investment in the de-facto standard continues to bear fruits. It requires a phased approach towards replacing a successful standard and it’ll only be possible if the replacement standard has proven benefits compared to the de-facto standard and that the transition period is not long and not painful.
In my last post I was discussing how to reduce display data transmission using Display Compression technology. Reducing transmitted traffic while supporting a higher link rate allows to reduce pin count, power consumption and area (cost) of implementation. In the Oct’15 MIPI Face-to-face meeting we (Synopsys) showed the Industry’s first DPHY v1.2 operating at 2.5Gbps/lane with 16nm silicon running at 2.5Gbps. We used a setup that had two D-PHY testchip boards, one D-PHY acting as Rx and another D-PHY acting as Tx connecting to test equipment to provide stimulus and capture the results.
JEDEC UFS (Universal Flash Storage) v1.1 is a standard promoted by JEDEC JC64.1 aiming to replace eMMC for scalable and high performance non-volatile memory interface in mobile and consumer electronics. The same JEDEC JC64.1 is the group that develops eMMC meaning that they see the transition from eMMC to UFS and are prepared to that. It’s fair to assume that UFS will be used in high end mobile applications first like high end smartphones, tablets and Ultrabooks and compete with eMMC on some of the lower end applications. Long term and assuming high volume manufacturing reduces UFS device costs we will see UFS replacing eMMC but there is a long way to go until we reach that time.
The second quarter of 2011 will be remembered as an inflection point in the mobile industry with Apple becoming the world’s top smartphone vendor by shipping 20.3 million Smartphones. Nokia continued its decline, down by 34 percent from the same quarter last year and shipped 16.7 million units in the second quarter, with vendors like HTC and Samsung continue to stay behind. It’s important to note however that Samsung shows fantastic market share growth from a slim 3% in the first quarter of 2011 to 13% of smartphones sold in the first quarter this year. Other growing OEM’s include HTC, Sony Ericsson and LG. Apple’s iPhone dominancy in smartphone doesn’t translate to leadership in smartphone operating system as Android leads the game mostly due to the fact that Android operating system is used by multiple smartphone manufacturers. According to latest report by IMS research smartphone shipments in 2011 will reach 420 million units, taking 28% of the total handset market. The research firm also predicts smartphone volumes to reach 1 billion units in 2016 mostly due to lower end smartphones. Here’s a graph (courtesy of IMS research) comparing OEM smartphone shipments in Q1 2011 with Q1 2010:
Sub-system definition is “a group of independent but interrelated elements comprising a unified whole” (www.thefreedictionary.com/subsystem). When using the term subsystem it all depends on the context as to what system you’re referring to. For example mobile SoC interfacing with camera, requires camera sub-system to handle all of the interface from the external image sensor (camera) up to the internal processor that receives the extracted data to do other tasks such as sending to display, post process, pack, store, transmit, etc. The camera subsystem comprised therefore with physical layer that handles all high speed transmissions and signaling and protocol layer that unpacks the pixels and sends to processor. In a similar way SoC’s display sub-system takes care of grabbing the data, packing it properly, taking care of all sync signals and transmitting it out via the physical interface through the traces to the embedded display. Here’s a diagram that illustrates the building blocks for camera subsystem (in Yellow square) and display subsystem (in light blue square) on the SoC host device communicating with external image sensor and display.