As the world expects the next wave of cars to talk to homes and refrigerators to communicate with phones, the interoperability of the Internet of Things (IoT) is a critical need. One step towards that would be mobile cloud computing. If fully realized, mobile cloud computing will simplify hardware architectures and domain-specific software needs, erasing integration borders among IoT-enabled devices.
Gartner estimates that approximately 4.9 billion objects will be connected to the Internet of Things (IoT) in 2015. Consumer applications will drive the number of connected devices, however enterprise will account for most of the revenue, according to the research firm. Fitness bands and smartwatches will account for some of the roughly 2.9 billion consumer IoT devices in 2015, predicts Gartner.
Display technologies are evolving at an unprecedented rate. They have evolved so much that we can now see the minutest picture details on our devices – be it television sets, tablets or smart phones. Not very long ago, we started talking about high-definition content, and then came full HD that gave a new dimension to multimedia experience and now we are talking about ultra HD (UHD), which delivers four-times the picture resolution of 1080p full HD.
Video has been considered one of the most computationally intensive applications in mobile and consumer devices and as display resolutions grow, the computational needs for video processing are becoming even more challenging. The latest video coding technologies like HEVC (H.265) and VP9 need much more processing power than their legacy counterparts H.264 and VP8 respectively. With current silicon technology it may not be possible to increase the CPU clock beyond a certain extent due to thermal issues. However, heterogeneous System on Chips (SoCs) with multiple processing units have been launched in the market recently by chip makers which can deliver the desired compute performance to fulfill the increasing demand of video algorithms. Samsung® Exynos™, NVIDIA® Tegra® and Qualcomm® Snapdragon™ chipset series are, to name just a few, powered by ARMv7 architecture and incorporate multiple CPU cores (running as high as 2.5GHz) along with GPU Compute capability. No doubt, these platforms provide greater computational power to video software makers, but at the same time programmers need to design and architect their software in a parallel way to extract the maximum performance out of multi-core based systems.
In tech, there are two ways to become “the next big thing.” The first is the invention that automatically changes everything and instantly becomes ubiquitous - think the first iPhone in 2007. The second is for a type of technology to grow slowly, almost unnoticed, until it reaches a certain point where you realize it is everywhere. That’s what happened with wearable tech. Wearable devices have been around a long time, but the Internet of Things has made their usability even more versatile, and now every company with a credible technology department is trying their hand at this field. How can you increase the likelihood that your unique end-product will surface to the top?
Aricent recently hosted a webinar titled “Engineering the future of IoT”. In this webinar, our experts discussed on the evolution of IoT and the various engineering challenges with respect to developing and integrating IoT applications.