Even today, most of the mobile apps developers and QA teams are busy working on Android/iOS applications, changing their user interface almost every quarter and updating their apps for every new iOS/Android release. I am inquisitive to know if people feel it’s still interesting/glamorous, the way it used to be 5 years ago. I personally feel it’s not.
With the virtualization economy on the rise, numerous application workloads and virtual network functions (VNF) are leveraging virtualized infrastructure to reap benefits of the on-demand infrastructure to increase the pace of innovation and reduce cost. Security of virtualized workloads and virtual infrastructure, however, needs to be considered as a key constituent of the overall cloud defense-in-depth strategy.
We are never really happy living in the present and always love to discover or innovate ‘the next big thing.’ That’s how we have advanced from 2G to 3G, and now have got to LTE and many of its advanced features to fulfill our continuously increasing high-speed data greed, better user experience and high quality. High-speed data demands are leading to use of higher MIMO configuration and higher modulation schemes. Although these advanced communication technologies are meant for providing high data throughput, there are certain scenarios where low data transmission can provide better quality and increased user satisfaction. Maintaining quality becomes more challenging when user equipment (UE) has to do hand-in/hand-outs between different cells or channel quality estimated by UE is not very good. There’s where the problem of packet loss and degraded quality creeps in. In this blog, I will try to describe one problem scenario and an approach for a solution to support my views.
The solutions developed in the pre-cloud era, may now come with many limitations to adapt to the latest trends in technology and tools. Thus, the software product companies currently either offer their solutions through a distribution model or host them in a private data center. It is not impossible to migrate to Cloud, but this presents many choices and thus many challenges. It is not just about redesigning the current software but also establishing new culture in the software development process and to be truly agile and deliver changes to market more often without compromising quality.
The distributed and horizontally scalable architecture based platforms for storing and processing data, like, Hadoop, has seen very rapid development in past six to seven years and has come out as an obvious option for main stream data applications. But, still there is a lot to be done to mature these technologies on enterprise class features, like, security, monitoring, usability, etc. In this blog, I will discuss different aspects and ideas which can help in building a secured big data solution for industrialization.
Every organization that builds software products and/or services has been striving to develop new offerings for (or move existing ones to) the cloud, lately.
Start-ups and small organizations see it as an obvious choice, or rather the only viable option, since it helps them save the upfront set-up costs of infrastructure and maintenance, and reduce the time-to-market significantly. Larger organizations with well-established products/services see it as an opportunity to offer greater flexibility and cost efficiency to their customers, hence making their products more competitive and future-safe. It is a win-win proposition for all stakeholders – OEMs, vendors, service providers and customers.