Real-time analytics and dynamic policies are shaping the way subscribers and interest groups are served by Communication Service Providers (CSPs) networks. However, to keep pace with the increasing variance of QoS attributes driven by the specialization of digital streams across industries, CSPs must extend their approach and make their network platform more reactive and tunable in order to respond to granular demand. To achieve this goal, they must exploit new UX (User Experience) data sources to optimize the operation of their Carrier Cloud while reducing churn and opex. In this blog, we are discussing the integration of Enterprise APM (Application Performance Management) UX KPIs with CSPs’ policy frameworks to drive operational efficiencies and increased CSAT.
Even today, most of the mobile apps developers and QA teams are busy working on Android/iOS applications, changing their user interface almost every quarter and updating their apps for every new iOS/Android release. I am inquisitive to know if people feel it’s still interesting/glamorous, the way it used to be 5 years ago. I personally feel it’s not.
With the virtualization economy on the rise, numerous application workloads and virtual network functions (VNF) are leveraging virtualized infrastructure to reap benefits of the on-demand infrastructure to increase the pace of innovation and reduce cost. Security of virtualized workloads and virtual infrastructure, however, needs to be considered as a key constituent of the overall cloud defense-in-depth strategy.
We are never really happy living in the present and always love to discover or innovate ‘the next big thing.’ That’s how we have advanced from 2G to 3G, and now have got to LTE and many of its advanced features to fulfill our continuously increasing high-speed data greed, better user experience and high quality. High-speed data demands are leading to use of higher MIMO configuration and higher modulation schemes. Although these advanced communication technologies are meant for providing high data throughput, there are certain scenarios where low data transmission can provide better quality and increased user satisfaction. Maintaining quality becomes more challenging when user equipment (UE) has to do hand-in/hand-outs between different cells or channel quality estimated by UE is not very good. There’s where the problem of packet loss and degraded quality creeps in. In this blog, I will try to describe one problem scenario and an approach for a solution to support my views.
The solutions developed in the pre-cloud era, may now come with many limitations to adapt to the latest trends in technology and tools. Thus, the software product companies currently either offer their solutions through a distribution model or host them in a private data center. It is not impossible to migrate to Cloud, but this presents many choices and thus many challenges. It is not just about redesigning the current software but also establishing new culture in the software development process and to be truly agile and deliver changes to market more often without compromising quality.
The distributed and horizontally scalable architecture based platforms for storing and processing data, like, Hadoop, has seen very rapid development in past six to seven years and has come out as an obvious option for main stream data applications. But, still there is a lot to be done to mature these technologies on enterprise class features, like, security, monitoring, usability, etc. In this blog, I will discuss different aspects and ideas which can help in building a secured big data solution for industrialization.