Join today’s leading executives online at the Data Summit on March 9th. Register here.
This article was contributed by Mayur Rustagi, cofounder and CTO at Sigmoid.
Data and analytics have been a dominant force in increasing the transparency, speed, and decision-making capability of organizations for years, and they will continue to be a driving force in 2022 and beyond. They make organizations more resilient, especially in the face of the pandemic that has led them to realize the vast goldmine of unexplored data. Companies are now reinventing themselves culturally and technologically to utilize the data to the fullest and be digital-first. Here is a look at some data and analytics trends that we expect to take center stage in 2022.
1. MLops accelerates AI adoption
While computing power and massive datasets have led to ease of creating AI models, the harsh reality is that only 53% of AI Proof of Concepts make it to production. And even fewer manage to deliver the intended, measurable business value. MLops or AI Engineering is emerging as a dominant trend to define the best practices and processes to take machine learning models to production and operationalize them in real-world contexts. It is enabling data scientists, data engineers, and managed service professionals to work collaboratively to deploy, monitor, and govern machine learning services. The advancement in MLops will lead businesses to create stable machine learning models with application-level quality. In fact, Gartner predicts that by 2025, the 10% of enterprises that establish AI engineering best practices will generate at least three times more value from their AI efforts than the 90% of enterprises that do not.
2. Explosion of enterprise data opens new avenues for analytics and digital transformation in 2022
Increased consumer touchpoints have led companies to have access to significantly more consumer data than ever before. However, getting enterprise data such as manufacturing and sales has been a significant challenge. That’s no more the case. Enterprise data is being recorded, stored, and made accessible more than ever. It can be attributed to the increased proliferation of IoT devices, sensors, transaction records, cloud migration initiatives, digitization, and the need to effectively manage the hierarchical master data generated across departments. Companies are now looking to explore the goldmine of data and leverage it to be more competitive, efficient, and innovative, providing an opportunity for enterprise architecture leaders to design data initiatives and enable faster decision-making.
3. Mature data governance ensures data availability and security
A large chunk of data remains unused due to data management and security challenges. Moreover, 30% of total enterprise time is spent on non-value-added tasks because of poor data quality and availability. Mature data governance advances the data management process and helps measure and understand how well companies manage their data. A robust data governance policy will allow companies to maximize the value from data, manage the risk of data misuse, and ensure that the appropriate people have access to the right data. It helps the organization to tailor data approaches according to teams and projects. A mature data governance framework addresses these challenges by standardizing data systems, promoting data transparency, and enabling self-service analytics.
4. Data analytics becomes a key driver for 2022 business resiliency
Businesses are focusing on operational agility and resilience to bounce back from adversity, for which analytics is a key driver. It is especially true for supply chains that saw significant disruption in transportation, organizational complexity, and raw material procurement due to COVID-19. A survey found that over 87% of supply chain professionals plan to invest in resilience and agility in the next two years. Sourcing raw materials from different geographies is a significant challenge that companies face, making procurement analytics and forecasting a critical requirement. It analyzes potential risks in the supply chain and improves the ability to respond to internal and external obstacles, increasing transparency and insights for better decision-making.
5. Leveraging platform-specific services marks the evolution of multi-cloud
Traditional multicloud services eliminated the reliance on a single cloud provider and leveraged two or more public or private clouds to distribute computing resources and minimize the risk of downtime and data loss. The multi-cloud approach is now evolving where companies want to be able to not only switch between clouds but use specific platforms such as Databricks and Snowflake to leverage the best of their offerings. While the traditional multi-cloud could have looked like 50% Google and 50% Amazon, it can now be 70% Amazon, 20% Databricks, 30% Confluent, and 10% MongoDB. They may be using Google or AWS, but want to use Databricks for Spark or Snowflake for warehouses because they are the best vendors for that offerings.
Companies are experimenting with these specialized multi-cloud models using niche services from the best vendors. This approach is not only cost-effective but also allows companies to utilize highly competitive services and add developmental capabilities without adding specialized staff.
Mayur Rustagi is cofounder and CTO at Sigmoid.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Read More From DataDecisionMakers
Source: Read Full Article