11 December 2024
Big data has been likened to the oxygen fuelling the fire of Industry 4.0 but as data sets become more complex and distributed, the way businesses analyse them will need to change.
Big data has been likened to the oxygen fuelling the fire of Industry 4.0 but as data sets become more complex and distributed, the way businesses analyse them will need to change.
Although data analytics has become a major component of modern business decision-making, data quality issues continue to cost businesses an average of US$12.9 million a year.
While there are myriad reasons for these data quality issues, one major driver is that the traditional model of analysis requires businesses to create a centralised platform for all their data – duplicating information from unconnected sets into a new unified format and then processing it.
Moving data from multiple sources (and often in multiple formats) into a ‘single source of truth’ is an expensive and time-consuming process, which BlackSwan Technologies chief digital transformation officer Dr David Amzallag noted can create problems for businesses.
These include data not being up-to-date, difficulties managing how data is used across the organisation, and the sizeable cost of storing large volumes of data and the computational power to process it all.
In an opinion piece for IDG Connect, Dr Amzallag pointed to ‘data fabric’ as a possible solution to these issues.
“The traditional, centralised approach to data is leaving organisations with many challenges to overcome,” he said.
“An alternative strategy is to take a decentralised approach. The design concept of a data fabric can help with this; it’s based on multiple data management technologies working in tandem, streamlining data ingestion and integration across a company’s ecosystem.”
Put simply, data fabric serves as a connective tissue that allows end-users to access and use data from multiple sources without having to duplicate it into a single, centralised source.
This decentralised approach to data analytics has started to gather pace in the post-COVID economy, with global data and insights firm Gartner listing it as their No.1 strategic technology trend for 2022.
“Data fabric provides a flexible, resilient integration of data sources across platforms and business users, making data available everywhere it’s needed regardless where the data lives,” Gartner’s report said.
“Data fabric can use analytics to learn and actively recommend where data should be used and changed. This can reduce data management efforts by up to 70%.”
Market forecasts point to an increase in the use of this technology, too, with the market now tipped to grow from roughly US$1.24 billion in 2021, to almost US$2.5 billion by 2028.
Sources
- World Economic Forum, Federated Data Systems: Balancing Innovation and Trust in the Use of Sensitive Data
- Gartner, Gartner Identifies Top 10 Data and Analytics Technology Trends for 2021
- Gartner, Gartner Top Strategic Technology Trends for 2022
- IDG Connect, Enhancing data governance and quality with a Data Fabric strategy
- IBM, Data Fabric
- Global Newswire, Data Fabric Market Forecast to 2028 – COVID-19 Impact and Global Analysis By Deployment, Component, Solution, and End User