‘Bringing AI to your data’ Krish Vitaldevara, Senior Vice President and General Manager at NetApp, the concept proudly emblazoned on his T-shirt during his appearance on the main stage.
The future of intelligent data infrastructure and artificial intelligence – an often talked about conept alongside the complexities associated with moving data to where AI computation traditionally occurs.
“It’s actually incredibly hard because our data is everywhere,” Vitaldevara shared, highlighting issues related to geographic distribution, data permissions, and the monumental gravity of data.”
The traditional approach often results in what’s known as data silos—fragmented, redundant repositories of information that complicate AI implementation. To address these challenges, Vitaldevara addressed two crucial steps: making data AI-ready and bringing AI to the data. This approach allows for inferencing and creating data embeddings in place, enabling data to be turned into actionable insights wherever it resides, rather than moving it across various platforms and risking fragmentation.
Vitaldevara provided further insights into the ongoing debate around data retention and sustainability. It was pointed out the importance of distinguishing between core business data and operational or transactional data,
“The data that lets [businesses] create the most shareholder value and serve their customers well is generally the most valued” Vitaldevara stated.
The ever-present issue of biases within AI models and data, a problem that arises from the human element involved in building these systems.
“The goal is to anchor the data so that you can ground it as close to the truth as possible” added Vitaldevara.
Expressing the necessity for explainability in AI models to identify and eliminate biases introduced either inadvertently or through malicious data poisoning.
Vitaldevara acknowledged the additional burden this places on cybersecurity professionals already grappling with a laundry list of issues. Existing cybersecurity practices, alongside new technological investments, can mitigate many of these risks.
“Everything you are doing already as a cybersecurity person—protecting against ransomware, ensuring data immutability, having the right security postures—will help.” Vitaldevara commented.
Looking ahead, Vitaldevara suggested that the companies best prepared for the “data tsunami” will be those that have laid a solid foundation for data quality and protection, especially in regulated industries like healthcare and finance.
“Data preparation is where 80% of data scientists’ and engineers’ time is spent,” he noted, stressing the importance of a well-prepared data infrastructure.
NetApp’s role in this is to make both infrastructure and data AI-ready. This includes offering scalable, efficient performance and providing structured oversight for vast amounts of unstructured data. NetApp aims to create an ecosystem where AI can seamlessly interface with existing data, pushing the envelope toward more efficient and actionable insights that can be leverage to drive business outcomes.