VIDEO | Novartis Global Head: Data Needs to Be Highly Accessible

VIDEO | Novartis Global Head: Data Needs to Be Highly Accessible

(US and Canada) Ravi Prasad, Novartis Global Head of Data, AI Strategy and Operations, speaks with Girish Bhat, Acceldata Senior Vice President of Marketing, about leading the organization's digital transformation journey, ensuring the reliability of data, and managing data pipelines.

Prasad begins by stating that digital and data have been at the forefront and center of Novartis’ transformation journey. 

From a people standpoint, he indicates that the organization has not only been able to attract and retain good talent, but also empower them with creativity and an innovative environment.

Speaking about processes, Prasad says that alignment with the business is key and the teams are working towards a two-pronged approach:

  1. Driving a high degree of operational efficiency and cost efficiency in operations.

  2. Being proactive in engaging with customers and enhancing the speed of service.

On the products and technology side, Prasad shares that, while the focus has mostly been on people and processes, Novartis has invested heavily on the infrastructure side as well.

Prasad goes on to talk about ensuring the reliability of the data and building trust across all stakeholders. He suggests four key steps:

  1. Establishing a robust internal data life cycle process and data pipeline.

  2. Ensuring stringent privacy, security, and ethical standards.

  3. Ensuring high applicability readiness. The accessibility of the data needs to be very high.

  4. Having transparency and trusted relationships with the cloud and external data platform providers.

When asked about his approach to managing data pipelines and the role of technologies like AI or ML, Prasad says that Novartis has many data pipelines and with the digitalization of services, he anticipates them to grow exponentially.

He reveals that the organization is looking to leverage ML and other data science methods to automate and scale existing pipelines. Certain functions within the pipeline processes like gathering data, harmonization, cleansing, anonymizing, and  moving them to the downstream systems and applications could be automated, he adds.

Related Stories

No stories found.
CDO Magazine
www.cdomagazine.tech