You are opening our English language website. You can keep reading or switch to other languages.
02.04.2025
5 min read

Key Insights on Asset Management Data Strategy: Webinar Takeaways

Asset managers now face an ever-growing challenge: how to effectively manage and leverage vast amounts of information. Our recent webinar, "Overloaded with Data? Learn to Future-Proof Your Data Strategy," brought together finance industry experts to discuss strategies for navigating data complexities, using data effectively, modernizing outdated systems, and integrating AI into data management.

Key Insights on Asset Management Data Strategy: Webinar Takeaways

Hosted by Ed Simmons from Branch Brook Advisors, the panel included Alexey Utkin, Principal Solution Consultant, and Andrey Ivanov, VP of Finance Practice, both from DataArt, who shared their experience in the financial sector to provide perspectives on evolving data strategies in asset management.

Data as a Competitive Advantage for Asset Managers

The webinar began by exploring how data serves as a critical competitive advantage in asset management. Andrey Ivanov emphasized that asset managers are essentially "gigantic data processing machines" that collect and analyze vast amounts of data about markets, companies, and economic trends.

At the end of the day, every asset manager seeks to generate alpha, beat competitors, and minimize operating expenses. The better your data stack is, the easier it becomes to be operationally efficient and focus the attention and time of your most valuable employees on generating alpha.
Andrey Ivanov

A robust data strategy enables asset managers to launch new products faster, generate more alpha from existing products, and optimize costs. However, as data volumes grow and analytics become more sophisticated, delivering these advantages has become increasingly challenging.

Data Products and Platforms: A Natural Evolution

A key focus of the discussion was data products and platforms. Alexey Utkin explained that the data-as-a-product paradigm became popular about 4-5 years ago with the rise of data mesh architecture. This approach focuses on the value data delivers to consumers in a business context rather than just making data available.

It really focuses on the value your data delivers to the consumers in the business context... instead of just dumping data on users with tables and whatnot, you give all this rich context around the data and turn it to a product which is available for self-service.
Alexey Utkin

Andrey Ivanov added that data products aren't entirely new concepts but rather an evolution of practices asset managers have been using for years. He cited security masters as early examples of structured data products that had well-defined interfaces and standards. The key difference now is applying this product thinking across all data assets in an organization.

Data Mesh: Decentralization with Governance

The webinar analyzed the concept of data mesh, which advocates for decentralization rather than relying on central data platforms and teams. This approach enables domain-oriented teams aligned with business functions to produce data products efficiently without building their own infrastructure.

Alexey Utkin highlighted that data mesh balances decentralization with self-service capabilities and standardized governance, ensuring interoperability between independently produced data products while giving teams the autonomy to innovate quickly.

Ed Simmons drew an apt analogy comparing this approach to the relationship between state and federal governments—local control speeds time to market, while some central standards ensure interoperability and economies of scale, maintaining cohesion and efficiency.

Modernizing Legacy Systems: Practical Strategies

For many asset managers, legacy infrastructure—often built on 2005-2010 SQL Server, Sybase, or Oracle databases—remains a bottleneck.

The speakers outlined several practical approaches to modernization:

  • Assessment and Profiling: Understanding what's valuable before planning migration.
  • Selective Modernization: Not everything needs to be rebuilt.
  • Facade Pattern: Building modern interfaces around legacy systems that perform critical functions.
  • AI-Assisted Migration: Using AI tools to streamline translation, data modeling, and validation.
You don't always have to rewrite the system or refactor the system. What you can do is build a facade around that—let it do its core function and build, for example, a REST API around it to expose it to more modern parts of your data stack.
Andrey Ivanov

AI's Expanding Role in Data Management

The final segment explored AI's impact on data management across three key areas:

  • Enhancing the Data Supply Chain: Using AI for data quality checks, integration, and discovery.
  • Alpha Generation: Applying AI models for investment decisions (with regulatory considerations).
  • Personalized Customer Experiences: Delivering more intuitive and personalized user interactions.

Practical examples included using neural networks to detect pricing anomalies and building natural language interfaces that allow users to query data using plain English within tools like Excel.

Future-Proofing Your Data Strategy

The key takeaway? Future-proofing data strategy isn't a one-time project but a continuous evolution. Organizations must balance modernization with practical considerations, leverage AI thoughtfully, and embrace new paradigms like data products and data mesh while maintaining strong governance.

As data volumes continue to grow and technologies evolve, asset managers who build flexible, well-governed data platforms will be best positioned to maintain a competitive edge, drive operational efficiencies, and generate alpha in an increasingly complex environment.

If your company is facing data challenges, our team of experts can help you navigate modernization, optimize your data strategy, and leverage AI for maximum impact. Reach out to us for personalized advice and support.