Mastek modernizes legacy quantitative pricing workload system with cloud-native data program

For investment and asset management companies, quantitative investing is a niche area of investment that leverages diverse models for equity and fixed income markets, as well as currencies, commodities, and structured products. It employs a finely tuned blend of mathematical models, probability and statistical analysis, data analytics and machine learning techniques to develop innovative and sound trading strategies. Leveraging the right technologies with an expert understanding of the domain and related analytics is critical for success in monitoring the markets for opportunities and for developing investment strategies.

As the client’s strategic partner, Mastek delivered data modernization services and modernized their pricing ecosystem with a scalable, secure, and cloud-optimized architecture.

Geography

North America

Service Line

Data and AI Services

Oracle Cloud

Client Snapshot

As the global asset management arm of a leading financial institution, our client manages trillions in assets for institutional and individual investors across the globe. Noted for its expertise in equities, fixed income, cash, index funds and multi-asset strategies, it provides innovative investment products and institutional solutions, including ESG.

Value based analytics for councils

Challenge

Our client relied heavily on IBM Netezza for running their business-critical quantitative pricing models used in the valuation and market pricing of funds across global markets. With Netezza approaching end of life, it needed to proactively and preventively mitigate the imminent risks of escalating infrastructure maintenance costs and inadequate elasticity to scale compute for peak pricing windows. The client also looked to modernize the existing legacy system which was severely constrained by capacity limitations and its inability to integrate with modern cloud-native analytics and AI/ML frameworks.

Solution

The data modernization service was implemented with a fully cloud-native ‘data and compute’ modernization program, ensuring zero disruption to business-critical pricing runs.

A secure data migration foundation was built with a SFTP secure pipeline and customized SFTP to Amazon S3 movement utility. AWS Transfer Jobs was leveraged for automation, retries, and lineage tracking – and metadata-driven orchestration was created to allow teams to onboard new datasets without code changes. Configured Delta pipelines were configured on Databricks to ingest raw S3 data into optimized Delta Lake tables. The tables were optimized with Z-ORDER, Data Skipping, Auto-Optimize, and Bloom filters to accelerate key pricing queries. Additionally, a multi-layer architecture was designed to ensure auditability, data quality, and efficient transformations.

A Databricks Photon Engine (written in C++) was implemented to improve query performance for heavy analytical workloads. Auto-scaling clusters were enabled to support on-demand compute elasticity during daily and monthly pricing cycles.

Apache Spark structured streaming pipelines were customized to support near real-time ingestion for certain trading and market datasets – and fault tolerance was ensured with checkpointing, exactly-once processing, and event-driven orchestration. Consistent deployments across development, QA and production were achieved with a Dockerized pricing pipeline environment – with integrated CI/CD pipelines using GitHub Actions / Jenkins for version control, automated testing, and repeatability.

End-to-end pipeline monitoring was achieved with Databricks jobs, CloudWatch, and custom alerting. Lineage and audit trails were built for compliance with financial regulations (PCIDSS, GDPR). Additionally, cluster policies and FinOps best practices were introduced to ensure cost governance.

Impact

Data modernization of the asset management giant’s quantitative pricing workloads delivered impactful and wide-ranging outcomes.

  • 40–60% faster pricing workloads, directly improving the speed of daily NAV calculations. Photon acceleration reduced processing times for large historical models from hours to minutes.
  • 25–30% reduction of compute spend due to optimized cluster utilization and Delta performance.
  • Up to 5–10X speed improvements on large datasets used in pricing computations.
  • Elimination of legacy risk due to total removal of Netezza dependency.
  • On-demand scaling due to cloud deployment allowed large pricing jobs to be run during peak windows without resource contention.
  • Enhanced data quality and transparency with strong ACID guarantees provided by Delta architecture.
  • Cost optimization due to serverless and auto-scaling patterns that eliminated static hardware costs.
  • Future-ready platform compatible with AI/ML model development, advanced analytics, and future quantitative research.
Additional value delivered :
  • Standard operating procedures (SOPs), runbooks, and knowledge transfer for self-sufficiency.
  • ‘Follow-the-sun’ support to reduce internal support burden for the client.
  • Automated error-handling, retry logic, and failure notifications.
  • Performance benchmarking and tuning workshops with the client’s quant teams.
  • Architectural accelerators and re-usable ingestion frameworks to onboard additional datasets.

Share:

Ready to modernize communications like Arizona DES?

Scroll to Top