Shrijan Shrestha

Data Engineer / Analyst
Gurugram, IN.

About

Highly analytical and results-driven Data Engineer with a proven track record of designing and optimizing scalable data pipelines, transforming complex data into actionable insights. Expert in Azure ecosystems, SQL, Python, and PySpark, consistently delivering solutions that enhance performance, reduce costs, and drive strategic decision-making. Seeking to leverage advanced data engineering skills to build robust, high-impact data solutions in a dynamic environment.

Work

Zinn Software
|

Data Engineer

Gurugram, Haryana, India

Summary

Currently engineering robust real-time data ingestion pipelines and optimizing data lake architectures within Azure ecosystems for high-throughput transaction monitoring.

Highlights

Engineered robust real-time data ingestion pipelines using Azure Data Factory and ADLS, ensuring high-throughput transaction monitoring for critical business operations.

Optimized PySpark-based fraud detection workflows, decreasing false positives by 22% and improving detection accuracy.

Revamped the data lake architecture, achieving scalable and cost-effective storage solutions leveraging Azure Blob and ADLS Gen2.

Spearhead
|

Market Analyst

Bangalore, Karnataka, India

Summary

Executed algorithmic and discretionary trades while monitoring global financial markets and applying quantitative techniques to forecast trends.

Highlights

Executed algorithmic and discretionary trades across crypto, forex, and commodities, achieving strong risk-adjusted returns through strategic market analysis.

Monitored global financial markets, utilizing quantitative techniques to accurately forecast trends and inform trading strategies.

Deployed AI-enhanced trading bots and ML-based signal generators to automate and optimize trade execution processes.

Aries Corporate Ltd.
|

Data Analyst

Gurugram, Haryana, India

Summary

Designed and maintained scalable ETL pipelines and improved data retrieval speed using Azure Data Factory and PySpark.

Highlights

Designed and maintained scalable ETL pipelines for diverse business units, streamlining data integration using Azure Data Factory.

Improved data retrieval speed by over 30% through advanced query tuning and strategic partitioning implementations.

Leveraged PySpark for distributed data transformations and seamless integration with downstream Business Intelligence tools.

Enforced robust data governance using Azure Key Vault and RBAC policies to secure and manage data assets effectively.

Coin Trade
|

Analyst

Hyderabad, Telangana, India

Summary

Performed deep-dive analysis on blockchain protocols and built data pipelines to enhance token evaluation and trade logging efficiency.

Highlights

Performed deep-dive analysis on over 100 blockchain protocols, directly informing investment strategy and token due diligence processes.

Developed data pipelines and interactive dashboards, increasing token evaluation and trade logging efficiency by 40%.

Collaborated with product and engineering teams to validate smart contracts and Decentralized Autonomous Organization (DAO) structures.

Education

Vellore Institute of Technology
Vellore, TN, India

B.Tech

Biotechnology

Certificates

IBM Data Analyst Professional Certificate

Issued By

Coursera

Skills

Data Engineering & ETL

Scalable Data Pipelines, ETL Pipelines, Real-time Data Ingestion, Data Transformation, Data Lake Architecture, Batch ETL Optimization, Data Quality Frameworks, Schema Evolution, Data Contracts, Data Reconciliation, PySpark Transformations, Query Tuning, Partitioning Strategies, Distributed Data Transformations.

Cloud Technologies

Azure Data Factory, Azure Data Lake Storage (ADLS), Azure Blob, Azure Key Vault, Azure Event Hubs, Azure Databricks, Azure Synapse.

Programming & Scripting

Python, PySpark, SQL.

Data Analysis & Business Intelligence

Data Analysis, Actionable Insights, Dashboards, KPI Calculation, Trend Forecasting, Live Metrics, BI Tools, Retail Analytics.

Security & Governance

Data Governance, RBAC Policies, Encryption.

Development Practices

CI/CD (GitHub Actions), Python Design Patterns, Config-Driven Design, Unit Testing, API Design, MLOps.

Financial & Blockchain

Algorithmic Trading, Discretionary Trading, Cryptocurrency, Forex, Commodities, Blockchain Protocols, Token Due Diligence, Smart Contracts, DAO Structures, Quantitative Techniques, AI-enhanced Trading Bots, ML-based Signal Generators.

Projects

Credit Card Fraud Detection System

Summary

Developed a robust system leveraging Azure Data Factory, ADLS, Databricks, and PySpark for real-time credit card fraud detection, focusing on batch ETL optimization, Spark SQL tuning, and data quality frameworks.

Real-time ETL for Retail Analytics

Summary

Designed and implemented a real-time streaming architecture for retail transaction monitoring and analysis, utilizing Azure Event Hubs, Databricks, PySpark, and Synapse.

Automated Trade Logging Dashboard

Summary

Designed and deployed an internal analytics tool for traders using Python, SQL, Streamlit, Databricks, and PySpark, focusing on CI/CD, Python design patterns, and API design.