About
Highly accomplished Software Engineer and Data Scientist with a Master's in Data Science, specializing in developing advanced AI/ML models and scalable data solutions. Proven ability to leverage AWS services, Python, and SQL to design robust data warehousing systems, streamline payroll operations, and enhance data query efficiency by up to 40% for critical business insights.
Work
Genese Solution
|Software Engineer II
Kathmandu, Bagmati, Nepal
→
Summary
As Software Engineer II at Genese Solution, I specialize in developing Generative AI systems and custom payroll solutions, leveraging AWS services to enhance business intelligence and operational efficiency.
Highlights
Spearheaded the design and development of Gandalf Insights, a Generative AI system powered by AWS Services (S3, Lambda, Bedrock), enabling natural language querying of databases.
Implemented interactive, data-rich dashboards within Gandalf Insights, significantly improving data query efficiency by 40% for business analysts and enhancing user self-service capabilities.
Engineered a bespoke Zoho Payroll System utilizing AWS Services (EC2, RDS) tailored to Nepal's specific payroll policies for diverse employee groups.
Automated payroll calculations and processing, reducing manual errors by 35% and significantly enhancing overall workflow efficiency and user interaction for HR operations.
Genese Solution
|Data Engineer
Kathmandu, Bagmati, Nepal
→
Summary
As a Data Engineer at Genese Solution, I was responsible for designing, implementing, and optimizing data warehousing systems and ETL pipelines to improve data accessibility, consistency, and analytical performance.
Highlights
Designed and implemented a comprehensive data warehousing system, overseeing data migration, processing, and visualization workflows.
Leveraged Apache NiFi, Apache Airflow, and Apache Spark to build robust data pipelines, reducing overall data processing time by 30% and enhancing data accessibility for stakeholders.
Developed and optimized complex ETL pipelines using Python and SQL, integrating data from multiple disparate sources into a consolidated data warehouse.
Ensured high data consistency and reliability across the data warehouse, enabling faster and more accurate analytics for critical business reporting and decision-making.
Fusemachines
|AI Fellow
Kathmandu, Bagmati, Nepal
→
Summary
As an AI Fellow at Fusemachines, I focused on developing and deploying advanced machine learning models to provide actionable insights and enhance data-driven decision-making for critical operational strategies.
Highlights
Developed a sophisticated machine learning model for accident severity prediction, leveraging advanced algorithms to analyze complex datasets and identify key predictive factors.
Enhanced analytical capabilities by integrating the ML model into existing systems, providing actionable insights that informed data-driven decision-making processes.
Focused on improving prediction accuracy through rigorous hyperparameter tuning and model validation, directly supporting the development of safer and more effective operational strategies.
Collaborated with a research team to explore novel AI methodologies and apply them to real-world problems, contributing to the advancement of predictive analytics within the organization.
CodeRush
|Data Engineering Apprentice
Kathmandu, Bagmati, Nepal
→
Summary
As a Data Engineering Apprentice at CodeRush, I gained practical experience in executing data engineering projects, building scalable data pipelines, and optimizing data reliability using AWS cloud services.
Highlights
Executed foundational data engineering projects, applying core principles of data extraction, transformation, and loading (ETL) in real-world scenarios.
Utilized AWS cloud services (e.g., S3, EC2, Lambda) to build scalable data pipelines for diverse applications, including web scraping and cryptocurrency data integration.
Contributed to improving data reliability and integrity by implementing robust data validation and cleaning processes within the pipelines.
Enabled faster analytics and reporting capabilities by optimizing data flow and storage, providing timely access to processed data for analysis.
Education
Charles Darwin University
→
Masters
Data Science
Khwopa College of Engineering
→
Bachelor's
Computer Engineering
Grade: 74.84%
Languages
English
Nepali
Skills
Programming Languages
Python, SQL, C#, Java, JavaScript, HTML/CSS, LaTeX.
Cloud Platforms & Services
AWS (S3, Lambda, EC2, RDS, Redshift, DynamoDB, SQS, Bedrock), Microsoft (Azure DevOps), DigitalOcean, Heroku.
Big Data & Data Engineering
Apache Spark, Apache Airflow, Apache NiFi, Kafka, ETL Pipelines, Data Warehousing.
Machine Learning & AI
LangChain, LangGraph, AI Agents, Autonomic AI, Machine Learning Models, Predictive Analytics, Generative AI.
Web Development & Frameworks
Django.
Tools & Methodologies
Git/GitHub, Atlassian (Jira, Confluence, Bitbucket), Visual Studio, IntelliJ IDEA, Selenium, Tableau, Data Visualization.