Accomplished DevOps Engineer & Data Engineer with 5+ years of experience in cloud infrastructure, data pipeline orchestration, and automation. Demonstrated expertise in designing scalable AWS-based data platforms and orchestrating data workflows using AWS Managed Airflow (MWAA), Apache Airflow, and Argo Workflows. Proficient in building robust ETL pipelines using AWS Glue, DBT, Redshift, and Snowflake, with experience in data ingestion using Airbyte and Fivetran. Skilled in SQL, Python, Terraform, and Kubernetes, with hands-on experience in CI/CD pipeline automation using GitHub Actions, Jenkins, and GitLab. Proficient in container orchestration using Docker, Helm, and Kubernetes, AWS (EKS) managed via Rancher. Proven ability to optimize data workflows, ensure secure and compliant data architecture, and build real-time monitoring solutions with QuickSight, Grafana, and Prometheus. I'm committed to:
- π Designing and implementing data pipelines
- π Building CI/CD pipelines for data workflows
- π Deploying scalable data infrastructure
- π οΈ Automating data processing and deployment processes
- π Optimizing system performance for data-intensive applications
- π Implementing data monitoring solutions
My path into technology has been one of transformation and growth. What began as a career change has evolved into a passion for combining data engineering with DevOps practices. Every challenge has been an opportunity to learn and innovate in the data space, and I'm committed to continual improvement.
I believe in the power of data and automation to solve complex problems and create efficient systems. My approach combines technical expertise in data processing with infrastructure automation, always focused on delivering scalable data solutions that make a difference.
I'm always open to discussing data projects, pipeline architectures, innovative ideas, or opportunities to collaborate. Feel free to reach out through any of the channels below: