· Bachelor’s degree/Diploma in Computer Science, Computer Studies, Information Technology, or related disciplines
· 5+ years of experience in software development.
· Experience with Kafka, .NET, CI/CD, SQL and/or DBT
· Cloud native development experience, preferably with AWS and docker
· Sound knowledge of best practices in data engineering and data security
· Knowledge on cloud security controls, DevOps and CICD pipelines
· Experience with AWS EKS, Amazon Aurora, AWS RDS, AWS S3 and AWS CloudWatch
· Experience using one or more of AWS / Azure / GCP data and analytics services in combination with custom solutions - Spark, Azure Data Lake, Databricks, Snowflake, HDInsights, SQL DW, DocumentDB, Glue, Athena, Elastic Pool etc.
· Experience with multiple data storage solutions for analytics, operational and archival purposes like MongoDB, Cassandra, HBase, Redis, PostgreSQL, MySQL, DB2, Neo4j, S3 etc.
· Experience with data transformation tools/platforms like: dbt, Alteryx, Datameer, dataform, Informatica, Talend etc. and their data quality management features
· Mastered at least one core language: Python, Scala, Java
· Track record of delivering scalable Data pipeline services running in production.
· Excellent communication and presentation skills
• Interested Candidates can send your updated CV to sharon@trinityconsulting.asia