Ayan Infotech has an urgent and immediate requirement for an AWS Cloud Data Engineer for a full time permanent job opportunity in Perth.
All applicants must have full work rights in Australia.
Title: Cloud (AWS) Data Engineer
Type: Full time Permanent
Required Skills and Experience:
- Work closely with client to understand their data requirements, develop and model data structure, and design and build the ingestion process to provide access to data from operational and enterprise source systems
- Design and development of data integration and data pipelines (ETL) for various on-prem and cloud-based data platforms using cutting edge technologies and services
- Contribute to the delivery of secure and good practice data integration strategies and approaches
- Work closely with database teams on topics related to data requirements, cleanliness, quality etc.
- Experienced in multiple database technologies such as Distributed Processing, Traditional RDBMS, MPP, NoSQL.
- Strong experience in traditional data warehousing / ETL tools (Informatica, Talend, Pentaho, DataStage)
- Demonstrated experience working across structured, semi-structured, and unstructured data
- Experience with data processing systems such as Hadoop, Spark, Storm, Impala, etc.
- Strong understanding of understanding of traditional ETL tools & RDBMS, End to End Data Pipeline
- Knowledge of Data Governance and strong understanding of data lineage and data quality
- AWS Cloud experience, including;
- Experience with AWS cloud services: S3, EC2, EMR, RDS, Redshift and Kinesis.
- Experienced in multiple database technologies such as Distributed Processing (Spark, Hadoop, EMR), Traditional RDBMS (MS SQL Server, Oracle, MySQL, PostgreSQL), MPP (AWS Redshift, Teradata). NoSQL (MongoDB, DynamoDB, Cassandra, Neo4J, Titan)
- Experience in designing and building streaming data ingestion, analysis and processing pipelines using Kafka, Kafka Streams, Spark Streaming and similar cloud native technologies.
- Ideally have experience with Infrastructure as Code using Terraform.
- Strong experience with Python and at least two of the following technologies: Scala, SQL, Java
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience deploying applications into production environments e.g. code packaging, integration testing, monitoring, release management.
- Experience with source control tools such along with branching and merging concepts. Ideally experience with GitLab for source control and CI/CD.
- Experience in software engineering best practices such as code reviews, testing frameworks, maintainability and readability
- Ideally have experience with making data available for consumption (i.e. APIs, Event based publish / subscribe, Data Mart provisioning)
- Ideally have experience with MuleSoft, Solace and Streamsets.
- Bachelor's degree required with minimum 4 years of experience working in data engineering or architecture role
- Experience working in DevOps, Agile, Scrum, Continuous Delivery and/or Rapid Application Development environments
Contact: 02 9412 4178 for more details.