Skip to main content Skip to Footer

AWS BigData Application Developer

LOCATIONS:

About Accenture: Accenture is a leading global professional services company, providing a broad range of services in strategy and consulting, interactive, technology and operations, with digital capabilities across all of these services. We combine unmatched experience and specialized capabilities across more than 40 industries — powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. With 514,000 people serving clients in more than 120 countries, Accenture brings continuous innovation to help clients improve their performance and create lasting value across their enterprises. Visit us at www.accenture.com


  • Project Role :Application Developer
  • Project Role Description :Design, build and configure applications to meet business process and application requirements.
  • Management Level :10
  • Work Experience :4-6 years
  • Work location :Bengaluru
  • Must Have Skills :Apache Spark,AWS BigData,Hadoop,Python Programming Language
  • Good To Have Skills :
  • Job Requirements :

    • Key Responsibilities : 1 3 years of experience in Python programming language required 2 2 years of experience in Scala programming language required 3 Should be Proficient in using relational databases and writing SQL 4 AWS Experience in AWS ecosystem tools S3/Athena/Cloud Formation/EC2 5 Strong understanding and experience with functional and object-oriented design and development in Python or Scala 6 Glue has been finalized for Data Ingestion We need resources with strong SQL with Data Ingestion and transforms
    • Technical Experience : Must Have Skills- AWS Glue, Lamda, AWS Redshift Good To have - SQL, Python 1 Experience in automated testing TDD, BDD, Mocking, Unit, Functional and Integration Testing 2 AWS Glue, Terraform, Quicksight knowledge required 3 Hands on experience in any distributed version control system such as Git or Mercurial 4 Understanding of software design patterns 5 Experience with Unix-like operating systems such as Linux or Solaris required 6 Hadoop Eco System Tools Hive/Impala, HDFS, Kafka, Spark
    • Professional Attributes : Good communication skill Ready to work in shifts 12 PM to 10 PM
    • Additional Information : Ready to work in shifts 12 PM to 10 PM


15 years of full time education

Apply now

Join our Talent Community

See the latest jobs, news and events by joining our talent community:

Job Locations

{{alert.msg}}

Comments

Bangalore