Skip to main content Skip to Footer

Apache Spark Application Developer

LOCATIONS:

About Accenture: Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology and Operations services-all powered by the world's largest network of Advanced Technology and Intelligent Operations centers. Our 514,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com
Accenture | Let there be change
We embrace change to create 360-degree value
www.accenture.com


  • Project Role :Application Developer
  • Project Role Description :Design, build and configure applications to meet business process and application requirements.
  • Management Level :10
  • Work Experience :4-6 years
  • Work location :Bengaluru
  • Must Have Skills :Apache Spark
  • Good To Have Skills :Python Programming Language
  • Job Requirements :

    • Key Responsibilities : a Provide subject matter expertise and hands on delivery of data capture, curation and consumption pipelines on Azure and Hadoop- Ability to build cloud data solutions b provide domain perspective on storage, big data platform services, serverless architectures, hadoop ecosystem, vendor products, RDBMS, DW/DM, NoSQL databases and security- Participate in deep architectural discussions to build confidence and ensure customer success when building new solutions and migrating existing data
    • Technical Experience : a Pyspark, b Spark / Scala c Python d Must be a Subject Matter Expert in Spark e Proficiency with Big Data processing technologies Hadoop, Spark, AWS f Experience in building data pipelines and analysis tools using Python, PySpark, Scala g Create Scala/Spark jobs for data transformation and aggregation h Produce unit tests for Spark transformations and helper methods i Write Scaladoc-style documentation with all code Design data processing pipelines j Good to have experience with Hadoop / AWS
    • Professional Attributes : a Proven ability to build, manage and foster a team-oriented environment b Proven ability to work creatively and analytically in a problem-solving environment c Desire to work in an information systems environment


15 years of full time education

Apply now

Join our Talent Community

See the latest jobs, news and events by joining our talent community:

Job Locations

{{alert.msg}}

Comments

Bangalore