Hewlett Packard (HPE) Recruitment for Big Data Developers at Bangalore
Table of Contents - Job - Hewlett Packard Enterprise (HPE)
- Big Data Developer
Job ID / Advertisement No.
Number of Openings
Knowledge and Skills
- Extensive experience with multiple software systems design and languages.
- Experience in overall architecture of software systems for products and solutions in latest platforms and technologies
- Excellent analytical and problem-solving skills.
- Excellent written and verbal communication skills. Ability to effectively communicate product architectures, design proposals, strategies and negotiate options at senior management levels
- Hands on Proficiency in specific skills needed for data engineering
Must have Skills for Specialist/Developer: (Overall experience 2-11 Years)
- Software Development Background 2-7 Years
- Python or Scala or JAVA – 1 + Years
- SQL – 1 + Years
- AWS – EC2, S3, Redshift, API Gateway or Any cloud like Cloudera, Azure, Google cloud – 2 + Years
- Spark – 1 + Years R
- EST Web Services – 1 + Years
- End to End Deployment experience – 1+ projects
Good to have Skills (Both Expert and Junior level)
- AWS – CLI, Kinesis, Snowball, Lambda, Elastic Beanstalk, Airflow Data bricks Linux Docker Github, Jenkins
Software Development – Big Data
- This role is in HP’s Bangalore R&D Center.
- HP is keen on establishing a platform of capabilities which enables cost effective and high-performance data collection, processing, archiving, analysis, and reporting.
- This role will engage with analytics teams across HP’s business units to understand data requirements, and engage across HP’s R&D teams to drive telemetry requirements.
- The scope will include the entire project lifecycle – understanding requirements, design, delivery and qualification.
How to Apply ?
Please read all job details clearly and apply exactly as mentioned below only if you meet eligibility criteria.
Note: Please apply before the Job URL expires.
Job Tagged in
Hewlett Packard (HPE)
This is our calling. This is a new HP.