8-12 yearsNoidaGraduation/Post Graduation
Responsibilities
- Define client needs, oversee project milestones to ensure expectations, budgets are met
- Define data platform architecture & design; Review code & make changes
- Leverage knowledge of distributed systems, process flows & procedures to aid analyses & recommendations for solution offerings
- Responsible for the overall quality of project deliverables & successful implementation of defined solution
- Build superior client relationships & proactively manage client expectations, ensure that change control is used when scope boundaries are exceeded
- Communicate a compelling & inspired vision to all customer levels, from CTOs/CIOs, to engineering managers and programmers
- Maintain strong network & promote us at various meetings, forums, panels, publications, conferences to establish thought leadership in industry
- Responsible for POCs on any tech stacks introduced, or possible version upgrades
Requirements
- 8+ years experience in design & development using various database technologies with recent 4+ years associated with Hadoop technology stack & programming languages
- Hands-on/Technical lead experience in 2 or more areas:
- Hadoop, HDFS, MR
- High Availability architecture and DR setup
- Spark Streaming, Spark SQL, Spark ML
- Kafka/Flume
- Apache NiFi
- Worked with Hortonworks Data Platform as Architect CDH (Cloudera Distribution for Hadoop) as developer/administrator
- Hive/Pig/Sqoop
- NoSQL Databases HBase/Cassandra/Neo4j/MongoDB
- Visualisation & Reporting frameworks like D3.js, Zeppelin, Grafana, Kibana Tableau, Pentaho
- Scrapy for crawling websites
- Good to have knowledge of Elastic Search
- Good to have understanding on Google Analytics data streaming
- Data security (Kerberos/OpenLDAP/Knox/Ranger)
- Knowledge of Big Data Integration with Third party/in house built Metadata Management, Data Quality, Master Data Management Solutions, Structured/Unstructured data
- Been active in terms of speaking engagements at conferences