Location : Sunnyvale, CA Experience : MIN 13+ Yrs. overall IT Exp MIN - 5+ Yrs. overall Architect Exp MIN 3+ Yrs. AWS/Hadoop/Hive/Big Data Exp
Bachelors Degree in Computer Science or Engineering
8+ years of relevant consulting or industry experience
2+ years in a technical or functional lead role
Experience working independently with minimal guidance
Strong problem solving and troubleshooting skills with experience exercising mature judgment
Proven experience effectively prioritizing workload to meet deadlines and work objectives
Demonstrated ability to write clearly, succinctly, and in a manner that appeals to a wide audience
Proficiency in word processing, spreadsheet, and presentation creation tools, as well as Internet research tools
6 years of relevant technology architecture consulting or industry experience to include Information delivery, Analytics and Business Intelligence based on data from hybrid of Hadoop Distributed File System (HDFS), non-relational (e.g. NoSQL, MongoDB, Cassandra) and relational Data Warehouses.
3 years of hands on experience with data lake implementations, core modernization and data ingestion.
1 years of hands on experience with Cloud using Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP)
At least 3 years hands on working experience with Big data technologies; MapReduce, Pig, Hive, HBase, Sqoop, Spark, Flume, YARN, Kafka, Storm etc.
Experience working with commercial distributions of HDFS (Hortonworks, Cloudera, Pivotal HD, MapR)
1 years of hands on experience designing and implementing data ingestion techniques for real time and batch processes for video, voice, weblog, sensor, machine and social media data into Hadoop ecosystems and HDFS clusters.
1 years of hands on experience with data integration products like Informatica Power Center Big Data Edition (BDE), IBM BigInsights, Pentaho, Talend etc.
Ability to obtain and maintain the required security clearance for the role